Science.gov

Sample records for advanced computational capabilities

  1. Advances in Computational Capabilities for Hypersonic Flows

    NASA Technical Reports Server (NTRS)

    Kumar, Ajay; Gnoffo, Peter A.; Moss, James N.; Drummond, J. Philip

    1997-01-01

    The paper reviews the growth and advances in computational capabilities for hypersonic applications over the period from the mid-1980's to the present day. The current status of the code development issues such as surface and field grid generation, algorithms, physical and chemical modeling, and validation is provided. A brief description of some of the major codes being used at NASA Langley Research Center for hypersonic continuum and rarefied flows is provided, along with their capabilities and deficiencies. A number of application examples are presented, and future areas of research to enhance accuracy, reliability, efficiency, and robustness of computational codes are discussed.

  2. COMPUTATIONAL TOXICOLOGY ADVANCES: EMERGING CAPABILITIES FOR DATA EXPLORATION AND SAR MODEL DEVELOPMENT

    EPA Science Inventory

    Computational Toxicology Advances: Emerging capabilities for data exploration and SAR model development
    Ann M. Richard and ClarLynda R. Williams, National Health & Environmental Effects Research Laboratory, US EPA, Research Triangle Park, NC, USA; email: richard.ann@epa.gov

  3. Advanced multi-dimensional deterministic transport computational capability for safety analysis of pebble-bed reactors

    NASA Astrophysics Data System (ADS)

    Tyobeka, Bismark Mzubanzi

    A coupled neutron transport thermal-hydraulics code system with both diffusion and transport theory capabilities is presented. At the heart of the coupled code is a powerful neutronics solver, based on a neutron transport theory approach, powered by the time-dependent extension of the well known DORT code, DORT-TD. DORT-TD uses a fully implicit time integration scheme and is coupled via a general interface to the thermal-hydraulics code THERMIX-DIREKT, an HTR-specific two dimensional core thermal-hydraulics code. Feedback is accounted for by interpolating multigroup cross sections from pre-generated libraries which are structured for user specified discrete sets of thermal-hydraulic parameters e.g. fuel and moderator temperatures. The coupled code system is applied to two HTGR designs, the PBMR 400MW and the PBMR 268MW. Steady-state and several design basis transients are modeled in an effort to discern with the adequacy of using neutron diffusion theory as against the more accurate but yet computationally expensive neutron transport theory. It turns out that there are small but significant differences in the results from using either of the two theories. It is concluded that diffusion theory can be used with a higher degree of confidence in the PBMR as long as more than two energy groups are used and that the result must be checked against lower order transport solution, especially for safety analysis purposes. The end product of this thesis is a high fidelity, state-of-the-art computer code system, with multiple capabilities to analyze all PBMR safety related transients in an accurate and efficient manner.

  4. Development of Computational Capabilities to Predict the Corrosion Wastage of Boiler Tubes in Advanced Combustion Systems

    SciTech Connect

    Kung, Steven; Rapp, Robert

    2014-08-31

    A comprehensive corrosion research project consisting of pilot-scale combustion testing and long-term laboratory corrosion study has been successfully performed. A pilot-scale combustion facility available at Brigham Young University was selected and modified to enable burning of pulverized coals under the operating conditions typical for advanced coal-fired utility boilers. Eight United States (U.S.) coals were selected for this investigation, with the test conditions for all coals set to have the same heat input to the combustor. In addition, the air/fuel stoichiometric ratio was controlled so that staged combustion was established, with the stoichiometric ratio maintained at 0.85 in the burner zone and 1.15 in the burnout zone. The burner zone represented the lower furnace of utility boilers, while the burnout zone mimicked the upper furnace areas adjacent to the superheaters and reheaters. From this staged combustion, approximately 3% excess oxygen was attained in the combustion gas at the furnace outlet. During each of the pilot-scale combustion tests, extensive online measurements of the flue gas compositions were performed. In addition, deposit samples were collected at the same location for chemical analyses. Such extensive gas and deposit analyses enabled detailed characterization of the actual combustion environments existing at the lower furnace walls under reducing conditions and those adjacent to the superheaters and reheaters under oxidizing conditions in advanced U.S. coal-fired utility boilers. The gas and deposit compositions were then carefully simulated in a series of 1000-hour laboratory corrosion tests, in which the corrosion performances of different commercial candidate alloys and weld overlays were evaluated at various temperatures for advanced boiler systems. Results of this laboratory study led to significant improvement in understanding of the corrosion mechanisms operating on the furnace walls as well as superheaters and reheaters in

  5. Advanced CLIPS capabilities

    NASA Technical Reports Server (NTRS)

    Riley, Gary

    1991-01-01

    The C Language Integrated Production System (CLIPS) is a forward chaining rule based language developed by NASA. CLIPS was designed specifically to provide high portability, low cost, and easy integration with external systems. The current release of CLIPS, version 4.3, is being used by over 2500 users throughout the public and private community. The primary addition to the next release of CLIPS, version 5.0, will be the CLIPS Object Oriented Language (COOL). The major capabilities of COOL are: class definition with multiple inheritance and no restrictions on the number, types, or cardinality of slots; message passing which allows procedural code bundled with an object to be executed; and query functions which allow groups of instances to be examined and manipulated. In addition to COOL, numerous other enhancements were added to CLIPS including: generic functions (which allow different pieces of procedural code to be executed depending upon the types or classes of the arguments); integer and double precision data type support; multiple conflict resolution strategies; global variables; logical dependencies; type checking on facts; full ANSI compiler support; and incremental reset for rules.

  6. Advances in time-domain electromagnetic simulation capabilities through the use of overset grids and massively parallel computing

    NASA Astrophysics Data System (ADS)

    Blake, Douglas Clifton

    A new methodology is presented for conducting numerical simulations of electromagnetic scattering and wave-propagation phenomena on massively parallel computing platforms. A process is constructed which is rooted in the Finite-Volume Time-Domain (FVTD) technique to create a simulation capability that is both versatile and practical. In terms of versatility, the method is platform independent, is easily modifiable, and is capable of solving a large number of problems with no alterations. In terms of practicality, the method is sophisticated enough to solve problems of engineering significance and is not limited to mere academic exercises. In order to achieve this capability, techniques are integrated from several scientific disciplines including computational fluid dynamics, computational electromagnetics, and parallel computing. The end result is the first FVTD solver capable of utilizing the highly flexible overset-gridding process in a distributed-memory computing environment. In the process of creating this capability, work is accomplished to conduct the first study designed to quantify the effects of domain-decomposition dimensionality on the parallel performance of hyperbolic partial differential equations solvers; to develop a new method of partitioning a computational domain comprised of overset grids; and to provide the first detailed assessment of the applicability of overset grids to the field of computational electromagnetics. Using these new methods and capabilities, results from a large number of wave propagation and scattering simulations are presented. The overset-grid FVTD algorithm is demonstrated to produce results of comparable accuracy to single-grid simulations while simultaneously shortening the grid-generation process and increasing the flexibility and utility of the FVTD technique. Furthermore, the new domain-decomposition approaches developed for overset grids are shown to be capable of producing partitions that are better load balanced and

  7. Computational physics and applied mathematics capability review June 8-10, 2010 (Advance materials to committee members)

    SciTech Connect

    Lee, Stephen R

    2010-01-01

    Los Alamos National Laboratory will review its Computational Physics and Applied Mathematics (CPAM) capabilities in 2010. The goals of capability reviews are to assess the quality of science, technology, and engineering (STE) performed by the capability, evaluate the integration of this capability across the Laboratory and within the scientific community, examine the relevance of this capability to the Laboratory's programs, and provide advice on the current and future directions of this capability. This is the first such review for CPAM, which has a long and unique history at the laboratory, starting from the inception of the Laboratory in 1943. The CPAM capability covers an extremely broad technical area at Los Alamos, encompassing a wide array of disciplines, research topics, and organizations. A vast array of technical disciplines and activities are included in this capability, from general numerical modeling, to coupled mUlti-physics simulations, to detailed domain science activities in mathematics, methods, and algorithms. The CPAM capability involves over 12 different technical divisions and a majority of our programmatic and scientific activities. To make this large scope tractable, the CPAM capability is broken into the following six technical 'themes.' These themes represent technical slices through the CP AM capability and collect critical core competencies of the Laboratory, each of which contributes to the capability (and each of which is divided into multiple additional elements in the detailed descriptions of the themes in subsequent sections): (1) Computational Fluid Dynamics - This theme speaks to the vast array of scientific capabilities for the simulation of fluids under shocks, low-speed flow, and turbulent conditions - which are key, historical, and fundamental strengths of the laboratory; (2) Partial Differential Equations - The technical scope of this theme is the applied mathematics and numerical solution of partial differential equations

  8. Overview of ASC Capability Computing System Governance Model

    SciTech Connect

    Doebling, Scott W.

    2012-07-11

    This document contains a description of the Advanced Simulation and Computing Program's Capability Computing System Governance Model. Objectives of the Governance Model are to ensure that the capability system resources are allocated on a priority-driven basis according to the Program requirements; and to utilize ASC Capability Systems for the large capability jobs for which they were designed and procured.

  9. Advancing Test Capabilities at NASA Wind Tunnels

    NASA Technical Reports Server (NTRS)

    Bell, James

    2015-01-01

    NASA maintains twelve major wind tunnels at three field centers capable of providing flows at 0.1 M 10 and unit Reynolds numbers up to 45106m. The maintenance and enhancement of these facilities is handled through a unified management structure under NASAs Aeronautics and Evaluation and Test Capability (AETC) project. The AETC facilities are; the 11x11 transonic and 9x7 supersonic wind tunnels at NASA Ames; the 10x10 and 8x6 supersonic wind tunnels, 9x15 low speed tunnel, Icing Research Tunnel, and Propulsion Simulator Laboratory, all at NASA Glenn; and the National Transonic Facility, Transonic Dynamics Tunnel, LAL aerothermodynamics laboratory, 8 High Temperature Tunnel, and 14x22 low speed tunnel, all at NASA Langley. This presentation describes the primary AETC facilities and their current capabilities, as well as improvements which are planned over the next five years. These improvements fall into three categories. The first are operations and maintenance improvements designed to increase the efficiency and reliability of the wind tunnels. These include new (possibly composite) fan blades at several facilities, new temperature control systems, and new and much more capable facility data systems. The second category of improvements are facility capability advancements. These include significant improvements to optical access in wind tunnel test sections at Ames, improvements to test section acoustics at Glenn and Langley, the development of a Supercooled Large Droplet capability for icing research, and the development of an icing capability for large engine testing. The final category of improvements consists of test technology enhancements which provide value across multiple facilities. These include projects to increase balance accuracy, provide NIST-traceable calibration characterization for wind tunnels, and to advance optical instruments for Computational Fluid Dynamics (CFD) validation. Taken as a whole, these individual projects provide significant

  10. Advances in Time-Domain Electromagnetic Simulation Capabilities Through the Use of Overset Grids and Massively Parallel Computing

    DTIC Science & Technology

    1997-03-01

    to construct their computer codes (written largely in FORTRAN) to exploit this type of architecture. Towards the end of the 1980s, however, vector...for exploiting parallel architectures using both single and overset grids in conjunction with typical grid-based PDE solvers in general and FVTD...8217. Furthermore, it naturally exploits the means by which the electric and magnetic fields are related through the curl operators. Unfortunately, although stag

  11. Development of coupled SCALE4.2/GTRAN2 computational capability for advanced MOX fueled assembly designs

    SciTech Connect

    Vujic, J.; Greenspan, E.; Slater, Postma, T.; Casher, G.; Soares, I.; Leal, L.

    1995-05-01

    An advanced assembly code system that can efficiently and accurately analyze various designs (current and advanced) proposed for plutonium disposition is being developed by {open_quotes}marrying{close_quotes} two existing state-of-the-art methodologies-GTRAN2 and SCALE 4.2. The resulting code system, GT-SCALE, posses several unique characteristics: exact 2D representation of a complete fuel assembly, while preserving the heterogeniety of each of its pin cells; flexibility in the energy group structure, the present upper limit being 218 groups; a comprehensive cross-section library and material data base; and accurate burnup calculations. The resulting GT-SCALE is expected to be very useful for a wide variety of applications, including the analysis of very heterogeneous UO{sub 2} fueled LWR fuel assemblies; of hexagonal shaped fuel assemblies as of the Russian LWRs; of fuel assemblies for HTGRs; as well as for the analysis of criticality safety and for calculation of the source term of spent fuel.

  12. NASA capabilities roadmap: advanced telescopes and observatories

    NASA Technical Reports Server (NTRS)

    Feinberg, Lee D.

    2005-01-01

    The NASA Advanced Telescopes and Observatories (ATO) Capability Roadmap addresses technologies necessary for NASA to enable future space telescopes and observatories collecting all electromagnetic bands, ranging from x-rays to millimeter waves, and including gravity-waves. It has derived capability priorities from current and developing Space Missions Directorate (SMD) strategic roadmaps and, where appropriate, has ensured their consistency with other NASA Strategic and Capability Roadmaps. Technology topics include optics; wavefront sensing and control and interferometry; distributed and advanced spacecraft systems; cryogenic and thermal control systems; large precision structure for observatories; and the infrastructure essential to future space telescopes and observatories.

  13. Computational capabilities of physical systems.

    PubMed

    Wolpert, David H

    2002-01-01

    In this paper strong limits on the accuracy of real-world physical computation are established. To derive these results a non-Turing machine formulation of physical computation is used. First it is proven that there cannot be a physical computer C to which one can pose any and all computational tasks concerning the physical universe. Next it is proven that no physical computer C can correctly carry out every computational task in the subset of such tasks that could potentially be posed to C. This means in particular that there cannot be a physical computer that can be assured of correctly "processing information faster than the universe does." Because this result holds independent of how or if the computer is physically coupled to the rest of the universe, it also means that there cannot exist an infallible, general-purpose observation apparatus, nor an infallible, general-purpose control apparatus. These results do not rely on systems that are infinite, and/or nonclassical, and/or obey chaotic dynamics. They also hold even if one could use an infinitely fast, infinitely dense computer, with computational powers greater than that of a Turing machine (TM). After deriving these results analogs of the TM Halting theorem are derived for the novel kind of computer considered in this paper, as are results concerning the (im)possibility of certain kinds of error-correcting codes. In addition, an analog of algorithmic information complexity, "prediction complexity," is elaborated. A task-independent bound is derived on how much the prediction complexity of a computational task can differ for two different reference universal physical computers used to solve that task. This is analogous to the "encoding" bound governing how much the algorithm information complexity of a TM calculation can differ for two reference universal TMs. It is proven that either the Hamiltonian of our universe proscribes a certain type of computation, or prediction complexity is unique (unlike

  14. Advanced Computer Typography.

    DTIC Science & Technology

    1981-12-01

    ADVANCED COMPUTER TYPOGRAPHY .(U) DEC 81 A V HERSHEY UNCLASSIFIED NPS012-81-005 M MEEEIEEEII IIUJIL15I.4 MICROCQP RE SO.JjI ON ft R NPS012-81-005...NAVAL POSTGRADUATE SCHOOL 0Monterey, California DTIC SELECTEWA APR 5 1982 B ADVANCED COMPUTER TYPOGRAPHY by A. V. HERSHEY December 1981 OApproved for...Subtitle) S. TYPE Or REPORT & PERIOD COVERED Final ADVANCED COMPUTER TYPOGRAPHY Dec 1979 - Dec 1981 S. PERFORMING ORG. REPORT NUMBER 7. AUTHOR(s) S CONTRACT

  15. Advanced-capability alkaline fuel cell powerplant

    NASA Astrophysics Data System (ADS)

    Deronck, Henry J.

    The alkaline fuel cell powerplant utilized in the Space Shuttle Orbiter has established an excellent performance and reliability record over the past decade. Recent AFC technology programs have demonstrated significant advances in cell durability and power density. These capabilities provide the basis for substantial improvement of the Orbiter powerplant, enabling new mission applications as well as enhancing performance in the Orbiter. Improved durability would extend the powerplant's time between overhaul fivefold, and permit longer-duration missions. The powerplant would also be a strong candidate for lunar/planetary surface power systems. Higher power capability would enable replacement of the Orbiter's auxiliary power units with electric motors, and benefits mass-critical applications such as the National AeroSpace Plane.

  16. DOE's Computer Incident Advisory Capability (CIAC)

    SciTech Connect

    Schultz, E.

    1990-09-01

    Computer security is essential in maintaining quality in the computing environment. Computer security incidents, however, are becoming more sophisticated. The DOE Computer Incident Advisory Capability (CIAC) team was formed primarily to assist DOE sites in responding to computer security incidents. Among CIAC's other responsibilities are gathering and distributing information to DOE sites, providing training workshops, coordinating with other agencies, response teams, and vendors, creating guidelines for incident handling, and developing software tools. CIAC has already provided considerable assistance to DOE sites faced with virus infections and worm and hacker attacks, has issued over 40 information bulletins, and has developed and presented a workshop on incident handling. CIAC's experience in helping sites has produced several lessons learned, including the need to follow effective procedures to avoid virus infections in small systems and the need for sound password management and system administration in networked systems. CIAC's activity and scope will expand in the future. 4 refs.

  17. Opportunities in computational mechanics: Advances in parallel computing

    SciTech Connect

    Lesar, R.A.

    1999-02-01

    In this paper, the authors will discuss recent advances in computing power and the prospects for using these new capabilities for studying plasticity and failure. They will first review the new capabilities made available with parallel computing. They will discuss how these machines perform and how well their architecture might work on materials issues. Finally, they will give some estimates on the size of problems possible using these computers.

  18. Advanced computations in plasma physics

    NASA Astrophysics Data System (ADS)

    Tang, W. M.

    2002-05-01

    Scientific simulation in tandem with theory and experiment is an essential tool for understanding complex plasma behavior. In this paper we review recent progress and future directions for advanced simulations in magnetically confined plasmas with illustrative examples chosen from magnetic confinement research areas such as microturbulence, magnetohydrodynamics, magnetic reconnection, and others. Significant recent progress has been made in both particle and fluid simulations of fine-scale turbulence and large-scale dynamics, giving increasingly good agreement between experimental observations and computational modeling. This was made possible by innovative advances in analytic and computational methods for developing reduced descriptions of physics phenomena spanning widely disparate temporal and spatial scales together with access to powerful new computational resources. In particular, the fusion energy science community has made excellent progress in developing advanced codes for which computer run-time and problem size scale well with the number of processors on massively parallel machines (MPP's). A good example is the effective usage of the full power of multi-teraflop (multi-trillion floating point computations per second) MPP's to produce three-dimensional, general geometry, nonlinear particle simulations which have accelerated progress in understanding the nature of turbulence self-regulation by zonal flows. It should be emphasized that these calculations, which typically utilized billions of particles for thousands of time-steps, would not have been possible without access to powerful present generation MPP computers and the associated diagnostic and visualization capabilities. In general, results from advanced simulations provide great encouragement for being able to include increasingly realistic dynamics to enable deeper physics insights into plasmas in both natural and laboratory environments. The associated scientific excitement should serve to

  19. Advanced Computation in Plasma Physics

    NASA Astrophysics Data System (ADS)

    Tang, William

    2001-10-01

    Scientific simulation in tandem with theory and experiment is an essential tool for understanding complex plasma behavior. This talk will review recent progress and future directions for advanced simulations in magnetically-confined plasmas with illustrative examples chosen from areas such as microturbulence, magnetohydrodynamics, magnetic reconnection, and others. Significant recent progress has been made in both particle and fluid simulations of fine-scale turbulence and large-scale dynamics, giving increasingly good agreement between experimental observations and computational modeling. This was made possible by innovative advances in analytic and computational methods for developing reduced descriptions of physics phenomena spanning widely disparate temporal and spatial scales together with access to powerful new computational resources. In particular, the fusion energy science community has made excellent progress in developing advanced codes for which computer run-time and problem size scale well with the number of processors on massively parallel machines (MPP's). A good example is the effective usage of the full power of multi-teraflop MPP's to produce 3-dimensional, general geometry, nonlinear particle simulations which have accelerated progress in understanding the nature of turbulence self-regulation by zonal flows. It should be emphasized that these calculations, which typically utilized billions of particles for tens of thousands time-steps, would not have been possible without access to powerful present generation MPP computers and the associated diagnostic and visualization capabilities. In general, results from advanced simulations provide great encouragement for being able to include increasingly realistic dynamics to enable deeper physics insights into plasmas in both natural and laboratory environments. The associated scientific excitement should serve to stimulate improved cross-cutting collaborations with other fields and also to help attract

  20. Application of advanced electronics to a future spacecraft computer design

    NASA Technical Reports Server (NTRS)

    Carney, P. C.

    1980-01-01

    Advancements in hardware and software technology are summarized with specific emphasis on spacecraft computer capabilities. Available state of the art technology is reviewed and candidate architectures are defined.

  1. Environments for online maritime simulators with cloud computing capabilities

    NASA Astrophysics Data System (ADS)

    Raicu, Gabriel; Raicu, Alexandra

    2016-12-01

    This paper presents the cloud computing environments, network principles and methods for graphical development in realistic naval simulation, naval robotics and virtual interactions. The aim of this approach is to achieve a good simulation quality in large networked environments using open source solutions designed for educational purposes. Realistic rendering of maritime environments requires near real-time frameworks with enhanced computing capabilities during distance interactions. E-Navigation concepts coupled with the last achievements in virtual and augmented reality will enhance the overall experience leading to new developments and innovations. We have to deal with a multiprocessing situation using advanced technologies and distributed applications using remote ship scenario and automation of ship operations.

  2. Validating DOE's Office of Science "capability" computing needs.

    SciTech Connect

    Mattern, Peter L.; Camp, William J.; Leland, Robert W.; Barsis, Edwin Howard

    2004-07-01

    A study was undertaken to validate the 'capability' computing needs of DOE's Office of Science. More than seventy members of the community provided information about algorithmic scaling laws, so that the impact of having access to Petascale capability computers could be assessed. We have concluded that the Office of Science community has described credible needs for Petascale capability computing.

  3. Advances in Computational Astrophysics

    SciTech Connect

    Calder, Alan C.; Kouzes, Richard T.

    2009-03-01

    I was invited to be the guest editor for a special issue of Computing in Science and Engineering along with a colleague from Stony Brook. This is the guest editors' introduction to a special issue of Computing in Science and Engineering. Alan and I have written this introduction and have been the editors for the 4 papers to be published in this special edition.

  4. Advanced Modeling, Simulation and Analysis (AMSA) Capability Roadmap Progress Review

    NASA Technical Reports Server (NTRS)

    Antonsson, Erik; Gombosi, Tamas

    2005-01-01

    Contents include the following: NASA capability roadmap activity. Advanced modeling, simulation, and analysis overview. Scientific modeling and simulation. Operations modeling. Multi-special sensing (UV-gamma). System integration. M and S Environments and Infrastructure.

  5. Epidermal electronics with advanced capabilities in near-field communication.

    PubMed

    Kim, Jeonghyun; Banks, Anthony; Cheng, Huanyu; Xie, Zhaoqian; Xu, Sheng; Jang, Kyung-In; Lee, Jung Woo; Liu, Zhuangjian; Gutruf, Philipp; Huang, Xian; Wei, Pinghung; Liu, Fei; Li, Kan; Dalal, Mitul; Ghaffari, Roozbeh; Feng, Xue; Huang, Yonggang; Gupta, Sanjay; Paik, Ungyu; Rogers, John A

    2015-02-25

    Epidermal electronics with advanced capabilities in near field communications (NFC) are presented. The systems include stretchable coils and thinned NFC chips on thin, low modulus stretchable adhesives, to allow seamless, conformal contact with the skin and simultaneous capabilities for wireless interfaces to any standard, NFC-enabled smartphone, even under extreme deformation and after/during normal daily activities.

  6. Advanced Telescopes and Observatories Capability Roadmap Presentation to the NRC

    NASA Technical Reports Server (NTRS)

    2005-01-01

    This viewgraph presentation provides an overview of the NASA Advanced Planning and Integration Office (APIO) roadmap for developing technological capabilities for telescopes and observatories in the following areas: Optics; Wavefront Sensing and Control and Interferometry; Distributed and Advanced Spacecraft; Large Precision Structures; Cryogenic and Thermal Control Systems; Infrastructure.

  7. Advanced Capabilities for Wind Tunnel Testing in the 21st Century

    NASA Technical Reports Server (NTRS)

    Kegelman, Jerome T.; Danehy, Paul M.; Schwartz, Richard J.

    2010-01-01

    Wind tunnel testing methods and test technologies for the 21st century using advanced capabilities are presented. These capabilities are necessary to capture more accurate and high quality test results by eliminating the uncertainties in testing and to facilitate verification of computational tools for design. This paper discusses near term developments underway in ground testing capabilities, which will enhance the quality of information of both the test article and airstream flow details. Also discussed is a selection of new capability investments that have been made to accommodate such developments. Examples include advanced experimental methods for measuring the test gas itself; using efficient experiment methodologies, including quality assurance strategies within the test; and increasing test result information density by using extensive optical visualization together with computed flow field results. These points could be made for both major investments in existing tunnel capabilities or for entirely new capabilities.

  8. Advanced Simulation Capability for Environmental Management (ASCEM) Phase II Demonstration

    SciTech Connect

    Freshley, M.; Hubbard, S.; Flach, G.; Freedman, V.; Agarwal, D.; Andre, B.; Bott, Y.; Chen, X.; Davis, J.; Faybishenko, B.; Gorton, I.; Murray, C.; Moulton, D.; Meyer, J.; Rockhold, M.; Shoshani, A.; Steefel, C.; Wainwright, H.; Waichler, S.

    2012-09-28

    In 2009, the National Academies of Science (NAS) reviewed and validated the U.S. Department of Energy Office of Environmental Management (EM) Technology Program in its publication, Advice on the Department of Energy’s Cleanup Technology Roadmap: Gaps and Bridges. The NAS report outlined prioritization needs for the Groundwater and Soil Remediation Roadmap, concluded that contaminant behavior in the subsurface is poorly understood, and recommended further research in this area as a high priority. To address this NAS concern, the EM Office of Site Restoration began supporting the development of the Advanced Simulation Capability for Environmental Management (ASCEM). ASCEM is a state-of-the-art scientific approach that uses an integration of toolsets for understanding and predicting contaminant fate and transport in natural and engineered systems. The ASCEM modeling toolset is modular and open source. It is divided into three thrust areas: Multi-Process High Performance Computing (HPC), Platform and Integrated Toolsets, and Site Applications. The ASCEM toolsets will facilitate integrated approaches to modeling and site characterization that enable robust and standardized assessments of performance and risk for EM cleanup and closure activities. During fiscal year 2012, the ASCEM project continued to make significant progress in capabilities development. Capability development occurred in both the Platform and Integrated Toolsets and Multi-Process HPC Simulator areas. The new Platform and Integrated Toolsets capabilities provide the user an interface and the tools necessary for end-to-end model development that includes conceptual model definition, data management for model input, model calibration and uncertainty analysis, and model output processing including visualization. The new HPC Simulator capabilities target increased functionality of process model representations, toolsets for interaction with the Platform, and model confidence testing and verification for

  9. White Paper on Institutional Capability Computing Requirements

    SciTech Connect

    Kissel, L; McCoy, M G; Seager, M K

    2002-01-29

    This paper documents the need for a rapid, order-of-magnitude increase in the computing infrastructure provided to scientists working in the unclassified domains at Lawrence Livermore National Laboratory. This proposed increase could be viewed as a step in a broader strategy linking hardware evolution to applications development that would take LLNL unclassified computational science to a position of distinction, if not preeminence, by 2006. We believe that it is possible for LLNL institutional scientists to gain access late this year to a new system with a capacity roughly 80% to 200% that of the 12-TF/s (twelve trillion floating-point operations per second) ASCI White system for a cost that is an order of magnitude lower than the White system. This platform could be used for first-class science-of-scale computing and for the development of aggressive, strategically chosen applications that can challenge the near PF/s (petaflop/s, a thousand trillion floating-point operations per second) scale systems ASCI is working to bring to the LLNL unclassified environment in 2005. As the distilled scientific requirements data presented in this document indicate, great computational science is being done at LLNL--the breadth of accomplishment is amazing. The computational efforts make it clear what a unique national treasure this Laboratory has become. While the projects cover a wide and varied application space, they share three elements--they represent truly great science, they have broad impact on the Laboratory's major technical programs, and they depend critically on big computers.

  10. F/A-18 FAST Offers Advanced System Test Capability

    NASA Video Gallery

    NASA's Dryden Flight Research Center has modified an F/A-18A Hornet aircraft with additional research flight control computer systems for use as a Full-scale Advanced Systems Test Bed. Previously f...

  11. Central control element expands computer capability

    NASA Technical Reports Server (NTRS)

    Easton, R. A.

    1975-01-01

    Redundant processing and multiprocessing modes can be obtained from one computer by using logic configuration. Configuration serves as central control element which can automatically alternate between high-capacity multiprocessing mode and high-reliability redundant mode using dynamic mode switching in real time.

  12. Advanced Post-Irradiation Examination Capabilities Alternatives Analysis Report

    SciTech Connect

    Jeff Bryan; Bill Landman; Porter Hill

    2012-12-01

    An alternatives analysis was performed for the Advanced Post-Irradiation Capabilities (APIEC) project in accordance with the U.S. Department of Energy (DOE) Order DOE O 413.3B, “Program and Project Management for the Acquisition of Capital Assets”. The Alternatives Analysis considered six major alternatives: ? No Action ? Modify Existing DOE Facilities – capabilities distributed among multiple locations ? Modify Existing DOE Facilities – capabilities consolidated at a few locations ? Construct New Facility ? Commercial Partnership ? International Partnerships Based on the alternatives analysis documented herein, it is recommended to DOE that the advanced post-irradiation examination capabilities be provided by a new facility constructed at the Materials and Fuels Complex at the Idaho National Laboratory.

  13. Computational capabilities of multilayer committee machines

    NASA Astrophysics Data System (ADS)

    Neirotti, J. P.; Franco, L.

    2010-11-01

    We obtained an analytical expression for the computational complexity of many layered committee machines with a finite number of hidden layers (L < ∞) using the generalization complexity measure introduced by Franco et al (2006) IEEE Trans. Neural Netw. 17 578. Although our result is valid in the large-size limit and for an overlap synaptic matrix that is ultrametric, it provides a useful tool for inferring the appropriate architecture a network must have to reproduce an arbitrary realizable Boolean function.

  14. Summary of NASA Advanced Telescope and Observatory Capability Roadmap

    NASA Technical Reports Server (NTRS)

    Stahl, H. Phil; Feinberg, Lee

    2006-01-01

    The NASA Advanced Telescope and Observatory (ATO) Capability Roadmap addresses technologies necessary for NASA to enable future space telescopes and observatories operating in all electromagnetic bands, from x-rays to millimeter waves, and including gravity-waves. It lists capability priorities derived from current and developing Space Missions Directorate (SMD) strategic roadmaps. Technology topics include optics; wavefront sensing and control and interferometry; distributed and advanced spacecraft systems; cryogenic and thermal control systems; large precision structure for observatories; and the infrastructure essential to future space telescopes and observatories.

  15. Summary of NASA Advanced Telescope and Observatory Capability Roadmap

    NASA Technical Reports Server (NTRS)

    Stahl, H. Philip; Feinberg, Lee

    2007-01-01

    The NASA Advanced Telescope and Observatory (ATO) Capability Roadmap addresses technologies necessary for NASA to enable future space telescopes and observatories operating in all electromagnetic bands, from x-rays to millimeter waves, and including gravity-waves. It lists capability priorities derived from current and developing Space Missions Directorate (SMD) strategic roadmaps. Technology topics include optics; wavefront sensing and control and interferometry; distributed and advanced spacecraft systems; cryogenic and thermal control systems; large precision structure for observatories; and the infrastructure essential to future space telescopes and observatories.

  16. Advances and challenges in computational plasma science

    NASA Astrophysics Data System (ADS)

    Tang, W. M.

    2005-02-01

    Scientific simulation, which provides a natural bridge between theory and experiment, is an essential tool for understanding complex plasma behaviour. Recent advances in simulations of magnetically confined plasmas are reviewed in this paper, with illustrative examples, chosen from associated research areas such as microturbulence, magnetohydrodynamics and other topics. Progress has been stimulated, in particular, by the exponential growth of computer speed along with significant improvements in computer technology. The advances in both particle and fluid simulations of fine-scale turbulence and large-scale dynamics have produced increasingly good agreement between experimental observations and computational modelling. This was enabled by two key factors: (a) innovative advances in analytic and computational methods for developing reduced descriptions of physics phenomena spanning widely disparate temporal and spatial scales and (b) access to powerful new computational resources. Excellent progress has been made in developing codes for which computer run-time and problem-size scale well with the number of processors on massively parallel processors (MPPs). Examples include the effective usage of the full power of multi-teraflop (multi-trillion floating point computations per second) MPPs to produce three-dimensional, general geometry, nonlinear particle simulations that have accelerated advances in understanding the nature of turbulence self-regulation by zonal flows. These calculations, which typically utilized billions of particles for thousands of time-steps, would not have been possible without access to powerful present generation MPP computers and the associated diagnostic and visualization capabilities. In looking towards the future, the current results from advanced simulations provide great encouragement for being able to include increasingly realistic dynamics to enable deeper physics insights into plasmas in both natural and laboratory environments. This

  17. Predictive Dynamic Security Assessment through Advanced Computing

    SciTech Connect

    Huang, Zhenyu; Diao, Ruisheng; Jin, Shuangshuang; Chen, Yousu

    2014-11-30

    Abstract— Traditional dynamic security assessment is limited by several factors and thus falls short in providing real-time information to be predictive for power system operation. These factors include the steady-state assumption of current operating points, static transfer limits, and low computational speed. This addresses these factors and frames predictive dynamic security assessment. The primary objective of predictive dynamic security assessment is to enhance the functionality and computational process of dynamic security assessment through the use of high-speed phasor measurements and the application of advanced computing technologies for faster-than-real-time simulation. This paper presents algorithms, computing platforms, and simulation frameworks that constitute the predictive dynamic security assessment capability. Examples of phasor application and fast computation for dynamic security assessment are included to demonstrate the feasibility and speed enhancement for real-time applications.

  18. Patriot Advanced Capability-3 Missile Segment Enhancement (PAC-3 MSE)

    DTIC Science & Technology

    2015-12-01

    Selected Acquisition Report (SAR) RCS: DD-A&T(Q&A)823-492 Patriot Advanced Capability-3 Missile Segment Enhancement (PAC-3 MSE) As of FY 2017...Program Manager POE - Program Office Estimate RDT&E - Research, Development, Test, and Evaluation SAR - Selected Acquisition Report SCP - Service Cost

  19. CHARACTERIZATION OF THE ADVANCED RADIOGRAPHIC CAPABILITY FRONT END ON NIF

    SciTech Connect

    Haefner, C; Heebner, J; Dawson, J; Fochs, S; Shverdin, M; Crane, J K; Kanz, V K; Halpin, J; Phan, H; Sigurdsson, R; Brewer, W; Britten, J; Brunton, G; Clark, W; Messerly, M J; Nissen, J D; Nguyen, H; Shaw, B; Hackel, R; Hermann, M; Tietbohl, G; Siders, C W; Barty, C J

    2009-07-15

    We have characterized the Advanced Radiographic Capability injection laser system and demonstrated that it meets performance requirements for upcoming National Ignition Facility fusion experiments. Pulse compression was achieved with a scaled down replica of the meter-scale grating ARC compressor and sub-ps pulse duration was demonstrated at the Joule-level.

  20. 2005 White Paper on Institutional Capability Computing Requirements

    SciTech Connect

    Carnes, B; McCoy, M; Seager, M

    2006-01-20

    This paper documents the need for a significant increase in the computing infrastructure provided to scientists working in the unclassified domains at Lawrence Livermore National Laboratory (LLNL). This need could be viewed as the next step in a broad strategy outlined in the January 2002 White Paper (UCRL-ID-147449) that bears essentially the same name as this document. Therein we wrote: 'This proposed increase could be viewed as a step in a broader strategy linking hardware evolution to applications development that would take LLNL unclassified computational science to a position of distinction if not preeminence by 2006.' This position of distinction has certainly been achieved. This paper provides a strategy for sustaining this success but will diverge from its 2002 predecessor in that it will: (1) Amplify the scientific and external success LLNL has enjoyed because of the investments made in 2002 (MCR, 11 TF) and 2004 (Thunder, 23 TF). (2) Describe in detail the nature of additional investments that are important to meet both the institutional objectives of advanced capability for breakthrough science and the scientists clearly stated request for adequate capacity and more rapid access to moderate-sized resources. (3) Put these requirements in the context of an overall strategy for simulation science and external collaboration. While our strategy for Multiprogrammatic and Institutional Computing (M&IC) has worked well, three challenges must be addressed to assure and enhance our position. The first is that while we now have over 50 important classified and unclassified simulation codes available for use by our computational scientists, we find ourselves coping with high demand for access and long queue wait times. This point was driven home in the 2005 Institutional Computing Executive Group (ICEG) 'Report Card' to the Deputy Director for Science and Technology (DDST) Office and Computation Directorate management. The second challenge is related to the balance

  1. Center for Advanced Computational Technology

    NASA Technical Reports Server (NTRS)

    Noor, Ahmed K.

    2000-01-01

    The Center for Advanced Computational Technology (ACT) was established to serve as a focal point for diverse research activities pertaining to application of advanced computational technology to future aerospace systems. These activities include the use of numerical simulations, artificial intelligence methods, multimedia and synthetic environments, and computational intelligence, in the modeling, analysis, sensitivity studies, optimization, design and operation of future aerospace systems. The Center is located at NASA Langley and is an integral part of the School of Engineering and Applied Science of the University of Virginia. The Center has four specific objectives: 1) conduct innovative research on applications of advanced computational technology to aerospace systems; 2) act as pathfinder by demonstrating to the research community what can be done (high-potential, high-risk research); 3) help in identifying future directions of research in support of the aeronautical and space missions of the twenty-first century; and 4) help in the rapid transfer of research results to industry and in broadening awareness among researchers and engineers of the state-of-the-art in applications of advanced computational technology to the analysis, design prototyping and operations of aerospace and other high-performance engineering systems. In addition to research, Center activities include helping in the planning and coordination of the activities of a multi-center team of NASA and JPL researchers who are developing an intelligent synthesis environment for future aerospace systems; organizing workshops and national symposia; as well as writing state-of-the-art monographs and NASA special publications on timely topics.

  2. Advanced Simulation Capability for Environmental Management (ASCEM): Early Site Demonstration

    SciTech Connect

    Meza, Juan; Hubbard, Susan; Freshley, Mark D.; Gorton, Ian; Moulton, David; Denham, Miles E.

    2011-03-07

    The U.S. Department of Energy Office of Environmental Management, Technology Innovation and Development (EM-32), is supporting development of the Advanced Simulation Capability for Environmental Management (ASCEM). ASCEM is a state-of-the-art scientific tool and approach for understanding and predicting contaminant fate and transport in natural and engineered systems. The modular and open source high performance computing tool will facilitate integrated approaches to modeling and site characterization that enable robust and standardized assessments of performance and risk for EM cleanup and closure activities. As part of the initial development process, a series of demonstrations were defined to test ASCEM components and provide feedback to developers, engage end users in applications, and lead to an outcome that would benefit the sites. The demonstration was implemented for a sub-region of the Savannah River Site General Separations Area that includes the F-Area Seepage Basins. The physical domain included the unsaturated and saturated zones in the vicinity of the seepage basins and Fourmile Branch, using an unstructured mesh fit to the hydrostratigraphy and topography of the site. The calculations modeled variably saturated flow and the resulting flow field was used in simulations of the advection of non-reactive species and the reactive-transport of uranium. As part of the demonstrations, a new set of data management, visualization, and uncertainty quantification tools were developed to analyze simulation results and existing site data. These new tools can be used to provide summary statistics, including information on which simulation parameters were most important in the prediction of uncertainty and to visualize the relationships between model input and output.

  3. Computer-assisted sperm analysis (CASA): capabilities and potential developments.

    PubMed

    Amann, Rupert P; Waberski, Dagmar

    2014-01-01

    Computer-assisted sperm analysis (CASA) systems have evolved over approximately 40 years, through advances in devices to capture the image from a microscope, huge increases in computational power concurrent with amazing reduction in size of computers, new computer languages, and updated/expanded software algorithms. Remarkably, basic concepts for identifying sperm and their motion patterns are little changed. Older and slower systems remain in use. Most major spermatology laboratories and semen processing facilities have a CASA system, but the extent of reliance thereon ranges widely. This review describes capabilities and limitations of present CASA technology used with boar, bull, and stallion sperm, followed by possible future developments. Each marketed system is different. Modern CASA systems can automatically view multiple fields in a shallow specimen chamber to capture strobe-like images of 500 to >2000 sperm, at 50 or 60 frames per second, in clear or complex extenders, and in <2 minutes, store information for ≥ 30 frames and provide summary data for each spermatozoon and the population. A few systems evaluate sperm morphology concurrent with motion. CASA cannot accurately predict 'fertility' that will be obtained with a semen sample or subject. However, when carefully validated, current CASA systems provide information important for quality assurance of semen planned for marketing, and for the understanding of the diversity of sperm responses to changes in the microenvironment in research. The four take-home messages from this review are: (1) animal species, extender or medium, specimen chamber, intensity of illumination, imaging hardware and software, instrument settings, technician, etc., all affect accuracy and precision of output values; (2) semen production facilities probably do not need a substantially different CASA system whereas biology laboratories would benefit from systems capable of imaging and tracking sperm in deep chambers for a flexible

  4. Advanced flight computer. Special study

    NASA Technical Reports Server (NTRS)

    Coo, Dennis

    1995-01-01

    This report documents a special study to define a 32-bit radiation hardened, SEU tolerant flight computer architecture, and to investigate current or near-term technologies and development efforts that contribute to the Advanced Flight Computer (AFC) design and development. An AFC processing node architecture is defined. Each node may consist of a multi-chip processor as needed. The modular, building block approach uses VLSI technology and packaging methods that demonstrate a feasible AFC module in 1998 that meets that AFC goals. The defined architecture and approach demonstrate a clear low-risk, low-cost path to the 1998 production goal, with intermediate prototypes in 1996.

  5. Advances in National Capabilities for Consequence Assessment Modeling of Airborne Hazards

    SciTech Connect

    Nasstrom, J; Sugiyama, G; Foster, K; Larsen, S; Kosovic, B; Eme, B; Walker, H; Goldstein, P; Lundquist, J; Pobanz, B; Fulton, J

    2007-11-26

    This paper describes ongoing advancement of airborne hazard modeling capabilities in support of multiple agencies through the National Atmospheric Release Advisory Center (NARAC) and the Interagency Atmospheric Modeling and Atmospheric Assessment Center (IMAAC). A suite of software tools developed by Lawrence Livermore National Laboratory (LLNL) and collaborating organizations includes simple stand-alone, local-scale plume modeling tools for end user's computers, Web- and Internet-based software to access advanced 3-D flow and atmospheric dispersion modeling tools and expert analysis from the national center at LLNL, and state-of-the-science high-resolution urban models and event reconstruction capabilities.

  6. ATCA for Machines-- Advanced Telecommunications Computing Architecture

    SciTech Connect

    Larsen, R.S.; /SLAC

    2008-04-22

    The Advanced Telecommunications Computing Architecture is a new industry open standard for electronics instrument modules and shelves being evaluated for the International Linear Collider (ILC). It is the first industrial standard designed for High Availability (HA). ILC availability simulations have shown clearly that the capabilities of ATCA are needed in order to achieve acceptable integrated luminosity. The ATCA architecture looks attractive for beam instruments and detector applications as well. This paper provides an overview of ongoing R&D including application of HA principles to power electronics systems.

  7. Advanced Computing Tools and Models for Accelerator Physics

    SciTech Connect

    Ryne, Robert; Ryne, Robert D.

    2008-06-11

    This paper is based on a transcript of my EPAC'08 presentation on advanced computing tools for accelerator physics. Following an introduction I present several examples, provide a history of the development of beam dynamics capabilities, and conclude with thoughts on the future of large scale computing in accelerator physics.

  8. Archiving Software Systems: Approaches to Preserve Computational Capabilities

    NASA Astrophysics Data System (ADS)

    King, T. A.

    2014-12-01

    A great deal of effort is made to preserve scientific data. Not only because data is knowledge, but it is often costly to acquire and is sometimes collected under unique circumstances. Another part of the science enterprise is the development of software to process and analyze the data. Developed software is also a large investment and worthy of preservation. However, the long term preservation of software presents some challenges. Software often requires a specific technology stack to operate. This can include software, operating systems and hardware dependencies. One past approach to preserve computational capabilities is to maintain ancient hardware long past its typical viability. On an archive horizon of 100 years, this is not feasible. Another approach to preserve computational capabilities is to archive source code. While this can preserve details of the implementation and algorithms, it may not be possible to reproduce the technology stack needed to compile and run the resulting applications. This future forward dilemma has a solution. Technology used to create clouds and process big data can also be used to archive and preserve computational capabilities. We explore how basic hardware, virtual machines, containers and appropriate metadata can be used to preserve computational capabilities and to archive functional software systems. In conjunction with data archives, this provides scientist with both the data and capability to reproduce the processing and analysis used to generate past scientific results.

  9. ADVANCED SIMULATION CAPABILITY FOR ENVIRONMENTAL MANAGEMENT- CURRENT STATUS AND PHASE II DEMONSTRATION RESULTS

    SciTech Connect

    Seitz, R.

    2013-02-26

    The U.S. Department of Energy (USDOE) Office of Environmental Management (EM), Office of Soil and Groundwater, is supporting development of the Advanced Simulation Capability for Environmental Management (ASCEM). ASCEM is a state-of-the-art scientific tool and approach for understanding and predicting contaminant fate and transport in natural and engineered systems. The modular and open source high-performance computing tool facilitates integrated approaches to modeling and site characterization that enable robust and standardized assessments of performance and risk for EM cleanup and closure activities. The ASCEM project continues to make significant progress in development of computer software capabilities with an emphasis on integration of capabilities in FY12. Capability development is occurring for both the Platform and Integrated Toolsets and High-Performance Computing (HPC) Multiprocess Simulator. The Platform capabilities provide the user interface and tools for end-to-end model development, starting with definition of the conceptual model, management of data for model input, model calibration and uncertainty analysis, and processing of model output, including visualization. The HPC capabilities target increased functionality of process model representations, toolsets for interaction with Platform, and verification and model confidence testing. The Platform and HPC capabilities are being tested and evaluated for EM applications in a set of demonstrations as part of Site Applications Thrust Area activities. The Phase I demonstration focusing on individual capabilities of the initial toolsets was completed in 2010. The Phase II demonstration completed in 2012 focused on showcasing integrated ASCEM capabilities. For Phase II, the Hanford Site deep vadose zone (BC Cribs) served as an application site for an end-to-end demonstration of capabilities, with emphasis on integration and linkages between the Platform and HPC components. Other demonstrations

  10. ADVANCED SIMULATION CAPABILITY FOR ENVIRONMENTAL MANAGEMENT – CURRENT STATUS AND PHASE II DEMONSTRATION RESULTS

    SciTech Connect

    Seitz, Roger; Freshley, Mark D.; Dixon, Paul; Hubbard, Susan S.; Freedman, Vicky L.; Flach, Gregory P.; Faybishenko, Boris; Gorton, Ian; Finsterle, Stefan A.; Moulton, John D.; Steefel, Carl I.; Marble, Justin

    2013-06-27

    The U.S. Department of Energy (USDOE) Office of Environmental Management (EM), Office of Soil and Groundwater, is supporting development of the Advanced Simulation Capability for Environmental Management (ASCEM). ASCEM is a state-of-the-art scientific tool and approach for understanding and predicting contaminant fate and transport in natural and engineered systems. The modular and open source high-performance computing tool facilitates integrated approaches to modeling and site characterization that enable robust and standardized assessments of performance and risk for EM cleanup and closure activities. The ASCEM project continues to make significant progress in development of computer software capabilities with an emphasis on integration of capabilities in FY12. Capability development is occurring for both the Platform and Integrated Toolsets and High-Performance Computing (HPC) Multiprocess Simulator. The Platform capabilities provide the user interface and tools for end-to-end model development, starting with definition of the conceptual model, management of data for model input, model calibration and uncertainty analysis, and processing of model output, including visualization. The HPC capabilities target increased functionality of process model representations, toolsets for interaction with Platform, and verification and model confidence testing. The Platform and HPC capabilities are being tested and evaluated for EM applications in a set of demonstrations as part of Site Applications Thrust Area activities. The Phase I demonstration focusing on individual capabilities of the initial toolsets was completed in 2010. The Phase II demonstration completed in 2012 focused on showcasing integrated ASCEM capabilities. For Phase II, the Hanford Site deep vadose zone (BC Cribs) served as an application site for an end-to-end demonstration of capabilities, with emphasis on integration and linkages between the Platform and HPC components. Other demonstrations

  11. Hypersonic Experimental and Computational Capability, Improvement and Validation. Volume 2

    NASA Technical Reports Server (NTRS)

    Muylaert, Jean (Editor); Kumar, Ajay (Editor); Dujarric, Christian (Editor)

    1998-01-01

    The results of the phase 2 effort conducted under AGARD Working Group 18 on Hypersonic Experimental and Computational Capability, Improvement and Validation are presented in this report. The first volume, published in May 1996, mainly focused on the design methodology, plans and some initial results of experiments that had been conducted to serve as validation benchmarks. The current volume presents the detailed experimental and computational data base developed during this effort.

  12. Combining human and computer interpretation capabilities to analyze ERTS imagery

    NASA Technical Reports Server (NTRS)

    Nichols, J. D.

    1973-01-01

    The human photointerpreter and the computer have complementary capabilities that are exploited in a computer-based data analysis system developed at the Forestry Remote Sensing Laboratory, University of California. This system is designed to optimize the process of extracting resource information from ERTS images. The human has the ability to quickly delineate gross differences in land classes, such as wildland, urban, and agriculture on appropriate ERTS images, and to further break these gross classes into meaningful subclasses. The computer, however, can more efficiently analyze point-by-point spectral information and localized textural information which can result in a much more detailed agricultural or wildland classification based on species composition and/or plant association. These human and computer capabilities have been integrated through the use of an inexpensive small scale computer dedicated to the interactive preprocessing of the human inputs and the display of raw ERTS images and computer classified images. The small computer is linked to a large scale computer system wherein the bulk of the statistical work and the automatic point-by-point classification is done.

  13. Computer routine adds plotting capabilities to existing programs

    NASA Technical Reports Server (NTRS)

    Harris, J. C.; Linnekin, J. S.

    1966-01-01

    PLOTAN, a generalized plot analysis routine written for the IBM 7094 computer, minimizes the difficulties in adding plot capabilities to large existing programs. PLOTAN is used in conjunction with a binary tape writing routine and has the ability to plot any variable on the intermediate binary tape as a function of any other.

  14. Computer-Based Measurement of Intellectual Capabilities. Final Report.

    ERIC Educational Resources Information Center

    Weiss, David J.

    During 1975-1979 this research into the potential of computerized adaptive testing to reduce errors in the measurement of human capabilities used Marine recruits for a live-testing validity comparison of computerized adaptive and conventional tests. The program purposes were to: (1) identify the most useful computer-based adaptive testing…

  15. Fluidic technology: adding control, computation, and sensing capability to microfluidics

    NASA Astrophysics Data System (ADS)

    Drzewiecki, Tadeusz M.; Macia, Narciso F.

    2003-04-01

    This paper presents an overview of fluidic technology - a technology that provides an additional dimension to conventional microfluidic technology by adding sensing, computation (both analog and digital) and control. The US Army Diamond Ordnance Fuze Labs officially recognized fluidics as a comprehensive technology comparable to electronics with its announcement in 1959. Because fluidic elements have very few or no moving parts the technology provides significant operational advantages in harsh environments (EMI, radiation, high temperature and vibration). It also offers advantages when dealing with fluid variables (flow, pressure, density, viscosity, etc.) by eliminating the need for interfaces. With the elimination of the inertia and friction associated with moving parts there are even greater advantages as a result of higher speed of operation. Where mechanical and micromechanical devices may be limited to only hundreds of hertz true microfluidic systems can operate at tens of thousands of hertz. We discuss the fundamental principles of jet deflection amplification and vortex modulation and present circuit building blocks such as the laminar proportional amplifier, vortex valve, oscillators, and positive-feedback digital components. However, most importantly, we present and discuss three specific applications illustrating the power of fluidics in microfluidics and MEMS. These are: a gas analyzer-on-a-chip, capable of simultaneous analysis of multiple gas mixtures with clinical accuracies; an intermittent oxygen delivery system that provides supplemental oxygen to ambulatory patients through a nasal cannula; and, an array of vortex microvalves capable of controlling propellants for micropropulsion systems or for the temporal and spatial modulation of fuel for the optimal control of gas turbine combustors. A sampling of other fluidic in microfluidic application are mentioned to include pressure and acoustic amplification (a kosher public address system is

  16. Brookhaven National Laboratory's capabilities for advanced analyses of cyber threats

    SciTech Connect

    DePhillips, M. P.

    2014-01-01

    BNL has several ongoing, mature, and successful programs and areas of core scientific expertise that readily could be modified to address problems facing national security and efforts by the IC related to securing our nation’s computer networks. In supporting these programs, BNL houses an expansive, scalable infrastructure built exclusively for transporting, storing, and analyzing large disparate data-sets. Our ongoing research projects on various infrastructural issues in computer science undoubtedly would be relevant to national security. Furthermore, BNL frequently partners with researchers in academia and industry worldwide to foster unique and innovative ideas for expanding research opportunities and extending our insights. Because the basic science conducted at BNL is unique, such projects have led to advanced techniques, unlike any others, to support our mission of discovery. Many of them are modular techniques, thus making them ideal for abstraction and retrofitting to other uses including those facing national security, specifically the safety of the nation’s cyber space.

  17. Surface Modeling, Solid Modeling and Finite Element Modeling. Analysis Capabilities of Computer-Assisted Design and Manufacturing Systems.

    ERIC Educational Resources Information Center

    Nee, John G.; Kare, Audhut P.

    1987-01-01

    Explores several concepts in computer assisted design/computer assisted manufacturing (CAD/CAM). Defines, evaluates, reviews and compares advanced computer-aided geometric modeling and analysis techniques. Presents the results of a survey to establish the capabilities of minicomputer based-systems with the CAD/CAM packages evaluated. (CW)

  18. Advanced Query and Data Mining Capabilities for MaROS

    NASA Technical Reports Server (NTRS)

    Wang, Paul; Wallick, Michael N.; Allard, Daniel A.; Gladden, Roy E.; Hy, Franklin H.

    2013-01-01

    The Mars Relay Operational Service (MaROS) comprises a number of tools to coordinate, plan, and visualize various aspects of the Mars Relay network. These levels include a Web-based user interface, a back-end "ReSTlet" built in Java, and databases that store the data as it is received from the network. As part of MaROS, the innovators have developed and implemented a feature set that operates on several levels of the software architecture. This new feature is an advanced querying capability through either the Web-based user interface, or through a back-end REST interface to access all of the data gathered from the network. This software is not meant to replace the REST interface, but to augment and expand the range of available data. The current REST interface provides specific data that is used by the MaROS Web application to display and visualize the information; however, the returned information from the REST interface has typically been pre-processed to return only a subset of the entire information within the repository, particularly only the information that is of interest to the GUI (graphical user interface). The new, advanced query and data mining capabilities allow users to retrieve the raw data and/or to perform their own data processing. The query language used to access the repository is a restricted subset of the structured query language (SQL) that can be built safely from the Web user interface, or entered as freeform SQL by a user. The results are returned in a CSV (Comma Separated Values) format for easy exporting to third party tools and applications that can be used for data mining or user-defined visualization and interpretation. This is the first time that a service is capable of providing access to all cross-project relay data from a single Web resource. Because MaROS contains the data for a variety of missions from the Mars network, which span both NASA and ESA, the software also establishes an access control list (ACL) on each data record

  19. Purple Computational Environment With Mappings to ACE Requirements for the General Availability User Environment Capabilities

    SciTech Connect

    Barney, B; Shuler, J

    2006-08-21

    Purple is an Advanced Simulation and Computing (ASC) funded massively parallel supercomputer located at Lawrence Livermore National Laboratory (LLNL). The Purple Computational Environment documents the capabilities and the environment provided for the FY06 LLNL Level 1 General Availability Milestone. This document describes specific capabilities, tools, and procedures to support both local and remote users. The model is focused on the needs of the ASC user working in the secure computing environments at Los Alamos National Laboratory, Lawrence Livermore National Laboratory, and Sandia National Laboratories, but also documents needs of the LLNL and Alliance users working in the unclassified environment. Additionally, the Purple Computational Environment maps the provided capabilities to the Trilab ASC Computing Environment (ACE) Version 8.0 requirements. The ACE requirements reflect the high performance computing requirements for the General Availability user environment capabilities of the ASC community. Appendix A lists these requirements and includes a description of ACE requirements met and those requirements that are not met for each section of this document. The Purple Computing Environment, along with the ACE mappings, has been issued and reviewed throughout the Tri-lab community.

  20. Defense Science Board Report on Advanced Computing

    DTIC Science & Technology

    2009-03-01

    complex computational  issues are  pursued , and that several vendors remain at  the  leading edge of  supercomputing  capability  in  the U.S.  In... pursuing   the  ASC  program  to  help  assure  that  HPC  advances  are  available  to  the  broad  national  security  community. As  in  the past, many...apply HPC  to  technical  problems  related  to  weapons  physics,  but  that  are  entirely  unclassified.  Examples include explosive  astrophysical

  1. Capabilities and Limitations of Infinite-Time Computation

    NASA Astrophysics Data System (ADS)

    Long, James Thomas, III

    The relatively new field of infinitary computability strives to characterize the capabilities and limitations of infinite-time computation; that is, computations of potentially transfinite length. Throughout our work, we focus on the prototypical model of infinitary computation: Hamkins and Lewis' infinite-time Turing machine (ITTM), which generalizes the classical Turing machine model in a natural way. This dissertation adopts a novel approach to this study: whereas most of the literature, starting with Hamkins and Lewis' debut of the ITTM model, pursues set-theoretic questions using a set-theoretic approach, we employ arguments that are truly computational in character. Indeed, we fully utilize analogues of classical results from finitary computability, such as the s mn Theorem and existence of universal machines, and for the most part, judiciously restrict our attention to the classical setting of computations over the natural numbers. In Chapter 2 of this dissertation, we state, and derive, as necessary, the aforementioned analogues of the classical results, as well as some useful constructs for ITTM programming. With this due paid, the subsequent work in Chapters 3 and 4 requires little in the way of programming, and that programming which is required in Chapter 5 is dramatically streamlined. In Chapter 3, we formulate two analogues of one of Rado's busy beaver functions from classical computability, and show, in analogy with Rado's results, that they grow faster than a wide class of infinite-time computable functions. Chapter 4 is tasked with developing a system of ordinal notations via a natural approach involving infinite-time computation, as well as an associated fast-growing hierarchy of functions over the natural numbers. We then demonstrate that the busy beaver functions from Chapter 3 grow faster than the functions which appear in a significant portion of this hierarchy. Finally, we debut, in Chapter 5, two enhancements of the ITTM model which can self

  2. Advanced SAR simulator with multi-beam interferometric capabilities

    NASA Astrophysics Data System (ADS)

    Reppucci, Antonio; Márquez, José; Cazcarra, Victor; Ruffini, Giulio

    2014-10-01

    State of the art simulations are of great interest when designing a new instrument, studying the imaging mechanisms due to a given scenario or for inversion algorithm design as they allow to analyze and understand the effects of different instrument configurations and targets compositions. In the framework of the studies about a new instruments devoted to the estimation of the ocean surface movements using Synthetic Aperture Radar along-track interferometry (SAR-ATI) an End-to-End simulator has been developed. The simulator, built in a high modular way to allow easy integration of different processing-features, deals with all the basic operations involved in an end to end scenario. This includes the computation of the position and velocity of the platform (airborne/spaceborne) and the geometric parameters defining the SAR scene, the surface definition, the backscattering computation, the atmospheric attenuation, the instrument configuration, and the simulation of the transmission/reception chains and the raw data. In addition, the simulator provides a inSAR processing suit and a sea surface movement retrieval module. Up to four beams (each one composed by a monostatic and a bistatic channel) can be activated. Each channel provides raw data and SLC images with the possibility of choosing between Strip-map and Scansar modes. Moreover, the software offers the possibility of radiometric sensitivity analysis and error analysis due atmospheric disturbances, instrument-noise, interferogram phase-noise, platform velocity and attitude variations. In this paper, the architecture and the capabilities of this simulator will be presented. Meaningful simulation examples will be shown.

  3. Advanced Simulation Capability for Environmental Management - Current Status and Phase II Demonstration Results - 13161

    SciTech Connect

    Seitz, Roger R.; Flach, Greg; Freshley, Mark D.; Freedman, Vicky; Gorton, Ian; Dixon, Paul; Moulton, J. David; Hubbard, Susan S.; Faybishenko, Boris; Steefel, Carl I.; Finsterle, Stefan; Marble, Justin

    2013-07-01

    The U.S. Department of Energy (US DOE) Office of Environmental Management (EM), Office of Soil and Groundwater, is supporting development of the Advanced Simulation Capability for Environmental Management (ASCEM). ASCEM is a state-of-the-art scientific tool and approach for understanding and predicting contaminant fate and transport in natural and engineered systems. The modular and open source high-performance computing tool facilitates integrated approaches to modeling and site characterization that enable robust and standardized assessments of performance and risk for EM cleanup and closure activities. The ASCEM project continues to make significant progress in development of computer software capabilities with an emphasis on integration of capabilities in FY12. Capability development is occurring for both the Platform and Integrated Tool-sets and High-Performance Computing (HPC) Multi-process Simulator. The Platform capabilities provide the user interface and tools for end-to-end model development, starting with definition of the conceptual model, management of data for model input, model calibration and uncertainty analysis, and processing of model output, including visualization. The HPC capabilities target increased functionality of process model representations, tool-sets for interaction with Platform, and verification and model confidence testing. The Platform and HPC capabilities are being tested and evaluated for EM applications in a set of demonstrations as part of Site Applications Thrust Area activities. The Phase I demonstration focusing on individual capabilities of the initial tool-sets was completed in 2010. The Phase II demonstration completed in 2012 focused on showcasing integrated ASCEM capabilities. For Phase II, the Hanford Site deep vadose zone (BC Cribs) served as an application site for an end-to-end demonstration of capabilities, with emphasis on integration and linkages between the Platform and HPC components. Other demonstrations

  4. Advanced capabilities for in situ planetary mass spectrometry

    NASA Astrophysics Data System (ADS)

    Arevalo, R. D., Jr.; Mahaffy, P. R.; Brinckerhoff, W. B.; Getty, S.; Benna, M.; van Amerom, F. H. W.; Danell, R.; Pinnick, V. T.; Li, X.; Grubisic, A.; Cornish, T.; Hovmand, L.

    2015-12-01

    NASA GSFC has delivered highly capable quadrupole mass spectrometers (QMS) for missions to Venus (Pioneer Venus), Jupiter (Galileo), Saturn/Titan (Cassini-Huygens), Mars (MSL and MAVEN), and the Moon (LADEE). Our understanding of the Solar System has been expanded significantly by these exceedingly versatile yet low risk and cost efficient instruments. GSFC has developed more recently a suite of advanced instrument technologies promising enhanced science return while selectively leveraging heritage designs. Relying on a traditional precision QMS, the Analysis of Gas Evolved from Samples (AGES) instrument measures organic inventory, determines exposure age and establishes the absolute timing of deposition/petrogenesis of interrogated samples. The Mars Organic Molecule Analyzer (MOMA) aboard the ExoMars 2018 rover employs a two-dimensional ion trap, built analogously to heritage QMS rod assemblies, which can support dual ionization sources, selective ion enrichment and tandem mass spectrometry (MS/MS). The same miniaturized analyzer serves as the core of the Linear Ion Trap Mass Spectrometer (LITMS) instrument, which offers negative ion detection (switchable polarity) and an extended mass range (>2000 Da). Time-of-flight mass spectrometers (TOF-MS) have been interfaced to a range of laser sources to progress high-sensitivity laser ablation and desorption methods for analysis of inorganic and non-volatile organic compounds, respectively. The L2MS (two-step laser mass spectrometer) enables the desorption of neutrals and/or prompt ionization at IR (1.0 up to 3.1 µm, with an option for tunability) or UV wavelengths (commonly 266 or 355 nm). For the selective ionization of specific classes of organics, such as aromatic hydrocarbons, a second UV laser may be employed to decouple the desorption and ionization steps and limit molecular fragmentation. Mass analyzers with substantially higher resolving powers (up to m/Δm > 100,000), such as the Advanced Resolution Organic

  5. Predictive Capability Maturity Model for computational modeling and simulation.

    SciTech Connect

    Oberkampf, William Louis; Trucano, Timothy Guy; Pilch, Martin M.

    2007-10-01

    The Predictive Capability Maturity Model (PCMM) is a new model that can be used to assess the level of maturity of computational modeling and simulation (M&S) efforts. The development of the model is based on both the authors experience and their analysis of similar investigations in the past. The perspective taken in this report is one of judging the usefulness of a predictive capability that relies on the numerical solution to partial differential equations to better inform and improve decision making. The review of past investigations, such as the Software Engineering Institute's Capability Maturity Model Integration and the National Aeronautics and Space Administration and Department of Defense Technology Readiness Levels, indicates that a more restricted, more interpretable method is needed to assess the maturity of an M&S effort. The PCMM addresses six contributing elements to M&S: (1) representation and geometric fidelity, (2) physics and material model fidelity, (3) code verification, (4) solution verification, (5) model validation, and (6) uncertainty quantification and sensitivity analysis. For each of these elements, attributes are identified that characterize four increasing levels of maturity. Importantly, the PCMM is a structured method for assessing the maturity of an M&S effort that is directed toward an engineering application of interest. The PCMM does not assess whether the M&S effort, the accuracy of the predictions, or the performance of the engineering system satisfies or does not satisfy specified application requirements.

  6. A new type of GC-MS with advanced capabilities

    NASA Astrophysics Data System (ADS)

    Fialkov, Alexander B.; Steiner, Urs; Jones, Larry; Amirav, Aviv

    2006-03-01

    We have combined the benefits of supersonic molecular beam interface and its related fly-through electron ionization (EI) ion source with the advanced features of the Varian 1200L gas chromatography-mass spectrometry (GC-MS) and mass spectrometry-mass spectrometry (MS-MS), resulting in a new and powerful GC-MS platform with record setting performance. Electron ionization of vibrationally cold molecules in the supersonic molecular beams (SMB) (cold EI) provided mass spectra with enhanced molecular ion, yet with good library search results and superior identification probabilities. We found that high GC column flow rates lower the elution temperature for any given compounds. This allows much larger molecules to elute at the maximum temperature of standard columns. We analyzed a mixture of heavy linear chain hydrocarbons all the way to C84H170 with a molecular weight of 1179.3 amu, using a 4 m 0.25 mm i.d. column and 32 ml/min He flow rate. Furthermore, we obtained a dominant molecular ion to all these compounds. The lower elution temperatures also greatly enhance the ability to analyze very thermally labile compounds such as carbamate pesticides. The experimental 1200 system is capable of triple quadrupole based MS-MS. We found that MS-MS on the molecular ion is much more effective than on fragment ions, and thus, the enhancement of the molecular ion directly improves the MS-MS sensitivity. Fast GC-MS analysis was also explored, based on very high column flow rate for fast splitless injections without affecting the sensitivity, and on the high system selectivity due to the combination of enhanced molecular ion and MS-MS. We demonstrate a few seconds long GC-MS-MS analysis of diazinon, spiked at 10 ng/g in a mixed fruit and vegetable extract. The feature of enhanced molecular ion provides significant enhancement in the detection sensitivity via SIM and RSIM on the molecular ion. While octafluoronaphthalene (OFN) detection limit of below 1 fg in SIM mode is shown, the

  7. Aerodynamic Analyses Requiring Advanced Computers, part 2

    NASA Technical Reports Server (NTRS)

    1975-01-01

    Papers given at the conference present the results of theoretical research on aerodynamic flow problems requiring the use of advanced computers. Topics discussed include two-dimensional configurations, three-dimensional configurations, transonic aircraft, and the space shuttle.

  8. Aerodynamic Analyses Requiring Advanced Computers, Part 1

    NASA Technical Reports Server (NTRS)

    1975-01-01

    Papers are presented which deal with results of theoretical research on aerodynamic flow problems requiring the use of advanced computers. Topics discussed include: viscous flows, boundary layer equations, turbulence modeling and Navier-Stokes equations, and internal flows.

  9. Bringing Advanced Computational Techniques to Energy Research

    SciTech Connect

    Mitchell, Julie C

    2012-11-17

    Please find attached our final technical report for the BACTER Institute award. BACTER was created as a graduate and postdoctoral training program for the advancement of computational biology applied to questions of relevance to bioenergy research.

  10. Quantum chromodynamics with advanced computing

    SciTech Connect

    Kronfeld, Andreas S.; /Fermilab

    2008-07-01

    We survey results in lattice quantum chromodynamics from groups in the USQCD Collaboration. The main focus is on physics, but many aspects of the discussion are aimed at an audience of computational physicists.

  11. Advanced Telescopes and Observatories and Scientific Instruments and Sensors Capability Roadmaps: General Background and Introduction

    NASA Technical Reports Server (NTRS)

    Coulter, Dan; Bankston, Perry

    2005-01-01

    Agency objective are: Strategic Planning Transformation. Advanced Planning Organizational Roles. Public Involvement in Strategic Planning. Strategic Roadmaps and Schedule. Capability Roadmaps and Schedule. Purpose of NRC Review. Capability Roadmap Development (Progress to Date).

  12. Developing an Advanced Environment for Collaborative Computing

    NASA Technical Reports Server (NTRS)

    Becerra-Fernandez, Irma; Stewart, Helen; DelAlto, Martha; DelAlto, Martha; Knight, Chris

    1999-01-01

    Knowledge management in general tries to organize and make available important know-how, whenever and where ever is needed. Today, organizations rely on decision-makers to produce "mission critical" decisions that am based on inputs from multiple domains. The ideal decision-maker has a profound understanding of specific domains that influence the decision-making process coupled with the experience that allows them to act quickly and decisively on the information. In addition, learning companies benefit by not repeating costly mistakes, and by reducing time-to-market in Research & Development projects. Group-decision making tools can help companies make better decisions by capturing the knowledge from groups of experts. Furthermore, companies that capture their customers preferences can improve their customer service, which translates to larger profits. Therefore collaborative computing provides a common communication space, improves sharing of knowledge, provides a mechanism for real-time feedback on the tasks being performed, helps to optimize processes, and results in a centralized knowledge warehouse. This paper presents the research directions. of a project which seeks to augment an advanced collaborative web-based environment called Postdoc, with workflow capabilities. Postdoc is a "government-off-the-shelf" document management software developed at NASA-Ames Research Center (ARC).

  13. Advanced Biomedical Computing Center (ABCC) | DSITP

    Cancer.gov

    The Advanced Biomedical Computing Center (ABCC), located in Frederick Maryland (MD), provides HPC resources for both NIH/NCI intramural scientists and the extramural biomedical research community. Its mission is to provide HPC support, to provide collaborative research, and to conduct in-house research in various areas of computational biology and biomedical research.

  14. Advanced laptop and small personal computer technology

    NASA Technical Reports Server (NTRS)

    Johnson, Roger L.

    1991-01-01

    Advanced laptop and small personal computer technology is presented in the form of the viewgraphs. The following areas of hand carried computers and mobile workstation technology are covered: background, applications, high end products, technology trends, requirements for the Control Center application, and recommendations for the future.

  15. Advanced Simulation and Computing FY17 Implementation Plan, Version 0

    SciTech Connect

    McCoy, Michel; Archer, Bill; Hendrickson, Bruce; Wade, Doug; Hoang, Thuc

    2016-08-29

    The Stockpile Stewardship Program (SSP) is an integrated technical program for maintaining the safety, surety, and reliability of the U.S. nuclear stockpile. The SSP uses nuclear test data, computational modeling and simulation, and experimental facilities to advance understanding of nuclear weapons. It includes stockpile surveillance, experimental research, development and engineering programs, and an appropriately scaled production capability to support stockpile requirements. This integrated national program requires the continued use of experimental facilities and programs, and the computational capabilities to support these programs. The Advanced Simulation and Computing Program (ASC) is a cornerstone of the SSP, providing simulation capabilities and computational resources that support annual stockpile assessment and certification, study advanced nuclear weapons design and manufacturing processes, analyze accident scenarios and weapons aging, and provide the tools to enable stockpile Life Extension Programs (LEPs) and the resolution of Significant Finding Investigations (SFIs). This requires a balance of resource, including technical staff, hardware, simulation software, and computer science solutions. ASC is now focused on increasing predictive capabilities in a three-dimensional (3D) simulation environment while maintaining support to the SSP. The program continues to improve its unique tools for solving progressively more difficult stockpile problems (sufficient resolution, dimensionality, and scientific details), and quantifying critical margins and uncertainties. Resolving each issue requires increasingly difficult analyses because the aging process has progressively moved the stockpile further away from the original test base. Where possible, the program also enables the use of high performance computing (HPC) and simulation tools to address broader national security needs, such as foreign nuclear weapon assessments and counter nuclear terrorism.

  16. Advanced Mesh-Enabled Monte carlo capability for Multi-Physics Reactor Analysis

    SciTech Connect

    Wilson, Paul; Evans, Thomas; Tautges, Tim

    2012-12-24

    This project will accumulate high-precision fluxes throughout reactor geometry on a non- orthogonal grid of cells to support multi-physics coupling, in order to more accurately calculate parameters such as reactivity coefficients and to generate multi-group cross sections. This work will be based upon recent developments to incorporate advanced geometry and mesh capability in a modular Monte Carlo toolkit with computational science technology that is in use in related reactor simulation software development. Coupling this capability with production-scale Monte Carlo radiation transport codes can provide advanced and extensible test-beds for these developments. Continuous energy Monte Carlo methods are generally considered to be the most accurate computational tool for simulating radiation transport in complex geometries, particularly neutron transport in reactors. Nevertheless, there are several limitations for their use in reactor analysis. Most significantly, there is a trade-off between the fidelity of results in phase space, statistical accuracy, and the amount of computer time required for simulation. Consequently, to achieve an acceptable level of statistical convergence in high-fidelity results required for modern coupled multi-physics analysis, the required computer time makes Monte Carlo methods prohibitive for design iterations and detailed whole-core analysis. More subtly, the statistical uncertainty is typically not uniform throughout the domain, and the simulation quality is limited by the regions with the largest statistical uncertainty. In addition, the formulation of neutron scattering laws in continuous energy Monte Carlo methods makes it difficult to calculate adjoint neutron fluxes required to properly determine important reactivity parameters. Finally, most Monte Carlo codes available for reactor analysis have relied on orthogonal hexahedral grids for tallies that do not conform to the geometric boundaries and are thus generally not well

  17. Advancing Capabilities for Understanding the Earth System Through Intelligent Systems, the NSF Perspective

    NASA Astrophysics Data System (ADS)

    Gil, Y.; Zanzerkia, E. E.; Munoz-Avila, H.

    2015-12-01

    The National Science Foundation (NSF) Directorate for Geosciences (GEO) and Directorate for Computer and Information Science (CISE) acknowledge the significant scientific challenges required to understand the fundamental processes of the Earth system, within the atmospheric and geospace, Earth, ocean and polar sciences, and across those boundaries. A broad view of the opportunities and directions for GEO are described in the report "Dynamic Earth: GEO imperative and Frontiers 2015-2020." Many of the aspects of geosciences research, highlighted both in this document and other community grand challenges, pose novel problems for researchers in intelligent systems. Geosciences research will require solutions for data-intensive science, advanced computational capabilities, and transformative concepts for visualizing, using, analyzing and understanding geo phenomena and data. Opportunities for the scientific community to engage in addressing these challenges are available and being developed through NSF's portfolio of investments and activities. The NSF-wide initiative, Cyberinfrastructure Framework for 21st Century Science and Engineering (CIF21), looks to accelerate research and education through new capabilities in data, computation, software and other aspects of cyberinfrastructure. EarthCube, a joint program between GEO and the Advanced Cyberinfrastructure Division, aims to create a well-connected and facile environment to share data and knowledge in an open, transparent, and inclusive manner, thus accelerating our ability to understand and predict the Earth system. EarthCube's mission opens an opportunity for collaborative research on novel information systems enhancing and supporting geosciences research efforts. NSF encourages true, collaborative partnerships between scientists in computer sciences and the geosciences to meet these challenges.

  18. Analysis of Alternatives (AoA) of Open Colllaboration and Research Capabilities Collaboratipon in Research and Engineering in Advanced Technology and Education and High-Performance Computing Innovation Center (HPCIC) on the LVOC.

    SciTech Connect

    Vrieling, P. Douglas

    2016-01-01

    The Livermore Valley Open Campus (LVOC), a joint initiative of the National Nuclear Security Administration (NNSA), Lawrence Livermore National Laboratory (LLNL), and Sandia National Laboratories (SNL), enhances the national security missions of NNSA by promoting greater collaboration between world-class scientists at the national security laboratories, and their partners in industry and academia. Strengthening the science, technology, and engineering (ST&E) base of our nation is one of the NNSA’s top goals. By conducting coordinated and collaborative programs, LVOC enhances both the NNSA and the broader national science and technology base, and helps to ensure the health of core capabilities at LLNL and SNL. These capabilities must remain strong to enable the laboratories to execute their primary mission for NNSA.

  19. Role of HPC in Advancing Computational Aeroelasticity

    NASA Technical Reports Server (NTRS)

    Guruswamy, Guru P.

    2004-01-01

    On behalf of the High Performance Computing and Modernization Program (HPCMP) and NASA Advanced Supercomputing Division (NAS) a study is conducted to assess the role of supercomputers on computational aeroelasticity of aerospace vehicles. The study is mostly based on the responses to a web based questionnaire that was designed to capture the nuances of high performance computational aeroelasticity, particularly on parallel computers. A procedure is presented to assign a fidelity-complexity index to each application. Case studies based on major applications using HPCMP resources are presented.

  20. Advances and trends in computational structural mechanics

    NASA Technical Reports Server (NTRS)

    Noor, Ahmed K.; Atluri, Satya N.

    1987-01-01

    The development status and applicational range of techniques in computational structural mechanics (CSM) are evaluated with a view to advances in computational models for material behavior, discrete-element technology, quality assessment, the control of numerical simulations of structural response, hybrid analysis techniques, techniques for large-scale optimization, and the impact of new computing systems on CSM. Primary pacers of CSM development encompass prediction and analysis of novel materials for structural components, computational strategies for large-scale structural calculations, and the assessment of response prediction reliability together with its adaptive improvement.

  1. Combining advanced imaging processing and low cost remote imaging capabilities

    NASA Astrophysics Data System (ADS)

    Rohrer, Matthew J.; McQuiddy, Brian

    2008-04-01

    Target images are very important for evaluating the situation when Unattended Ground Sensors (UGS) are deployed. These images add a significant amount of information to determine the difference between hostile and non-hostile activities, the number of targets in an area, the difference between animals and people, the movement dynamics of targets, and when specific activities of interest are taking place. The imaging capabilities of UGS systems need to provide only target activity and not images without targets in the field of view. The current UGS remote imaging systems are not optimized for target processing and are not low cost. McQ describes in this paper an architectural and technologic approach for significantly improving the processing of images to provide target information while reducing the cost of the intelligent remote imaging capability.

  2. Advanced Computational Techniques for Power Tube Design.

    DTIC Science & Technology

    1986-07-01

    fixturing applications, in addition to the existing computer-aided engineering capabilities. o Helix TWT Manufacturing has Implemented a tooling and fixturing...illustrates the ajor features of this computer network. ) The backbone of our system is a Sytek Broadband Network (LAN) which Interconnects terminals and...automatic network analyzer (FANA) which electrically characterizes the slow-wave helices of traveling-wave tubes ( TWTs ) -- both for engineering design

  3. Advanced Crew Personal Support Computer (CPSC) task

    NASA Technical Reports Server (NTRS)

    Muratore, Debra

    1991-01-01

    The topics are presented in view graph form and include: background; objectives of task; benefits to the Space Station Freedom (SSF) Program; technical approach; baseline integration; and growth and evolution options. The objective is to: (1) introduce new computer technology into the SSF Program; (2) augment core computer capabilities to meet additional mission requirements; (3) minimize risk in upgrading technology; and (4) provide a low cost way to enhance crew and ground operations support.

  4. Unclassified Computing Capability: User Responses to a Multiprogrammatic and Institutional Computing Questionnaire

    SciTech Connect

    McCoy, M; Kissel, L

    2002-01-29

    We are experimenting with a new computing model to be applied to a new computer dedicated to that model. Several LLNL science teams now have computational requirements, evidenced by the mature scientific applications that have been developed over the past five plus years, that far exceed the capability of the institution's computing resources. Thus, there is increased demand for dedicated, powerful parallel computational systems. Computation can, in the coming year, potentially field a capability system that is low cost because it will be based on a model that employs open source software and because it will use PC (IA32-P4) hardware. This incurs significant computer science risk regarding stability and system features but also presents great opportunity. We believe the risks can be managed, but the existence of risk cannot be ignored. In order to justify the budget for this system, we need to make the case that it serves science and, through serving science, serves the institution. That is the point of the meeting and the White Paper that we are proposing to prepare. The questions are listed and the responses received are in this report.

  5. TOPICAL REVIEW: Advances and challenges in computational plasma science

    NASA Astrophysics Data System (ADS)

    Tang, W. M.; Chan, V. S.

    2005-02-01

    Scientific simulation, which provides a natural bridge between theory and experiment, is an essential tool for understanding complex plasma behaviour. Recent advances in simulations of magnetically confined plasmas are reviewed in this paper, with illustrative examples, chosen from associated research areas such as microturbulence, magnetohydrodynamics and other topics. Progress has been stimulated, in particular, by the exponential growth of computer speed along with significant improvements in computer technology. The advances in both particle and fluid simulations of fine-scale turbulence and large-scale dynamics have produced increasingly good agreement between experimental observations and computational modelling. This was enabled by two key factors: (a) innovative advances in analytic and computational methods for developing reduced descriptions of physics phenomena spanning widely disparate temporal and spatial scales and (b) access to powerful new computational resources. Excellent progress has been made in developing codes for which computer run-time and problem-size scale well with the number of processors on massively parallel processors (MPPs). Examples include the effective usage of the full power of multi-teraflop (multi-trillion floating point computations per second) MPPs to produce three-dimensional, general geometry, nonlinear particle simulations that have accelerated advances in understanding the nature of turbulence self-regulation by zonal flows. These calculations, which typically utilized billions of particles for thousands of time-steps, would not have been possible without access to powerful present generation MPP computers and the associated diagnostic and visualization capabilities. In looking towards the future, the current results from advanced simulations provide great encouragement for being able to include increasingly realistic dynamics to enable deeper physics insights into plasmas in both natural and laboratory environments. This

  6. Advanced simulation capability for environmental management - current status and future applications

    SciTech Connect

    Freshley, Mark; Scheibe, Timothy; Robinson, Bruce; Moulton, J. David; Dixon, Paul; Marble, Justin; Gerdes, Kurt; Stockton, Tom; Seitz, Roger; Black, Paul

    2013-07-01

    The U.S. Department of Energy (US DOE) Office of Environmental Management (EM), Office of Soil and Groundwater (EM-12), is supporting development of the Advanced Simulation Capability for Environmental Management (ASCEM). ASCEM is a state-of-the-art scientific tool and approach that is currently aimed at understanding and predicting contaminant fate and transport in natural and engineered systems. ASCEM is a modular and open source high-performance computing tool. It will be used to facilitate integrated approaches to modeling and site characterization, and provide robust and standardized assessments of performance and risk for EM cleanup and closure activities. The ASCEM project continues to make significant progress in development of capabilities, with current emphasis on integration of capabilities in FY12. Capability development is occurring for both the Platform and Integrated Tool-sets and High-Performance Computing (HPC) multi-process simulator. The Platform capabilities provide the user interface and tools for end-to-end model development, starting with definition of the conceptual model, management of data for model input, model calibration and uncertainty analysis, and processing of model output, including visualization. The HPC capabilities target increased functionality of process model representations, tool-sets for interaction with Platform, and verification and model confidence testing. The integration of the Platform and HPC capabilities were tested and evaluated for EM applications in a set of demonstrations as part of Site Applications Thrust Area activities in 2012. The current maturity of the ASCEM computational and analysis capabilities has afforded the opportunity for collaborative efforts to develop decision analysis tools to support and optimize radioactive waste disposal. Recent advances in computerized decision analysis frameworks provide the perfect opportunity to bring this capability into ASCEM. This will allow radioactive waste

  7. Advancing NASA's Satellite Control Capabilities: More than Just Better Technology

    NASA Technical Reports Server (NTRS)

    Smith, Danford

    2008-01-01

    This viewgraph presentation reviews the work of the Goddard Mission Services Evolution Center (GMSEC) in the development of the NASA's satellite control capabilities. The purpose of the presentation is to provide a quick overview of NASA's Goddard Space Flight Center and our approach to coordinating the ground system resources and development activities across many different missions. NASA Goddard's work in developing and managing the current and future space exploration missions is highlighted. The GMSEC, was established to to coordinate ground and flight data systems development and services, to create a new standard ground system for many missions and to reflect the reality that business reengineering and mindset were just as important.

  8. Advances and Challenges in Computational Plasma Science

    SciTech Connect

    W.M. Tang; V.S. Chan

    2005-01-03

    Scientific simulation, which provides a natural bridge between theory and experiment, is an essential tool for understanding complex plasma behavior. Recent advances in simulations of magnetically-confined plasmas are reviewed in this paper with illustrative examples chosen from associated research areas such as microturbulence, magnetohydrodynamics, and other topics. Progress has been stimulated in particular by the exponential growth of computer speed along with significant improvements in computer technology.

  9. The Advanced Communications Technology Satellite (ACTS) capabilities for serving science

    NASA Technical Reports Server (NTRS)

    Meyer, Thomas R.

    1990-01-01

    Results of research on potential science applications of the NASA Advanced Communications Technology Satellite (ACTS) are presented. Discussed here are: (1) general research on communications related issues; (2) a survey of science-related activities and programs in the local area; (3) interviews of selected scientists and associated telecommunications support personnel whose projects have communications requirements; (4) analysis of linkages between ACTS functionality and science user communications activities and modes of operation; and (5) an analysis of survey results and the projection of conclusions to a national scale.

  10. Computable general equilibrium model fiscal year 2014 capability development report

    SciTech Connect

    Edwards, Brian Keith; Boero, Riccardo

    2016-05-11

    This report provides an overview of the development of the NISAC CGE economic modeling capability since 2012. This capability enhances NISAC's economic modeling and analysis capabilities to answer a broader set of questions than possible with previous economic analysis capability. In particular, CGE modeling captures how the different sectors of the economy, for example, households, businesses, government, etc., interact to allocate resources in an economy and this approach captures these interactions when it is used to estimate the economic impacts of the kinds of events NISAC often analyzes.

  11. DOE accelerated strategic computing initiative: challenges and opportunities for predictive materials simulation capabilities

    SciTech Connect

    Mailhiot, C.

    1997-10-01

    In response to the unprecedented national security challenges derived from the end of nuclear testing, the Defense Programs of the Department of Energy has developed a long-term strategic plan based on a vigorous Science-Based Stockpile Stewardship (SBSS) program. The main objective of the SBSS program is to ensure confidence in the performance, safety, and reliability of the stockpile on the basis of a fundamental science-based approach. A central element of this approach is the development of predictive, full-physics, full-scale computer simulation tools. As a critical component of the SBSS program, the Accelerated Strategic Computing Initiative (ASCI) was established to provide the required advances in computer platforms and to enable predictive, physics-based simulation technologies. Foremost among the key elements needed to develop predictive simulation capabilities, the development of improved physics-based materials models has been universally identified as one of the highest-priority, highest-leverage activity. We indicate some of the materials modeling issues of relevance to stockpile materials and illustrate how the ASCI program will enable the tools necessary to advance the state-of-the-art in the field of computational condensed matter and materials physics.

  12. Advanced Computing Architectures for Cognitive Processing

    DTIC Science & Technology

    2009-07-01

    AND IS APPROVED FOR PUBLICATION IN ACCORDANCE WITH ASSIGNED DISTRIBUTION STATEMENT. FOR THE DIRECTOR: / s ... s / LOK YAN EDWARD J. JONES, Deputy Chief Work Unit Manager Advanced Computing Division...ELEMENT NUMBER 62702F 6. AUTHOR( S ) Gregory D. Peterson 5d. PROJECT NUMBER 459T 5e. TASK NUMBER AC 5f. WORK UNIT NUMBER CP 7. PERFORMING

  13. Advancing Space Weather Modeling Capabilities at the CCMC

    NASA Astrophysics Data System (ADS)

    Mays, M. Leila; Kuznetsova, Maria; Boblitt, Justin; Chulaki, Anna; MacNeice, Peter; Mendoza, Michelle; Mullinix, Richard; Pembroke, Asher; Pulkkinen, Antti; Rastaetter, Lutz; Shim, Ja Soon; Taktakishvili, Aleksandre; Wiegand, Chiu; Zheng, Yihua

    2016-04-01

    The Community Coordinated Modeling Center (CCMC, http://ccmc.gsfc.nasa.gov) serves as a community access point to an expanding collection of state-of-the-art space environment models and as a hub for collaborative development on next generation of space weather forecasting systems. In partnership with model developers and the international research and operational communities, the CCMC integrates new data streams and models from diverse sources into end-to-end space weather predictive systems, identifies weak links in data-model & model-model coupling and leads community efforts to fill those gaps. The presentation will focus on the latest model installations at the CCMC and advances in CCMC-led community-wide model validation projects.

  14. Reach and get capability in a computing environment

    DOEpatents

    Bouchard, Ann M [Albuquerque, NM; Osbourn, Gordon C [Albuquerque, NM

    2012-06-05

    A reach and get technique includes invoking a reach command from a reach location within a computing environment. A user can then navigate to an object within the computing environment and invoke a get command on the object. In response to invoking the get command, the computing environment is automatically navigated back to the reach location and the object copied into the reach location.

  15. User Instructions for the Systems Assessment Capability, Rev. 1, Computer Codes Volume 3: Utility Codes

    SciTech Connect

    Eslinger, Paul W.; Aaberg, Rosanne L.; Lopresti, Charles A.; Miley, Terri B.; Nichols, William E.; Strenge, Dennis L.

    2004-09-14

    This document contains detailed user instructions for a suite of utility codes developed for Rev. 1 of the Systems Assessment Capability. The suite of computer codes for Rev. 1 of Systems Assessment Capability performs many functions.

  16. Simulation methods for advanced scientific computing

    SciTech Connect

    Booth, T.E.; Carlson, J.A.; Forster, R.A.

    1998-11-01

    This is the final report of a three-year, Laboratory Directed Research and Development (LDRD) project at the Los Alamos National Laboratory (LANL). The objective of the project was to create effective new algorithms for solving N-body problems by computer simulation. The authors concentrated on developing advanced classical and quantum Monte Carlo techniques. For simulations of phase transitions in classical systems, they produced a framework generalizing the famous Swendsen-Wang cluster algorithms for Ising and Potts models. For spin-glass-like problems, they demonstrated the effectiveness of an extension of the multicanonical method for the two-dimensional, random bond Ising model. For quantum mechanical systems, they generated a new method to compute the ground-state energy of systems of interacting electrons. They also improved methods to compute excited states when the diffusion quantum Monte Carlo method is used and to compute longer time dynamics when the stationary phase quantum Monte Carlo method is used.

  17. Verification, validation, and predictive capability in computational engineering and physics.

    SciTech Connect

    Oberkampf, William Louis; Hirsch, Charles; Trucano, Timothy Guy

    2003-02-01

    Developers of computer codes, analysts who use the codes, and decision makers who rely on the results of the analyses face a critical question: How should confidence in modeling and simulation be critically assessed? Verification and validation (V&V) of computational simulations are the primary methods for building and quantifying this confidence. Briefly, verification is the assessment of the accuracy of the solution to a computational model. Validation is the assessment of the accuracy of a computational simulation by comparison with experimental data. In verification, the relationship of the simulation to the real world is not an issue. In validation, the relationship between computation and the real world, i.e., experimental data, is the issue.

  18. Research Institute for Advanced Computer Science

    NASA Technical Reports Server (NTRS)

    Gross, Anthony R. (Technical Monitor); Leiner, Barry M.

    2000-01-01

    The Research Institute for Advanced Computer Science (RIACS) carries out basic research and technology development in computer science, in support of the National Aeronautics and Space Administration's missions. RIACS is located at the NASA Ames Research Center. It currently operates under a multiple year grant/cooperative agreement that began on October 1, 1997 and is up for renewal in the year 2002. Ames has been designated NASA's Center of Excellence in Information Technology. In this capacity, Ames is charged with the responsibility to build an Information Technology Research Program that is preeminent within NASA. RIACS serves as a bridge between NASA Ames and the academic community, and RIACS scientists and visitors work in close collaboration with NASA scientists. RIACS has the additional goal of broadening the base of researchers in these areas of importance to the nation's space and aeronautics enterprises. RIACS research focuses on the three cornerstones of information technology research necessary to meet the future challenges of NASA missions: (1) Automated Reasoning for Autonomous Systems. Techniques are being developed enabling spacecraft that will be self-guiding and self-correcting to the extent that they will require little or no human intervention. Such craft will be equipped to independently solve problems as they arise, and fulfill their missions with minimum direction from Earth; (2) Human-Centered Computing. Many NASA missions require synergy between humans and computers, with sophisticated computational aids amplifying human cognitive and perceptual abilities; (3) High Performance Computing and Networking. Advances in the performance of computing and networking continue to have major impact on a variety of NASA endeavors, ranging from modeling and simulation to data analysis of large datasets to collaborative engineering, planning and execution. In addition, RIACS collaborates with NASA scientists to apply information technology research to a

  19. Frontiers of research in advanced computations

    SciTech Connect

    1996-07-01

    The principal mission of the Institute for Scientific Computing Research is to foster interactions among LLNL researchers, universities, and industry on selected topics in scientific computing. In the area of computational physics, the Institute has developed a new algorithm, GaPH, to help scientists understand the chemistry of turbulent and driven plasmas or gases at far less cost than other methods. New low-frequency electromagnetic models better describe the plasma etching and deposition characteristics of a computer chip in the making. A new method for modeling realistic curved boundaries within an orthogonal mesh is resulting in a better understanding of the physics associated with such boundaries and much quicker solutions. All these capabilities are being developed for massively parallel implementation, which is an ongoing focus of Institute researchers. Other groups within the Institute are developing novel computational methods to address a range of other problems. Examples include feature detection and motion recognition by computer, improved monitoring of blood oxygen levels, and entirely new models of human joint mechanics and prosthetic devices.

  20. Airborne Advanced Reconfigurable Computer System (ARCS)

    NASA Technical Reports Server (NTRS)

    Bjurman, B. E.; Jenkins, G. M.; Masreliez, C. J.; Mcclellan, K. L.; Templeman, J. E.

    1976-01-01

    A digital computer subsystem fault-tolerant concept was defined, and the potential benefits and costs of such a subsystem were assessed when used as the central element of a new transport's flight control system. The derived advanced reconfigurable computer system (ARCS) is a triple-redundant computer subsystem that automatically reconfigures, under multiple fault conditions, from triplex to duplex to simplex operation, with redundancy recovery if the fault condition is transient. The study included criteria development covering factors at the aircraft's operation level that would influence the design of a fault-tolerant system for commercial airline use. A new reliability analysis tool was developed for evaluating redundant, fault-tolerant system availability and survivability; and a stringent digital system software design methodology was used to achieve design/implementation visibility.

  1. Advances in Electromagnetic Modelling through High Performance Computing

    SciTech Connect

    Ko, K.; Folwell, N.; Ge, L.; Guetz, A.; Lee, L.; Li, Z.; Ng, C.; Prudencio, E.; Schussman, G.; Uplenchwar, R.; Xiao, L.; /SLAC

    2006-03-29

    Under the DOE SciDAC project on Accelerator Science and Technology, a suite of electromagnetic codes has been under development at SLAC that are based on unstructured grids for higher accuracy, and use parallel processing to enable large-scale simulation. The new modeling capability is supported by SciDAC collaborations on meshing, solvers, refinement, optimization and visualization. These advances in computational science are described and the application of the parallel eigensolver Omega3P to the cavity design for the International Linear Collider is discussed.

  2. Computational fluid dynamics capability for the solid fuel ramjet projectile

    NASA Astrophysics Data System (ADS)

    Nusca, Michael J.; Chakravarthy, Sukumar R.; Goldberg, Uriel C.

    1988-12-01

    A computational fluid dynamics solution of the Navier-Stokes equations has been applied to the internal and external flow of inert solid-fuel ramjet projectiles. Computational modeling reveals internal flowfield details not attainable by flight or wind tunnel measurements, thus contributing to the current investigation into the flight performance of solid-fuel ramjet projectiles. The present code employs numerical algorithms termed total variational diminishing (TVD). Computational solutions indicate the importance of several special features of the code including the zonal grid framework, the TVD scheme, and a recently developed backflow turbulence model. The solutions are compared with results of internal surface pressure measurements. As demonstrated by these comparisons, the use of a backflow turbulence model distinguishes between satisfactory and poor flowfield predictions.

  3. Stretching Capabilities: Children with Disabilities Playing TV and Computer Games

    ERIC Educational Resources Information Center

    Wasterfors, David

    2011-01-01

    Intervention studies show that if children with disabilities play motion-controlled TV and computer games for training purposes their motivation increases and their training becomes more intensive, but why this happens has not been explained. This article addresses this question with the help of ethnographic material from a public project in…

  4. Autonomous Systems, Robotics, and Computing Systems Capability Roadmap: NRC Dialogue

    NASA Technical Reports Server (NTRS)

    Zornetzer, Steve; Gage, Douglas

    2005-01-01

    Contents include the following: Introduction. Process, Mission Drivers, Deliverables, and Interfaces. Autonomy. Crew-Centered and Remote Operations. Integrated Systems Health Management. Autonomous Vehicle Control. Autonomous Process Control. Robotics. Robotics for Solar System Exploration. Robotics for Lunar and Planetary Habitation. Robotics for In-Space Operations. Computing Systems. Conclusion.

  5. Learning from a Computer Tutor with Natural Language Capabilities

    ERIC Educational Resources Information Center

    Michael, Joel; Rovick, Allen; Glass, Michael; Zhou, Yujian; Evens, Martha

    2003-01-01

    CIRCSIM-Tutor is a computer tutor designed to carry out a natural language dialogue with a medical student. Its domain is the baroreceptor reflex, the part of the cardiovascular system that is responsible for maintaining a constant blood pressure. CIRCSIM-Tutor's interaction with students is modeled after the tutoring behavior of two experienced…

  6. Current capabilities and future directions in computational fluid dynamics

    NASA Technical Reports Server (NTRS)

    1986-01-01

    A summary of significant findings is given, followed by specific recommendations for future directions of emphasis for computational fluid dynamics development. The discussion is organized into three application areas: external aerodynamics, hypersonics, and propulsion - and followed by a turbulence modeling synopsis.

  7. 75 FR 64720 - DOE/Advanced Scientific Computing Advisory Committee

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-10-20

    .../Advanced Scientific Computing Advisory Committee AGENCY: Department of Energy, Office of Science. ACTION: Notice of open meeting. SUMMARY: This notice announces a meeting of the Advanced Scientific Computing... Baker, Office of Advanced Scientific Computing Research; SC-21/Germantown Building; U. S. Department...

  8. 75 FR 9887 - Advanced Scientific Computing Advisory Committee

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-03-04

    ... Advanced Scientific Computing Advisory Committee AGENCY: Department of Energy, Office of Science. ACTION: Notice of open meeting. SUMMARY: This notice announces a meeting of the Advanced Scientific Computing... Baker, Office of Advanced Scientific Computing Research; SC-21/Germantown Building; U.S. Department...

  9. 78 FR 6087 - Advanced Scientific Computing Advisory Committee

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-01-29

    ... Advanced Scientific Computing Advisory Committee AGENCY: Department of Energy, Office of Science. ACTION: Notice of open meeting. SUMMARY: This notice announces a meeting of the Advanced Scientific Computing..., Office of Advanced Scientific Computing Research; SC-21/Germantown Building; U. S. Department of...

  10. 75 FR 43518 - Advanced Scientific Computing Advisory Committee; Meeting

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-07-26

    ... Advanced Scientific Computing Advisory Committee; Meeting AGENCY: Office of Science, DOE. ACTION: Notice of open meeting. SUMMARY: This notice announces a meeting of the Advanced Scientific Computing Advisory..., Office of Advanced Scientific Computing Research; SC-21/Germantown Building; U. S. Department of...

  11. 76 FR 41234 - Advanced Scientific Computing Advisory Committee Charter Renewal

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-07-13

    ... Advanced Scientific Computing Advisory Committee Charter Renewal AGENCY: Department of Energy, Office of... Administration, notice is hereby given that the Advanced Scientific Computing Advisory Committee will be renewed... concerning the Advanced Scientific Computing program in response only to charges from the Director of...

  12. 76 FR 9765 - Advanced Scientific Computing Advisory Committee

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-02-22

    ... Advanced Scientific Computing Advisory Committee AGENCY: Office of Science, Department of Energy. ACTION: Notice of open meeting. SUMMARY: This notice announces a meeting of the Advanced Scientific Computing..., Office of Advanced Scientific Computing Research, SC-21/Germantown Building, U.S. Department of...

  13. 77 FR 45345 - DOE/Advanced Scientific Computing Advisory Committee

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-07-31

    .../Advanced Scientific Computing Advisory Committee AGENCY: Office of Science, Department of Energy. ACTION: Notice of open meeting. SUMMARY: This notice announces a meeting of the Advanced Scientific Computing... Baker, Office of Advanced Scientific Computing Research; SC-21/Germantown Building; U.S. Department...

  14. 78 FR 41046 - Advanced Scientific Computing Advisory Committee

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-07-09

    ... Advanced Scientific Computing Advisory Committee AGENCY: Department of Energy, Office of Science. ACTION... hereby given that the Advanced Scientific Computing Advisory Committee will be renewed for a two-year... (DOE), on the Advanced Scientific Computing Research Program managed by the Office of...

  15. National facility for advanced computational science: A sustainable path to scientific discovery

    SciTech Connect

    Simon, Horst; Kramer, William; Saphir, William; Shalf, John; Bailey, David; Oliker, Leonid; Banda, Michael; McCurdy, C. William; Hules, John; Canning, Andrew; Day, Marc; Colella, Philip; Serafini, David; Wehner, Michael; Nugent, Peter

    2004-04-02

    Lawrence Berkeley National Laboratory (Berkeley Lab) proposes to create a National Facility for Advanced Computational Science (NFACS) and to establish a new partnership between the American computer industry and a national consortium of laboratories, universities, and computing facilities. NFACS will provide leadership-class scientific computing capability to scientists and engineers nationwide, independent of their institutional affiliation or source of funding. This partnership will bring into existence a new class of computational capability in the United States that is optimal for science and will create a sustainable path towards petaflops performance.

  16. National Research Council Dialogue to Assess Progress on NASA's Advanced Modeling, Simulation and Analysis Capability and Systems Engineering Capability Roadmap Development

    NASA Technical Reports Server (NTRS)

    Aikins, Jan

    2005-01-01

    Contents include the following: General Background and Introduction of Capability Roadmaps. Agency Objective. Strategic Planning Transformation. Advanced Planning Organizational Roles. Public Involvement in Strategic Planning. Strategic Roadmaps and Schedule. Capability Roadmaps and Schedule. Purpose of NRC Review. Capability Roadmap Development (Progress to Date).

  17. Computational Design of Advanced Nuclear Fuels

    SciTech Connect

    Savrasov, Sergey; Kotliar, Gabriel; Haule, Kristjan

    2014-06-03

    The objective of the project was to develop a method for theoretical understanding of nuclear fuel materials whose physical and thermophysical properties can be predicted from first principles using a novel dynamical mean field method for electronic structure calculations. We concentrated our study on uranium, plutonium, their oxides, nitrides, carbides, as well as some rare earth materials whose 4f eletrons provide a simplified framework for understanding complex behavior of the f electrons. We addressed the issues connected to the electronic structure, lattice instabilities, phonon and magnon dynamics as well as thermal conductivity. This allowed us to evaluate characteristics of advanced nuclear fuel systems using computer based simulations and avoid costly experiments.

  18. Computable general equilibrium model fiscal year 2013 capability development report

    SciTech Connect

    Edwards, Brian Keith; Rivera, Michael Kelly; Boero, Riccardo

    2016-05-17

    This report documents progress made on continued developments of the National Infrastructure Simulation and Analysis Center (NISAC) Computable General Equilibrium Model (NCGEM), developed in fiscal year 2012. In fiscal year 2013, NISAC the treatment of the labor market and tests performed with the model to examine the properties of the solutions computed by the model. To examine these, developers conducted a series of 20 simulations for 20 U.S. States. Each of these simulations compared an economic baseline simulation with an alternative simulation that assumed a 20-percent reduction in overall factor productivity in the manufacturing industries of each State. Differences in the simulation results between the baseline and alternative simulations capture the economic impact of the reduction in factor productivity. While not every State is affected in precisely the same way, the reduction in manufacturing industry productivity negatively affects the manufacturing industries in each State to an extent proportional to the reduction in overall factor productivity. Moreover, overall economic activity decreases when manufacturing sector productivity is reduced. Developers ran two additional simulations: (1) a version of the model for the State of Michigan, with manufacturing divided into two sub-industries (automobile and other vehicle manufacturing as one sub-industry and the rest of manufacturing as the other subindustry); and (2) a version of the model for the United States, divided into 30 industries. NISAC conducted these simulations to illustrate the flexibility of industry definitions in NCGEM and to examine the simulation properties of in more detail.

  19. Developing Materials Processing to Performance Modeling Capabilities and the Need for Exascale Computing Architectures (and Beyond)

    SciTech Connect

    Schraad, Mark William; Luscher, Darby Jon

    2016-09-06

    Additive Manufacturing techniques are presenting the Department of Energy and the NNSA Laboratories with new opportunities to consider novel component production and repair processes, and to manufacture materials with tailored response and optimized performance characteristics. Additive Manufacturing technologies already are being applied to primary NNSA mission areas, including Nuclear Weapons. These mission areas are adapting to these new manufacturing methods, because of potential advantages, such as smaller manufacturing footprints, reduced needs for specialized tooling, an ability to embed sensing, novel part repair options, an ability to accommodate complex geometries, and lighter weight materials. To realize the full potential of Additive Manufacturing as a game-changing technology for the NNSA’s national security missions; however, significant progress must be made in several key technical areas. In addition to advances in engineering design, process optimization and automation, and accelerated feedstock design and manufacture, significant progress must be made in modeling and simulation. First and foremost, a more mature understanding of the process-structure-property-performance relationships must be developed. Because Additive Manufacturing processes change the nature of a material’s structure below the engineering scale, new models are required to predict materials response across the spectrum of relevant length scales, from the atomistic to the continuum. New diagnostics will be required to characterize materials response across these scales. And not just models, but advanced algorithms, next-generation codes, and advanced computer architectures will be required to complement the associated modeling activities. Based on preliminary work in each of these areas, a strong argument for the need for Exascale computing architectures can be made, if a legitimate predictive capability is to be developed.

  20. High-Performance Computing for Advanced Smart Grid Applications

    SciTech Connect

    Huang, Zhenyu; Chen, Yousu

    2012-07-06

    The power grid is becoming far more complex as a result of the grid evolution meeting an information revolution. Due to the penetration of smart grid technologies, the grid is evolving as an unprecedented speed and the information infrastructure is fundamentally improved with a large number of smart meters and sensors that produce several orders of magnitude larger amounts of data. How to pull data in, perform analysis, and put information out in a real-time manner is a fundamental challenge in smart grid operation and planning. The future power grid requires high performance computing to be one of the foundational technologies in developing the algorithms and tools for the significantly increased complexity. New techniques and computational capabilities are required to meet the demands for higher reliability and better asset utilization, including advanced algorithms and computing hardware for large-scale modeling, simulation, and analysis. This chapter summarizes the computational challenges in smart grid and the need for high performance computing, and present examples of how high performance computing might be used for future smart grid operation and planning.

  1. Bringing Together Computational and Experimental Capabilities at the Crystal Scale

    SciTech Connect

    Barton, N R; Bernier, J V; Edmiston, J K

    2009-07-23

    Many phenomena of interest occur at the scale of crystals or are controlled be events happening at the crystalline scale. Examples include allotropic phase transformations in metals and pore collapse in energetic crystals. The research community is increasingly able to make detailed experimental observations at the crystalline scale and to inform crystal scale models using lower length scale computational tools. In situ diffraction techniques are pushing toward finer spatial and temporal resolution. Molecular and dislocation dynamics calculations are now able to directly inform mechanisms at the crystalline scale. Taken together, these factors give crystal based continuum models the ability to rationalize experimental observations, investigate competition among physical processes, and, when appropriately formulated and calibrated, predict behaviors. We will present an overview of current efforts, with emphasis on recent work investigating phase transformations and twinning in metals.

  2. Advances in Cross-Cutting Ideas for Computational Climate Science

    SciTech Connect

    Ng, Esmond; Evans, Katherine J.; Caldwell, Peter; Hoffman, Forrest M.; Jackson, Charles; Kerstin, Van Dam; Leung, Ruby; Martin, Daniel F.; Ostrouchov, George; Tuminaro, Raymond; Ullrich, Paul; Wild, S.; Williams, Samuel

    2017-01-01

    This report presents results from the DOE-sponsored workshop titled, ``Advancing X-Cutting Ideas for Computational Climate Science Workshop,'' known as AXICCS, held on September 12--13, 2016 in Rockville, MD. The workshop brought together experts in climate science, computational climate science, computer science, and mathematics to discuss interesting but unsolved science questions regarding climate modeling and simulation, promoted collaboration among the diverse scientists in attendance, and brainstormed about possible tools and capabilities that could be developed to help address them. Emerged from discussions at the workshop were several research opportunities that the group felt could advance climate science significantly. These include (1) process-resolving models to provide insight into important processes and features of interest and inform the development of advanced physical parameterizations, (2) a community effort to develop and provide integrated model credibility, (3) including, organizing, and managing increasingly connected model components that increase model fidelity yet complexity, and (4) treating Earth system models as one interconnected organism without numerical or data based boundaries that limit interactions. The group also identified several cross-cutting advances in mathematics, computer science, and computational science that would be needed to enable one or more of these big ideas. It is critical to address the need for organized, verified, and optimized software, which enables the models to grow and continue to provide solutions in which the community can have confidence. Effectively utilizing the newest computer hardware enables simulation efficiency and the ability to handle output from increasingly complex and detailed models. This will be accomplished through hierarchical multiscale algorithms in tandem with new strategies for data handling, analysis, and storage. These big ideas and cross-cutting technologies for enabling

  3. Advanced missions safety. Volume 3: Appendices. Part 1: Space shuttle rescue capability

    NASA Technical Reports Server (NTRS)

    1972-01-01

    The space shuttle rescue capability is analyzed as a part of the advanced mission safety study. The subjects discussed are: (1) mission evaluation, (2) shuttle configurations and performance, (3) performance of shuttle-launched tug system, (4) multiple pass grazing reentry from lunar orbit, (5) ground launched ascent and rendezvous time, (6) cost estimates, and (7) parallel-burn space shuttle configuration.

  4. Performance Measurements of the Injection Laser System Configured for Picosecond Scale Advanced Radiographic Capability

    SciTech Connect

    Haefner, L C; Heebner, J E; Dawson, J W; Fochs, S N; Shverdin, M Y; Crane, J K; Kanz, K V; Halpin, J M; Phan, H H; Sigurdsson, R J; Brewer, S W; Britten, J A; Brunton, G K; Clark, W J; Messerly, M J; Nissen, J D; Shaw, B H; Hackel, R P; Hermann, M R; Tietbohl, G L; Siders, C W; Barty, C J

    2009-10-23

    We have characterized the Advanced Radiographic Capability injection laser system and demonstrated that it meets performance requirements for upcoming National Ignition Facility fusion experiments. Pulse compression was achieved with a scaled down replica of the meter-scale grating ARC compressor and sub-ps pulse duration was demonstrated at the Joule-level.

  5. Advanced high-performance computer system architectures

    NASA Astrophysics Data System (ADS)

    Vinogradov, V. I.

    2007-02-01

    Convergence of computer systems and communication technologies are moving to switched high-performance modular system architectures on the basis of high-speed switched interconnections. Multi-core processors become more perspective way to high-performance system, and traditional parallel bus system architectures (VME/VXI, cPCI/PXI) are moving to new higher speed serial switched interconnections. Fundamentals in system architecture development are compact modular component strategy, low-power processor, new serial high-speed interface chips on the board, and high-speed switched fabric for SAN architectures. Overview of advanced modular concepts and new international standards for development high-performance embedded and compact modular systems for real-time applications are described.

  6. Intelligent operating systems for autonomous robots: Real-time capabilities on a hypercube super-computer

    SciTech Connect

    Einstein, J.R.; Barhen, J.; Jefferson, D.

    1986-01-01

    Autonomous robots which must perform time-critical tasks in hostile environments require computers which can perform many asynchronous tasks at extremely high speeds. Certain hypercube multiprocessors have many of the required attributes, but their operating systems must be provided with special functions to improve the capability of the system to respond rapidly to unpredictable events. A ''virtual-time'' shell, under design for addition to the Vertex operating system of the NCUBE hypercube computer, and having such capabilities, is described.

  7. Making Advanced Computer Science Topics More Accessible through Interactive Technologies

    ERIC Educational Resources Information Center

    Shao, Kun; Maher, Peter

    2012-01-01

    Purpose: Teaching advanced technical concepts in a computer science program to students of different technical backgrounds presents many challenges. The purpose of this paper is to present a detailed experimental pedagogy in teaching advanced computer science topics, such as computer networking, telecommunications and data structures using…

  8. 75 FR 57742 - Advanced Scientific Computing Advisory Committee

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-09-22

    ... Advanced Scientific Computing Advisory Committee AGENCY: Department of Energy, Office of Science. ACTION... Scientific Computing Advisory Committee (ASCAC). Federal Advisory Committee Act (Pub. L. 92-463, 86 Stat. 770...: Melea Baker, Office of Advanced Scientific Computing Research; SC-21/Germantown Building;...

  9. 76 FR 45786 - Advanced Scientific Computing Advisory Committee; Meeting

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-08-01

    ... Advanced Scientific Computing Advisory Committee; Meeting AGENCY: Office of Science, Department of Energy... Computing Advisory Committee (ASCAC). Federal Advisory Committee Act (Pub. L. 92-463, 86 Stat. 770) requires... INFORMATION CONTACT: Melea Baker, Office of Advanced Scientific Computing Research; SC-21/Germantown...

  10. OPENING REMARKS: Scientific Discovery through Advanced Computing

    NASA Astrophysics Data System (ADS)

    Strayer, Michael

    2006-01-01

    Good morning. Welcome to SciDAC 2006 and Denver. I share greetings from the new Undersecretary for Energy, Ray Orbach. Five years ago SciDAC was launched as an experiment in computational science. The goal was to form partnerships among science applications, computer scientists, and applied mathematicians to take advantage of the potential of emerging terascale computers. This experiment has been a resounding success. SciDAC has emerged as a powerful concept for addressing some of the biggest challenges facing our world. As significant as these successes were, I believe there is also significance in the teams that achieved them. In addition to their scientific aims these teams have advanced the overall field of computational science and set the stage for even larger accomplishments as we look ahead to SciDAC-2. I am sure that many of you are expecting to hear about the results of our current solicitation for SciDAC-2. I’m afraid we are not quite ready to make that announcement. Decisions are still being made and we will announce the results later this summer. Nearly 250 unique proposals were received and evaluated, involving literally thousands of researchers, postdocs, and students. These collectively requested more than five times our expected budget. This response is a testament to the success of SciDAC in the community. In SciDAC-2 our budget has been increased to about 70 million for FY 2007 and our partnerships have expanded to include the Environment and National Security missions of the Department. The National Science Foundation has also joined as a partner. These new partnerships are expected to expand the application space of SciDAC, and broaden the impact and visibility of the program. We have, with our recent solicitation, expanded to turbulence, computational biology, and groundwater reactive modeling and simulation. We are currently talking with the Department’s applied energy programs about risk assessment, optimization of complex systems - such

  11. COMPUTER SUPPORT SYSTEMS FOR ESTIMATING CHEMICAL TOXICITY: PRESENT CAPABILITIES AND FUTURE TRENDS

    EPA Science Inventory

    Computer Support Systems for Estimating Chemical Toxicity: Present Capabilities and Future Trends

    A wide variety of computer-based artificial intelligence (AI) and decision support systems exist currently to aid in the assessment of toxicity for environmental chemicals. T...

  12. 77 FR 12823 - Advanced Scientific Computing Advisory Committee

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-03-02

    ... final report, Advanced Networking update Status from Computer Science COV Early Career technical talks Summary of Applied Math and Computer Science Workshops ASCR's new SBIR awards Data-intensive...

  13. The AEDC aerospace chamber 7V: An advanced test capability for infrared surveillance and seeker sensors

    NASA Technical Reports Server (NTRS)

    Simpson, W. R.

    1994-01-01

    An advanced sensor test capability is now operational at the Air Force Arnold Engineering Development Center (AEDC) for calibration and performance characterization of infrared sensors. This facility, known as the 7V, is part of a broad range of test capabilities under development at AEDC to provide complete ground test support to the sensor community for large-aperture surveillance sensors and kinetic kill interceptors. The 7V is a state-of-the-art cryo/vacuum facility providing calibration and mission simulation against space backgrounds. Key features of the facility include high-fidelity scene simulation with precision track accuracy and in-situ target monitoring, diffraction limited optical system, NIST traceable broadband and spectral radiometric calibration, outstanding jitter control, environmental systems for 20 K, high-vacuum, low-background simulation, and an advanced data acquisition system.

  14. Computer graphics for management: An abstract of capabilities and applications of the EIS system

    NASA Technical Reports Server (NTRS)

    Solem, B. J.

    1975-01-01

    The Executive Information Services (EIS) system, developed as a computer-based, time-sharing tool for making and implementing management decisions, and including computer graphics capabilities, was described. The following resources are available through the EIS languages: centralized corporate/gov't data base, customized and working data bases, report writing, general computational capability, specialized routines, modeling/programming capability, and graphics. Nearly all EIS graphs can be created by a single, on-line instruction. A large number of options are available, such as selection of graphic form, line control, shading, placement on the page, multiple images on a page, control of scaling and labeling, plotting of cum data sets, optical grid lines, and stack charts. The following are examples of areas in which the EIS system may be used: research, estimating services, planning, budgeting, and performance measurement, national computer hook-up negotiations.

  15. Cielo Computational Environment Usage Model With Mappings to ACE Requirements for the General Availability User Environment Capabilities Release Version 1.1

    SciTech Connect

    Vigil,Benny Manuel; Ballance, Robert; Haskell, Karen

    2012-08-09

    Cielo is a massively parallel supercomputer funded by the DOE/NNSA Advanced Simulation and Computing (ASC) program, and operated by the Alliance for Computing at Extreme Scale (ACES), a partnership between Los Alamos National Laboratory (LANL) and Sandia National Laboratories (SNL). The primary Cielo compute platform is physically located at Los Alamos National Laboratory. This Cielo Computational Environment Usage Model documents the capabilities and the environment to be provided for the Q1 FY12 Level 2 Cielo Capability Computing (CCC) Platform Production Readiness Milestone. This document describes specific capabilities, tools, and procedures to support both local and remote users. The model is focused on the needs of the ASC user working in the secure computing environments at Lawrence Livermore National Laboratory (LLNL), Los Alamos National Laboratory, or Sandia National Laboratories, but also addresses the needs of users working in the unclassified environment. The Cielo Computational Environment Usage Model maps the provided capabilities to the tri-Lab ASC Computing Environment (ACE) Version 8.0 requirements. The ACE requirements reflect the high performance computing requirements for the Production Readiness Milestone user environment capabilities of the ASC community. A description of ACE requirements met, and those requirements that are not met, are included in each section of this document. The Cielo Computing Environment, along with the ACE mappings, has been issued and reviewed throughout the tri-Lab community.

  16. Computational and design methods for advanced imaging

    NASA Astrophysics Data System (ADS)

    Birch, Gabriel C.

    This dissertation merges the optical design and computational aspects of imaging systems to create novel devices that solve engineering problems in optical science and attempts to expand the solution space available to the optical designer. This dissertation is divided into two parts: the first discusses a new active illumination depth sensing modality, while the second part discusses a passive illumination system called plenoptic, or lightfield, imaging. The new depth sensing modality introduced in part one is called depth through controlled aberration. This technique illuminates a target with a known, aberrated projected pattern and takes an image using a traditional, unmodified imaging system. Knowing how the added aberration in the projected pattern changes as a function of depth, we are able to quantitatively determine depth of a series of points from the camera. A major advantage this method permits is the ability for illumination and imaging axes to be coincident. Plenoptic cameras capture both spatial and angular data simultaneously. This dissertation present a new set of parameters that permit the design and comparison of plenoptic devices outside the traditionally published plenoptic 1.0 and plenoptic 2.0 configurations. Additionally, a series of engineering advancements are presented, including full system raytraces of raw plenoptic images, Zernike compression techniques of raw image files, and non-uniform lenslet arrays to compensate for plenoptic system aberrations. Finally, a new snapshot imaging spectrometer is proposed based off the plenoptic configuration.

  17. The Advanced Modeling, Simulation and Analysis Capability Roadmap Vision for Engineering

    NASA Technical Reports Server (NTRS)

    Zang, Thomas; Lieber, Mike; Norton, Charles; Fucik, Karen

    2006-01-01

    This paper summarizes a subset of the Advanced Modeling Simulation and Analysis (AMSA) Capability Roadmap that was developed for NASA in 2005. The AMSA Capability Roadmap Team was chartered to "To identify what is needed to enhance NASA's capabilities to produce leading-edge exploration and science missions by improving engineering system development, operations, and science understanding through broad application of advanced modeling, simulation and analysis techniques." The AMSA roadmap stressed the need for integration, not just within the science, engineering and operations domains themselves, but also across these domains. Here we discuss the roadmap element pertaining to integration within the engineering domain, with a particular focus on implications for future observatory missions. The AMSA products supporting the system engineering function are mission information, bounds on information quality, and system validation guidance. The Engineering roadmap element contains 5 sub-elements: (1) Large-Scale Systems Models, (2) Anomalous Behavior Models, (3) advanced Uncertainty Models, (4) Virtual Testing Models, and (5) space-based Robotics Manufacture and Servicing Models.

  18. In-Situ Creep Testing Capability for the Advanced Test Reactor

    SciTech Connect

    B. G. Kim; J. L. Rempe; D. L. Knudson; K. G. Condie; B. H. Sencer

    2012-09-01

    An instrumented creep testing capability is being developed for specimens irradiated in Pressurized Water Reactor (PWR) coolant conditions at the Advanced Test Reactor (ATR). The test rig has been developed such that samples will be subjected to stresses ranging from 92 to 350 MPa at temperatures between 290 and 370 °C up to at least 2 dpa (displacement per atom). The status of Idaho National Laboratory (INL) efforts to develop the test rig in-situ creep testing capability for the ATR is described. In addition to providing an overview of in-pile creep test capabilities available at other test reactors, this paper reports efforts by INL to evaluate a prototype test rig in an autoclave at INL’s High Temperature Test Laboratory (HTTL). Initial data from autoclave tests with 304 stainless steel (304 SS) specimens are reported.

  19. Advanced flight computers for planetary exploration

    NASA Technical Reports Server (NTRS)

    Stephenson, R. Rhoads

    1988-01-01

    Research concerning flight computers for use on interplanetary probes is reviewed. The history of these computers from the Viking mission to the present is outlined. The differences between ground commercial computers and computers for planetary exploration are listed. The development of a computer for the Mariner Mark II comet rendezvous asteroid flyby mission is described. Various aspects of recently developed computer systems are examined, including the Max real time, embedded computer, a hypercube distributed supercomputer, a SAR data processor, a processor for the High Resolution IR Imaging Spectrometer, and a robotic vision multiresolution pyramid machine for processsing images obtained by a Mars Rover.

  20. Some Recent Advances in Computer Graphics.

    ERIC Educational Resources Information Center

    Whitted, Turner

    1982-01-01

    General principles of computer graphics are reviewed, including discussions of display hardware, geometric modeling, algorithms, and applications in science, computer-aided design, flight training, communications, business, art, and entertainment. (JN)

  1. An evaluation of Java's I/O capabilities for high-performance computing.

    SciTech Connect

    Dickens, P. M.; Thakur, R.

    2000-11-10

    Java is quickly becoming the preferred language for writing distributed applications because of its inherent support for programming on distributed platforms. In particular, Java provides compile-time and run-time security, automatic garbage collection, inherent support for multithreading, support for persistent objects and object migration, and portability. Given these significant advantages of Java, there is a growing interest in using Java for high-performance computing applications. To be successful in the high-performance computing domain, however, Java must have the capability to efficiently handle the significant I/O requirements commonly found in high-performance computing applications. While there has been significant research in high-performance I/O using languages such as C, C++, and Fortran, there has been relatively little research into the I/O capabilities of Java. In this paper, we evaluate the I/O capabilities of Java for high-performance computing. We examine several approaches that attempt to provide high-performance I/O--many of which are not obvious at first glance--and investigate their performance in both parallel and multithreaded environments. We also provide suggestions for expanding the I/O capabilities of Java to better support the needs of high-performance computing applications.

  2. Advanced EVA Capabilities: A Study for NASA's Revolutionary Aerospace Systems Concept Program

    NASA Technical Reports Server (NTRS)

    Hoffman, Stephen J.

    2004-01-01

    This report documents the results of a study carried out as part of NASA s Revolutionary Aerospace Systems Concepts Program examining the future technology needs of extravehicular activities (EVAs). The intent of this study is to produce a comprehensive report that identifies various design concepts for human-related advanced EVA systems necessary to achieve the goals of supporting future space exploration and development customers in free space and on planetary surfaces for space missions in the post-2020 timeframe. The design concepts studied and evaluated are not limited to anthropomorphic space suits, but include a wide range of human-enhancing EVA technologies as well as consideration of coordination and integration with advanced robotics. The goal of the study effort is to establish a baseline technology "road map" that identifies and describes an investment and technical development strategy, including recommendations that will lead to future enhanced synergistic human/robot EVA operations. The eventual use of this study effort is to focus evolving performance capabilities of various EVA system elements toward the goal of providing high performance human operational capabilities for a multitude of future space applications and destinations. The data collected for this study indicate a rich and diverse history of systems that have been developed to perform a variety of EVA tasks, indicating what is possible. However, the data gathered for this study also indicate a paucity of new concepts and technologies for advanced EVA missions - at least any that researchers are willing to discuss in this type of forum.

  3. Advances in Sensitivity Analysis Capabilities with SCALE 6.0 and 6.1

    SciTech Connect

    Rearden, Bradley T; Petrie Jr, Lester M; Williams, Mark L

    2010-01-01

    The sensitivity and uncertainty analysis sequences of SCALE compute the sensitivity of k{sub eff} to each constituent multigroup cross section using perturbation theory based on forward and adjoint transport computations with several available codes. Versions 6.0 and 6.1 of SCALE, released in 2009 and 2010, respectively, include important additions to the TSUNAMI-3D sequence, which computes forward and adjoint solutions in multigroup with the KENO Monte Carlo codes. Previously, sensitivity calculations were performed with the simple and efficient geometry capabilities of KENO V.a, but now calculations can also be performed with the generalized geometry code KENO-VI. TSUNAMI-3D requires spatial refinement of the angular flux moment solutions for the forward and adjoint calculations. These refinements are most efficiently achieved with the use of a mesh accumulator. For SCALE 6.0, a more flexible mesh accumulator capability has been added to the KENO codes, enabling varying granularity of the spatial refinement to optimize the calculation for different regions of the system model. The new mesh capabilities allow the efficient calculation of larger models than were previously possible. Additional improvements in the TSUNAMI calculations were realized in the computation of implicit effects of resonance self-shielding on the final sensitivity coefficients. Multigroup resonance self-shielded cross sections are accurately computed with SCALE's robust deterministic continuous-energy treatment for the resolved and thermal energy range and with Bondarenko shielding factors elsewhere, including the unresolved resonance range. However, the sensitivities of the self-shielded cross sections to the parameters input to the calculation are quantified using only full-range Bondarenko factors.

  4. Computing Advances in the Teaching of Chemistry.

    ERIC Educational Resources Information Center

    Baskett, W. P.; Matthews, G. P.

    1984-01-01

    Discusses three trends in computer-oriented chemistry instruction: (1) availability of interfaces to integrate computers with experiments; (2) impact of the development of higher resolution graphics and greater memory capacity; and (3) role of videodisc technology on computer assisted instruction. Includes program listings for auto-titration and…

  5. ADVANCED COMPUTATIONAL METHODS IN DOSE MODELING: APPLICATION OF COMPUTATIONAL BIOPHYSICAL TRANSPORT, COMPUTATIONAL CHEMISTRY, AND COMPUTATIONAL BIOLOGY

    EPA Science Inventory

    Computational toxicology (CompTox) leverages the significant gains in computing power and computational techniques (e.g., numerical approaches, structure-activity relationships, bioinformatics) realized over the last few years, thereby reducing costs and increasing efficiency i...

  6. On the Computational Capabilities of Physical Systems. Part 1; The Impossibility of Infallible Computation

    NASA Technical Reports Server (NTRS)

    Wolpert, David H.; Koga, Dennis (Technical Monitor)

    2000-01-01

    In this first of two papers, strong limits on the accuracy of physical computation are established. First it is proven that there cannot be a physical computer C to which one can pose any and all computational tasks concerning the physical universe. Next it is proven that no physical computer C can correctly carry out any computational task in the subset of such tasks that can be posed to C. This result holds whether the computational tasks concern a system that is physically isolated from C, or instead concern a system that is coupled to C. As a particular example, this result means that there cannot be a physical computer that can, for any physical system external to that computer, take the specification of that external system's state as input and then correctly predict its future state before that future state actually occurs; one cannot build a physical computer that can be assured of correctly 'processing information faster than the universe does'. The results also mean that there cannot exist an infallible, general-purpose observation apparatus, and that there cannot be an infallible, general-purpose control apparatus. These results do not rely on systems that are infinite, and/or non-classical, and/or obey chaotic dynamics. They also hold even if one uses an infinitely fast, infinitely dense computer, with computational powers greater than that of a Turing Machine. This generality is a direct consequence of the fact that a novel definition of computation - a definition of 'physical computation' - is needed to address the issues considered in these papers. While this definition does not fit into the traditional Chomsky hierarchy, the mathematical structure and impossibility results associated with it have parallels in the mathematics of the Chomsky hierarchy. The second in this pair of papers presents a preliminary exploration of some of this mathematical structure, including in particular that of prediction complexity, which is a 'physical computation

  7. On The Computational Capabilities of Physical Systems. Part 2; Relationship With Conventional Computer Science

    NASA Technical Reports Server (NTRS)

    Wolpert, David H.; Koga, Dennis (Technical Monitor)

    2000-01-01

    In the first of this pair of papers, it was proven that there cannot be a physical computer to which one can properly pose any and all computational tasks concerning the physical universe. It was then further proven that no physical computer C can correctly carry out all computational tasks that can be posed to C. As a particular example, this result means that no physical computer that can, for any physical system external to that computer, take the specification of that external system's state as input and then correctly predict its future state before that future state actually occurs; one cannot build a physical computer that can be assured of correctly "processing information faster than the universe does". These results do not rely on systems that are infinite, and/or non-classical, and/or obey chaotic dynamics. They also hold even if one uses an infinitely fast, infinitely dense computer, with computational powers greater than that of a Turing Machine. This generality is a direct consequence of the fact that a novel definition of computation - "physical computation" - is needed to address the issues considered in these papers, which concern real physical computers. While this novel definition does not fit into the traditional Chomsky hierarchy, the mathematical structure and impossibility results associated with it have parallels in the mathematics of the Chomsky hierarchy. This second paper of the pair presents a preliminary exploration of some of this mathematical structure. Analogues of Chomskian results concerning universal Turing Machines and the Halting theorem are derived, as are results concerning the (im)possibility of certain kinds of error-correcting codes. In addition, an analogue of algorithmic information complexity, "prediction complexity", is elaborated. A task-independent bound is derived on how much the prediction complexity of a computational task can differ for two different reference universal physical computers used to solve that task

  8. Computational Investigation of Impact Energy Absorption Capability of Polyurea Coatings via Deformation-Induced Glass Transition

    DTIC Science & Technology

    2010-01-01

    homepage: www.e lsev ier .com/ locate /msea Computational investigation of impact energy absorption capability of polyurea coatings via deformation-induced...Keywords: Polyurea Computational analysis Glass transition Blast/impact energy absorption coating a b s t r a c t A number of experimental investigations...reported in the open literature have indicated that the applica- tion of polyurea coatings can substantially improve blast and ballistic impact

  9. The Advanced Test Reactor Irradiation Capabilities Available as a National Scientific User Facility

    SciTech Connect

    S. Blaine Grover

    2008-09-01

    The Advanced Test Reactor (ATR) is one of the world’s premiere test reactors for performing long term, high flux, and/or large volume irradiation test programs. The ATR is a very versatile facility with a wide variety of experimental test capabilities for providing the environment needed in an irradiation experiment. These capabilities include simple capsule experiments, instrumented and/or temperature-controlled experiments, and pressurized water loop experiment facilities. Monitoring systems have also been utilized to monitor different parameters such as fission gases for fuel experiments, to measure specimen performance during irradiation. ATR’s control system provides a stable axial flux profile throughout each reactor operating cycle, and allows the thermal and fast neutron fluxes to be controlled separately in different sections of the core. The ATR irradiation positions vary in diameter from 16 mm to 127 mm over an active core height of 1.2 m. This paper discusses the different irradiation capabilities with examples of different experiments and the cost/benefit issues related to each capability. The recent designation of ATR as a national scientific user facility will make the ATR much more accessible at very low to no cost for research by universities and possibly commercial entities.

  10. Advances in Engine Test Capabilities at the NASA Glenn Research Center's Propulsion Systems Laboratory

    NASA Technical Reports Server (NTRS)

    Pachlhofer, Peter M.; Panek, Joseph W.; Dicki, Dennis J.; Piendl, Barry R.; Lizanich, Paul J.; Klann, Gary A.

    2006-01-01

    The Propulsion Systems Laboratory at the National Aeronautics and Space Administration (NASA) Glenn Research Center is one of the premier U.S. facilities for research on advanced aeropropulsion systems. The facility can simulate a wide range of altitude and Mach number conditions while supplying the aeropropulsion system with all the support services necessary to operate at those conditions. Test data are recorded on a combination of steady-state and highspeed data-acquisition systems. Recently a number of upgrades were made to the facility to meet demanding new requirements for the latest aeropropulsion concepts and to improve operational efficiency. Improvements were made to data-acquisition systems, facility and engine-control systems, test-condition simulation systems, video capture and display capabilities, and personnel training procedures. This paper discusses the facility s capabilities, recent upgrades, and planned future improvements.

  11. Advancing crime scene computer forensics techniques

    NASA Astrophysics Data System (ADS)

    Hosmer, Chet; Feldman, John; Giordano, Joe

    1999-02-01

    Computers and network technology have become inexpensive and powerful tools that can be applied to a wide range of criminal activity. Computers have changed the world's view of evidence because computers are used more and more as tools in committing `traditional crimes' such as embezzlements, thefts, extortion and murder. This paper will focus on reviewing the current state-of-the-art of the data recovery and evidence construction tools used in both the field and laboratory for prosection purposes.

  12. Application of advanced computational technology to propulsion CFD

    NASA Astrophysics Data System (ADS)

    Szuch, John R.

    The Internal Fluid Mechanics Division of the NASA Lewis Research Center is combining the key elements of computational fluid dynamics, aerothermodynamic experiments, and advanced computational technology to bring internal computational fluid dynamics (ICFM) to a state of practical application for aerospace propulsion system design. This paper presents an overview of efforts underway at NASA Lewis to advance and apply computational technology to ICFM. These efforts include the use of modern, software engineering principles for code development, the development of an AI-based user-interface for large codes, the establishment of a high-performance, data communications network to link ICFM researchers and facilities, and the application of parallel processing to speed up computationally intensive and/or time-critical ICFM problems. A multistage compressor flow physics program is cited as an example of efforts to use advanced computational technology to enhance a current NASA Lewis ICFM research program.

  13. Computation of Viscous Flow about Advanced Projectiles.

    DTIC Science & Technology

    1983-09-09

    Domain". Journal of Comp. Physics, Vol. 8, 1971, pp. 392-408. 10. Thompson , J . F ., Thames, F. C., and Mastin, C. M., "Automatic Numerical Generation of...computations, USSR Comput. Math. Math. Phys., 12, 2 (1972), 182-195. I~~ll A - - 18. Thompson , J . F ., F. C. Thames, and C. M. Mastin, Automatic

  14. Performance capabilities of a JPL dual-arm advanced teleoperation system

    NASA Technical Reports Server (NTRS)

    Szakaly, Z. F.; Bejczy, A. K.

    1991-01-01

    The system comprises: (1) two PUMA 560 robot arms, each equipped with the latest JPL developed smart hands which contain 3-D force/moment and grasp force sensors; (2) two general purpose force reflecting hand controllers; (3) a NS32016 microprocessors based distributed computing system together with JPL developed universal motor controllers; (4) graphics display of sensor data; (5) capabilities for time delay experiments; and (6) automatic data recording capabilities. Several different types of control modes are implemented on this system using different feedback control techniques. Some of the control modes and the related feedback control techniques are described, and the achievable control performance for tracking position and force trajectories are reported. The interaction between position and force trajectory tracking is illustrated. The best performance is obtained by using a novel, task space error feedback technique.

  15. Characterization of the Temperature Capabilities of Advanced Disk Alloy ME3

    NASA Technical Reports Server (NTRS)

    Gabb, Timothy P.; Telesman, Jack; Kantzos, Peter T.; OConnor, Kenneth

    2002-01-01

    The successful development of an advanced powder metallurgy disk alloy, ME3, was initiated in the NASA High Speed Research/Enabling Propulsion Materials (HSR/EPM) Compressor/Turbine Disk program in cooperation with General Electric Engine Company and Pratt & Whitney Aircraft Engines. This alloy was designed using statistical screening and optimization of composition and processing variables to have extended durability at 1200 F in large disks. Disks of this alloy were produced at the conclusion of the program using a realistic scaled-up disk shape and processing to enable demonstration of these properties. The objective of the Ultra-Efficient Engine Technologies disk program was to assess the mechanical properties of these ME3 disks as functions of temperature in order to estimate the maximum temperature capabilities of this advanced alloy. These disks were sectioned, machined into specimens, and extensively tested. Additional sub-scale disks and blanks were processed and selectively tested to explore the effects of several processing variations on mechanical properties. Results indicate the baseline ME3 alloy and process can produce 1300 to 1350 F temperature capabilities, dependent on detailed disk and engine design property requirements.

  16. 10 CFR 830 Major Modification Determination for the Advanced Test Reactor Remote Monitoring and Management Capability

    SciTech Connect

    Bohachek, Randolph Charles

    2015-09-01

    The Advanced Test Reactor (ATR; TRA-670), which is located in the ATR Complex at Idaho National Laboratory, was constructed in the 1960s for the purpose of irradiating reactor fuels and materials. Other irradiation services, such as radioisotope production, are also performed at ATR. While ATR is safely fulfilling current mission requirements, assessments are continuing. These assessments intend to identify areas to provide defense–in-depth and improve safety for ATR. One of the assessments performed by an independent group of nuclear industry experts recommended that a remote accident management capability be provided. The report stated that: “contemporary practice in commercial power reactors is to provide a remote shutdown station or stations to allow shutdown of the reactor and management of long-term cooling of the reactor (i.e., management of reactivity, inventory, and cooling) should the main control room be disabled (e.g., due to a fire in the control room or affecting the control room).” This project will install remote reactor monitoring and management capabilities for ATR. Remote capabilities will allow for post scram reactor management and monitoring in the event the main Reactor Control Room (RCR) must be evacuated.

  17. AXIS: an instrument for imaging Compton radiographs using the Advanced Radiography Capability on the NIF.

    PubMed

    Hall, G N; Izumi, N; Tommasini, R; Carpenter, A C; Palmer, N E; Zacharias, R; Felker, B; Holder, J P; Allen, F V; Bell, P M; Bradley, D; Montesanti, R; Landen, O L

    2014-11-01

    Compton radiography is an important diagnostic for Inertial Confinement Fusion (ICF), as it provides a means to measure the density and asymmetries of the DT fuel in an ICF capsule near the time of peak compression. The AXIS instrument (ARC (Advanced Radiography Capability) X-ray Imaging System) is a gated detector in development for the National Ignition Facility (NIF), and will initially be capable of recording two Compton radiographs during a single NIF shot. The principal reason for the development of AXIS is the requirement for significantly improved detection quantum efficiency (DQE) at high x-ray energies. AXIS will be the detector for Compton radiography driven by the ARC laser, which will be used to produce Bremsstrahlung X-ray backlighter sources over the range of 50 keV-200 keV for this purpose. It is expected that AXIS will be capable of recording these high-energy x-rays with a DQE several times greater than other X-ray cameras at NIF, as well as providing a much larger field of view of the imploded capsule. AXIS will therefore provide an image with larger signal-to-noise that will allow the density and distribution of the compressed DT fuel to be measured with significantly greater accuracy as ICF experiments are tuned for ignition.

  18. AXIS: An instrument for imaging Compton radiographs using the Advanced Radiography Capability on the NIF

    SciTech Connect

    Hall, G. N. Izumi, N.; Tommasini, R.; Carpenter, A. C.; Palmer, N. E.; Zacharias, R.; Felker, B.; Holder, J. P.; Allen, F. V.; Bell, P. M.; Bradley, D.; Montesanti, R.; Landen, O. L.

    2014-11-15

    Compton radiography is an important diagnostic for Inertial Confinement Fusion (ICF), as it provides a means to measure the density and asymmetries of the DT fuel in an ICF capsule near the time of peak compression. The AXIS instrument (ARC (Advanced Radiography Capability) X-ray Imaging System) is a gated detector in development for the National Ignition Facility (NIF), and will initially be capable of recording two Compton radiographs during a single NIF shot. The principal reason for the development of AXIS is the requirement for significantly improved detection quantum efficiency (DQE) at high x-ray energies. AXIS will be the detector for Compton radiography driven by the ARC laser, which will be used to produce Bremsstrahlung X-ray backlighter sources over the range of 50 keV–200 keV for this purpose. It is expected that AXIS will be capable of recording these high-energy x-rays with a DQE several times greater than other X-ray cameras at NIF, as well as providing a much larger field of view of the imploded capsule. AXIS will therefore provide an image with larger signal-to-noise that will allow the density and distribution of the compressed DT fuel to be measured with significantly greater accuracy as ICF experiments are tuned for ignition.

  19. A Ground Testbed to Advance US Capability in Autonomous Rendezvous and Docking Project

    NASA Technical Reports Server (NTRS)

    D'Souza, Chris

    2014-01-01

    This project will advance the Autonomous Rendezvous and Docking (AR&D) GNC system by testing it on hardware, particularly in a flight processor, with a goal of testing it in IPAS with the Waypoint L2 AR&D scenario. The entire Agency supports development of a Commodity for Autonomous Rendezvous and Docking (CARD) as outlined in the Agency-wide Community of Practice whitepaper entitled: "A Strategy for the U.S. to Develop and Maintain a Mainstream Capability for Automated/Autonomous Rendezvous and Docking in Low Earth Orbit and Beyond". The whitepaper establishes that 1) the US is in a continual state of AR&D point-designs and therefore there is no US "off-the-shelf" AR&D capability in existence today, 2) the US has fallen behind our foreign counterparts particularly in the autonomy of AR&D systems, 3) development of an AR&D commodity is a national need that would benefit NASA, our commercial partners, and DoD, and 4) an initial estimate indicates that the development of a standardized AR&D capability could save the US approximately $60M for each AR&D project and cut each project's AR&D flight system implementation time in half.

  20. Development of high-lift wing modifications for an advanced capability EA-6B aircraft

    NASA Technical Reports Server (NTRS)

    Waggoner, Edgar G.

    1990-01-01

    NASA-Langley has been in a development program aimed at improvements of the EA-6B electronic countermeasures aircraft's maneuvering capabilities; one objective of this effort is the investigation of relatively simple wing design modifications which could yield improved low speed high lift performance with minimum degradation of higher-speed performance. Various two- and three-dimensional low speed and transonic CFD techniques have accordingly been used during the design effort, which involved leading-edge slat and trailing-edge flap contour evaluations by both computation and wind tunnel experiment. Significant low-speed maximum-lift enhancements were obtained without cruise-speed deterioration.

  1. Computing Algorithms for Nuffield Advanced Physics.

    ERIC Educational Resources Information Center

    Summers, M. K.

    1978-01-01

    Defines all recurrence relations used in the Nuffield course, to solve first- and second-order differential equations, and describes a typical algorithm for computer generation of solutions. (Author/GA)

  2. Aerodynamic optimization studies on advanced architecture computers

    NASA Technical Reports Server (NTRS)

    Chawla, Kalpana

    1995-01-01

    The approach to carrying out multi-discipline aerospace design studies in the future, especially in massively parallel computing environments, comprises of choosing (1) suitable solvers to compute solutions to equations characterizing a discipline, and (2) efficient optimization methods. In addition, for aerodynamic optimization problems, (3) smart methodologies must be selected to modify the surface shape. In this research effort, a 'direct' optimization method is implemented on the Cray C-90 to improve aerodynamic design. It is coupled with an existing implicit Navier-Stokes solver, OVERFLOW, to compute flow solutions. The optimization method is chosen such that it can accomodate multi-discipline optimization in future computations. In the work , however, only single discipline aerodynamic optimization will be included.

  3. Advanced Test Reactor -- Testing Capabilities and Plans AND Advanced Test Reactor National Scientific User Facility -- Partnerships and Networks

    SciTech Connect

    Frances M. Marshall

    2008-07-01

    The Advanced Test Reactor (ATR), at the Idaho National Laboratory (INL), is one of the world’s premier test reactors for providing the capability for studying the effects of intense neutron and gamma radiation on reactor materials and fuels. The physical configuration of the ATR, a 4-leaf clover shape, allows the reactor to be operated at different power levels in the corner “lobes” to allow for different testing conditions for multiple simultaneous experiments. The combination of high flux (maximum thermal neutron fluxes of 1E15 neutrons per square centimeter per second and maximum fast [E>1.0 MeV] neutron fluxes of 5E14 neutrons per square centimeter per second) and large test volumes (up to 122 cm long and 12.7 cm diameter) provide unique testing opportunities. For future research, some ATR modifications and enhancements are currently planned. In 2007 the US Department of Energy designated the ATR as a National Scientific User Facility (NSUF) to facilitate greater access to the ATR for material testing research by a broader user community. This paper provides more details on some of the ATR capabilities, key design features, experiments, and plans for the NSUF.

  4. NWChem Meeting on Science Driven Petascale Computing and Capability Development at EMSL

    SciTech Connect

    De Jong, Wibe A.

    2007-02-19

    On January 25, and 26, 2007, an NWChem meeting was held that was attended by 65 scientists from 29 institutions including 22 universities and 5 national laboratories. The goals of the meeting were to look at major scientific challenges that could be addressed by computational modeling in environmental molecular sciences, and to identify the associated capability development needs. In addition, insights were sought into petascale computing developments in computational chemistry. During the meeting common themes were identified that will drive the need for the development of new or improved capabilities in NWChem. Crucial areas of development that the developer's team will be focusing on are (1) modeling of dynamics and kinetics in chemical transformations, (2) modeling of chemistry at interfaces and in the condensed phase, and (3) spanning longer time scales in biological processes modeled with molecular dynamics. Various computational chemistry methodologies were discussed during the meeting, which will provide the basis for the capability developments in the near or long term future of NWChem.

  5. Advanced Computational Techniques in Regional Wave Studies

    DTIC Science & Technology

    1990-01-03

    the new GERESS data. The dissertation work emphasized the development and use of advanced computa- tional techniques for studying regional seismic...hand, the possibility of new data sources at regional distances permits using previously ignored signals. Unfortunately, these regional signals will...the Green’s function around this new reference point is containing the propagation effects, and V is the source Gnk(x,t;r,t) - (2) volume where fJk

  6. Scientific Discovery through Advanced Computing (SciDAC-3) Partnership Project Annual Report

    SciTech Connect

    Hoffman, Forest M.; Bochev, Pavel B.; Cameron-Smith, Philip J..; Easter, Richard C; Elliott, Scott M.; Ghan, Steven J.; Liu, Xiaohong; Lowrie, Robert B.; Lucas, Donald D.; Ma, Po-lun; Sacks, William J.; Shrivastava, Manish; Singh, Balwinder; Tautges, Timothy J.; Taylor, Mark A.; Vertenstein, Mariana; Worley, Patrick H.

    2014-01-15

    The Applying Computationally Efficient Schemes for BioGeochemical Cycles ACES4BGC Project is advancing the predictive capabilities of Earth System Models (ESMs) by reducing two of the largest sources of uncertainty, aerosols and biospheric feedbacks, with a highly efficient computational approach. In particular, this project is implementing and optimizing new computationally efficient tracer advection algorithms for large numbers of tracer species; adding important biogeochemical interactions between the atmosphere, land, and ocean models; and applying uncertainty quanti cation (UQ) techniques to constrain process parameters and evaluate uncertainties in feedbacks between biogeochemical cycles and the climate system.

  7. Advances in Computer-Supported Learning

    ERIC Educational Resources Information Center

    Neto, Francisco; Brasileiro, Francisco

    2007-01-01

    The Internet and growth of computer networks have eliminated geographic barriers, creating an environment where education can be brought to a student no matter where that student may be. The success of distance learning programs and the availability of many Web-supported applications and multimedia resources have increased the effectiveness of…

  8. Evaluation of Advanced Computing Techniques and Technologies: Reconfigurable Computing

    NASA Technical Reports Server (NTRS)

    Wells, B. Earl

    2003-01-01

    The focus of this project was to survey the technology of reconfigurable computing determine its level of maturity and suitability for NASA applications. To better understand and assess the effectiveness of the reconfigurable design paradigm that is utilized within the HAL-15 reconfigurable computer system. This system was made available to NASA MSFC for this purpose, from Star Bridge Systems, Inc. To implement on at least one application that would benefit from the performance levels that are possible with reconfigurable hardware. It was originally proposed that experiments in fault tolerance and dynamically reconfigurability would be perform but time constraints mandated that these be pursued as future research.

  9. Monitoring of Ebola Virus Makona Evolution through Establishment of Advanced Genomic Capability in Liberia.

    PubMed

    Kugelman, Jeffrey R; Wiley, Michael R; Mate, Suzanne; Ladner, Jason T; Beitzel, Brett; Fakoli, Lawrence; Taweh, Fahn; Prieto, Karla; Diclaro, Joseph W; Minogue, Timothy; Schoepp, Randal J; Schaecher, Kurt E; Pettitt, James; Bateman, Stacey; Fair, Joseph; Kuhn, Jens H; Hensley, Lisa; Park, Daniel J; Sabeti, Pardis C; Sanchez-Lockhart, Mariano; Bolay, Fatorma K; Palacios, Gustavo

    2015-07-01

    To support Liberia's response to the ongoing Ebola virus (EBOV) disease epidemic in Western Africa, we established in-country advanced genomic capabilities to monitor EBOV evolution. Twenty-five EBOV genomes were sequenced at the Liberian Institute for Biomedical Research, which provided an in-depth view of EBOV diversity in Liberia during September 2014-February 2015. These sequences were consistent with a single virus introduction to Liberia; however, shared ancestry with isolates from Mali indicated at least 1 additional instance of movement into or out of Liberia. The pace of change is generally consistent with previous estimates of mutation rate. We observed 23 nonsynonymous mutations and 1 nonsense mutation. Six of these changes are within known binding sites for sequence-based EBOV medical countermeasures; however, the diagnostic and therapeutic impact of EBOV evolution within Liberia appears to be low.

  10. Time-temperature-stress capabilities of composite materials for advanced supersonic technology application, phase 1

    NASA Technical Reports Server (NTRS)

    Kerr, J. R.; Haskins, J. F.

    1980-01-01

    Implementation of metal and resin matrix composites into supersonic vehicle usage is contingent upon accelerating the demonstration of service capacity and design technology. Because of the added material complexity and lack of extensive service data, laboratory replication of the flight service will provide the most rapid method of documenting the airworthiness of advanced composite systems. A program in progress to determine the time temperature stress capabilities of several high temperature composite materials includes thermal aging, environmental aging, fatigue, creep, fracture, and tensile tests as well as real time flight simulation exposure. The program has two parts. The first includes all the material property determinations and aging and simulation exposures up through 10,000 hours. The second continues these tests up to 50,000 cumulative hours. Results are presented of the 10,000 hour phase, which has now been completed.

  11. ASDA - Advanced Suit Design Analyzer computer program

    NASA Technical Reports Server (NTRS)

    Bue, Grant C.; Conger, Bruce C.; Iovine, John V.; Chang, Chi-Min

    1992-01-01

    An ASDA model developed to evaluate the heat and mass transfer characteristics of advanced pressurized suit design concepts for low pressure or vacuum planetary applications is presented. The model is based on a generalized 3-layer suit that uses the Systems Integrated Numerical Differencing Analyzer '85 in conjunction with a 41-node FORTRAN routine. The latter simulates the transient heat transfer and respiratory processes of a human body in a suited environment. The user options for the suit encompass a liquid cooled garment, a removable jacket, a CO2/H2O permeable layer, and a phase change layer.

  12. Image processing for the Advanced Radiographic Capability (ARC) at the National Ignition Facility

    NASA Astrophysics Data System (ADS)

    Leach, Richard R.; Awwal, Abdul A. S.; Lowe-Webb, Roger; Miller-Kamm, Victoria; Orth, Charles; Roberts, Randy; Wilhelmsen, Karl

    2016-09-01

    The Advance Radiographic Capability (ARC) at the National Ignition Facility (NIF) is a laser system that employs up to four petawatt (PW) lasers to produce a sequence of short-pulse kilo-Joule laser pulses with controllable delays that generate X-rays to provide backlighting for high-density internal confinement fusion (ICF) capsule targets. Multi-frame, hard-X-ray radiography of imploding NIF capsules is a capability which is critical to the success of NIF's missions. ARC is designed to employ up to eight backlighters with tens-of-picosecond temporal resolution, to record the dynamics and produce an X-ray "motion picture" of the compression and ignition of cryogenic deuterium-tritium targets. ARC will generate tens-of-picosecond temporal resolution during the critical phases of ICF shots. Additionally, ARC supports a variety of other high energy density experiments including fast ignition studies on NIF. The automated alignment image analysis algorithms use digital camera sensor images to direct ARC beams onto the tens-of-microns scale metal wires. This paper describes the ARC automatic alignment sequence throughout the laser chain from pulse initiation to target with an emphasis on the image processing algorithms that generate the crucial alignment positions for ARC. The image processing descriptions and flow diagrams detail the alignment control loops throughout the ARC laser chain beginning in the ARC high-contrast front end (HCAFE), on into the ARC main laser area, and ending in the ARC target area.

  13. An Overview of Advanced Elastomeric Seal Development and Testing Capabilities at NASA Glenn Research Center

    NASA Technical Reports Server (NTRS)

    Dunlap, Patrick H.

    2014-01-01

    NASA is developing advanced space-rated elastomeric seals to support future space exploration missions to low Earth orbit, the Moon, near Earth asteroids, and other destinations. This includes seals for a new docking system and vehicle hatches. These seals must exhibit extremely low leak rates to ensure that astronauts have sufficient breathable air for extended missions. Seal compression loads must be below prescribed limits so as not to overload the mechanisms that compress them, and seal adhesion forces must be low to allow the sealed interface to be separated when required (e.g., during undocking or hatch opening). NASA Glenn Research Center has developed a number of unique test fixtures to measure the leak rates and compression and adhesion loads of candidate seal designs under simulated thermal, vacuum, and engagement conditions. Tests can be performed on full-scale seals with diameters on the order of 50 in., subscale seals that are about 12 in. in diameter, and smaller specimens such as O-rings. Test conditions include temperatures ranging from -238 to 662 F (-150 to 350 C), operational pressure gradients, and seal-on-seal or seal-on-flange mating configurations. Nominal and off-nominal conditions (e.g., incomplete seal compression) can also be simulated. This paper describes the main design features and capabilities of each type of test apparatus and provides an overview of advanced seal development activities at NASA Glenn.

  14. An Overview of Advanced Elastomeric Seal Development and Testing Capabilities at NASA Glenn Research Center

    NASA Technical Reports Server (NTRS)

    Dunlap, Patrick H., Jr.

    2014-01-01

    NASA is developing advanced space-rated elastomeric seals to support future space exploration missions to low Earth orbit, the Moon, near Earth asteroids, and other destinations. This includes seals for a new docking system and vehicle hatches. These seals must exhibit extremely low leak rates to ensure that astronauts have sufficient breathable air for extended missions. Seal compression loads must be below prescribed limits so as not to overload the mechanisms that compress them, and seal adhesion forces must be low to allow the sealed interface to be separated when required (e.g., during undocking or hatch opening). NASA Glenn Research Center has developed a number of unique test fixtures to measure the leak rates and compression and adhesion loads of candidate seal designs under simulated thermal, vacuum, and engagement conditions. Tests can be performed on full-scale seals with diameters on the order of 50 in., subscale seals that are about 12 in. in diameter, and smaller specimens such as O-rings. Test conditions include temperatures ranging from -238 to +662F (-150 to +350C), operational pressure gradients, and seal-on-seal or seal-on-flange mating configurations. Nominal and off-nominal conditions (e.g., incomplete seal compression) can also be simulated. This paper describes the main design features and capabilities of each test apparatus and provides an overview of advanced seal development activities at NASA Glenn.

  15. An Overview of Advanced Elastomeric Seal Development and Testing Capabilities at NASA Glenn Research Center

    NASA Technical Reports Server (NTRS)

    Dunlap, Patrick H., Jr.

    2014-01-01

    NASA is developing advanced space-rated elastomeric seals to support future space exploration missions to low Earth orbit, the Moon, near Earth asteroids, and other destinations. This includes seals for a new docking system and vehicle hatches. These seals must exhibit extremely low leak rates to ensure that astronauts have sufficient breathable air for extended missions. Seal compression loads must be below prescribed limits so as not to overload the mechanisms that compress them, and seal adhesion forces must be low to allow the sealed interface to be separated when required (e.g., during undocking or hatch opening). NASA Glenn Research Center has developed a number of unique test fixtures to measure the leak rates and compression and adhesion loads of candidate seal designs under simulated thermal, vacuum, and engagement conditions. Tests can be performed on fullscale seals with diameters on the order of 50 in., subscale seals that are about 12 in. in diameter, and smaller specimens such as O-rings. Test conditions include temperatures ranging from -238 to 662degF (-150 to 350degC), operational pressure gradients, and seal-on-seal or seal-on-flange mating configurations. Nominal and off-nominal conditions (e.g., incomplete seal compression) can also be simulated. This paper describes the main design features and capabilities of each type of test apparatus and provides an overview of advanced seal development activities at NASA Glenn.

  16. Time-temperature-stress capabilities of composite materials for advanced supersonic technology application

    NASA Technical Reports Server (NTRS)

    Kerr, James R.; Haskins, James F.

    1987-01-01

    Advanced composites will play a key role in the development of the technology for the design and fabrication of future supersonic vehicles. However, incorporating the material into vehicle usage is contingent on accelerating the demonstration of service capacity and design technology. Because of the added material complexity and lack of extensive data, laboratory replication of the flight service will provide the most rapid method to document the airworthiness of advanced composite systems. Consequently, a laboratory program was conducted to determine the time-temperature-stress capabilities of several high temperature composites. Tests included were thermal aging, environmental aging, fatigue, creep, fracture, tensile, and real-time flight simulation exposure. The program had two phases. The first included all the material property determinations and aging and simulation exposures up through 10,000 hours. The second continued these tests up to 50,000 cumulative hours. This report presents the results of the Phase 1 baseline and 10,000-hr aging and flight simulation studies, the Phase 2 50,000-hr aging studies, and the Phase 2 flight simulation tests, some of which extended to almost 40,000 hours.

  17. Computational physics and applied mathematics capability review June 8-10, 2010

    SciTech Connect

    Lee, Stephen R

    2010-01-01

    Los Alamos National Laboratory will review its Computational Physics and Applied Mathematics (CPAM) capabilities in 2010. The goals of capability reviews are to assess the quality of science, technology, and engineering (STE) performed by the capability, evaluate the integration of this capability across the Laboratory and within the scientific community, examine the relevance of this capability to the Laboratory's programs, and provide advice on the current and future directions of this capability. This is the first such review for CPAM, which has a long and unique history at the Laboratory, starting from the inception of the Laboratory in 1943. The CPAM capability covers an extremely broad technical area at Los Alamos, encompassing a wide array of disciplines, research topics, and organizations. A vast array of technical disciplines and activities are included in this capability, from general numerical modeling, to coupled multi-physics simulations, to detailed domain science activities in mathematics, methods, and algorithms. The CPAM capability involves over 12 different technical divisions and a majority of our programmatic and scientific activities. To make this large scope tractable, the CPAM capability is broken into the following six technical 'themes.' These themes represent technical slices through the CPAM capability and collect critical core competencies of the Laboratory, each of which contributes to the capability (and each of which is divided into multiple additional elements in the detailed descriptions of the themes in subsequent sections), as follows. Theme 1: Computational Fluid Dynamics - This theme speaks to the vast array of scientific capabilities for the simulation of fluids under shocks, low-speed flow, and turbulent conditions - which are key, historical, and fundamental strengths of the Laboratory. Theme 2: Partial Differential Equations - The technical scope of this theme is the applied mathematics and numerical solution of partial

  18. FY05-FY06 Advanced Simulation and Computing Implementation Plan, Volume 2

    SciTech Connect

    Baron, A L

    2004-07-19

    The Stockpile Stewardship Program (SSP) is a single, highly integrated technical program for maintaining the safety and reliability of the U.S. nuclear stockpile. The SSP uses past nuclear test data along with future non-nuclear test data, computational modeling and simulation, and experimental facilities to advance understanding of nuclear weapons. It includes stockpile surveillance, experimental research, development and engineering programs, and an appropriately scaled production capability to support stockpile requirements. This integrated national program will require the continued use of current facilities and programs along with new experimental facilities and computational enhancements to support these programs. The Advanced Simulation and Computing program (ASC) is a cornerstone of the SSP, providing simulation capabilities and computational resources to support the annual stockpile assessment and certification, to study advanced nuclear weapon design and manufacturing processes, to analyze accident scenarios and weapons aging, and to provide the tools to enable stockpile life extension programs and the resolution of significant finding investigations (SFIs). This requires a balanced system of technical staff, hardware, simulation software, and computer science solutions.

  19. Intelligent Software Tools for Advanced Computing

    SciTech Connect

    Baumgart, C.W.

    2001-04-03

    Feature extraction and evaluation are two procedures common to the development of any pattern recognition application. These features are the primary pieces of information which are used to train the pattern recognition tool, whether that tool is a neural network, a fuzzy logic rulebase, or a genetic algorithm. Careful selection of the features to be used by the pattern recognition tool can significantly streamline the overall development and training of the solution for the pattern recognition application. This report summarizes the development of an integrated, computer-based software package called the Feature Extraction Toolbox (FET), which can be used for the development and deployment of solutions to generic pattern recognition problems. This toolbox integrates a number of software techniques for signal processing, feature extraction and evaluation, and pattern recognition, all under a single, user-friendly development environment. The toolbox has been developed to run on a laptop computer, so that it may be taken to a site and used to develop pattern recognition applications in the field. A prototype version of this toolbox has been completed and is currently being used for applications development on several projects in support of the Department of Energy.

  20. Advanced Simulation and Computing Fiscal Year 2011-2012 Implementation Plan, Revision 0

    SciTech Connect

    McCoy, M; Phillips, J; Hpson, J; Meisner, R

    2010-04-22

    The Stockpile Stewardship Program (SSP) is a single, highly integrated technical program for maintaining the surety and reliability of the U.S. nuclear stockpile. The SSP uses past nuclear test data along with current and future non-nuclear test data, computational modeling and simulation, and experimental facilities to advance understanding of nuclear weapons. It includes stockpile surveillance, experimental research, development and engineering (D&E) programs, and an appropriately scaled production capability to support stockpile requirements. This integrated national program requires the continued use of current facilities and programs along with new experimental facilities and computational enhancements to support these programs. The Advanced Simulation and Computing Program (ASC) is a cornerstone of the SSP, providing simulation capabilities and computational resources to support the annual stockpile assessment and certification, to study advanced nuclear weapons design and manufacturing processes, to analyze accident scenarios and weapons aging, and to provide the tools to enable stockpile Life Extension Programs (LEPs) and the resolution of Significant Finding Investigations (SFIs). This requires a balanced resource, including technical staff, hardware, simulation software, and computer science solutions. In its first decade, the ASC strategy focused on demonstrating simulation capabilities of unprecedented scale in three spatial dimensions. In its second decade, ASC is focused on increasing its predictive capabilities in a three-dimensional (3D) simulation environment while maintaining support to the SSP. The program continues to improve its unique tools for solving progressively more difficult stockpile problems (focused on sufficient resolution, dimensionality and scientific details); to quantify critical margins and uncertainties (QMU); and to resolve increasingly difficult analyses needed for the SSP. Moreover, ASC has restructured its business model

  1. Advanced Simulation and Computing FY08-09 Implementation Plan Volume 2 Revision 0

    SciTech Connect

    McCoy, M; Kusnezov, D; Bikkel, T; Hopson, J

    2007-04-25

    The Stockpile Stewardship Program (SSP) is a single, highly integrated technical program for maintaining the safety and reliability of the U.S. nuclear stockpile. The SSP uses past nuclear test data along with current and future nonnuclear test data, computational modeling and simulation, and experimental facilities to advance understanding of nuclear weapons. It includes stockpile surveillance, experimental research, development and engineering programs, and an appropriately scaled production capability to support stockpile requirements. This integrated national program requires the continued use of current facilities and programs along with new experimental facilities and computational enhancements to support these programs. The Advanced Simulation and Computing Program (ASC) is a cornerstone of the SSP, providing simulation capabilities and computational resources to support the annual stockpile assessment and certification, to study advanced nuclear-weapons design and manufacturing processes, to analyze accident scenarios and weapons aging, and to provide the tools to enable Stockpile Life Extension Programs (SLEPs) and the resolution of Significant Finding Investigations (SFIs). This requires a balanced resource, including technical staff, hardware, simulation software, and computer science solutions. In its first decade, the ASC strategy focused on demonstrating simulation capabilities of unprecedented scale in three spatial dimensions. In its second decade, ASC is focused on increasing its predictive capabilities in a three-dimensional simulation environment while maintaining the support to the SSP. The program continues to improve its unique tools for solving progressively more difficult stockpile problems (focused on sufficient resolution, dimensionality and scientific details); to quantify critical margins and uncertainties (QMU); and to resolve increasingly difficult analyses needed for the SSP. Moreover, ASC has restructured its business model from one

  2. Advanced Simulation and Computing FY07-08 Implementation Plan Volume 2

    SciTech Connect

    Kusnezov, D; Hale, A; McCoy, M; Hopson, J

    2006-06-22

    The Stockpile Stewardship Program (SSP) is a single, highly integrated technical program for maintaining the safety and reliability of the U.S. nuclear stockpile. The SSP uses past nuclear test data along with current and future nonnuclear test data, computational modeling and simulation, and experimental facilities to advance understanding of nuclear weapons. It includes stockpile surveillance, experimental research, development and engineering programs, and an appropriately scaled production capability to support stockpile requirements. This integrated national program will require the continued use of current facilities and programs along with new experimental facilities and computational enhancements to support these programs. The Advanced Simulation and Computing Program (ASC) is a cornerstone of the SSP, providing simulation capabilities and computational resources to support the annual stockpile assessment and certification, to study advanced nuclear-weapons design and manufacturing processes, to analyze accident scenarios and weapons aging, and to provide the tools to enable Stockpile Life Extension Programs (SLEPs) and the resolution of Significant Finding Investigations (SFIs). This requires a balanced resource, including technical staff, hardware, simulation software, and computer science solutions. In its first decade, the ASC strategy focused on demonstrating simulation capabilities of unprecedented scale in three spatial dimensions. In its second decade, ASC is focused on increasing its predictive capabilities in a three-dimensional simulation environment while maintaining the support to the SSP. The program continues to improve its unique tools for solving progressively more difficult stockpile problems (focused on sufficient resolution, dimensionality and scientific details); to quantify critical margins and uncertainties (QMU); and to resolve increasingly difficult analyses needed for the SSP. Moreover, ASC has restructured its business model from

  3. Advanced Simulation & Computing FY09-FY10 Implementation Plan Volume 2, Rev. 0

    SciTech Connect

    Meisner, R; Perry, J; McCoy, M; Hopson, J

    2008-04-30

    The Stockpile Stewardship Program (SSP) is a single, highly integrated technical program for maintaining the safety and reliability of the U.S. nuclear stockpile. The SSP uses past nuclear test data along with current and future nonnuclear test data, computational modeling and simulation, and experimental facilities to advance understanding of nuclear weapons. It includes stockpile surveillance, experimental research, development and engineering programs, and an appropriately scaled production capability to support stockpile requirements. This integrated national program requires the continued use of current facilities and programs along with new experimental facilities and computational enhancements to support these programs. The Advanced Simulation and Computing Program (ASC)1 is a cornerstone of the SSP, providing simulation capabilities and computational resources to support the annual stockpile assessment and certification, to study advanced nuclear-weapons design and manufacturing processes, to analyze accident scenarios and weapons aging, and to provide the tools to enable Stockpile Life Extension Programs (SLEPs) and the resolution of Significant Finding Investigations (SFIs). This requires a balanced resource, including technical staff, hardware, simulation software, and computer science solutions. In its first decade, the ASC strategy focused on demonstrating simulation capabilities of unprecedented scale in three spatial dimensions. In its second decade, ASC is focused on increasing its predictive capabilities in a three-dimensional simulation environment while maintaining the support to the SSP. The program continues to improve its unique tools for solving progressively more difficult stockpile problems (focused on sufficient resolution, dimensionality and scientific details); to quantify critical margins and uncertainties (QMU); and to resolve increasingly difficult analyses needed for the SSP. Moreover, ASC has restructured its business model from one

  4. Advanced Simulation and Computing FY09-FY10 Implementation Plan, Volume 2, Revision 0.5

    SciTech Connect

    Meisner, R; Hopson, J; Peery, J; McCoy, M

    2008-10-07

    The Stockpile Stewardship Program (SSP) is a single, highly integrated technical program for maintaining the surety and reliability of the U.S. nuclear stockpile. The SSP uses past nuclear test data along with current and future non-nuclear test data, computational modeling and simulation, and experimental facilities to advance understanding of nuclear weapons. It includes stockpile surveillance, experimental research, development and engineering programs, and an appropriately scaled production capability to support stockpile requirements. This integrated national program requires the continued use of current facilities and programs along with new experimental facilities and computational enhancements to support these programs. The Advanced Simulation and Computing Program (ASC)1 is a cornerstone of the SSP, providing simulation capabilities and computational resources to support the annual stockpile assessment and certification, to study advanced nuclear weapons design and manufacturing processes, to analyze accident scenarios and weapons aging, and to provide the tools to enable stockpile Life Extension Programs (LEPs) and the resolution of Significant Finding Investigations (SFIs). This requires a balanced resource, including technical staff, hardware, simulation software, and computer science solutions. In its first decade, the ASC strategy focused on demonstrating simulation capabilities of unprecedented scale in three spatial dimensions. In its second decade, ASC is focused on increasing its predictive capabilities in a three-dimensional simulation environment while maintaining support to the SSP. The program continues to improve its unique tools for solving progressively more difficult stockpile problems (focused on sufficient resolution, dimensionality and scientific details); to quantify critical margins and uncertainties (QMU); and to resolve increasingly difficult analyses needed for the SSP. Moreover, ASC has restructured its business model from one

  5. Advanced Simulation and Computing FY08-09 Implementation Plan, Volume 2, Revision 0.5

    SciTech Connect

    Kusnezov, D; Bickel, T; McCoy, M; Hopson, J

    2007-09-13

    The Stockpile Stewardship Program (SSP) is a single, highly integrated technical program for maintaining the surety and reliability of the U.S. nuclear stockpile. The SSP uses past nuclear test data along with current and future non-nuclear test data, computational modeling and simulation, and experimental facilities to advance understanding of nuclear weapons. It includes stockpile surveillance, experimental research, development and engineering programs, and an appropriately scaled production capability to support stockpile requirements. This integrated national program requires the continued use of current facilities and programs along with new experimental facilities and computational enhancements to support these programs. The Advanced Simulation and Computing Program (ASC)1 is a cornerstone of the SSP, providing simulation capabilities and computational resources to support the annual stockpile assessment and certification, to study advanced nuclear-weapons design and manufacturing processes, to analyze accident scenarios and weapons aging, and to provide the tools to enable Stockpile Life Extension Programs (SLEPs) and the resolution of Significant Finding Investigations (SFIs). This requires a balanced resource, including technical staff, hardware, simulation software, and computer science solutions. In its first decade, the ASC strategy focused on demonstrating simulation capabilities of unprecedented scale in three spatial dimensions. In its second decade, ASC is focused on increasing its predictive capabilities in a three-dimensional simulation environment while maintaining the support to the SSP. The program continues to improve its unique tools for solving progressively more difficult stockpile problems (focused on sufficient resolution, dimensionality and scientific details); to quantify critical margins and uncertainties (QMU); and to resolve increasingly difficult analyses needed for the SSP. Moreover, ASC has restructured its business model from

  6. Advanced Simulation and Computing FY09-FY10 Implementation Plan Volume 2, Rev. 1

    SciTech Connect

    Kissel, L

    2009-04-01

    The Stockpile Stewardship Program (SSP) is a single, highly integrated technical program for maintaining the surety and reliability of the U.S. nuclear stockpile. The SSP uses past nuclear test data along with current and future non-nuclear test data, computational modeling and simulation, and experimental facilities to advance understanding of nuclear weapons. It includes stockpile surveillance, experimental research, development and engineering programs, and an appropriately scaled production capability to support stockpile requirements. This integrated national program requires the continued use of current facilities and programs along with new experimental facilities and computational enhancements to support these programs. The Advanced Simulation and Computing Program (ASC) is a cornerstone of the SSP, providing simulation capabilities and computational resources to support the annual stockpile assessment and certification, to study advanced nuclear weapons design and manufacturing processes, to analyze accident scenarios and weapons aging, and to provide the tools to enable stockpile Life Extension Programs (LEPs) and the resolution of Significant Finding Investigations (SFIs). This requires a balanced resource, including technical staff, hardware, simulation software, and computer science solutions. In its first decade, the ASC strategy focused on demonstrating simulation capabilities of unprecedented scale in three spatial dimensions. In its second decade, ASC is focused on increasing its predictive capabilities in a three-dimensional simulation environment while maintaining support to the SSP. The program continues to improve its unique tools for solving progressively more difficult stockpile problems (focused on sufficient resolution, dimensionality and scientific details); to quantify critical margins and uncertainties (QMU); and to resolve increasingly difficult analyses needed for the SSP. Moreover, ASC has restructured its business model from one that

  7. Advanced Simulation and Computing FY10-FY11 Implementation Plan Volume 2, Rev. 0.5

    SciTech Connect

    Meisner, R; Peery, J; McCoy, M; Hopson, J

    2009-09-08

    The Stockpile Stewardship Program (SSP) is a single, highly integrated technical program for maintaining the surety and reliability of the U.S. nuclear stockpile. The SSP uses past nuclear test data along with current and future non-nuclear test data, computational modeling and simulation, and experimental facilities to advance understanding of nuclear weapons. It includes stockpile surveillance, experimental research, development and engineering (D&E) programs, and an appropriately scaled production capability to support stockpile requirements. This integrated national program requires the continued use of current facilities and programs along with new experimental facilities and computational enhancements to support these programs. The Advanced Simulation and Computing Program (ASC) is a cornerstone of the SSP, providing simulation capabilities and computational resources to support the annual stockpile assessment and certification, to study advanced nuclear weapons design and manufacturing processes, to analyze accident scenarios and weapons aging, and to provide the tools to enable stockpile Life Extension Programs (LEPs) and the resolution of Significant Finding Investigations (SFIs). This requires a balanced resource, including technical staff, hardware, simulation software, and computer science solutions. In its first decade, the ASC strategy focused on demonstrating simulation capabilities of unprecedented scale in three spatial dimensions. In its second decade, ASC is focused on increasing its predictive capabilities in a three-dimensional (3D) simulation environment while maintaining support to the SSP. The program continues to improve its unique tools for solving progressively more difficult stockpile problems (focused on sufficient resolution, dimensionality and scientific details); to quantify critical margins and uncertainties (QMU); and to resolve increasingly difficult analyses needed for the SSP. Moreover, ASC has restructured its business model

  8. Advances in computer imaging/applications in facial plastic surgery.

    PubMed

    Papel, I D; Jiannetto, D F

    1999-01-01

    Rapidly progressing computer technology, ever-increasing expectations of patients, and a confusing medicolegal environment requires a clarification of the role of computer imaging/applications. Advances in computer technology and its applications are reviewed. A brief historical discussion is included for perspective. Improvements in both hardware and software with the advent of digital imaging have allowed great increases in speed and accuracy in patient imaging. This facilitates doctor-patient communication and possibly realistic patient expectations. Patients seeking cosmetic surgery now often expect preoperative imaging. Although society in general has become more litigious, a literature search up to 1998 reveals no lawsuits directly involving computer imaging. It appears that conservative utilization of computer imaging by the facial plastic surgeon may actually reduce liability and promote communication. Recent advances have significantly enhanced the value of computer imaging in the practice of facial plastic surgery. These technological advances in computer imaging appear to contribute a useful technique for the practice of facial plastic surgery. Inclusion of computer imaging should be given serious consideration as an adjunct to clinical practice.

  9. Advanced Simulation and Computing Business Plan

    SciTech Connect

    Rummel, E.

    2015-07-09

    To maintain a credible nuclear weapons program, the National Nuclear Security Administration’s (NNSA’s) Office of Defense Programs (DP) needs to make certain that the capabilities, tools, and expert staff are in place and are able to deliver validated assessments. This requires a complete and robust simulation environment backed by an experimental program to test ASC Program models. This ASC Business Plan document encapsulates a complex set of elements, each of which is essential to the success of the simulation component of the Nuclear Security Enterprise. The ASC Business Plan addresses the hiring, mentoring, and retaining of programmatic technical staff responsible for building the simulation tools of the nuclear security complex. The ASC Business Plan describes how the ASC Program engages with industry partners—partners upon whom the ASC Program relies on for today’s and tomorrow’s high performance architectures. Each piece in this chain is essential to assure policymakers, who must make decisions based on the results of simulations, that they are receiving all the actionable information they need.

  10. Direction and Integration of Experimental Ground Test Capabilities and Computational Methods

    NASA Technical Reports Server (NTRS)

    Dunn, Steven C.

    2016-01-01

    This paper groups and summarizes the salient points and findings from two AIAA conference panels targeted at defining the direction, with associated key issues and recommendations, for the integration of experimental ground testing and computational methods. Each panel session utilized rapporteurs to capture comments from both the panel members and the audience. Additionally, a virtual panel of several experts were consulted between the two sessions and their comments were also captured. The information is organized into three time-based groupings, as well as by subject area. These panel sessions were designed to provide guidance to both researchers/developers and experimental/computational service providers in defining the future of ground testing, which will be inextricably integrated with the advancement of computational tools.

  11. A shock wave capability for the improved Two-Dimensional Kinetics (TDK) computer program

    NASA Technical Reports Server (NTRS)

    Nickerson, G. R.; Dang, L. D.

    1984-01-01

    The Two Dimensional Kinetics (TDK) computer program is a primary tool in applying the JANNAF liquid rocket engine performance prediction procedures. The purpose of this contract has been to improve the TDK computer program so that it can be applied to rocket engine designs of advanced type. In particular, future orbit transfer vehicles (OTV) will require rocket engines that operate at high expansion ratio, i.e., in excess of 200:1. Because only a limited length is available in the space shuttle bay, it is possible that OTV nozzles will be designed with both relatively short length and high expansion ratio. In this case, a shock wave may be present in the flow. The TDK computer program was modified to include the simulation of shock waves in the supersonic nozzle flow field. The shocks induced by the wall contour can produce strong perturbations of the flow, affecting downstream conditions which need to be considered for thrust chamber performance calculations.

  12. Rodent Habitat on ISS: Advances in Capability for Determining Spaceflight Effects on Mammalian Physiology

    NASA Technical Reports Server (NTRS)

    Globus, R. K.; Choi, S.; Gong, C.; Leveson-Gower, D.; Ronca, A.; Taylor, E.; Beegle, J.

    2016-01-01

    Rodent research is a valuable essential tool for advancing biomedical discoveries in life sciences on Earth and in space. The National Research Counsel's Decadal survey (1) emphasized the importance of expanding NASAs life sciences research to perform long duration, rodent experiments on the International Space Station (ISS). To accomplish this objective, new flight hardware, operations, and science capabilities were developed at NASA ARC to support commercial and government-sponsored research. The flight phases of two separate spaceflight missions (Rodent Research-1 and Rodent Research-2) have been completed and new capabilities are in development. The first flight experiments carrying 20 mice were launched on Sept 21, 2014 in an unmanned Dragon Capsule, SpaceX4; Rodent Research-1 was dedicated to achieving both NASA validation and CASIS science objectives, while Rodent Reesearch-2 extended the period on orbit to 60 days. Groundbased control groups (housed in flight hardware or standard cages) were maintained in environmental chambers at Kennedy Space Center. Crewmembers previously trained in animal handling transferred mice from the Transporter into Habitats under simultaneous veterinary supervision by video streaming and were deemed healthy. Health and behavior of all mice on the ISS was monitored by video feed on a daily basis, and post-flight quantitative analyses of behavior were performed. The 10 mice from RR-1 Validation (16wk old, female C57Bl6/J) ambulated freely and actively throughout the Habitat, relying heavily on their forelimbs for locomotion. The first on-orbit dissections of mice were performed successfully, and high quality RNA (RIN values>9) and liver enzyme activities were obtained, validating the quality of sample recovery. Post-flight sample analysis revealed that body weights of FLT animals did not differ from ground controls (GC) housed in the same hardware, or vivarium controls (VIV) housed in standard cages. Organ weights analyzed post

  13. Advanced Cardiac Life Support (ACLS) utilizing Man-Tended Capability (MTC) hardware onboard Space Station Freedom

    NASA Technical Reports Server (NTRS)

    Smith, M.; Barratt, M.; Lloyd, C.

    1992-01-01

    Because of the time and distance involved in returning a patient from space to a definitive medical care facility, the capability for Advanced Cardiac Life Support (ACLS) exists onboard Space Station Freedom. Methods: In order to evaluate the effectiveness of terrestrial ACLS protocols in microgravity, a medical team conducted simulations during parabolic flights onboard the KC-135 aircraft. The hardware planned for use during the MTC phase of the space station was utilized to increase the fidelity of the scenario and to evaluate the prototype equipment. Based on initial KC-135 testing of CPR and ACLS, changes were made to the ventricular fibrillation algorithm in order to accommodate the space environment. Other constraints to delivery of ACLS onboard the space station include crew size, minimum training, crew deconditioning, and limited supplies and equipment. Results: The delivery of ACLS in microgravity is hindered by the environment, but should be adequate. Factors specific to microgravity were identified for inclusion in the protocol including immediate restraint of the patient and early intubation to insure airway. External cardiac compressions of adequate force and frequency were administered using various methods. The more significant limiting factors appear to be crew training, crew size, and limited supplies. Conclusions: Although ACLS is possible in the microgravity environment, future evaluations are necessary to further refine the protocols. Proper patient and medical officer restraint is crucial prior to advanced procedures. Also emphasis should be placed on early intubation for airway management and drug administration. Preliminary results and further testing will be utilized in the design of medical hardware, determination of crew training, and medical operations for space station and beyond.

  14. Advanced Simulation & Computing FY15 Implementation Plan Volume 2, Rev. 0.5

    SciTech Connect

    McCoy, Michel; Archer, Bill; Matzen, M. Keith

    2014-09-16

    The Stockpile Stewardship Program (SSP) is a single, highly integrated technical program for maintaining the surety and reliability of the U.S. nuclear stockpile. The SSP uses nuclear test data, computational modeling and simulation, and experimental facilities to advance understanding of nuclear weapons. It includes stockpile surveillance, experimental research, development and engineering programs, and an appropriately scaled production capability to support stockpile requirements. This integrated national program requires the continued use of experimental facilities and programs, and the computational enhancements to support these programs. The Advanced Simulation and Computing Program (ASC) is a cornerstone of the SSP, providing simulation capabilities and computational resources that support annual stockpile assessment and certification, study advanced nuclear weapons design and manufacturing processes, analyze accident scenarios and weapons aging, and provide the tools to enable stockpile Life Extension Programs (LEPs) and the resolution of Significant Finding Investigations (SFIs). This requires a balance of resource, including technical staff, hardware, simulation software, and computer science solutions. As the program approaches the end of its second decade, ASC is intently focused on increasing predictive capabilities in a three-dimensional (3D) simulation environment while maintaining support to the SSP. The program continues to improve its unique tools for solving progressively more difficult stockpile problems (sufficient resolution, dimensionality, and scientific details), quantify critical margins and uncertainties, and resolve increasingly difficult analyses needed for the SSP. Where possible, the program also enables the use of high-performance simulation and computing tools to address broader national security needs, such as foreign nuclear weapon assessments and counternuclear terrorism.

  15. High-performance computational and geostatistical experiments for testing the capabilities of 3-d electrical tomography

    SciTech Connect

    Carle, S. F.; Daily, W. D.; Newmark, R. L.; Ramirez, A.; Tompson, A.

    1999-01-19

    This project explores the feasibility of combining geologic insight, geostatistics, and high-performance computing to analyze the capabilities of 3-D electrical resistance tomography (ERT). Geostatistical methods are used to characterize the spatial variability of geologic facies that control sub-surface variability of permeability and electrical resistivity Synthetic ERT data sets are generated from geostatistical realizations of alluvial facies architecture. The synthetic data sets enable comparison of the "truth" to inversion results, quantification of the ability to detect particular facies at particular locations, and sensitivity studies on inversion parameters

  16. Integrated Analysis Capability pilot computer program. [large space structures and data management

    NASA Technical Reports Server (NTRS)

    Vos, R. G.

    1981-01-01

    An integrated analysis capability (IAC) computer software package was developed for the design analysis and performance evaluation of large space systems. The IAC aids the user in coupling the required technical disciplines (initially structures, thermal and controls), providing analysis solution paths which reveal critical interactive effects in order to study loads, stability and mission performance. Existing technical software modules, having a wide existing user community, are combined with the interface software to bridge between the different technologies and mathematical modeling techniques. The package is supported by executive, data management and interactive graphics software, with primary development within the superminicomputer environment.

  17. Advanced Simulation and Computing Co-Design Strategy

    SciTech Connect

    Ang, James A.; Hoang, Thuc T.; Kelly, Suzanne M.; McPherson, Allen; Neely, Rob

    2015-11-01

    This ASC Co-design Strategy lays out the full continuum and components of the co-design process, based on what we have experienced thus far and what we wish to do more in the future to meet the program’s mission of providing high performance computing (HPC) and simulation capabilities for NNSA to carry out its stockpile stewardship responsibility.

  18. Projected role of advanced computational aerodynamic methods at the Lockheed-Georgia company

    NASA Technical Reports Server (NTRS)

    Lores, M. E.

    1978-01-01

    Experience with advanced computational methods being used at the Lockheed-Georgia Company to aid in the evaluation and design of new and modified aircraft indicates that large and specialized computers will be needed to make advanced three-dimensional viscous aerodynamic computations practical. The Numerical Aerodynamic Simulation Facility should be used to provide a tool for designing better aerospace vehicles while at the same time reducing development costs by performing computations using Navier-Stokes equations solution algorithms and permitting less sophisticated but nevertheless complex calculations to be made efficiently. Configuration definition procedures and data output formats can probably best be defined in cooperation with industry, therefore, the computer should handle many remote terminals efficiently. The capability of transferring data to and from other computers needs to be provided. Because of the significant amount of input and output associated with 3-D viscous flow calculations and because of the exceedingly fast computation speed envisioned for the computer, special attention should be paid to providing rapid, diversified, and efficient input and output.

  19. Computer-Assisted Foreign Language Teaching and Learning: Technological Advances

    ERIC Educational Resources Information Center

    Zou, Bin; Xing, Minjie; Wang, Yuping; Sun, Mingyu; Xiang, Catherine H.

    2013-01-01

    Computer-Assisted Foreign Language Teaching and Learning: Technological Advances highlights new research and an original framework that brings together foreign language teaching, experiments and testing practices that utilize the most recent and widely used e-learning resources. This comprehensive collection of research will offer linguistic…

  20. 76 FR 64330 - Advanced Scientific Computing Advisory Committee

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-10-18

    ... Advanced Scientific Computing Advisory Committee AGENCY: Department of Energy, Office of Science. ACTION... Reliability, Diffusion on Complex Networks, and Reversible Software Execution Systems Report from Applied Math... at: (301) 903-7486 or by email at: Melea.Baker@science.doe.gov . You must make your request for...

  1. 78 FR 56871 - Advanced Scientific Computing Advisory Committee

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-09-16

    ... Advanced Scientific Computing Advisory Committee AGENCY: Office of Science, Department of Energy. ACTION... Exascale technical approaches subcommittee Facilities update Report from Applied Math Committee of Visitors...: ( Melea.Baker@science.doe.gov ). You must make your request for an oral statement at least five...

  2. The Federal Government's Role in Advancing Computer Technology

    ERIC Educational Resources Information Center

    Information Hotline, 1978

    1978-01-01

    As part of the Federal Data Processing Reorganization Study submitted by the Science and Technology Team, the Federal Government's role in advancing and diffusing computer technology is discussed. Findings and conclusions assess the state-of-the-art in government and in industry, and five recommendations provide directions for government policy…

  3. Advanced computational research in materials processing for design and manufacturing

    SciTech Connect

    Zacharia, T.

    1994-12-31

    The computational requirements for design and manufacture of automotive components have seen dramatic increases for producing automobiles with three times the mileage. Automotive component design systems are becoming increasingly reliant on structural analysis requiring both overall larger analysis and more complex analyses, more three-dimensional analyses, larger model sizes, and routine consideration of transient and non-linear effects. Such analyses must be performed rapidly to minimize delays in the design and development process, which drives the need for parallel computing. This paper briefly describes advanced computational research in superplastic forming and automotive crash worthiness.

  4. Advanced computational tools for 3-D seismic analysis

    SciTech Connect

    Barhen, J.; Glover, C.W.; Protopopescu, V.A.

    1996-06-01

    The global objective of this effort is to develop advanced computational tools for 3-D seismic analysis, and test the products using a model dataset developed under the joint aegis of the United States` Society of Exploration Geophysicists (SEG) and the European Association of Exploration Geophysicists (EAEG). The goal is to enhance the value to the oil industry of the SEG/EAEG modeling project, carried out with US Department of Energy (DOE) funding in FY` 93-95. The primary objective of the ORNL Center for Engineering Systems Advanced Research (CESAR) is to spearhead the computational innovations techniques that would enable a revolutionary advance in 3-D seismic analysis. The CESAR effort is carried out in collaboration with world-class domain experts from leading universities, and in close coordination with other national laboratories and oil industry partners.

  5. "Head up and eyes out" advances in head mounted displays capabilities

    NASA Astrophysics Data System (ADS)

    Cameron, Alex

    2013-06-01

    There are a host of helmet and head mounted displays, flooding the market place with displays which provide what is essentially a mobile computer display. What sets aviators HMDs apart is that they provide the user with accurate conformal information embedded in the pilots real world view (see through display) where the information presented is intuitive and easy to use because it overlays the real world (mix of sensor imagery, symbolic information and synthetic imagery) and enables them to stay head up, eyes out, - improving their effectiveness, reducing workload and improving safety. Such systems are an enabling technology in the provision of enhanced Situation Awareness (SA) and reducing user workload in high intensity situations. Safety Is Key; so the addition of these HMD functions cannot detract from the aircrew protection functions of conventional aircrew helmets which also include life support and audio communications. These capabilities are finding much wider application in new types of compact man mounted audio/visual products enabled by the emergence of new families of micro displays, novel optical concepts and ultra-compact low power processing solutions. This papers attempts to capture the key drivers and needs for future head mounted systems for aviation applications.

  6. Accuracy improvement capability of advanced projectile based on course correction fuze concept.

    PubMed

    Elsaadany, Ahmed; Wen-jun, Yi

    2014-01-01

    Improvement in terminal accuracy is an important objective for future artillery projectiles. Generally it is often associated with range extension. Various concepts and modifications are proposed to correct the range and drift of artillery projectile like course correction fuze. The course correction fuze concepts could provide an attractive and cost-effective solution for munitions accuracy improvement. In this paper, the trajectory correction has been obtained using two kinds of course correction modules, one is devoted to range correction (drag ring brake) and the second is devoted to drift correction (canard based-correction fuze). The course correction modules have been characterized by aerodynamic computations and flight dynamic investigations in order to analyze the effects on deflection of the projectile aerodynamic parameters. The simulation results show that the impact accuracy of a conventional projectile using these course correction modules can be improved. The drag ring brake is found to be highly capable for range correction. The deploying of the drag brake in early stage of trajectory results in large range correction. The correction occasion time can be predefined depending on required correction of range. On the other hand, the canard based-correction fuze is found to have a higher effect on the projectile drift by modifying its roll rate. In addition, the canard extension induces a high-frequency incidence angle as canards reciprocate at the roll motion.

  7. Accuracy Improvement Capability of Advanced Projectile Based on Course Correction Fuze Concept

    PubMed Central

    Elsaadany, Ahmed; Wen-jun, Yi

    2014-01-01

    Improvement in terminal accuracy is an important objective for future artillery projectiles. Generally it is often associated with range extension. Various concepts and modifications are proposed to correct the range and drift of artillery projectile like course correction fuze. The course correction fuze concepts could provide an attractive and cost-effective solution for munitions accuracy improvement. In this paper, the trajectory correction has been obtained using two kinds of course correction modules, one is devoted to range correction (drag ring brake) and the second is devoted to drift correction (canard based-correction fuze). The course correction modules have been characterized by aerodynamic computations and flight dynamic investigations in order to analyze the effects on deflection of the projectile aerodynamic parameters. The simulation results show that the impact accuracy of a conventional projectile using these course correction modules can be improved. The drag ring brake is found to be highly capable for range correction. The deploying of the drag brake in early stage of trajectory results in large range correction. The correction occasion time can be predefined depending on required correction of range. On the other hand, the canard based-correction fuze is found to have a higher effect on the projectile drift by modifying its roll rate. In addition, the canard extension induces a high-frequency incidence angle as canards reciprocate at the roll motion. PMID:25097873

  8. Addressing capability computing challenges of high-resolution global climate modelling at the Oak Ridge Leadership Computing Facility

    NASA Astrophysics Data System (ADS)

    Anantharaj, Valentine; Norman, Matthew; Evans, Katherine; Taylor, Mark; Worley, Patrick; Hack, James; Mayer, Benjamin

    2014-05-01

    During 2013, high-resolution climate model simulations accounted for over 100 million "core hours" using Titan at the Oak Ridge Leadership Computing Facility (OLCF). The suite of climate modeling experiments, primarily using the Community Earth System Model (CESM) at nearly 0.25 degree horizontal resolution, generated over a petabyte of data and nearly 100,000 files, ranging in sizes from 20 MB to over 100 GB. Effective utilization of leadership class resources requires careful planning and preparation. The application software, such as CESM, need to be ported, optimized and benchmarked for the target platform in order to meet the computational readiness requirements. The model configuration needs to be "tuned and balanced" for the experiments. This can be a complicated and resource intensive process, especially for high-resolution configurations using complex physics. The volume of I/O also increases with resolution; and new strategies may be required to manage I/O especially for large checkpoint and restart files that may require more frequent output for resiliency. It is also essential to monitor the application performance during the course of the simulation exercises. Finally, the large volume of data needs to be analyzed to derive the scientific results; and appropriate data and information delivered to the stakeholders. Titan is currently the largest supercomputer available for open science. The computational resources, in terms of "titan core hours" are allocated primarily via the Innovative and Novel Computational Impact on Theory and Experiment (INCITE) and ASCR Leadership Computing Challenge (ALCC) programs, both sponsored by the U.S. Department of Energy (DOE) Office of Science. Titan is a Cray XK7 system, capable of a theoretical peak performance of over 27 PFlop/s, consists of 18,688 compute nodes, with a NVIDIA Kepler K20 GPU and a 16-core AMD Opteron CPU in every node, for a total of 299,008 Opteron cores and 18,688 GPUs offering a cumulative 560

  9. [Activities of Research Institute for Advanced Computer Science

    NASA Technical Reports Server (NTRS)

    Gross, Anthony R. (Technical Monitor); Leiner, Barry M.

    2001-01-01

    The Research Institute for Advanced Computer Science (RIACS) carries out basic research and technology development in computer science, in support of the National Aeronautics and Space Administrations missions. RIACS is located at the NASA Ames Research Center, Moffett Field, California. RIACS research focuses on the three cornerstones of IT research necessary to meet the future challenges of NASA missions: 1. Automated Reasoning for Autonomous Systems Techniques are being developed enabling spacecraft that will be self-guiding and self-correcting to the extent that they will require little or no human intervention. Such craft will be equipped to independently solve problems as they arise, and fulfill their missions with minimum direction from Earth. 2. Human-Centered Computing Many NASA missions require synergy between humans and computers, with sophisticated computational aids amplifying human cognitive and perceptual abilities. 3. High Performance Computing and Networking Advances in the performance of computing and networking continue to have major impact on a variety of NASA endeavors, ranging from modeling and simulation to analysis of large scientific datasets to collaborative engineering, planning and execution. In addition, RIACS collaborates with NASA scientists to apply IT research to a variety of NASA application domains. RIACS also engages in other activities, such as workshops, seminars, visiting scientist programs and student summer programs, designed to encourage and facilitate collaboration between the university and NASA IT research communities.

  10. Session on High Speed Civil Transport Design Capability Using MDO and High Performance Computing

    NASA Technical Reports Server (NTRS)

    Rehder, Joe

    2000-01-01

    Since the inception of CAS in 1992, NASA Langley has been conducting research into applying multidisciplinary optimization (MDO) and high performance computing toward reducing aircraft design cycle time. The focus of this research has been the development of a series of computational frameworks and associated applications that increased in capability, complexity, and performance over time. The culmination of this effort is an automated high-fidelity analysis capability for a high speed civil transport (HSCT) vehicle installed on a network of heterogeneous computers with a computational framework built using Common Object Request Broker Architecture (CORBA) and Java. The main focus of the research in the early years was the development of the Framework for Interdisciplinary Design Optimization (FIDO) and associated HSCT applications. While the FIDO effort was eventually halted, work continued on HSCT applications of ever increasing complexity. The current application, HSCT4.0, employs high fidelity CFD and FEM analysis codes. For each analysis cycle, the vehicle geometry and computational grids are updated using new values for design variables. Processes for aeroelastic trim, loads convergence, displacement transfer, stress and buckling, and performance have been developed. In all, a total of 70 processes are integrated in the analysis framework. Many of the key processes include automatic differentiation capabilities to provide sensitivity information that can be used in optimization. A software engineering process was developed to manage this large project. Defining the interactions among 70 processes turned out to be an enormous, but essential, task. A formal requirements document was prepared that defined data flow among processes and subprocesses. A design document was then developed that translated the requirements into actual software design. A validation program was defined and implemented to ensure that codes integrated into the framework produced the same

  11. Advances in Numerical Boundary Conditions for Computational Aeroacoustics

    NASA Technical Reports Server (NTRS)

    Tam, Christopher K. W.

    1997-01-01

    Advances in Computational Aeroacoustics (CAA) depend critically on the availability of accurate, nondispersive, least dissipative computation algorithm as well as high quality numerical boundary treatments. This paper focuses on the recent developments of numerical boundary conditions. In a typical CAA problem, one often encounters two types of boundaries. Because a finite computation domain is used, there are external boundaries. On the external boundaries, boundary conditions simulating the solution outside the computation domain are to be imposed. Inside the computation domain, there may be internal boundaries. On these internal boundaries, boundary conditions simulating the presence of an object or surface with specific acoustic characteristics are to be applied. Numerical boundary conditions, both external or internal, developed for simple model problems are reviewed and examined. Numerical boundary conditions for real aeroacoustic problems are also discussed through specific examples. The paper concludes with a description of some much needed research in numerical boundary conditions for CAA.

  12. Advances in computational design and analysis of airbreathing propulsion systems

    NASA Technical Reports Server (NTRS)

    Klineberg, John M.

    1989-01-01

    The development of commercial and military aircraft depends, to a large extent, on engine manufacturers being able to achieve significant increases in propulsion capability through improved component aerodynamics, materials, and structures. The recent history of propulsion has been marked by efforts to develop computational techniques that can speed up the propulsion design process and produce superior designs. The availability of powerful supercomputers, such as the NASA Numerical Aerodynamic Simulator, and the potential for even higher performance offered by parallel computer architectures, have opened the door to the use of multi-dimensional simulations to study complex physical phenomena in propulsion systems that have previously defied analysis or experimental observation. An overview of several NASA Lewis research efforts is provided that are contributing toward the long-range goal of a numerical test-cell for the integrated, multidisciplinary design, analysis, and optimization of propulsion systems. Specific examples in Internal Computational Fluid Mechanics, Computational Structural Mechanics, Computational Materials Science, and High Performance Computing are cited and described in terms of current capabilities, technical challenges, and future research directions.

  13. Advancements in Root Growth Measurement Technologies and Observation Capabilities for Container-Grown Plants

    PubMed Central

    Judd, Lesley A.; Jackson, Brian E.; Fonteno, William C.

    2015-01-01

    The study, characterization, observation, and quantification of plant root growth and root systems (Rhizometrics) has been and remains an important area of research in all disciplines of plant science. In the horticultural industry, a large portion of the crops grown annually are grown in pot culture. Root growth is a critical component in overall plant performance during production in containers, and therefore it is important to understand the factors that influence and/or possible enhance it. Quantifying root growth has varied over the last several decades with each method of quantification changing in its reliability of measurement and variation among the results. Methods such as root drawings, pin boards, rhizotrons, and minirhizotrons initiated the aptitude to measure roots with field crops, and have been expanded to container-grown plants. However, many of the published research methods are monotonous and time-consuming. More recently, computer programs have increased in use as technology advances and measuring characteristics of root growth becomes easier. These programs are instrumental in analyzing various root growth characteristics, from root diameter and length of individual roots to branching angle and topological depth of the root architecture. This review delves into the expanding technologies involved with expertly measuring root growth of plants in containers, and the advantages and disadvantages that remain. PMID:27135334

  14. Advancements in Root Growth Measurement Technologies and Observation Capabilities for Container-Grown Plants.

    PubMed

    Judd, Lesley A; Jackson, Brian E; Fonteno, William C

    2015-07-03

    The study, characterization, observation, and quantification of plant root growth and root systems (Rhizometrics) has been and remains an important area of research in all disciplines of plant science. In the horticultural industry, a large portion of the crops grown annually are grown in pot culture. Root growth is a critical component in overall plant performance during production in containers, and therefore it is important to understand the factors that influence and/or possible enhance it. Quantifying root growth has varied over the last several decades with each method of quantification changing in its reliability of measurement and variation among the results. Methods such as root drawings, pin boards, rhizotrons, and minirhizotrons initiated the aptitude to measure roots with field crops, and have been expanded to container-grown plants. However, many of the published research methods are monotonous and time-consuming. More recently, computer programs have increased in use as technology advances and measuring characteristics of root growth becomes easier. These programs are instrumental in analyzing various root growth characteristics, from root diameter and length of individual roots to branching angle and topological depth of the root architecture. This review delves into the expanding technologies involved with expertly measuring root growth of plants in containers, and the advantages and disadvantages that remain.

  15. Advancing the predictive capability for pedestal structure through experiment and modeling

    NASA Astrophysics Data System (ADS)

    Hughes, Jerry

    2012-10-01

    Prospects for predictive capability of the edge pedestal in magnetic fusion devices have been dramatically enhanced due to recent research, which was conducted jointly by the US experimental and theory communities. Studies on the C-Mod, DIII-D and NSTX devices have revealed common features, including an upper limit on pedestal pressure in ELMy H-mode determined by instability to peeling-ballooning modes (PBMs), and pedestal width which scales approximately as βpol^1/2. The width dependence is consistent with a pedestal regulated by kinetic ballooning modes (KBMs). Signatures of KBMs have been actively sought both in experimental fluctuation measurements and in gyrokinetic simulations of the pedestal, with encouraging results. Studies of the temporal evolution of the pedestal during the ELM cycle reveal a tendency for the pressure gradient to saturate in advance of the ELM, with a steady growth in the pedestal width occurring prior to the ELM crash, which further supports a model for KBMs and PBMs working together to set the pedestal structure. Such a model, EPED, reproduces the pedestal height and width to better than 20% accuracy on existing devices over a range of more than 20 in pedestal pressure. Additional transport processes are assessed for their impact on pedestal structure, in particular the relative variation of the temperature and density pedestals due, for example, to differences in edge neutral sources. Such differences are observed in dimensionlessly matched discharges on C-Mod and DIII-D, despite their having similar calculated MHD stability and similar edge fluctuations. In certain high performance discharges, such as EDA H-mode, QH-mode and I-mode, pedestal relaxation is accomplished by continuous edge fluctuations, avoiding peeling-ballooning instabilities and associated ELMs. Progress in understanding these regimes will be reported.

  16. Building an advanced climate model: Program plan for the CHAMMP (Computer Hardware, Advanced Mathematics, and Model Physics) Climate Modeling Program

    SciTech Connect

    Not Available

    1990-12-01

    The issue of global warming and related climatic changes from increasing concentrations of greenhouse gases in the atmosphere has received prominent attention during the past few years. The Computer Hardware, Advanced Mathematics, and Model Physics (CHAMMP) Climate Modeling Program is designed to contribute directly to this rapid improvement. The goal of the CHAMMP Climate Modeling Program is to develop, verify, and apply a new generation of climate models within a coordinated framework that incorporates the best available scientific and numerical approaches to represent physical, biogeochemical, and ecological processes, that fully utilizes the hardware and software capabilities of new computer architectures, that probes the limits of climate predictability, and finally that can be used to address the challenging problem of understanding the greenhouse climate issue through the ability of the models to simulate time-dependent climatic changes over extended times and with regional resolution.

  17. Advanced sensor-computer technology for urban runoff monitoring

    NASA Astrophysics Data System (ADS)

    Yu, Byunggu; Behera, Pradeep K.; Ramirez Rochac, Juan F.

    2011-04-01

    The paper presents the project team's advanced sensor-computer sphere technology for real-time and continuous monitoring of wastewater runoff at the sewer discharge outfalls along the receiving water. This research significantly enhances and extends the previously proposed novel sensor-computer technology. This advanced technology offers new computation models for an innovative use of the sensor-computer sphere comprising accelerometer, programmable in-situ computer, solar power, and wireless communication for real-time and online monitoring of runoff quantity. This innovation can enable more effective planning and decision-making in civil infrastructure, natural environment protection, and water pollution related emergencies. The paper presents the following: (i) the sensor-computer sphere technology; (ii) a significant enhancement to the previously proposed discrete runoff quantity model of this technology; (iii) a new continuous runoff quantity model. Our comparative study on the two distinct models is presented. Based on this study, the paper further investigates the following: (1) energy-, memory-, and communication-efficient use of the technology for runoff monitoring; (2) possible sensor extensions for runoff quality monitoring.

  18. Advanced Simulation and Computing Fiscal Year 2016 Implementation Plan, Version 0

    SciTech Connect

    McCoy, M.; Archer, B.; Hendrickson, B.

    2015-08-27

    The Stockpile Stewardship Program (SSP) is an integrated technical program for maintaining the safety, surety, and reliability of the U.S. nuclear stockpile. The SSP uses nuclear test data, computational modeling and simulation, and experimental facilities to advance understanding of nuclear weapons. It includes stockpile surveillance, experimental research, development and engineering programs, and an appropriately scaled production capability to support stockpile requirements. This integrated national program requires the continued use of experimental facilities and programs, and the computational capabilities to support these programs. The purpose of this IP is to outline key work requirements to be performed and to control individual work activities within the scope of work. Contractors may not deviate from this plan without a revised WA or subsequent IP.

  19. Building a computer-aided design capability using a standard time share operating system

    NASA Technical Reports Server (NTRS)

    Sobieszczanski, J.

    1975-01-01

    The paper describes how an integrated system of engineering computer programs can be built using a standard commercially available operating system. The discussion opens with an outline of the auxiliary functions that an operating system can perform for a team of engineers involved in a large and complex task. An example of a specific integrated system is provided to explain how the standard operating system features can be used to organize the programs into a simple and inexpensive but effective system. Applications to an aircraft structural design study are discussed to illustrate the use of an integrated system as a flexible and efficient engineering tool. The discussion concludes with an engineer's assessment of an operating system's capabilities and desirable improvements.

  20. High Performance Computing Modeling Advances Accelerator Science for High-Energy Physics

    SciTech Connect

    Amundson, James; Macridin, Alexandru; Spentzouris, Panagiotis

    2014-07-28

    The development and optimization of particle accelerators are essential for advancing our understanding of the properties of matter, energy, space, and time. Particle accelerators are complex devices whose behavior involves many physical effects on multiple scales. Therefore, advanced computational tools utilizing high-performance computing are essential for accurately modeling them. In the past decade, the US Department of Energy's SciDAC program has produced accelerator-modeling tools that have been employed to tackle some of the most difficult accelerator science problems. The authors discuss the Synergia framework and its applications to high-intensity particle accelerator physics. Synergia is an accelerator simulation package capable of handling the entire spectrum of beam dynamics simulations. Our authors present Synergia's design principles and its performance on HPC platforms.

  1. High Performance Computing Modeling Advances Accelerator Science for High-Energy Physics

    DOE PAGES

    Amundson, James; Macridin, Alexandru; Spentzouris, Panagiotis

    2014-07-28

    The development and optimization of particle accelerators are essential for advancing our understanding of the properties of matter, energy, space, and time. Particle accelerators are complex devices whose behavior involves many physical effects on multiple scales. Therefore, advanced computational tools utilizing high-performance computing are essential for accurately modeling them. In the past decade, the US Department of Energy's SciDAC program has produced accelerator-modeling tools that have been employed to tackle some of the most difficult accelerator science problems. The authors discuss the Synergia framework and its applications to high-intensity particle accelerator physics. Synergia is an accelerator simulation package capable ofmore » handling the entire spectrum of beam dynamics simulations. Our authors present Synergia's design principles and its performance on HPC platforms.« less

  2. Cogeneration computer model assessment: Advanced cogeneration research study

    NASA Technical Reports Server (NTRS)

    Rosenberg, L.

    1983-01-01

    Cogeneration computer simulation models to recommend the most desirable models or their components for use by the Southern California Edison Company (SCE) in evaluating potential cogeneration projects was assessed. Existing cogeneration modeling capabilities are described, preferred models are identified, and an approach to the development of a code which will best satisfy SCE requirements is recommended. Five models (CELCAP, COGEN 2, CPA, DEUS, and OASIS) are recommended for further consideration.

  3. Fault Injection and Monitoring Capability for a Fault-Tolerant Distributed Computation System

    NASA Technical Reports Server (NTRS)

    Torres-Pomales, Wilfredo; Yates, Amy M.; Malekpour, Mahyar R.

    2010-01-01

    The Configurable Fault-Injection and Monitoring System (CFIMS) is intended for the experimental characterization of effects caused by a variety of adverse conditions on a distributed computation system running flight control applications. A product of research collaboration between NASA Langley Research Center and Old Dominion University, the CFIMS is the main research tool for generating actual fault response data with which to develop and validate analytical performance models and design methodologies for the mitigation of fault effects in distributed flight control systems. Rather than a fixed design solution, the CFIMS is a flexible system that enables the systematic exploration of the problem space and can be adapted to meet the evolving needs of the research. The CFIMS has the capabilities of system-under-test (SUT) functional stimulus generation, fault injection and state monitoring, all of which are supported by a configuration capability for setting up the system as desired for a particular experiment. This report summarizes the work accomplished so far in the development of the CFIMS concept and documents the first design realization.

  4. The Design and Implementation of NASA's Advanced Flight Computing Module

    NASA Technical Reports Server (NTRS)

    Alkakaj, Leon; Straedy, Richard; Jarvis, Bruce

    1995-01-01

    This paper describes a working flight computer Multichip Module developed jointly by JPL and TRW under their respective research programs in a collaborative fashion. The MCM is fabricated by nCHIP and is packaged within a 2 by 4 inch Al package from Coors. This flight computer module is one of three modules under development by NASA's Advanced Flight Computer (AFC) program. Further development of the Mass Memory and the programmable I/O MCM modules will follow. The three building block modules will then be stacked into a 3D MCM configuration. The mass and volume of the flight computer MCM achieved at 89 grams and 1.5 cubic inches respectively, represent a major enabling technology for future deep space as well as commercial remote sensing applications.

  5. Impact of computer advances on future finite elements computations. [for aircraft and spacecraft design

    NASA Technical Reports Server (NTRS)

    Fulton, Robert E.

    1985-01-01

    Research performed over the past 10 years in engineering data base management and parallel computing is discussed, and certain opportunities for research toward the next generation of structural analysis capability are proposed. Particular attention is given to data base management associated with the IPAD project and parallel processing associated with the Finite Element Machine project, both sponsored by NASA, and a near term strategy for a distributed structural analysis capability based on relational data base management software and parallel computers for a future structural analysis system.

  6. Full Scale Advanced Systems Testbed (FAST): Capabilities and Recent Flight Research

    NASA Technical Reports Server (NTRS)

    Miller, Christopher

    2014-01-01

    At the NASA Armstrong Flight Research Center research is being conducted into flight control technologies that will enable the next generation of air and space vehicles. The Full Scale Advanced Systems Testbed (FAST) aircraft provides a laboratory for flight exploration of these technologies. In recent years novel but simple adaptive architectures for aircraft and rockets have been researched along with control technologies for improving aircraft fuel efficiency and control structural interaction. This presentation outlines the FAST capabilities and provides a snapshot of the research accomplishments to date. Flight experimentation allows a researcher to substantiate or invalidate their assumptions and intuition about a new technology or innovative approach Data early in a development cycle is invaluable for determining which technology barriers are real and which ones are imagined Data for a technology at a low TRL can be used to steer and focus the exploration and fuel rapid advances based on real world lessons learned It is important to identify technologies that are mature enough to benefit from flight research data and not be tempted to wait until we have solved all the potential issues prior to getting some data Sometimes a stagnated technology just needs a little real world data to get it going One trick to getting data for low TRL technologies is finding an environment where it is okay to take risks, where occasional failure is an expected outcome Learning how things fail is often as valuable as showing that they work FAST has been architected to facilitate this type of testing for control system technologies, specifically novel algorithms and sensors Rapid prototyping with a quick turnaround in a fly-fix-fly paradigm Sometimes it's easier and cheaper to just go fly it than to analyze the problem to death The goal is to find and test control technologies that would benefit from flight data and find solutions to the real barriers to innovation. The FAST

  7. Activities of the Research Institute for Advanced Computer Science

    NASA Technical Reports Server (NTRS)

    Oliger, Joseph

    1994-01-01

    The Research Institute for Advanced Computer Science (RIACS) was established by the Universities Space Research Association (USRA) at the NASA Ames Research Center (ARC) on June 6, 1983. RIACS is privately operated by USRA, a consortium of universities with research programs in the aerospace sciences, under contract with NASA. The primary mission of RIACS is to provide research and expertise in computer science and scientific computing to support the scientific missions of NASA ARC. The research carried out at RIACS must change its emphasis from year to year in response to NASA ARC's changing needs and technological opportunities. Research at RIACS is currently being done in the following areas: (1) parallel computing; (2) advanced methods for scientific computing; (3) high performance networks; and (4) learning systems. RIACS technical reports are usually preprints of manuscripts that have been submitted to research journals or conference proceedings. A list of these reports for the period January 1, 1994 through December 31, 1994 is in the Reports and Abstracts section of this report.

  8. Expanded serial communication capability for the transport systems research vehicle laptop computers

    NASA Technical Reports Server (NTRS)

    Easley, Wesley C.

    1991-01-01

    A recent upgrade of the Transport Systems Research Vehicle (TSRV) operated by the Advanced Transport Operating Systems Program Office at the NASA Langley Research Center included installation of a number of Grid 1500 series laptop computers. Each unit is a 80386-based IBM PC clone. RS-232 data busses are needed for TSRV flight research programs, and it has been advantageous to extend the application of the Grids in this area. Use was made of the expansion features of the Grid internal bus to add a user programmable serial communication channel. Software to allow use of the Grid bus expansion has been written and placed in a Turbo C library for incorporation into applications programs in a transparent manner via function calls. Port setup; interrupt-driven, two-way data transfer; and software flow control are built into the library functions.

  9. Soft computing in design and manufacturing of advanced materials

    NASA Technical Reports Server (NTRS)

    Cios, Krzysztof J.; Baaklini, George Y; Vary, Alex

    1993-01-01

    The potential of fuzzy sets and neural networks, often referred to as soft computing, for aiding in all aspects of manufacturing of advanced materials like ceramics is addressed. In design and manufacturing of advanced materials, it is desirable to find which of the many processing variables contribute most to the desired properties of the material. There is also interest in real time quality control of parameters that govern material properties during processing stages. The concepts of fuzzy sets and neural networks are briefly introduced and it is shown how they can be used in the design and manufacturing processes. These two computational methods are alternatives to other methods such as the Taguchi method. The two methods are demonstrated by using data collected at NASA Lewis Research Center. Future research directions are also discussed.

  10. Advanced computer modeling techniques expand belt conveyor technology

    SciTech Connect

    Alspaugh, M.

    1998-07-01

    Increased mining production is continuing to challenge engineers and manufacturers to keep up. The pressure to produce larger and more versatile equipment is increasing. This paper will show some recent major projects in the belt conveyor industry that have pushed the limits of design and engineering technology. Also, it will discuss the systems engineering discipline and advanced computer modeling tools that have helped make these achievements possible. Several examples of technologically advanced designs will be reviewed. However, new technology can sometimes produce increased problems with equipment availability and reliability if not carefully developed. Computer modeling techniques that help one design larger equipment can also compound operational headaches if engineering processes and algorithms are not carefully analyzed every step of the way.

  11. Fourier transform infrared spectrophotometry for thin film monitors: computer and equipment integration for enhanced capabilities

    NASA Astrophysics Data System (ADS)

    Cox, J. Neal; Sedayao, J.; Shergill, Gurmeet S.; Villasol, R.; Haaland, David M.

    1991-03-01

    Fourier transform infrared spectrophotometry (FTIR) is a valuable technique for monitoring thin films used in semiconductor device manufacture. Determinations of the constituent contents in borophosphosilicate (BPSG) phosphosilicate (PSG) silicon oxynitride (SiON:H and spin-on-glass (SOG) thin films are a few applications. Due to the nature of the technique FTIR instrumentation is one of the most extensively computer-dependent pieces of equipment that is likely to be found in a microelectronics plant. In the role of fab monitor or reactor characterization tool FTIR instruments can rapidly generate large amounts of data. Also the drive for greater accuracy and tighter precision is leading to the development of increasingly sophisticated data processing software that tax the computing abilities of most instrument local data stations. By linking a local FTIR data station to a remote minicomputer its capabilities are greatly improved. We discuss three classes of enhancement. First the FTIR in the fab area communicates and interacts in real time with the minicomputer: transferring data segments to it instructing it to perform sophisticated processing and returning the results to the operator in the fab. Characterizations of PSG thin films by this approach are discussed. Second the spectra of large numbers of samples are processed locally. The large database is then transmitted to the minicomputer for study by statistical/graphics software. Results of CVD-reactor spatial profiling experiments for plasma SiON are presented. Third processing of calibration spectra is performed

  12. Computer architectures for computational physics work done by Computational Research and Technology Branch and Advanced Computational Concepts Group

    NASA Technical Reports Server (NTRS)

    1985-01-01

    Slides are reproduced that describe the importance of having high performance number crunching and graphics capability. They also indicate the types of research and development underway at Ames Research Center to ensure that, in the near term, Ames is a smart buyer and user, and in the long-term that Ames knows the best possible solutions for number crunching and graphics needs. The drivers for this research are real computational physics applications of interest to Ames and NASA. They are concerned with how to map the applications, and how to maximize the physics learned from the results of the calculations. The computer graphics activities are aimed at getting maximum information from the three-dimensional calculations by using the real time manipulation of three-dimensional data on the Silicon Graphics workstation. Work is underway on new algorithms that will permit the display of experimental results that are sparse and random, the same way that the dense and regular computed results are displayed.

  13. The advanced computational testing and simulation toolkit (ACTS)

    SciTech Connect

    Drummond, L.A.; Marques, O.

    2002-05-21

    During the past decades there has been a continuous growth in the number of physical and societal problems that have been successfully studied and solved by means of computational modeling and simulation. Distinctively, a number of these are important scientific problems ranging in scale from the atomic to the cosmic. For example, ionization is a phenomenon as ubiquitous in modern society as the glow of fluorescent lights and the etching on silicon computer chips; but it was not until 1999 that researchers finally achieved a complete numerical solution to the simplest example of ionization, the collision of a hydrogen atom with an electron. On the opposite scale, cosmologists have long wondered whether the expansion of the Universe, which began with the Big Bang, would ever reverse itself, ending the Universe in a Big Crunch. In 2000, analysis of new measurements of the cosmic microwave background radiation showed that the geometry of the Universe is flat, and thus the Universe will continue expanding forever. Both of these discoveries depended on high performance computer simulations that utilized computational tools included in the Advanced Computational Testing and Simulation (ACTS) Toolkit. The ACTS Toolkit is an umbrella project that brought together a number of general purpose computational tool development projects funded and supported by the U.S. Department of Energy (DOE). These tools, which have been developed independently, mainly at DOE laboratories, make it easier for scientific code developers to write high performance applications for parallel computers. They tackle a number of computational issues that are common to a large number of scientific applications, mainly implementation of numerical algorithms, and support for code development, execution and optimization. The ACTS Toolkit Project enables the use of these tools by a much wider community of computational scientists, and promotes code portability, reusability, reduction of duplicate efforts

  14. NDE of advanced turbine engine components and materials by computed tomography

    NASA Technical Reports Server (NTRS)

    Yancey, R. N.; Baaklini, George Y.; Klima, Stanley J.

    1991-01-01

    Computed tomography (CT) is an X-ray technique that provides quantitative 3D density information of materials and components and can accurately detail spatial distributions of cracks, voids, and density variations. CT scans of ceramic materials, composites, and engine components were taken and the resulting images will be discussed. Scans were taken with two CT systems with different spatial resolution capabilities. The scans showed internal damage, density variations, and geometrical arrangement of various features in the materials and components. It was concluded that CT can play an important role in the characterization of advanced turbine engine materials and components. Future applications of this technology will be outlined.

  15. DOE Advanced Scientific Computing Advisory Subcommittee (ASCAC) Report: Top Ten Exascale Research Challenges

    SciTech Connect

    Lucas, Robert; Ang, James; Bergman, Keren; Borkar, Shekhar; Carlson, William; Carrington, Laura; Chiu, George; Colwell, Robert; Dally, William; Dongarra, Jack; Geist, Al; Haring, Rud; Hittinger, Jeffrey; Hoisie, Adolfy; Klein, Dean Micron; Kogge, Peter; Lethin, Richard; Sarkar, Vivek; Schreiber, Robert; Shalf, John; Sterling, Thomas; Stevens, Rick; Bashor, Jon; Brightwell, Ron; Coteus, Paul; Debenedictus, Erik; Hiller, Jon; Kim, K. H.; Langston, Harper; Murphy, Richard Micron; Webster, Clayton; Wild, Stefan; Grider, Gary; Ross, Rob; Leyffer, Sven; Laros III, James

    2014-02-10

    Exascale computing systems are essential for the scientific fields that will transform the 21st century global economy, including energy, biotechnology, nanotechnology, and materials science. Progress in these fields is predicated on the ability to perform advanced scientific and engineering simulations, and analyze the deluge of data. On July 29, 2013, ASCAC was charged by Patricia Dehmer, the Acting Director of the Office of Science, to assemble a subcommittee to provide advice on exascale computing. This subcommittee was directed to return a list of no more than ten technical approaches (hardware and software) that will enable the development of a system that achieves the Department's goals for exascale computing. Numerous reports over the past few years have documented the technical challenges and the non¬-viability of simply scaling existing computer designs to reach exascale. The technical challenges revolve around energy consumption, memory performance, resilience, extreme concurrency, and big data. Drawing from these reports and more recent experience, this ASCAC subcommittee has identified the top ten computing technology advancements that are critical to making a capable, economically viable, exascale system.

  16. Recovery Act: Advanced Interaction, Computation, and Visualization Tools for Sustainable Building Design

    SciTech Connect

    Greenberg, Donald P.; Hencey, Brandon M.

    2013-08-20

    Current building energy simulation technology requires excessive labor, time and expertise to create building energy models, excessive computational time for accurate simulations and difficulties with the interpretation of the results. These deficiencies can be ameliorated using modern graphical user interfaces and algorithms which take advantage of modern computer architectures and display capabilities. To prove this hypothesis, we developed an experimental test bed for building energy simulation. This novel test bed environment offers an easy-to-use interactive graphical interface, provides access to innovative simulation modules that run at accelerated computational speeds, and presents new graphics visualization methods to interpret simulation results. Our system offers the promise of dramatic ease of use in comparison with currently available building energy simulation tools. Its modular structure makes it suitable for early stage building design, as a research platform for the investigation of new simulation methods, and as a tool for teaching concepts of sustainable design. Improvements in the accuracy and execution speed of many of the simulation modules are based on the modification of advanced computer graphics rendering algorithms. Significant performance improvements are demonstrated in several computationally expensive energy simulation modules. The incorporation of these modern graphical techniques should advance the state of the art in the domain of whole building energy analysis and building performance simulation, particularly at the conceptual design stage when decisions have the greatest impact. More importantly, these better simulation tools will enable the transition from prescriptive to performative energy codes, resulting in better, more efficient designs for our future built environment.

  17. Parallel computing in genomic research: advances and applications

    PubMed Central

    Ocaña, Kary; de Oliveira, Daniel

    2015-01-01

    Today’s genomic experiments have to process the so-called “biological big data” that is now reaching the size of Terabytes and Petabytes. To process this huge amount of data, scientists may require weeks or months if they use their own workstations. Parallelism techniques and high-performance computing (HPC) environments can be applied for reducing the total processing time and to ease the management, treatment, and analyses of this data. However, running bioinformatics experiments in HPC environments such as clouds, grids, clusters, and graphics processing unit requires the expertise from scientists to integrate computational, biological, and mathematical techniques and technologies. Several solutions have already been proposed to allow scientists for processing their genomic experiments using HPC capabilities and parallelism techniques. This article brings a systematic review of literature that surveys the most recently published research involving genomics and parallel computing. Our objective is to gather the main characteristics, benefits, and challenges that can be considered by scientists when running their genomic experiments to benefit from parallelism techniques and HPC capabilities. PMID:26604801

  18. Parallel computing in genomic research: advances and applications.

    PubMed

    Ocaña, Kary; de Oliveira, Daniel

    2015-01-01

    Today's genomic experiments have to process the so-called "biological big data" that is now reaching the size of Terabytes and Petabytes. To process this huge amount of data, scientists may require weeks or months if they use their own workstations. Parallelism techniques and high-performance computing (HPC) environments can be applied for reducing the total processing time and to ease the management, treatment, and analyses of this data. However, running bioinformatics experiments in HPC environments such as clouds, grids, clusters, and graphics processing unit requires the expertise from scientists to integrate computational, biological, and mathematical techniques and technologies. Several solutions have already been proposed to allow scientists for processing their genomic experiments using HPC capabilities and parallelism techniques. This article brings a systematic review of literature that surveys the most recently published research involving genomics and parallel computing. Our objective is to gather the main characteristics, benefits, and challenges that can be considered by scientists when running their genomic experiments to benefit from parallelism techniques and HPC capabilities.

  19. Computation of the tip vortex flowfield for advanced aircraft propellers

    NASA Technical Reports Server (NTRS)

    Tsai, Tommy M.; Dejong, Frederick J.; Levy, Ralph

    1988-01-01

    The tip vortex flowfield plays a significant role in the performance of advanced aircraft propellers. The flowfield in the tip region is complex, three-dimensional and viscous with large secondary velocities. An analysis is presented using an approximate set of equations which contains the physics required by the tip vortex flowfield, but which does not require the resources of the full Navier-Stokes equations. A computer code was developed to predict the tip vortex flowfield of advanced aircraft propellers. A grid generation package was developed to allow specification of a variety of advanced aircraft propeller shapes. Calculations of the tip vortex generation on an SR3 type blade at high Reynolds numbers were made using this code and a parametric study was performed to show the effect of tip thickness on tip vortex intensity. In addition, calculations of the tip vortex generation on a NACA 0012 type blade were made, including the flowfield downstream of the blade trailing edge. Comparison of flowfield calculations with experimental data from an F4 blade was made. A user's manual was also prepared for the computer code (NASA CR-182178).

  20. Advanced Computational Methods for Thermal Radiative Heat Transfer

    SciTech Connect

    Tencer, John; Carlberg, Kevin Thomas; Larsen, Marvin E.; Hogan, Roy E.

    2016-10-01

    Participating media radiation (PMR) in weapon safety calculations for abnormal thermal environments are too costly to do routinely. This cost may be s ubstantially reduced by applying reduced order modeling (ROM) techniques. The application of ROM to PMR is a new and unique approach for this class of problems. This approach was investigated by the authors and shown to provide significant reductions in the computational expense associated with typical PMR simulations. Once this technology is migrated into production heat transfer analysis codes this capability will enable the routine use of PMR heat transfer in higher - fidelity simulations of weapon resp onse in fire environments.

  1. XII Advanced Computing and Analysis Techniques in Physics Research

    NASA Astrophysics Data System (ADS)

    Speer, Thomas; Carminati, Federico; Werlen, Monique

    November 2008 will be a few months after the official start of LHC when the highest quantum energy ever produced by mankind will be observed by the most complex piece of scientific equipment ever built. LHC will open a new era in physics research and push further the frontier of Knowledge This achievement has been made possible by new technological developments in many fields, but computing is certainly the technology that has made possible this whole enterprise. Accelerator and detector design, construction management, data acquisition, detectors monitoring, data analysis, event simulation and theoretical interpretation are all computing based HEP activities but also occurring many other research fields. Computing is everywhere and forms the common link between all involved scientists and engineers. The ACAT workshop series, created back in 1990 as AIHENP (Artificial Intelligence in High Energy and Nuclear Research) has been covering the tremendous evolution of computing in its most advanced topics, trying to setup bridges between computer science, experimental and theoretical physics. Conference web-site: http://acat2008.cern.ch/ Programme and presentations: http://indico.cern.ch/conferenceDisplay.py?confId=34666

  2. High resolution computed tomography of advanced composite and ceramic materials

    NASA Technical Reports Server (NTRS)

    Yancey, R. N.; Klima, S. J.

    1991-01-01

    Advanced composite and ceramic materials are being developed for use in many new defense and commercial applications. In order to achieve the desired mechanical properties of these materials, the structural elements must be carefully analyzed and engineered. A study was conducted to evaluate the use of high resolution computed tomography (CT) as a macrostructural analysis tool for advanced composite and ceramic materials. Several samples were scanned using a laboratory high resolution CT scanner. Samples were also destructively analyzed at the locations of the scans and the nondestructive and destructive results were compared. The study provides useful information outlining the strengths and limitations of this technique and the prospects for further research in this area.

  3. Improvements in Thermal Protection Sizing Capabilities for TCAT: Conceptual Design for Advanced Space Transportation Systems

    NASA Technical Reports Server (NTRS)

    Olds, John R.; Izon, Stephen James

    2002-01-01

    The Thermal Calculation Analysis Tool (TCAT), originally developed for the Space Systems Design Lab at the Georgia Institute of Technology, is a conceptual design tool capable of integrating aeroheating analysis into conceptual reusable launch vehicle design. It provides Thermal Protection System (TPS) unit thicknesses and acreage percentages based on the geometry of the vehicle and a reference trajectory to be used in calculation of the total cost and weight of the vehicle design. TCAT has proven to be reasonably accurate at calculating the TPS unit weights for in-flight trajectories; however, it does not have the capability of sizing TPS materials above cryogenic fuel tanks for ground hold operations. During ground hold operations, the vehicle is held for a brief period (generally about two hours) during which heat transfer from the TPS materials to the cryogenic fuel occurs. If too much heat is extracted from the TPS material, the surface temperature may fall below the freezing point of water, thereby freezing any condensation that may be present at the surface of the TPS. Condensation or ice on the surface of the vehicle is potentially hazardous to the mission and can also damage the TPS. It is questionable whether or not the TPS thicknesses provided by the aeroheating analysis would be sufficiently thick to insulate the surface of the TPS from the heat transfer to the fuel. Therefore, a design tool has been developed that is capable of sizing TPS materials at these cryogenic fuel tank locations to augment TCAT's TPS sizing capabilities.

  4. Advances in Computational Stability Analysis of Composite Aerospace Structures

    SciTech Connect

    Degenhardt, R.; Araujo, F. C. de

    2010-09-30

    European aircraft industry demands for reduced development and operating costs. Structural weight reduction by exploitation of structural reserves in composite aerospace structures contributes to this aim, however, it requires accurate and experimentally validated stability analysis of real structures under realistic loading conditions. This paper presents different advances from the area of computational stability analysis of composite aerospace structures which contribute to that field. For stringer stiffened panels main results of the finished EU project COCOMAT are given. It investigated the exploitation of reserves in primary fibre composite fuselage structures through an accurate and reliable simulation of postbuckling and collapse. For unstiffened cylindrical composite shells a proposal for a new design method is presented.

  5. Recent Advances in Computed Tomographic Technology: Cardiopulmonary Imaging Applications.

    PubMed

    Tabari, Azadeh; Lo Gullo, Roberto; Murugan, Venkatesh; Otrakji, Alexi; Digumarthy, Subba; Kalra, Mannudeep

    2017-03-01

    Cardiothoracic diseases result in substantial morbidity and mortality. Chest computed tomography (CT) has been an imaging modality of choice for assessing a host of chest diseases, and technologic advances have enabled the emergence of coronary CT angiography as a robust noninvasive test for cardiac imaging. Technologic developments in CT have also enabled the application of dual-energy CT scanning for assessing pulmonary vascular and neoplastic processes. Concerns over increasing radiation dose from CT scanning are being addressed with introduction of more dose-efficient wide-area detector arrays and iterative reconstruction techniques. This review article discusses the technologic innovations in CT and their effect on cardiothoracic applications.

  6. Advanced Computer Science on Internal Ballistics of Solid Rocket Motors

    NASA Astrophysics Data System (ADS)

    Shimada, Toru; Kato, Kazushige; Sekino, Nobuhiro; Tsuboi, Nobuyuki; Seike, Yoshio; Fukunaga, Mihoko; Daimon, Yu; Hasegawa, Hiroshi; Asakawa, Hiroya

    In this paper, described is the development of a numerical simulation system, what we call “Advanced Computer Science on SRM Internal Ballistics (ACSSIB)”, for the purpose of improvement of performance and reliability of solid rocket motors (SRM). The ACSSIB system is consisting of a casting simulation code of solid propellant slurry, correlation database of local burning-rate of cured propellant in terms of local slurry flow characteristics, and a numerical code for the internal ballistics of SRM, as well as relevant hardware. This paper describes mainly the objectives, the contents of this R&D, and the output of the fiscal year of 2008.

  7. Advancing the Surveillance Capabilities of the Air Force’s Large-Aperature Telescopes

    DTIC Science & Technology

    2014-03-06

    added to these images to simulate the observed data. Since it is not fundamental, we do not include read noise from the detectors , assuming that... blind restorations for simulated data of a target of brightness mv=+2 as would be acquired with telescopes of 1 m (blue line), 1.6 m (magenta line...aperture diversity and blind deconvolution”, OSA topical meeting on Computational Optical Sensing and Imaging, Computational Imaging through Turbulence

  8. Computational methods of the Advanced Fluid Dynamics Model

    SciTech Connect

    Bohl, W.R.; Wilhelm, D.; Parker, F.R.; Berthier, J.; Maudlin, P.J.; Schmuck, P.; Goutagny, L.; Ichikawa, S.; Ninokata, H.; Luck, L.B.

    1987-01-01

    To more accurately treat severe accidents in fast reactors, a program has been set up to investigate new computational models and approaches. The product of this effort is a computer code, the Advanced Fluid Dynamics Model (AFDM). This paper describes some of the basic features of the numerical algorithm used in AFDM. Aspects receiving particular emphasis are the fractional-step method of time integration, the semi-implicit pressure iteration, the virtual mass inertial terms, the use of three velocity fields, higher order differencing, convection of interfacial area with source and sink terms, multicomponent diffusion processes in heat and mass transfer, the SESAME equation of state, and vectorized programming. A calculated comparison with an isothermal tetralin/ammonia experiment is performed. We conclude that significant improvements are possible in reliably calculating the progression of severe accidents with further development.

  9. Advanced E-O test capability for Army Next-Generation Automated Test System (NGATS)

    NASA Astrophysics Data System (ADS)

    Errea, S.; Grigor, J.; King, D. F.; Matis, G.; McHugh, S.; McKechnie, J.; Nehring, B.

    2015-05-01

    The Future E-O (FEO) program was established to develop a flexible, modular, automated test capability as part of the Next Generation Automatic Test System (NGATS) program to support the test and diagnostic needs of currently fielded U.S. Army electro-optical (E-O) devices, as well as being expandable to address the requirements of future Navy, Marine Corps and Air Force E-O systems. Santa Barbara infrared (SBIR) has designed, fabricated, and delivered three (3) prototype FEO for engineering and logistics evaluation prior to anticipated full-scale production beginning in 2016. In addition to presenting a detailed overview of the FEO system hardware design, features and testing capabilities, the integration of SBIR's EO-IR sensor and laser test software package, IRWindows 4™, into FEO to automate the test execution, data collection and analysis, archiving and reporting of results is also described.

  10. Advancing Cybersecurity Capability Measurement Using the CERT(registered trademark)-RMM Maturity Indicator Level Scale

    DTIC Science & Technology

    2013-11-01

    Institute at permission@sei.cmu.edu. * These restrictions do not apply to U.S. government entities. CERT® and CMMI ® are registered marks of Carnegie...Attributes 4 1.3.4 Appraisal and Scoring Methods 5 1.3.5 Improvement Roadmaps 5 2 Introducing the Maturity Indicator Level (MIL) Concept 6 2.1...CERT®-RMM v1.2) utilizes the maturity architecture (levels and descriptions) as provided in the Capability Maturity Model Integration ( CMMI

  11. Computational ocean acoustics: Advances in 3D ocean acoustic modeling

    NASA Astrophysics Data System (ADS)

    Schmidt, Henrik; Jensen, Finn B.

    2012-11-01

    The numerical model of ocean acoustic propagation developed in the 1980's are still in widespread use today, and the field of computational ocean acoustics is often considered a mature field. However, the explosive increase in computational power available to the community has created opportunities for modeling phenomena that earlier were beyond reach. Most notably, three-dimensional propagation and scattering problems have been prohibitive computationally, but are now addressed routinely using brute force numerical approaches such as the Finite Element Method, in particular for target scattering problems, where they are being combined with the traditional wave theory propagation models in hybrid modeling frameworks. Also, recent years has seen the development of hybrid approaches coupling oceanographic circulation models with acoustic propagation models, enabling the forecasting of sonar performance uncertainty in dynamic ocean environments. These and other advances made over the last couple of decades support the notion that the field of computational ocean acoustics is far from being mature. [Work supported by the Office of Naval Research, Code 321OA].

  12. FTIR (Fourier Transform Infrared) spectrophotometry for thin film monitors: Computer and equipment integration for enhanced capabilities

    NASA Astrophysics Data System (ADS)

    Cox, J. N.; Sedayao, J.; Shergill, G.; Villasol, R.; Haaland, D. M.

    Fourier transform infrared spectrophotometry (FTIR) is a valuable technique for monitoring thin films used in semiconductor device manufacture. Determinations of the constituent contents in borophosphosilicate (BPSG), phosphosilicate (PSG), silicon oxynitride (SiON:H,OH), and spin-on-glass (SOG) thin films are a few applications. Due to the nature of the technique, FTIR instrumentation is one of the most extensively computer-dependent pieces of equipment that is likely to be found in a microelectronics plant. In the role of fab monitor or reactor characterization tool, FTIR instruments can rapidly generate large amounts of data. By linking a local FTIR data station to a remote minicomputer its capabilities are greatly improved. We discuss three caused of enhancement. First, the FTIR in the fab area communicates and interacts in real time with the minicomputer: transferring data segments to it, instructing it to perform sophisticated processing, and returning the result to the operator in the fab. Characterizations of PSG thin films by this approach are discussed. Second, the spectra of large numbers of samples are processed locally. The large database is then transmitted to the minicomputer for study by statistical/graphics software. Results of CVD-reactor spatial profiling experiments for plasma SiON are presented. Third, processing of calibration spectra is performed on the minicomputer to optimize the accuracy and precision of a Partial Least Squares analysis mode. This model is then transferred to the data station in the fab. The analysis of BPSG thin films is discussed in this regard. The prospects for fully automated at-line monitoring and for real-time, in-situ monitoring will be discussed.

  13. FTIR (Fourier transform infrared) spectrophotometry for thin film monitors: Computer and equipment integration for enhanced capabilities

    SciTech Connect

    Cox, J.N.; Sedayao, J.; Shergill, G.; Villasol, R. ); Haaland, D.M. )

    1990-01-01

    Fourier transform infrared spectrophotometry (FTIR) is a valuable technique for monitoring thin films used in semiconductor device manufacture. Determinations of the constituent contents in borophosphosilicate (BPSG), phosphosilicate (PSG), silicon oxynitride (SiON:H,OH), and spin-on-glass (SOG) thin films are a few applications. Due to the nature of the technique, FTIR instrumentation is one of the most extensively computer-dependent pieces of equipment that is likely to be found in a microelectronics plant. In the role of fab monitor or reactor characterization tool, FTIR instruments can rapidly generate large amounts of data. By linking a local FTIR data station to a remote minicomputer its capabilities are greatly improved. We discuss three caused of enhancement. First, the FTIR in the fab area communicates and interacts in real time with the minicomputer: transferring data segments to it, instructing it to perform sophisticated processing, and returning the result to the operator in the fab. Characterizations of PSG thin films by this approach are discussed. Second, the spectra of large numbers of samples are processed locally. The large database is then transmitted to the minicomputer for study by statistical/graphics software. Results of CVD-reactor spatial profiling experiments for plasma SiON are presented. Third, processing of calibration spectra is performed on the minicomputer to optimize the accuracy and precision of a Partial Least Squares'' analysis mode. This model is then transferred to the data station in the fab. The analysis of BPSG thin films is discussed in this regard. The prospects for fully automated at-line monitoring and for real-time, in-situ monitoring will be discussed. 10 refs., 4 figs.

  14. Advanced fuel assembly characterization capabilities based on gamma tomography at the Halden boiling water reactor

    SciTech Connect

    Holcombe, S.; Eitrheim, K.; Svaerd, S. J.; Hallstadius, L.; Willman, C.

    2012-07-01

    Characterization of individual fuel rods using gamma spectroscopy is a standard part of the Post Irradiation Examinations performed on experimental fuel at the Halden Boiling Water Reactor. However, due to handling and radiological safety concerns, these measurements are presently carried out only at the end of life of the fuel, and not earlier than several days or weeks after its removal from the reactor core. In order to enhance the fuel characterization capabilities at the Halden facilities, a gamma tomography measurement system is now being constructed, capable of characterizing fuel assemblies on a rod-by-rod basis in a more timely and efficient manner. Gamma tomography for measuring nuclear fuel is based on gamma spectroscopy measurements and tomographic reconstruction techniques. The technique, previously demonstrated on irradiated commercial fuel assemblies, is capable of determining rod-by-rod information without the need to dismantle the fuel. The new gamma tomography system will be stationed close to the Halden reactor in order to limit the need for fuel transport, and it will significantly reduce the time required to perform fuel characterization measurements. Furthermore, it will allow rod-by-rod fuel characterization to occur between irradiation cycles, thus allowing for measurement of experimental fuel repeatedly during its irradiation lifetime. The development of the gamma tomography measurement system is a joint project between the Inst. for Energy Technology - OECD Halden Reactor Project, Westinghouse (Sweden), and Uppsala Univ.. (authors)

  15. Computational cardiology: how computer simulations could be used to develop new therapies and advance existing ones

    PubMed Central

    Trayanova, Natalia A.; O'Hara, Thomas; Bayer, Jason D.; Boyle, Patrick M.; McDowell, Kathleen S.; Constantino, Jason; Arevalo, Hermenegild J.; Hu, Yuxuan; Vadakkumpadan, Fijoy

    2012-01-01

    This article reviews the latest developments in computational cardiology. It focuses on the contribution of cardiac modelling to the development of new therapies as well as the advancement of existing ones for cardiac arrhythmias and pump dysfunction. Reviewed are cardiac modelling efforts aimed at advancing and optimizing existent therapies for cardiac disease (defibrillation, ablation of ventricular tachycardia, and cardiac resynchronization therapy) and at suggesting novel treatments, including novel molecular targets, as well as efforts to use cardiac models in stratification of patients likely to benefit from a given therapy, and the use of models in diagnostic procedures. PMID:23104919

  16. Advanced Computational Framework for Environmental Management ZEM, Version 1.x

    SciTech Connect

    Vesselinov, Velimir V.; O'Malley, Daniel; Pandey, Sachin

    2016-11-04

    Typically environmental management problems require analysis of large and complex data sets originating from concurrent data streams with different data collection frequencies and pedigree. These big data sets require on-the-fly integration into a series of models with different complexity for various types of model analyses where the data are applied as soft and hard model constraints. This is needed to provide fast iterative model analyses based on the latest available data to guide decision-making. Furthermore, the data and model are associated with uncertainties. The uncertainties are probabilistic (e.g. measurement errors) and non-probabilistic (unknowns, e.g. alternative conceptual models characterizing site conditions). To address all of these issues, we have developed an integrated framework for real-time data and model analyses for environmental decision-making called ZEM. The framework allows for seamless and on-the-fly integration of data and modeling results for robust and scientifically-defensible decision-making applying advanced decision analyses tools such as Bayesian- Information-Gap Decision Theory (BIG-DT). The framework also includes advanced methods for optimization that are capable of dealing with a large number of unknown model parameters, and surrogate (reduced order) modeling capabilities based on support vector regression techniques. The framework is coded in Julia, a state-of-the-art high-performance programing language (http://julialang.org). The ZEM framework is open-source and can be applied to any environmental management site. The framework will be open-source and released under GPL V3 license.

  17. An Advanced Neutronic Analysis Toolkit with Inline Monte Carlo capability for BHTR Analysis

    SciTech Connect

    William R. Martin; John C. Lee

    2009-12-30

    Monte Carlo capability has been combined with a production LWR lattice physics code to allow analysis of high temperature gas reactor configurations, accounting for the double heterogeneity due to the TRISO fuel. The Monte Carlo code MCNP5 has been used in conjunction with CPM3, which was the testbench lattice physics code for this project. MCNP5 is used to perform two calculations for the geometry of interest, one with homogenized fuel compacts and the other with heterogeneous fuel compacts, where the TRISO fuel kernels are resolved by MCNP5.

  18. DOE Advanced Scientific Computing Advisory Committee (ASCAC) Report: Exascale Computing Initiative Review

    SciTech Connect

    Reed, Daniel; Berzins, Martin; Pennington, Robert; Sarkar, Vivek; Taylor, Valerie

    2015-08-01

    On November 19, 2014, the Advanced Scientific Computing Advisory Committee (ASCAC) was charged with reviewing the Department of Energy’s conceptual design for the Exascale Computing Initiative (ECI). In particular, this included assessing whether there are significant gaps in the ECI plan or areas that need to be given priority or extra management attention. Given the breadth and depth of previous reviews of the technical challenges inherent in exascale system design and deployment, the subcommittee focused its assessment on organizational and management issues, considering technical issues only as they informed organizational or management priorities and structures. This report presents the observations and recommendations of the subcommittee.

  19. Recent advances in computational mechanics of the human knee joint.

    PubMed

    Kazemi, M; Dabiri, Y; Li, L P

    2013-01-01

    Computational mechanics has been advanced in every area of orthopedic biomechanics. The objective of this paper is to provide a general review of the computational models used in the analysis of the mechanical function of the knee joint in different loading and pathological conditions. Major review articles published in related areas are summarized first. The constitutive models for soft tissues of the knee are briefly discussed to facilitate understanding the joint modeling. A detailed review of the tibiofemoral joint models is presented thereafter. The geometry reconstruction procedures as well as some critical issues in finite element modeling are also discussed. Computational modeling can be a reliable and effective method for the study of mechanical behavior of the knee joint, if the model is constructed correctly. Single-phase material models have been used to predict the instantaneous load response for the healthy knees and repaired joints, such as total and partial meniscectomies, ACL and PCL reconstructions, and joint replacements. Recently, poromechanical models accounting for fluid pressurization in soft tissues have been proposed to study the viscoelastic response of the healthy and impaired knee joints. While the constitutive modeling has been considerably advanced at the tissue level, many challenges still exist in applying a good material model to three-dimensional joint simulations. A complete model validation at the joint level seems impossible presently, because only simple data can be obtained experimentally. Therefore, model validation may be concentrated on the constitutive laws using multiple mechanical tests of the tissues. Extensive model verifications at the joint level are still crucial for the accuracy of the modeling.

  20. Recent Advances in Computational Mechanics of the Human Knee Joint

    PubMed Central

    Kazemi, M.; Dabiri, Y.; Li, L. P.

    2013-01-01

    Computational mechanics has been advanced in every area of orthopedic biomechanics. The objective of this paper is to provide a general review of the computational models used in the analysis of the mechanical function of the knee joint in different loading and pathological conditions. Major review articles published in related areas are summarized first. The constitutive models for soft tissues of the knee are briefly discussed to facilitate understanding the joint modeling. A detailed review of the tibiofemoral joint models is presented thereafter. The geometry reconstruction procedures as well as some critical issues in finite element modeling are also discussed. Computational modeling can be a reliable and effective method for the study of mechanical behavior of the knee joint, if the model is constructed correctly. Single-phase material models have been used to predict the instantaneous load response for the healthy knees and repaired joints, such as total and partial meniscectomies, ACL and PCL reconstructions, and joint replacements. Recently, poromechanical models accounting for fluid pressurization in soft tissues have been proposed to study the viscoelastic response of the healthy and impaired knee joints. While the constitutive modeling has been considerably advanced at the tissue level, many challenges still exist in applying a good material model to three-dimensional joint simulations. A complete model validation at the joint level seems impossible presently, because only simple data can be obtained experimentally. Therefore, model validation may be concentrated on the constitutive laws using multiple mechanical tests of the tissues. Extensive model verifications at the joint level are still crucial for the accuracy of the modeling. PMID:23509602

  1. Review and revocation of access privileges distributed through capabilities. [object sharing computer systems design methodology

    NASA Technical Reports Server (NTRS)

    Gligor, V. D.

    1979-01-01

    The problems of review and revocation of access privileges are presented in the context of the systems that use capabilities for the long-term distribution of access privileges. The approach to solve these two problems requires that a capability propagation graph be maintained in memory spaces associated with subjects (e.g., domains, processes, etc.) that make copies of the respective capability; the graph remains inaccessible to those subjects, however. Parallel processes of the operating system update the graph as the system runs. It is noted that the most important application of the above mechanisms may prove to be the possibility of implementing a capability-based system in which the capability representation is short.

  2. Improving Interactive Capabilities in Computer-Assisted Instruction. Semi-Annual Technical Report for Six Months Ending July 31, 1973.

    ERIC Educational Resources Information Center

    Collins, Allan M.; And Others

    Developments in three areas relating to interactive capabilities on the SCHOLAR computer-assisted instruction (CAI) system are reported. The first section discusses the implementation of two presentation strategies in SCHOLAR -- the Tutorial mode and the Block-Test mode -- and offers a comparative evaluation of these two modes using high school…

  3. Linear interpolation of four-dimensional tabulated data for computers with single subscripted variable capability

    NASA Technical Reports Server (NTRS)

    Farr, W. R.

    1971-01-01

    Using only a one-dimensional subscripted variable, a FORTRAN computer subprogram was developed to linearly interpolate tabulated data of functions of four or less variables. The primary motivation was for faster computation.

  4. NASA Trapezoidal Wing Computations Including Transition and Advanced Turbulence Modeling

    NASA Technical Reports Server (NTRS)

    Rumsey, C. L.; Lee-Rausch, E. M.

    2012-01-01

    Flow about the NASA Trapezoidal Wing is computed with several turbulence models by using grids from the first High Lift Prediction Workshop in an effort to advance understanding of computational fluid dynamics modeling for this type of flowfield. Transition is accounted for in many of the computations. In particular, a recently-developed 4-equation transition model is utilized and works well overall. Accounting for transition tends to increase lift and decrease moment, which improves the agreement with experiment. Upper surface flap separation is reduced, and agreement with experimental surface pressures and velocity profiles is improved. The predicted shape of wakes from upstream elements is strongly influenced by grid resolution in regions above the main and flap elements. Turbulence model enhancements to account for rotation and curvature have the general effect of increasing lift and improving the resolution of the wing tip vortex as it convects downstream. However, none of the models improve the prediction of surface pressures near the wing tip, where more grid resolution is needed.

  5. Advanced 0.3-NA EUV lithography capabilities at the ALS

    SciTech Connect

    Naulleau, Patrick; Anderson, Erik; Dean, Kim; Denham, Paul; Goldberg, Kenneth A.; Hoef, Brian; Jackson, Keith

    2005-07-07

    For volume nanoelectronics production using Extreme ultraviolet (EUV) lithography [1] to become a reality around the year 2011, advanced EUV research tools are required today. Microfield exposure tools have played a vital role in the early development of EUV lithography [2-4] concentrating on numerical apertures (NA) of 0.2 and smaller. Expected to enter production at the 32-nm node with NAs of 0.25, EUV can no longer rely on these early research tools to provide relevant learning. To overcome this problem, a new generation of microfield exposure tools, operating at an NA of 0.3 have been developed [5-8]. Like their predecessors, these tools trade off field size and speed for greatly reduced complexity. One of these tools is implemented at Lawrence Berkeley National Laboratory's Advanced Light Source synchrotron radiation facility. This tool gets around the problem of the intrinsically high coherence of the synchrotron source [9,10] by using an active illuminator scheme [11]. Here we describe recent printing results obtained from the Berkeley EUV exposure tool. Limited by the availability of ultra-high resolution chemically amplified resists, present resolution limits are approximately 32 nm for equal lines and spaces and 27 nm for semi-isolated lines.

  6. Recent Advances in Hydrogen Peroxide Propulsion Test Capability at NASA's Stennis Space Center E-Complex

    NASA Technical Reports Server (NTRS)

    Jacks, Thomas E.; Beisler, Michele

    2003-01-01

    In recent years, the rocket propulsion test capability at NASA's John C. Stennis Space Center's (SSC) E-Complex has been enhanced to include facilitization for hydrogen peroxide (H2O2) based ground testing. In particular, the E-3 test stand has conducted numerous test projects that have been reported in the open literature. These include combustion devices as simple as small-scale catalyst beds, and larger devices such as ablative thrust chambers and a flight-type engine (AR2-3). Consequently, the NASA SSC test engineering and operations knowledge base and infrastructure have grown considerably in order to conduct safe H2O2 test operations with a variety of test articles at the component and engine level. Currently, the E-Complex has a test requirement for a hydrogen peroxide based stage test. This new development, with its unique set of requirements, has motivated the facilitization for hydrogen peroxide propellant use at the E-2 Cell 2 test position in addition to E-3. Since the E-2 Cell 2 test position was not originally designed as a hydrogen peroxide test stand, a facility modernization-improvement project was planned and implemented in FY 2002-03 to enable this vertical engine test stand to accomodate H2O2. This paper discusses the ongoing enhancement of E-Complex ground test capability, specifically at the E-3 stand (Cell 1 and Cell 2) and E-2 Cell 2 stand, that enable current and future customers considerable test flexibility and operability in conducting their peroxide based rocket R&D efforts.

  7. Advancing Unmanned Aircraft Sensor Collection and Communication Capabilities with Optical Communications

    NASA Astrophysics Data System (ADS)

    Lukaczyk, T.

    2015-12-01

    Unmanned aircraft systems (UAS) are now being used for monitoring climate change over both land and seas. Their uses include monitoring of cloud conditions and atmospheric composition of chemicals and aerosols due to pollution, dust storms, fires, volcanic activity and air-sea fluxes. Additional studies of carbon flux are important for various ecosystem studies of both marine and terrestrial environments specifically, and can be related to climate change dynamics. Many measurements are becoming more complex as additional sensors become small enough to operate on more widely available small UAS. These include interferometric radars as well as scanning and fan-beam lidar systems which produce data streams even greater than those of high resolution video. These can be used to precisely map surfaces of the earth, ocean or ice features that are important for a variety of earth system studies. As these additional sensor capabilities are added to UAS the ability to transmit data back to ground or ship monitoring sites is limited by traditional wireless communication protocols. We describe results of tests of optical communication systems that provide significantly greater communication bandwidths for UAS, and discuss both the bandwidth and effective range of these systems, as well as their power and weight requirements both for systems on UAS, as well as those of ground-based receiver stations. We justify our additional use of Delay and Disruption Tolerant Networking (DTN) communication protocols with optical communication methods to ensure security and continuity of command and control operations. Finally, we discuss the implications for receiving, geo-referencing, archiving and displaying data streams from sensors communicated via optical communication to better enable real-time anomaly detection and adaptive sampling capabilities using multiple UAS or other unmanned or manned systems.

  8. Unified Instrumentation: Examining the Simultaneous Application of Advanced Measurement Techniques for Increased Wind Tunnel Testing Capability

    NASA Technical Reports Server (NTRS)

    Fleming, Gary A. (Editor); Bartram, Scott M.; Humphreys, William M., Jr.; Jenkins, Luther N.; Jordan, Jeffrey D.; Lee, Joseph W.; Leighty, Bradley D.; Meyers, James F.; South, Bruce W.; Cavone, Angelo A.; Ingram, JoAnne L.

    2002-01-01

    A Unified Instrumentation Test examining the combined application of Pressure Sensitive Paint, Projection Moire Interferometry, Digital Particle Image Velocimetry, Doppler Global Velocimetry, and Acoustic Microphone Array has been conducted at the NASA Langley Research Center. The fundamental purposes of conducting the test were to: (a) identify and solve compatibility issues among the techniques that would inhibit their simultaneous application in a wind tunnel, and (b) demonstrate that simultaneous use of advanced instrumentation techniques is feasible for increasing tunnel efficiency and identifying control surface actuation / aerodynamic reaction phenomena. This paper provides summary descriptions of each measurement technique used during the Unified Instrumentation Test, their implementation for testing in a unified fashion, and example results identifying areas of instrument compatibility and incompatibility. Conclusions are drawn regarding the conditions under which the measurement techniques can be operated simultaneously on a non-interference basis. Finally, areas requiring improvement for successfully applying unified instrumentation in future wind tunnel tests are addressed.

  9. Development of Education Program for Okinawa Model Creative and Capable Engineers in Advanced Welding Technology

    NASA Astrophysics Data System (ADS)

    Manabe, Yukio; Matsue, Junji; Makishi, Takashi; Higa, Yoshikazu; Matsuda, Shoich

    Okinawa National College of Technology proposed “Educational Program for Practically Skilled Engineers in Advanced Welding Technology in Okinawa Style” to the Ministry of Economy, Trade and Industry and was adopted as a 2-year project starting from 2005. This project designed to fit for the regional characteristics of Okinawa, aims to develop the core human resources program that will help reinforce and innovate the welding engineering in the manufacturing industries. In 2005, the education program and the original textbook were developed, and in 2006, a proof class was held to confirm the suitability and the effectiveness of the program and the textbook in order to improve the attendees' basics and the application ability of welding. The results were quite positive. Also, by collaborating with the Japan Welding Society, points scored in this course were authorized as the education points of IIW international welding engineer qualification.

  10. Trends in Human-Computer Interaction to Support Future Intelligence Analysis Capabilities

    DTIC Science & Technology

    2011-06-01

    Camera-based recognition Camera-based systems are now entering the market. One of the most visible examples is the newly launched Microsoft Kinect...Technology, which provides significant opportunities to better conduct military operations. In order to address the new context of military operations...well as better HCI capabilities to support collaboration and interaction with information. These enhanced capabilities must be provided both for

  11. Lightweight Tactical Client: A Capability-Based Approach to Command Post Computing

    DTIC Science & Technology

    2015-12-01

    definition of a thin client and propose a new term that is not bound to the thin client terminology. This report opens with two definitions of a thin...extracted from these definitions . These capabilities are paired with tactical capabilities incompatible with thin client architecture. A new term that...CONTENTS Page Introduction 1 Thin Client Definitions 1 U.S. Army Definition of Thick and Thin Clients (ref. 3) 1 Industry Definition of Thin

  12. Structural reliability assessment capability in NESSUS

    NASA Technical Reports Server (NTRS)

    Millwater, H.; Wu, Y.-T.

    1992-01-01

    The principal capabilities of NESSUS (Numerical Evaluation of Stochastic Structures Under Stress), an advanced computer code developed for probabilistic structural response analysis, are reviewed, and its structural reliability assessed. The code combines flexible structural modeling tools with advanced probabilistic algorithms in order to compute probabilistic structural response and resistance, component reliability and risk, and system reliability and risk. An illustrative numerical example is presented.

  13. High performance parallel computers for science: New developments at the Fermilab advanced computer program

    SciTech Connect

    Nash, T.; Areti, H.; Atac, R.; Biel, J.; Cook, A.; Deppe, J.; Edel, M.; Fischler, M.; Gaines, I.; Hance, R.

    1988-08-01

    Fermilab's Advanced Computer Program (ACP) has been developing highly cost effective, yet practical, parallel computers for high energy physics since 1984. The ACP's latest developments are proceeding in two directions. A Second Generation ACP Multiprocessor System for experiments will include $3500 RISC processors each with performance over 15 VAX MIPS. To support such high performance, the new system allows parallel I/O, parallel interprocess communication, and parallel host processes. The ACP Multi-Array Processor, has been developed for theoretical physics. Each $4000 node is a FORTRAN or C programmable pipelined 20 MFlops (peak), 10 MByte single board computer. These are plugged into a 16 port crossbar switch crate which handles both inter and intra crate communication. The crates are connected in a hypercube. Site oriented applications like lattice gauge theory are supported by system software called CANOPY, which makes the hardware virtually transparent to users. A 256 node, 5 GFlop, system is under construction. 10 refs., 7 figs.

  14. Advanced information processing system: Inter-computer communication services

    NASA Technical Reports Server (NTRS)

    Burkhardt, Laura; Masotto, Tom; Sims, J. Terry; Whittredge, Roy; Alger, Linda S.

    1991-01-01

    The purpose is to document the functional requirements and detailed specifications for the Inter-Computer Communications Services (ICCS) of the Advanced Information Processing System (AIPS). An introductory section is provided to outline the overall architecture and functional requirements of the AIPS and to present an overview of the ICCS. An overview of the AIPS architecture as well as a brief description of the AIPS software is given. The guarantees of the ICCS are provided, and the ICCS is described as a seven-layered International Standards Organization (ISO) Model. The ICCS functional requirements, functional design, and detailed specifications as well as each layer of the ICCS are also described. A summary of results and suggestions for future work are presented.

  15. Recent Advances with the AMPX Covariance Processing Capabilities in PUFF-IV

    SciTech Connect

    Wiarda, D. Arbanas, G.; Leal, L.; Dunn, M.E.

    2008-12-15

    The program PUFF-IV is used to process resonance parameter covariance information given in ENDF/B File 32 and point wise covariance matrices given in ENDF/B File 33 into group-averaged covariances matrices on a user-supplied group structure. For large resonance covariance matrices, found for example in {sup 235}U, the execution time of PUFF-IV can be quite long. Recently the code was modified to take advantage of Basic Linear Algebra Subprograms (BLAS) routines for the most time-consuming matrix multiplications. This led to a substantial decrease in execution time. This faster processing capability allowed us to investigate the conversion of File 32 data into File 33 data using a larger number of user-defined groups. While conversion substantially reduces the ENDF/B file size requirements for evaluations with a large number of resonances, a trade-off is made between the number of groups used to represent the resonance parameter covariance as a point wise covariance matrix and the file size. We are also investigating a hybrid version of the conversion, in which the low-energy part of the File 32 resonance parameter covariances matrix is retained and the correlations with higher energies as well as the high energy part are given in File 33.

  16. Recent Advances with the AMPX Covariance Processing Capabilities in PUFF-IV

    SciTech Connect

    Wiarda, Dorothea; Arbanas, Goran; Leal, Luiz C; Dunn, Michael E

    2008-01-01

    The program PUFF-IV is used to process resonance parameter covariance information given in ENDF/B File 32 and point-wise covariance matrices given in ENDF/B File 33 into group-averaged covariances matrices on a user-supplied group structure. For large resonance covariance matrices, found for example in 235U, the execution time of PUFF-IV can be quite long. Recently the code was modified to take advandage of Basic Linear Algebra Subprograms (BLAS) routines for the most time-consuming matrix multiplications. This led to a substantial decrease in execution time. This faster processing capability allowed us to investigate the conversion of File 32 data into File 33 data using a larger number of user-defined groups. While conversion substantially reduces the ENDF/B file size requirements for evaluations with a large number of resonances, a trade-off is made between the number of groups used to represent the resonance parameter covariance as a point-wise covariance matrix and the file size. We are also investigating a hybrid version of the conversion, in which the low-energy part of the File 32 resonance parameter covariances matrix is retained and the correlations with higher energies as well as the high energy part are given in File 33.

  17. Design Standards for Instructional Computer Programs. Physical Processes in Terrestrial and Aquatic Ecosystems, Computer Programs and Graphics Capabilities.

    ERIC Educational Resources Information Center

    Gales, Larry

    These materials were designed to be used by life science students for instruction in the application of physical theory to ecosystem operation. Most modules contain computer programs which are built around a particular application of a physical process. The report describes design standards for the computer programs. They are designed to be…

  18. Recent advances in computational structural reliability analysis methods

    NASA Technical Reports Server (NTRS)

    Thacker, Ben H.; Wu, Y.-T.; Millwater, Harry R.; Torng, Tony Y.; Riha, David S.

    1993-01-01

    The goal of structural reliability analysis is to determine the probability that the structure will adequately perform its intended function when operating under the given environmental conditions. Thus, the notion of reliability admits the possibility of failure. Given the fact that many different modes of failure are usually possible, achievement of this goal is a formidable task, especially for large, complex structural systems. The traditional (deterministic) design methodology attempts to assure reliability by the application of safety factors and conservative assumptions. However, the safety factor approach lacks a quantitative basis in that the level of reliability is never known and usually results in overly conservative designs because of compounding conservatisms. Furthermore, problem parameters that control the reliability are not identified, nor their importance evaluated. A summary of recent advances in computational structural reliability assessment is presented. A significant level of activity in the research and development community was seen recently, much of which was directed towards the prediction of failure probabilities for single mode failures. The focus is to present some early results and demonstrations of advanced reliability methods applied to structural system problems. This includes structures that can fail as a result of multiple component failures (e.g., a redundant truss), or structural components that may fail due to multiple interacting failure modes (e.g., excessive deflection, resonate vibration, or creep rupture). From these results, some observations and recommendations are made with regard to future research needs.

  19. Reliability of an interactive computer program for advance care planning.

    PubMed

    Schubart, Jane R; Levi, Benjamin H; Camacho, Fabian; Whitehead, Megan; Farace, Elana; Green, Michael J

    2012-06-01

    Despite widespread efforts to promote advance directives (ADs), completion rates remain low. Making Your Wishes Known: Planning Your Medical Future (MYWK) is an interactive computer program that guides individuals through the process of advance care planning, explaining health conditions and interventions that commonly involve life or death decisions, helps them articulate their values/goals, and translates users' preferences into a detailed AD document. The purpose of this study was to demonstrate that (in the absence of major life changes) the AD generated by MYWK reliably reflects an individual's values/preferences. English speakers ≥30 years old completed MYWK twice, 4 to 6 weeks apart. Reliability indices were assessed for three AD components: General Wishes; Specific Wishes for treatment; and Quality-of-Life values (QoL). Twenty-four participants completed the study. Both the Specific Wishes and QoL scales had high internal consistency in both time periods (Knuder Richardson formula 20 [KR-20]=0.83-0.95, and 0.86-0.89). Test-retest reliability was perfect for General Wishes (κ=1), high for QoL (Pearson's correlation coefficient=0.83), but lower for Specific Wishes (Pearson's correlation coefficient=0.57). MYWK generates an AD where General Wishes and QoL (but not Specific Wishes) statements remain consistent over time.

  20. The commissioning of the advanced radiographic capability laser system: experimental and modeling results at the main laser output

    NASA Astrophysics Data System (ADS)

    Di Nicola, J. M.; Yang, S. T.; Boley, C. D.; Crane, J. K.; Heebner, J. E.; Spinka, T. M.; Arnold, P.; Barty, C. P. J.; Bowers, M. W.; Budge, T. S.; Christensen, K.; Dawson, J. W.; Erbert, G.; Feigenbaum, E.; Guss, G.; Haefner, C.; Hermann, M. R.; Homoelle, D.; Jarboe, J. A.; Lawson, J. K.; Lowe-Webb, R.; McCandless, K.; McHale, B.; Pelz, L. J.; Pham, P. P.; Prantil, M. A.; Rehak, M. L.; Rever, M. A.; Rushford, M. C.; Sacks, R. A.; Shaw, M.; Smauley, D.; Smith, L. K.; Speck, R.; Tietbohl, G.; Wegner, P. J.; Widmayer, C.

    2015-02-01

    The National Ignition Facility (NIF) at Lawrence Livermore National Laboratory is the first of a kind megajoule-class laser with 192 beams capable of delivering over 1.8 MJ and 500TW of 351nm light [1], [2]. It has been commissioned and operated since 2009 to support a wide range of missions including the study of inertial confinement fusion, high energy density physics, material science, and laboratory astrophysics. In order to advance our understanding, and enable short-pulse multi-frame radiographic experiments of dense cores of cold material, the generation of very hard x-rays above 50 keV is necessary. X-rays with such characteristics can be efficiently generated with high intensity laser pulses above 1017 W/cm² [3]. The Advanced Radiographic Capability (ARC) [4] which is currently being commissioned on the NIF will provide eight, 1 ps to 50 ps, adjustable pulses with up to 1.7 kJ each to create x-ray point sources enabling dynamic, multi-frame x-ray backlighting. This paper will provide an overview of the ARC system and report on the laser performance tests conducted with a stretched-pulse up to the main laser output and their comparison with the results of our laser propagation codes.

  1. Advancement of a 30K W Solar Electric Propulsion System Capability for NASA Human and Robotic Exploration Missions

    NASA Technical Reports Server (NTRS)

    Smith, Bryan K.; Nazario, Margaret L.; Manzella, David H.

    2012-01-01

    Solar Electric Propulsion has evolved into a demonstrated operational capability performing station keeping for geosynchronous satellites, enabling challenging deep-space science missions, and assisting in the transfer of satellites from an elliptical orbit Geostationary Transfer Orbit (GTO) to a Geostationary Earth Orbit (GEO). Advancing higher power SEP systems will enable numerous future applications for human, robotic, and commercial missions. These missions are enabled by either the increased performance of the SEP system or by the cost reductions when compared to conventional chemical propulsion systems. Higher power SEP systems that provide very high payload for robotic missions also trade favorably for the advancement of human exploration beyond low Earth orbit. Demonstrated reliable systems are required for human space flight and due to their successful present day widespread use and inherent high reliability, SEP systems have progressively become a viable entrant into these future human exploration architectures. NASA studies have identified a 30 kW-class SEP capability as the next appropriate evolutionary step, applicable to wide range of both human and robotic missions. This paper describes the planning options, mission applications, and technology investments for representative 30kW-class SEP mission concepts under consideration by NASA

  2. Non-Intrusive Device for Real-Time Circulatory System Assessment with Advanced Signal Processing Capabilities

    NASA Astrophysics Data System (ADS)

    Pinheiro, E.; Postolache, O.; Girão, P.

    2010-01-01

    This paper presents a device that uses three cardiography signals to characterize several important parameters of a subject's circulatory system. Using electrocardiogram, finger photoplethysmogram, and ballistocardiogram, three heart rate estimates are acquired from beat-to-beat time interval extraction. Furthermore, pre-ejection period, pulse transit time (PTT), and pulse arrival time (PAT) are computed, and their long-term evolution is analyzed. The system estimates heart rate variability (HRV) and blood pressure variability (BPV) from the heart rate and PAT time series, to infer the activity of the cardiac autonomic system. The software component of the device evaluates the frequency content of HRV and BPV, and also their fractal dimension and entropy, thus providing a detailed analysis of the time series' regularity and complexity evolution, to allow personalized subject evaluation.

  3. TerraSAR Advancements & Next Generation-Mission Capabilities Supporting GMES/Copernicus

    NASA Astrophysics Data System (ADS)

    Bach, Katja; Schrage, Thomas; Janoth, Jurgen; Tinz, Marek; Thiergan, Christian

    2013-12-01

    This paper addresses the continuous evolution of the TerraSAR-X Mission in the context of Copernicus, previously known as GMES. From first data contracts starting in 2009, the TerraSAR-X GMES Contributing Mission (TSX-GCM) has become closely integrated with ESA's Coordinated Data Access System (CDS). TSX-GCM has continuously been working on improving data access for Copernicus users in response to new requirements on timeliness and data products: The TerraSAR ground station network has been upgraded to include Svalbard as a receiving station, and the product portfolio for TerraSAR-X has been enhanced with two new operational imaging modes, a Staring Spotlight and a Wide ScanSAR Mode. The planned TerraSAR Next Generation (TerraSAR- NG) System guarantees TerraSAR-X data and service continuity and provides advanced very high-resolution products to the user community. A partnership model, “WorldSAR”, is envisioned, where partners can participate through co-investment, subscription, and ownership of additional satellites operated in constellation.

  4. An ALS handbook: A summary of the capabilities and characteristics of the advanced light source

    SciTech Connect

    Not Available

    1989-04-01

    This booklet aims to provide the prospective user of the Advanced Light Source with a concise description of the radiation a researcher might expect at his or her experimental station. The focus is therefore on the characteristics of the light that emerges from insertion devices and bending magnets and on how components of the beam lines further alter the properties of the radiation. The few specifications and operating parameters of the ALS storage ring that are of interest are those that directly determine the radiation characteristics. Sections 4 through 5 are primarily devoted to summary presentations, by means of performance plots and tabular compilations, of radiation characteristics at the ALS--spectral brightness, flux, coherent power, resolution, etc.--assuming a representative set of three undulators and one wiggler and a corresponding set of four beam lines. As a complement to these performance summaries, Section 1 is a general introductory discussion of synchrotron radiation and the ALS, and Section 2 discusses the properties of the stored electron beam that affect the radiation. Section 3 then provides an introduction to the characteristics of synchrotron radiation from bending magnets, wigglers, and undulators. In addition, Section 5 briefly introduces the theory of diffraction-grating and crystal monochromators. As compared with previous editions of this booklet, the performance plots and tabular compilations of the ALS radiation characteristics are now based on conservative engineering designs rather than preliminary physics designs.

  5. Geared rotor dynamic methodologies for advancing prognostic modeling capabilities in rotary-wing transmission systems

    NASA Astrophysics Data System (ADS)

    Stringer, David Blake

    The overarching objective in this research is the development of a robust, rotor dynamic, physics based model of a helicopter drive train as a foundation for the prognostic modeling for rotary-wing transmissions. Rotorcrafts rely on the integrity of their drive trains for their airworthiness. Drive trains rely on gear technology for their integrity and function. Gears alter the vibration characteristics of a mechanical system and significantly contribute to noise, component fatigue, and personal discomfort prevalent in rotorcraft. This research effort develops methodologies for generating a rotor dynamic model of a rotary-wing transmission based on first principles, through (i) development of a three-dimensional gear-mesh stiffness model for helical and spur gears and integration of this model in a finite element rotor dynamic model, (ii) linear and nonlinear analyses of a geared system for comparison and validation of the gear-mesh model, (iii) development of a modal synthesis technique for potentially providing model reduction and faster analysis capabilities for geared systems, and (iv) extension of the gear-mesh model to bevel and epicyclic configurations. In addition to model construction and validation, faults indigenous to geared systems are presented and discussed. Two faults are selected for analysis and seeded into the transmission model. Diagnostic vibration parameters are presented and used as damage indicators in the analysis. The fault models produce results consistent with damage experienced during experimental testing. The results of this research demonstrate the robustness of the physics-based approach in simulating multiple normal and abnormal conditions. The advantages of this physics-based approach, when combined with contemporary probabilistic and time-series techniques, provide a useful method for improving health monitoring technologies in mechanical systems.

  6. Using advanced computer vision algorithms on small mobile robots

    NASA Astrophysics Data System (ADS)

    Kogut, G.; Birchmore, F.; Biagtan Pacis, E.; Everett, H. R.

    2006-05-01

    The Technology Transfer project employs a spiral development process to enhance the functionality and autonomy of mobile robot systems in the Joint Robotics Program (JRP) Robotic Systems Pool by converging existing component technologies onto a transition platform for optimization. An example of this approach is the implementation of advanced computer vision algorithms on small mobile robots. We demonstrate the implementation and testing of the following two algorithms useful on mobile robots: 1) object classification using a boosted Cascade of classifiers trained with the Adaboost training algorithm, and 2) human presence detection from a moving platform. Object classification is performed with an Adaboost training system developed at the University of California, San Diego (UCSD) Computer Vision Lab. This classification algorithm has been used to successfully detect the license plates of automobiles in motion in real-time. While working towards a solution to increase the robustness of this system to perform generic object recognition, this paper demonstrates an extension to this application by detecting soda cans in a cluttered indoor environment. The human presence detection from a moving platform system uses a data fusion algorithm which combines results from a scanning laser and a thermal imager. The system is able to detect the presence of humans while both the humans and the robot are moving simultaneously. In both systems, the two aforementioned algorithms were implemented on embedded hardware and optimized for use in real-time. Test results are shown for a variety of environments.

  7. Advances in the computational study of language acquisition.

    PubMed

    Brent, M R

    1996-01-01

    This paper provides a tutorial introduction to computational studies of how children learn their native languages. Its aim is to make recent advances accessible to the broader research community, and to place them in the context of current theoretical issues. The first section locates computational studies and behavioral studies within a common theoretical framework. The next two sections review two papers that appear in this volume: one on learning the meanings of words and one or learning the sounds of words. The following section highlights an idea which emerges independently in these two papers and which I have dubbed autonomous bootstrapping. Classical bootstrapping hypotheses propose that children begin to get a toc-hold in a particular linguistic domain, such as syntax, by exploiting information from another domain, such as semantics. Autonomous bootstrapping complements the cross-domain acquisition strategies of classical bootstrapping with strategies that apply within a single domain. Autonomous bootstrapping strategies work by representing partial and/or uncertain linguistic knowledge and using it to analyze the input. The next two sections review two more more contributions to this special issue: one on learning word meanings via selectional preferences and one on algorithms for setting grammatical parameters. The final section suggests directions for future research.

  8. The Need for Technology Maturity of Any Advanced Capability to Achieve Better Life Cycle Cost (LCC)

    NASA Technical Reports Server (NTRS)

    Robinson, John W.; Levack, Daniel J. H.; Rhodes, Russel E.; Chen, Timothy T.

    2009-01-01

    Programs such as space transportation systems are developed and deployed only rarely, and they have long development schedules and large development and life cycle costs (LCC). They have not historically had their LCC predicted well and have only had an effort to control the DDT&E phase of the programs. One of the factors driving the predictability, and thus control, of the LCC of a program is the maturity of the technologies incorporated in the program. If the technologies incorporated are less mature (as measured by their Technology Readiness Level - TRL), then the LCC not only increases but the degree of increase is difficult to predict. Consequently, new programs avoid incorporating technologies unless they are quite mature, generally TRL greater than or equal to 7 (system prototype demonstrated in a space environment) to allow better predictability of the DDT&E phase costs unless there is no alternative. On the other hand, technology development programs rarely develop technologies beyond TRL 6 (system/subsystem model or prototype demonstrated in a relevant environment). Currently the lack of development funds beyond TRL 6 and the major funding required for full scale development leave little or no funding available to prototype TRL 6 concepts so that hardware would be in the ready mode for safe, reliable and cost effective incorporation. The net effect is that each new program either incorporates little new technology or has longer development schedules and costs, and higher LCC, than planned. This paper presents methods to ensure that advanced technologies are incorporated into future programs while providing a greater accuracy of predicting their LCC. One method is having a dedicated organization to develop X-series vehicles or separate prototypes carried on other vehicles. The question of whether such an organization should be independent of NASA and/or have an independent funding source is discussed. Other methods are also discussed. How to make the

  9. Experimental and computing strategies in advanced material characterization problems

    SciTech Connect

    Bolzon, G.

    2015-10-28

    The mechanical characterization of materials relies more and more often on sophisticated experimental methods that permit to acquire a large amount of data and, contemporarily, to reduce the invasiveness of the tests. This evolution accompanies the growing demand of non-destructive diagnostic tools that assess the safety level of components in use in structures and infrastructures, for instance in the strategic energy sector. Advanced material systems and properties that are not amenable to traditional techniques, for instance thin layered structures and their adhesion on the relevant substrates, can be also characterized by means of combined experimental-numerical tools elaborating data acquired by full-field measurement techniques. In this context, parameter identification procedures involve the repeated simulation of the laboratory or in situ tests by sophisticated and usually expensive non-linear analyses while, in some situation, reliable and accurate results would be required in real time. The effectiveness and the filtering capabilities of reduced models based on decomposition and interpolation techniques can be profitably used to meet these conflicting requirements. This communication intends to summarize some results recently achieved in this field by the author and her co-workers. The aim is to foster further interaction between engineering and mathematical communities.

  10. Developing a Diagnosis System of Work-Related Capabilities for Students: A Computer-Assisted Assessment

    ERIC Educational Resources Information Center

    Liao, C. H.; Yang, M. H.; Yang, B. C.

    2013-01-01

    A gap exists between students' employment needs and higher education offerings. Thus, developing the capability to meet the learning needs of students in supporting their future aspirations should be facilitated. To bridge this gap in practice, this study uses multiple methods (i.e., nominal group technique and instructional systems development)…

  11. New European Training Network to Improve Young Scientists' Capabilities in Computational Wave Propagation

    NASA Astrophysics Data System (ADS)

    Igel, Heiner

    2004-07-01

    The European Commission recently funded a Marie-Curie Research Training Network (MCRTN) in the field of computational seismology within the 6th Framework Program. SPICE (Seismic wave Propagation and Imaging in Complex media: a European network) is coordinated by the computational seismology group of the Ludwig-Maximilians-Universität in Munich linking 14 European research institutions in total. The 4-year project will provide funding for 14 Ph.D. students (3-year projects) and 14 postdoctoral positions (2-year projects) within the various fields of computational seismology. These positions have been advertised and are currently being filled.

  12. Advanced entry guidance algorithm with landing footprint computation

    NASA Astrophysics Data System (ADS)

    Leavitt, James Aaron

    -determined angle of attack profile. The method is also capable of producing orbital footprints using an automatically-generated set of angle of attack profiles of varying range, with the lowest profile designed for near-maximum range in the absence of an active heat load constraint. The accuracy of the footprint method is demonstrated by direct comparison with footprints computed independently by an optimization program.

  13. A digital computer propulsion control facility: Description of capabilities and summary of experimental program results

    NASA Technical Reports Server (NTRS)

    Zeller, J. R.; Arpasi, D. J.; Lehtinen, B.

    1976-01-01

    Flight weight digital computers are being used today to carry out many of the propulsion system control functions previously delegated exclusively to hydromechanical controllers. An operational digital computer facility for propulsion control mode studies has been used successfully in several experimental programs. This paper describes the system and some of the results concerned with engine control, inlet control, and inlet engine integrated control. Analytical designs for the digital propulsion control modes include both classical and modern/optimal techniques.

  14. Design and Integration of a Driving Simulator With Eye-Tracking Capabilities in the Computer Assisted Rehabilitation Environment (CAREN)

    DTIC Science & Technology

    2014-07-09

    CAREN Driving Simulation 1 Naval Health Research Center Design and Integration of a Driving Simulator With Eye-Tracking Capabilities in...California 92106-3521 CAREN Driving Simulation 2 INTRODUCTION The Computer Assisted Rehabilitation Environment (CAREN; Motek Medical BV, Amsterdam...activate events, and record information. The ideal driving simulator for NHRC would include a variety of easily modified road courses, and it would

  15. Computer-Aided Detection of Rapid, Overt, Airborne, Reconnaissance Data with the Capability of Removing Oceanic Noises

    DTIC Science & Technology

    2013-12-01

    have been three times more attacks to naval ships using sea mines than all other forms combined. Sea mines have always been viewed upon as underhanded...and unchivalrous, yet they provide a weaker navy the capability to stall and damage a vastly superior navy. Utilizing unmanned sensors to detect sea ...mines is the goal of the navy for the future. Computer-aided detection (CAD) of sea mines is much faster and more consistent than a human operator

  16. Recovery Act: Advanced Direct Methanol Fuel Cell for Mobile Computing

    SciTech Connect

    Fletcher, James H.; Cox, Philip; Harrington, William J; Campbell, Joseph L

    2013-09-03

    ABSTRACT Project Title: Recovery Act: Advanced Direct Methanol Fuel Cell for Mobile Computing PROJECT OBJECTIVE The objective of the project was to advance portable fuel cell system technology towards the commercial targets of power density, energy density and lifetime. These targets were laid out in the DOE’s R&D roadmap to develop an advanced direct methanol fuel cell power supply that meets commercial entry requirements. Such a power supply will enable mobile computers to operate non-stop, unplugged from the wall power outlet, by using the high energy density of methanol fuel contained in a replaceable fuel cartridge. Specifically this project focused on balance-of-plant component integration and miniaturization, as well as extensive component, subassembly and integrated system durability and validation testing. This design has resulted in a pre-production power supply design and a prototype that meet the rigorous demands of consumer electronic applications. PROJECT TASKS The proposed work plan was designed to meet the project objectives, which corresponded directly with the objectives outlined in the Funding Opportunity Announcement: To engineer the fuel cell balance-of-plant and packaging to meet the needs of consumer electronic systems, specifically at power levels required for mobile computing. UNF used existing balance-of-plant component technologies developed under its current US Army CERDEC project, as well as a previous DOE project completed by PolyFuel, to further refine them to both miniaturize and integrate their functionality to increase the system power density and energy density. Benefits of UNF’s novel passive water recycling MEA (membrane electrode assembly) and the simplified system architecture it enabled formed the foundation of the design approach. The package design was hardened to address orientation independence, shock, vibration, and environmental requirements. Fuel cartridge and fuel subsystems were improved to ensure effective fuel

  17. Department of Defense Use of Commercial Cloud Computing Capabilities and Services

    DTIC Science & Technology

    2015-11-01

    provider. (NIST, 2011) Consumers acquire capabilities such as server time and network space through a web -based control panel or Application Programming...any location via a simple web -based access point. (The Open Group, 2013) A popular example of this characteristic is a consumer’s ability to access... web -based email, such as Gmail and Yahoo, from any device. Administrators can also access and provision cloud resources from outside a specialized

  18. Spatial resolution measurements of the advanced radiographic capability x-ray imaging system at energies relevant to Compton radiography

    NASA Astrophysics Data System (ADS)

    Hall, G. N.; Izumi, N.; Landen, O. L.; Tommasini, R.; Holder, J. P.; Hargrove, D.; Bradley, D. K.; Lumbard, A.; Cruz, J. G.; Piston, K.; Lee, J. J.; Romano, E.; Bell, P. M.; Carpenter, A. C.; Palmer, N. E.; Felker, B.; Rekow, V.; Allen, F. V.

    2016-11-01

    Compton radiography provides a means to measure the integrity, ρR and symmetry of the DT fuel in an inertial confinement fusion implosion near peak compression. Upcoming experiments at the National Ignition Facility will use the ARC (Advanced Radiography Capability) laser to drive backlighter sources for Compton radiography experiments and will use the newly commissioned AXIS (ARC X-ray Imaging System) instrument as the detector. AXIS uses a dual-MCP (micro-channel plate) to provide gating and high DQE at the 40-200 keV x-ray range required for Compton radiography, but introduces many effects that contribute to the spatial resolution. Experiments were performed at energies relevant to Compton radiography to begin characterization of the spatial resolution of the AXIS diagnostic.

  19. Spatial resolution measurements of the advanced radiographic capability x-ray imaging system at energies relevant to Compton radiography.

    PubMed

    Hall, G N; Izumi, N; Landen, O L; Tommasini, R; Holder, J P; Hargrove, D; Bradley, D K; Lumbard, A; Cruz, J G; Piston, K; Lee, J J; Romano, E; Bell, P M; Carpenter, A C; Palmer, N E; Felker, B; Rekow, V; Allen, F V

    2016-11-01

    Compton radiography provides a means to measure the integrity, ρR and symmetry of the DT fuel in an inertial confinement fusion implosion near peak compression. Upcoming experiments at the National Ignition Facility will use the ARC (Advanced Radiography Capability) laser to drive backlighter sources for Compton radiography experiments and will use the newly commissioned AXIS (ARC X-ray Imaging System) instrument as the detector. AXIS uses a dual-MCP (micro-channel plate) to provide gating and high DQE at the 40-200 keV x-ray range required for Compton radiography, but introduces many effects that contribute to the spatial resolution. Experiments were performed at energies relevant to Compton radiography to begin characterization of the spatial resolution of the AXIS diagnostic.

  20. Computational fluid dynamics capability for the solid-fuel ramjet projectile

    NASA Astrophysics Data System (ADS)

    Nusca, Michael J.; Chakravarthy, Sukumar R.; Goldberg, Uriel C.

    1990-06-01

    A computational fluid dynamics solution of the Navier-Stokes equations has been applied to the internal and external flow of inert solid-fuel ramjet projectiles. Computational modeling reveals internal flowfield details not attainable by flight or wind tunnel measurements, thus contributing to the current investigation into the flight performance of solid-fuel ramjet projectiles. The present code employs numerical algorithms termed total variational diminishing (TVD). Computational solutions indicate the importance of several special features of the code including the zonal grid framework, the TVD scheme, and a recently developed backflow turbulence model. The solutions are compared with results of internal surface pressure measurements. As demonstrated by these comparisons, the use of a backflow turbulence model distinguishes between satisfactory and poor flowfield predictions.

  1. Parallel computing structures capable of flexible associations and recognition of fuzzy inputs

    NASA Astrophysics Data System (ADS)

    Hogg, T.; Huberman, B. A.

    1985-10-01

    We experimentally show that computing with attractors leads to fast adaptive behavior in which dynamical associations can be made between different inputs which initially produce sharply distinct outputs. We do so by first defining a set of simple local procedures which allow a computing array to change its state in time so as to produce classical Pavlovian conditioning. We then examine the dynamics of coalescence and dissociation of attractors with a number of quantitative experiments. We also show how such arrays exhibit generalization and differentiation of inputs in their behavior.

  2. Integrated Computational Materials Engineering of Titanium: Current Capabilities Being Developed Under the Metals Affordability Initiative

    NASA Astrophysics Data System (ADS)

    Glavicic, M. G.; Venkatesh, V.

    2014-07-01

    A technical review of the titanium model development programs currently funded under the Metals Affordability Initiative is presented. Progress of the "Advanced Titanium Alloy Microstructure and Mechanical Property Modeling" and "ICME of Microtexture Evolution and its Effect on Cold Dwell/High/Low Cycle Fatigue Behavior of Dual Phase Titanium Alloys" will be reviewed followed by a discussion of the future modeling needs of the aerospace industry.

  3. Quantitative Computed Tomography and Image Analysis for Advanced Muscle Assessment

    PubMed Central

    Edmunds, Kyle Joseph; Gíslason, Magnus K.; Arnadottir, Iris D.; Marcante, Andrea; Piccione, Francesco; Gargiulo, Paolo

    2016-01-01

    Medical imaging is of particular interest in the field of translational myology, as extant literature describes the utilization of a wide variety of techniques to non-invasively recapitulate and quantity various internal and external tissue morphologies. In the clinical context, medical imaging remains a vital tool for diagnostics and investigative assessment. This review outlines the results from several investigations on the use of computed tomography (CT) and image analysis techniques to assess muscle conditions and degenerative process due to aging or pathological conditions. Herein, we detail the acquisition of spiral CT images and the use of advanced image analysis tools to characterize muscles in 2D and 3D. Results from these studies recapitulate changes in tissue composition within muscles, as visualized by the association of tissue types to specified Hounsfield Unit (HU) values for fat, loose connective tissue or atrophic muscle, and normal muscle, including fascia and tendon. We show how results from these analyses can be presented as both average HU values and compositions with respect to total muscle volumes, demonstrating the reliability of these tools to monitor, assess and characterize muscle degeneration. PMID:27478562

  4. Alignment mask design and image processing for the Advanced Radiographic Capability (ARC) at the National Ignition Facility

    NASA Astrophysics Data System (ADS)

    Leach, Richard R.; Awwal, Abdul; Cohen, Simon; Lowe-Webb, Roger; Roberts, Randy; Salmon, Thad; Smauley, David; Wilhelmsen, Karl

    2015-09-01

    The Advance Radiographic Capability (ARC) at the National Ignition Facility (NIF) is a laser system that employs up to four petawatt (PW) lasers to produce a sequence of short pulses that generate X-rays which backlight high-density inertial confinement fusion (ICF) targets. ARC is designed to produce multiple, sequential X-ray images by using up to eight back lighters. The images will be used to examine the compression and ignition of a cryogenic deuterium-tritium target with tens-of-picosecond temporal resolution during the critical phases of an ICF shot. Multi-frame, hard-X-ray radiography of imploding NIF capsules is a capability which is critical to the success of NIF's missions. As in the NIF system, ARC requires an optical alignment mask that can be inserted and removed as needed for precise positioning of the beam. Due to ARC's split beam design, inserting the nominal NIF main laser alignment mask in ARC produced a partial blockage of the mask pattern. Requirements for a new mask design were needed. In this paper we describe the ARC mask requirements, the resulting mask design pattern, and the image analysis algorithms used to detect and identify the beam and reference centers required for ARC alignment.

  5. Programmer's Guide for FFORM. Physical Processes in Terrestrial and Aquatic Ecosystems, Computer Programs and Graphics Capabilities.

    ERIC Educational Resources Information Center

    Anderson, Lougenia; Gales, Larry

    This module is part of a series designed to be used by life science students for instruction in the application of physical theory to ecosystem operation. Most modules contain computer programs which are built around a particular application of a physical process. FFORM is a portable format-free input subroutine package written in ANSI Fortran IV…

  6. Advanced capability RFID system

    DOEpatents

    Gilbert, Ronald W.; Steele, Kerry D.; Anderson, Gordon A.

    2007-09-25

    A radio-frequency transponder device having an antenna circuit configured to receive radio-frequency signals and to return modulated radio-frequency signals via continuous wave backscatter, a modulation circuit coupled to the antenna circuit for generating the modulated radio-frequency signals, and a microprocessor coupled to the antenna circuit and the modulation circuit and configured to receive and extract operating power from the received radio-frequency signals and to monitor inputs on at least one input pin and to generate responsive signals to the modulation circuit for modulating the radio-frequency signals. The microprocessor can be configured to generate output signals on output pins to associated devices for controlling the operation thereof. Electrical energy can be extracted and stored in an optional electrical power storage device.

  7. Investigating the Mobility of Light Autonomous Tracked Vehicles using a High Performance Computing Simulation Capability

    NASA Technical Reports Server (NTRS)

    Negrut, Dan; Mazhar, Hammad; Melanz, Daniel; Lamb, David; Jayakumar, Paramsothy; Letherwood, Michael; Jain, Abhinandan; Quadrelli, Marco

    2012-01-01

    This paper is concerned with the physics-based simulation of light tracked vehicles operating on rough deformable terrain. The focus is on small autonomous vehicles, which weigh less than 100 lb and move on deformable and rough terrain that is feature rich and no longer representable using a continuum approach. A scenario of interest is, for instance, the simulation of a reconnaissance mission for a high mobility lightweight robot where objects such as a boulder or a ditch that could otherwise be considered small for a truck or tank, become major obstacles that can impede the mobility of the light autonomous vehicle and negatively impact the success of its mission. Analyzing and gauging the mobility and performance of these light vehicles is accomplished through a modeling and simulation capability called Chrono::Engine. Chrono::Engine relies on parallel execution on Graphics Processing Unit (GPU) cards.

  8. Recent advances in data assimilation in computational geodynamic models

    NASA Astrophysics Data System (ADS)

    Ismail-Zadeh, Alik

    2010-05-01

    . The QRV method was most recently introduced in geodynamic modelling (Ismail-Zadeh et al., 2007, 2008; Tantsyrev, 2008; Glisovic et al., 2009). The advances in computational geodynamics and in data assimilation attract an interest of the community dealing with lithosphere, mantle and core dynamics.

  9. Computable General Equilibrium Model Fiscal Year 2013 Capability Development Report - April 2014

    SciTech Connect

    Edwards, Brian Keith; Rivera, Michael K.; Boero, Riccardo

    2014-04-01

    This report documents progress made on continued developments of the National Infrastructure Simulation and Analysis Center (NISAC) Computable General Equilibrium Model (NCGEM), developed in fiscal year 2012. In fiscal year 2013, NISAC the treatment of the labor market and tests performed with the model to examine the properties of the solutions computed by the model. To examine these, developers conducted a series of 20 simulations for 20 U.S. States. Each of these simulations compared an economic baseline simulation with an alternative simulation that assumed a 20-percent reduction in overall factor productivity in the manufacturing industries of each State. Differences in the simulation results between the baseline and alternative simulations capture the economic impact of the reduction in factor productivity. While not every State is affected in precisely the same way, the reduction in manufacturing industry productivity negatively affects the manufacturing industries in each State to an extent proportional to the reduction in overall factor productivity. Moreover, overall economic activity decreases when manufacturing sector productivity is reduced. Developers ran two additional simulations: (1) a version of the model for the State of Michigan, with manufacturing divided into two sub-industries (automobile and other vehicle manufacturing as one sub-industry and the rest of manufacturing as the other subindustry); and (2) a version of the model for the United States, divided into 30 industries. NISAC conducted these simulations to illustrate the flexibility of industry definitions in NCGEM and to examine the simulation properties of in more detail.

  10. The Design and Transfer of Advanced Command and Control (C2) Computer-Based Systems

    DTIC Science & Technology

    1980-03-31

    TECHNICAL REPORT 80-02 QUARTERLY TECHNICAL REPORT: THE DESIGN AND TRANSFER OF ADVANCED COMMAND AND CONTROL (C 2 ) COMPUTER-BASED SYSTEMS ARPA...The Tasks/Objectives and/or Purposes of the overall project are connected with the design , development, demonstration and transfer of advanced...command and control (C2 ) computer-based systems; this report covers work in the computer-based design and transfer areas only. The Technical Problems thus

  11. First 3 years of operation of RIACS (Research Institute for Advanced Computer Science) (1983-1985)

    NASA Technical Reports Server (NTRS)

    Denning, P. J.

    1986-01-01

    The focus of the Research Institute for Advanced Computer Science (RIACS) is to explore matches between advanced computing architectures and the processes of scientific research. An architecture evaluation of the MIT static dataflow machine, specification of a graphical language for expressing distributed computations, and specification of an expert system for aiding in grid generation for two-dimensional flow problems was initiated. Research projects for 1984 and 1985 are summarized.

  12. Evaluating the Capability of Information Technology to Prevent Adverse Drug Events: A Computer Simulation Approach

    PubMed Central

    Anderson, James G.; Jay, Stephen J.; Anderson, Marilyn; Hunt, Thaddeus J.

    2002-01-01

    Background: The annual cost of morbidity and mortality due to medication errors in the U.S. has been estimated at $76.6 billion. Information technology implemented systematically has the potential to significantly reduce medication errors that result in adverse drug events (ADEs). Objective: To develop a computer simulation model that can be used to evaluate the effectiveness of information technology applications designed to detect and prevent medication errors that result in adverse drug effects. Methods: A computer simulation model was constructed representing the medication delivery system in a hospital. STELLA, a continuous simulation software package, was used to construct the model. Parameters of the model were estimated from a study of prescription errors on two hospital medical/surgical units and used in the baseline simulation. Five prevention strategies were simulated based on information obtained from the literature. Results: The model simulates the four stages of the medication delivery system: prescribing, transcribing, dispensing, and administering drugs. We simulated interventions that have been demonstrated in prior studies to decrease error rates. The results suggest that an integrated medication delivery system can save up to 1,226 days of excess hospitalization and $1.4 million in associated costs annually in a large hospital. The results of the analyses regarding the effects of the interventions on the additional hospital costs associated with ADEs are somewhat sensitive to the distribution of errors in the hospital, more sensitive to the costs of an ADE, and most sensitive to the proportion of medication errors resulting in ADEs. Conclusions: The results suggest that clinical information systems are potentially a cost-effective means of preventing ADEs in hospitals and demonstrate the importance of viewing medication errors from a systems perspective. Prevention efforts that focus on a single stage of the process had limited impact on the

  13. Studies of challenge in lower hybrid current drive capability at high density regime in experimental advanced superconducting tokamak

    NASA Astrophysics Data System (ADS)

    Ding, B. J.; Li, M. H.; Li, Y. C.; Wang, M.; Liu, F. K.; Shan, J. F.; Li, J. G.; Wan, B. N.; Wan

    2017-02-01

    Aiming at a fusion reactor, two issues must be solved for the lower hybrid current drive (LHCD), namely good lower hybrid wave (LHW)-plasma coupling and effective current drive at high density. For this goal, efforts have been made to improve LHW-plasma coupling and current drive capability at high density in experimental advanced superconducting tokamak (EAST). LHW-plasma coupling is improved by means of local gas puffing and gas puffing from the electron side is taken as a routine way for EAST to operate with LHCD. Studies of high density experiments suggest that low recycling and high lower hybrid (LH) frequency are preferred for LHCD experiments at high density, consistent with previous results in other machines. With the combination of 2.45 GHz and 4.6 GHz LH waves, a repeatable high confinement mode plasma with maximum density up to 19~\\text{m}-3$ was obtained by LHCD in EAST. In addition, in the first stage of LHCD cyclic operation, an alternative candidate for more economical fusion reactors has been demonstrated in EAST and further work will be continued.

  14. Generalized Advanced Propeller Analysis System (GAPAS). Volume 2: Computer program user manual

    NASA Technical Reports Server (NTRS)

    Glatt, L.; Crawford, D. R.; Kosmatka, J. B.; Swigart, R. J.; Wong, E. W.

    1986-01-01

    The Generalized Advanced Propeller Analysis System (GAPAS) computer code is described. GAPAS was developed to analyze advanced technology multi-bladed propellers which operate on aircraft with speeds up to Mach 0.8 and altitudes up to 40,000 feet. GAPAS includes technology for analyzing aerodynamic, structural, and acoustic performance of propellers. The computer code was developed for the CDC 7600 computer and is currently available for industrial use on the NASA Langley computer. A description of all the analytical models incorporated in GAPAS is included. Sample calculations are also described as well as users requirements for modifying the analysis system. Computer system core requirements and running times are also discussed.

  15. Recent advances in computational methods for nuclear magnetic resonance data processing.

    PubMed

    Gao, Xin

    2013-02-01

    Although three-dimensional protein structure determination using nuclear magnetic resonance (NMR) spectroscopy is a computationally costly and tedious process that would benefit from advanced computational techniques, it has not garnered much research attention from specialists in bioinformatics and computational biology. In this paper, we review recent advances in computational methods for NMR protein structure determination. We summarize the advantages of and bottlenecks in the existing methods and outline some open problems in the field. We also discuss current trends in NMR technology development and suggest directions for research on future computational methods for NMR.

  16. Development and Validation of a Fast, Accurate and Cost-Effective Aeroservoelastic Method on Advanced Parallel Computing Systems

    NASA Technical Reports Server (NTRS)

    Goodwin, Sabine A.; Raj, P.

    1999-01-01

    Progress to date towards the development and validation of a fast, accurate and cost-effective aeroelastic method for advanced parallel computing platforms such as the IBM SP2 and the SGI Origin 2000 is presented in this paper. The ENSAERO code, developed at the NASA-Ames Research Center has been selected for this effort. The code allows for the computation of aeroelastic responses by simultaneously integrating the Euler or Navier-Stokes equations and the modal structural equations of motion. To assess the computational performance and accuracy of the ENSAERO code, this paper reports the results of the Navier-Stokes simulations of the transonic flow over a flexible aeroelastic wing body configuration. In addition, a forced harmonic oscillation analysis in the frequency domain and an analysis in the time domain are done on a wing undergoing a rigid pitch and plunge motion. Finally, to demonstrate the ENSAERO flutter-analysis capability, aeroelastic Euler and Navier-Stokes computations on an L-1011 wind tunnel model including pylon, nacelle and empennage are underway. All computational solutions are compared with experimental data to assess the level of accuracy of ENSAERO. As the computations described above are performed, a meticulous log of computational performance in terms of wall clock time, execution speed, memory and disk storage is kept. Code scalability is also demonstrated by studying the impact of varying the number of processors on computational performance on the IBM SP2 and the Origin 2000 systems.

  17. NSRD-11: Computational Capability to Substantiate DOE-HDBK-3010 Data.

    SciTech Connect

    Louie, David; Brown, Alexander; Gelbard, Fred; Bignell, John; Pierce, Flint; Voskuilen, Tyler; Rodriguez, Salvador B.; Dingreville, Remi Philippe Michel; Zepper, Ethan T.; Juan, Pierre-Alexandre; Le, San; Gilkey, Lindsay Noelle

    2016-11-01

    Safety basis analysts throughout the U.S. Department of Energy (DOE) complex rely heavily on the information provided in the DOE Handbook, DOE - HDBK - 3010, Airborne Release Fractions/Rates and Respirable Fractions for Nonreactor Nuclear Facilities, to determine radionuclide source terms. In calculating source terms, analysts tend to use the DOE Handbook's bounding values on airborne release fractions (ARFs) and respirable fractions (RFs) for various categories of insults (representing potential accident release categories). This is typically due to both time constraints and the avoidance of regulatory critique. Unfortunately, these bounding ARFs/RFs represent extremely conservative values. Moreover, they were derived from very limited small-scale bench/laboratory experiments and/or from engineered judgment. Thus, the basis for the data may not be representative of the actual unique accident conditions and configurations being evaluated. The goal of this research is to develop a more accurate and defensible method to determine bounding values for the DOE Handbook using state - of - art multi - physics - based computer codes. This enables us to better understand the fundamental physics and phenomena associated with the types of accidents in the handbook. In this year, this research included improvements of the high fidelity codes to model particle resuspension and multi - component evaporation for fire scenarios. We also began to model ceramic fragmentation experiments, and to reanalyze the liquid fire and powder release experiments that were done last year. The results show that the added physics better describes the fragmentation phenomena. Thus, this work provides a low-cost method to establish physics-justified safety bounds by taking into account specific geometries and conditions that may not have been previously measured and/or are too costly to perform.

  18. NSRD-06. Computational Capability to Substantiate DOE-HDBK-3010 Data

    SciTech Connect

    Louie, David L.Y.; Brown, Alexander L.

    2015-12-01

    Safety basis analysts throughout the U.S. Department of Energy (DOE) complex rely heavily on the information provided in the DOE Hand book, DOE-HDBK-3010, Airborne Release Fractions/Rates and Resp irable Fractions for Nonreactor Nuclear Facilities , to determine source terms. In calcula ting source terms, analysts tend to use the DOE Handbook's bounding values on airbor ne release fractions (ARFs) and respirable fractions (RFs) for various cat egories of insults (representing potential accident release categories). This is typica lly due to both time constraints and the avoidance of regulatory critique. Unfort unately, these bounding ARFs/RFs represent extremely conservative values. Moreover, th ey were derived from very limited small- scale table-top and bench/labo ratory experiments and/or fr om engineered judgment. Thus the basis for the data may not be re presentative to the actual unique accident conditions and configura tions being evaluated. The goal of this res earch is to develop a more ac curate method to identify bounding values for the DOE Handbook using the st ate-of-art multi-physics-based high performance computer codes. This enable s us to better understand the fundamental physics and phenomena associated with the ty pes of accidents for the data described in it. This research has examined two of the DOE Handbook's liquid fire experiments to substantiate the airborne release frac tion data. We found th at additional physical phenomena (i.e., resuspension) need to be included to derive bounding values. For the specific cases of solid powder under pre ssurized condition and mechanical insult conditions the codes demonstrated that we can simulate the phenomena. This work thus provides a low-cost method to establis h physics-justified sa fety bounds by taking into account specific geometri es and conditions that may not have been previously measured and/or are too costly to do so.

  19. OPMILL - MICRO COMPUTER PROGRAMMING ENVIRONMENT FOR CNC MILLING MACHINES THREE AXIS EQUATION PLOTTING CAPABILITIES

    NASA Technical Reports Server (NTRS)

    Ray, R. B.

    1994-01-01

    OPMILL is a computer operating system for a Kearney and Trecker milling machine that provides a fast and easy way to program machine part manufacture with an IBM compatible PC. The program gives the machinist an "equation plotter" feature which plots any set of equations that define axis moves (up to three axes simultaneously) and converts those equations to a machine milling program that will move a cutter along a defined path. Other supported functions include: drill with peck, bolt circle, tap, mill arc, quarter circle, circle, circle 2 pass, frame, frame 2 pass, rotary frame, pocket, loop and repeat, and copy blocks. The system includes a tool manager that can handle up to 25 tools and automatically adjusts tool length for each tool. It will display all tool information and stop the milling machine at the appropriate time. Information for the program is entered via a series of menus and compiled to the Kearney and Trecker format. The program can then be loaded into the milling machine, the tool path graphically displayed, and tool change information or the program in Kearney and Trecker format viewed. The program has a complete file handling utility that allows the user to load the program into memory from the hard disk, save the program to the disk with comments, view directories, merge a program on the disk with one in memory, save a portion of a program in memory, and change directories. OPMILL was developed on an IBM PS/2 running DOS 3.3 with 1 MB of RAM. OPMILL was written for an IBM PC or compatible 8088 or 80286 machine connected via an RS-232 port to a Kearney and Trecker Data Mill 700/C Control milling machine. It requires a "D:" drive (fixed-disk or virtual), a browse or text display utility, and an EGA or better display. Users wishing to modify and recompile the source code will also need Turbo BASIC, Turbo C, and Crescent Software's QuickPak for Turbo BASIC. IBM PC and IBM PS/2 are registered trademarks of International Business Machines. Turbo

  20. Advances in computer-aided design and computer-aided manufacture technology.

    PubMed

    Calamia, J R

    1994-01-01

    Although the development of computer-aided design (CAD) and computer-aided manufacture (CAM) technology and the benefits of increased productivity became obvious in the automobile and aerospace industries in the 1970s, investigations of this technology's application in the field of dentistry did not begin until the 1980s. Only now are we beginning to see the fruits of this work with the commercial availability of some systems; the potential for this technology seems boundless. This article reviews the recent literature with emphasis on the period from June 1992 to May 1993. This review should familiarize the reader with some of the latest developments in this technology, including a brief description of some systems currently available and the clinical and economical rationale for their acceptance into the dental mainstream. This article concentrates on a particular system, the Cerec (Siemens/Pelton and Crane, Charlotte, NC) system, for three reasons: first, this system has been available since 1985 and, as a result, has a track record of almost 7 years of data. Most of the data have just recently been released and consequently, much of this year's literature on CAD-CAM is monopolized by studies using this system. Second, this system was developed as a mobile, affordable, direct chairside CAD-CAM restorative method. As such, it is of special interest to the dentist who will offer this new technology directly to the patient, providing a one-visit restoration. Third, the author is currently engaged in research using this particular system and has a working knowledge of this system's capabilities.

  1. The Advance of Computing from the Ground to the Cloud

    ERIC Educational Resources Information Center

    Breeding, Marshall

    2009-01-01

    A trend toward the abstraction of computing platforms that has been developing in the broader IT arena over the last few years is just beginning to make inroads into the library technology scene. Cloud computing offers for libraries many interesting possibilities that may help reduce technology costs and increase capacity, reliability, and…

  2. Computational Enzyme Design: Advances, hurdles and possible ways forward

    PubMed Central

    Linder, Mats

    2012-01-01

    This mini review addresses recent developments in computational enzyme design. Successful protocols as well as known issues and limitations are discussed from an energetic perspective. It will be argued that improved results can be obtained by including a dynamic treatment in the design protocol. Finally, a molecular dynamics-based approach for evaluating and refining computational designs is presented. PMID:24688650

  3. Soft, curved electrode systems capable of integration on the auricle as a persistent brain–computer interface

    PubMed Central

    Norton, James J. S.; Lee, Dong Sup; Lee, Jung Woo; Lee, Woosik; Kwon, Ohjin; Won, Phillip; Jung, Sung-Young; Cheng, Huanyu; Jeong, Jae-Woong; Akce, Abdullah; Umunna, Stephen; Na, Ilyoun; Kwon, Yong Ho; Wang, Xiao-Qi; Liu, ZhuangJian; Paik, Ungyu; Huang, Yonggang; Bretl, Timothy; Yeo, Woon-Hong; Rogers, John A.

    2015-01-01

    Recent advances in electrodes for noninvasive recording of electroencephalograms expand opportunities collecting such data for diagnosis of neurological disorders and brain–computer interfaces. Existing technologies, however, cannot be used effectively in continuous, uninterrupted modes for more than a few days due to irritation and irreversible degradation in the electrical and mechanical properties of the skin interface. Here we introduce a soft, foldable collection of electrodes in open, fractal mesh geometries that can mount directly and chronically on the complex surface topology of the auricle and the mastoid, to provide high-fidelity and long-term capture of electroencephalograms in ways that avoid any significant thermal, electrical, or mechanical loading of the skin. Experimental and computational studies establish the fundamental aspects of the bending and stretching mechanics that enable this type of intimate integration on the highly irregular and textured surfaces of the auricle. Cell level tests and thermal imaging studies establish the biocompatibility and wearability of such systems, with examples of high-quality measurements over periods of 2 wk with devices that remain mounted throughout daily activities including vigorous exercise, swimming, sleeping, and bathing. Demonstrations include a text speller with a steady-state visually evoked potential-based brain–computer interface and elicitation of an event-related potential (P300 wave). PMID:25775550

  4. Soft, curved electrode systems capable of integration on the auricle as a persistent brain-computer interface.

    PubMed

    Norton, James J S; Lee, Dong Sup; Lee, Jung Woo; Lee, Woosik; Kwon, Ohjin; Won, Phillip; Jung, Sung-Young; Cheng, Huanyu; Jeong, Jae-Woong; Akce, Abdullah; Umunna, Stephen; Na, Ilyoun; Kwon, Yong Ho; Wang, Xiao-Qi; Liu, ZhuangJian; Paik, Ungyu; Huang, Yonggang; Bretl, Timothy; Yeo, Woon-Hong; Rogers, John A

    2015-03-31

    Recent advances in electrodes for noninvasive recording of electroencephalograms expand opportunities collecting such data for diagnosis of neurological disorders and brain-computer interfaces. Existing technologies, however, cannot be used effectively in continuous, uninterrupted modes for more than a few days due to irritation and irreversible degradation in the electrical and mechanical properties of the skin interface. Here we introduce a soft, foldable collection of electrodes in open, fractal mesh geometries that can mount directly and chronically on the complex surface topology of the auricle and the mastoid, to provide high-fidelity and long-term capture of electroencephalograms in ways that avoid any significant thermal, electrical, or mechanical loading of the skin. Experimental and computational studies establish the fundamental aspects of the bending and stretching mechanics that enable this type of intimate integration on the highly irregular and textured surfaces of the auricle. Cell level tests and thermal imaging studies establish the biocompatibility and wearability of such systems, with examples of high-quality measurements over periods of 2 wk with devices that remain mounted throughout daily activities including vigorous exercise, swimming, sleeping, and bathing. Demonstrations include a text speller with a steady-state visually evoked potential-based brain-computer interface and elicitation of an event-related potential (P300 wave).

  5. validation and Enhancement of Computational Fluid Dynamics and Heat Transfer Predictive Capabilities for Generation IV Reactor Systems

    SciTech Connect

    Robert E. Spall; Barton Smith; Thomas Hauser

    2008-12-08

    Nationwide, the demand for electricity due to population and industrial growth is on the rise. However, climate change and air quality issues raise serious questions about the wisdom of addressing these shortages through the construction of additional fossil fueled power plants. In 1997, the President's Committee of Advisors on Science and Technology Energy Research and Development Panel determined that restoring a viable nuclear energy option was essential and that the DOE should implement a R&D effort to address principal obstacles to achieving this option. This work has addressed the need for improved thermal/fluid analysis capabilities, through the use of computational fluid dynamics, which are necessary to support the design of generation IV gas-cooled and supercritical water reactors.

  6. A Study into Advanced Guidance Laws Using Computational Methods

    DTIC Science & Technology

    2011-12-01

    computing aerodynamic forces % and moments. Except where noted, all dimensions in % MKS system. % Inputs...9] R. L. Shaw, Fighter Combat: Tactics and Maneuvering. Annapolis, MD: Naval Institute Press, 1988. [10] U. S. Shukla and P. R. Mahapatra

  7. RECENT ADVANCES IN COMPUTATIONAL MECHANICS FOR CIVIL ENGINEERING

    NASA Astrophysics Data System (ADS)

    Applied Mechanics Committee, Computational Mechanics Subcommittee,

    In order to clarify mechanical phenomena in civil engineering, it is necessary to improve computational theory and technique in consideration of the particularity of objects to be analyzed and to update computational mechanics focusing on practical use. In addition to the analysis of infrastructure, for damage prediction of natural disasters such as earthquake, tsunami and flood, since it is essential to reflect broad ranges in space and time inherent to fields of civil engineering as well as material properties, it is important to newly develop computational method in view of the particularity of fields of civil engineering. In this context, research trend of methods of computational mechanics which is noteworthy for resolving the complex mechanics problems in civil engineering is reviewed in this paper.

  8. Advances in Domain Mapping of Massively Parallel Scientific Computations

    SciTech Connect

    Leland, Robert W.; Hendrickson, Bruce A.

    2015-10-01

    One of the most important concerns in parallel computing is the proper distribution of workload across processors. For most scientific applications on massively parallel machines, the best approach to this distribution is to employ data parallelism; that is, to break the datastructures supporting a computation into pieces and then to assign those pieces to different processors. Collectively, these partitioning and assignment tasks comprise the domain mapping problem.

  9. Parallel high-performance grid computing: capabilities and opportunities of a novel demanding service and business class allowing highest resource efficiency.

    PubMed

    Kepper, Nick; Ettig, Ramona; Dickmann, Frank; Stehr, Rene; Grosveld, Frank G; Wedemann, Gero; Knoch, Tobias A

    2010-01-01

    Especially in the life-science and the health-care sectors the huge IT requirements are imminent due to the large and complex systems to be analysed and simulated. Grid infrastructures play here a rapidly increasing role for research, diagnostics, and treatment, since they provide the necessary large-scale resources efficiently. Whereas grids were first used for huge number crunching of trivially parallelizable problems, increasingly parallel high-performance computing is required. Here, we show for the prime example of molecular dynamic simulations how the presence of large grid clusters including very fast network interconnects within grid infrastructures allows now parallel high-performance grid computing efficiently and thus combines the benefits of dedicated super-computing centres and grid infrastructures. The demands for this service class are the highest since the user group has very heterogeneous requirements: i) two to many thousands of CPUs, ii) different memory architectures, iii) huge storage capabilities, and iv) fast communication via network interconnects, are all needed in different combinations and must be considered in a highly dedicated manner to reach highest performance efficiency. Beyond, advanced and dedicated i) interaction with users, ii) the management of jobs, iii) accounting, and iv) billing, not only combines classic with parallel high-performance grid usage, but more importantly is also able to increase the efficiency of IT resource providers. Consequently, the mere "yes-we-can" becomes a huge opportunity like e.g. the life-science and health-care sectors as well as grid infrastructures by reaching higher level of resource efficiency.

  10. OPENING REMARKS: SciDAC: Scientific Discovery through Advanced Computing

    NASA Astrophysics Data System (ADS)

    Strayer, Michael

    2005-01-01

    Good morning. Welcome to SciDAC 2005 and San Francisco. SciDAC is all about computational science and scientific discovery. In a large sense, computational science characterizes SciDAC and its intent is change. It transforms both our approach and our understanding of science. It opens new doors and crosses traditional boundaries while seeking discovery. In terms of twentieth century methodologies, computational science may be said to be transformational. There are a number of examples to this point. First are the sciences that encompass climate modeling. The application of computational science has in essence created the field of climate modeling. This community is now international in scope and has provided precision results that are challenging our understanding of our environment. A second example is that of lattice quantum chromodynamics. Lattice QCD, while adding precision and insight to our fundamental understanding of strong interaction dynamics, has transformed our approach to particle and nuclear science. The individual investigator approach has evolved to teams of scientists from different disciplines working side-by-side towards a common goal. SciDAC is also undergoing a transformation. This meeting is a prime example. Last year it was a small programmatic meeting tracking progress in SciDAC. This year, we have a major computational science meeting with a variety of disciplines and enabling technologies represented. SciDAC 2005 should position itself as a new corner stone for Computational Science and its impact on science. As we look to the immediate future, FY2006 will bring a new cycle to SciDAC. Most of the program elements of SciDAC will be re-competed in FY2006. The re-competition will involve new instruments for computational science, new approaches for collaboration, as well as new disciplines. There will be new opportunities for virtual experiments in carbon sequestration, fusion, and nuclear power and nuclear waste, as well as collaborations

  11. Production Support Flight Control Computers: Research Capability for F/A-18 Aircraft at Dryden Flight Research Center

    NASA Technical Reports Server (NTRS)

    Carter, John F.

    1997-01-01

    NASA Dryden Flight Research Center (DFRC) is working with the United States Navy to complete ground testing and initiate flight testing of a modified set of F/A-18 flight control computers. The Production Support Flight Control Computers (PSFCC) can give any fleet F/A-18 airplane an in-flight, pilot-selectable research control law capability. NASA DFRC can efficiently flight test the PSFCC for the following four reasons: (1) Six F/A-18 chase aircraft are available which could be used with the PSFCC; (2) An F/A-18 processor-in-the-loop simulation exists for validation testing; (3) The expertise has been developed in programming the research processor in the PSFCC; and (4) A well-defined process has been established for clearing flight control research projects for flight. This report presents a functional description of the PSFCC. Descriptions of the NASA DFRC facilities, PSFCC verification and validation process, and planned PSFCC projects are also provided.

  12. Novel genotype-phenotype associations in human cancers enabled by advanced molecular platforms and computational analysis of whole slide images.

    PubMed

    Cooper, Lee A D; Kong, Jun; Gutman, David A; Dunn, William D; Nalisnik, Michael; Brat, Daniel J

    2015-04-01

    Technological advances in computing, imaging, and genomics have created new opportunities for exploring relationships between histology, molecular events, and clinical outcomes using quantitative methods. Slide scanning devices are now capable of rapidly producing massive digital image archives that capture histological details in high resolution. Commensurate advances in computing and image analysis algorithms enable mining of archives to extract descriptions of histology, ranging from basic human annotations to automatic and precisely quantitative morphometric characterization of hundreds of millions of cells. These imaging capabilities represent a new dimension in tissue-based studies, and when combined with genomic and clinical endpoints, can be used to explore biologic characteristics of the tumor microenvironment and to discover new morphologic biomarkers of genetic alterations and patient outcomes. In this paper, we review developments in quantitative imaging technology and illustrate how image features can be integrated with clinical and genomic data to investigate fundamental problems in cancer. Using motivating examples from the study of glioblastomas (GBMs), we demonstrate how public data from The Cancer Genome Atlas (TCGA) can serve as an open platform to conduct in silico tissue-based studies that integrate existing data resources. We show how these approaches can be used to explore the relation of the tumor microenvironment to genomic alterations and gene expression patterns and to define nuclear morphometric features that are predictive of genetic alterations and clinical outcomes. Challenges, limitations, and emerging opportunities in the area of quantitative imaging and integrative analyses are also discussed.

  13. Advanced flight computing technologies for validation by NASA's new millennium program

    NASA Astrophysics Data System (ADS)

    Alkalai, Leon

    1996-11-01

    The New Millennium Program (NMP) consists of a series of Deep-Space and Earth Orbiting missions that are technology-driven, in contrast to the more traditional science-driven space exploration missions of the past. These flights are designed to validate technologies that will enable a new era of low-cost highly miniaturized and highly capable spacebome applications in the new millennium. In addition to the series of flight projects managed by separate flight teams, the NMP technology initiatives are managed by the following six focused technology programs: Microelectronics Systems, Autonomy, Telecommunications, Instrument Technologies and Architectures, In-Situ Instruments and Micro-electromechanical Systems, and Modular and Multifunctional Systems. Each technology program is managed as an Integrated Product Development Team (IPDT) of government, academic, and industry partners. In this paper, we will describe elements of the technology roadmap proposed by the NMP Microelectronics IPDT. Moreover, we will relate the proposed technology roadmap to existing NASA technology development programs, such as the Advanced Flight Computing (AFC) program, and the Remote Exploration and Experimentation (REE) program, which constitute part of the on-going NASA technology development pipeline. We will also describe the Microelectronics Systems technologies that have been accepted as part of the first New Millennium Deep-Space One spacecraft, which is an asteroid fly-by mission scheduled for launched in July 1998.

  14. Connecting Performance Analysis and Visualization to Advance Extreme Scale Computing

    SciTech Connect

    Bremer, Peer-Timo; Mohr, Bernd; Schulz, Martin; Pasccci, Valerio; Gamblin, Todd; Brunst, Holger

    2015-07-29

    The characterization, modeling, analysis, and tuning of software performance has been a central topic in High Performance Computing (HPC) since its early beginnings. The overall goal is to make HPC software run faster on particular hardware, either through better scheduling, on-node resource utilization, or more efficient distributed communication.

  15. Advanced Computer Image Generation Techniques Exploiting Perceptual Characteristics. Final Report.

    ERIC Educational Resources Information Center

    Stenger, Anthony J.; And Others

    This study suggests and identifies computer image generation (CIG) algorithms for visual simulation that improve the training effectiveness of CIG simulators and identifies areas of basic research in visual perception that are significant for improving CIG technology. The first phase of the project entailed observing three existing CIG simulators.…

  16. MAX - An advanced parallel computer for space applications

    NASA Technical Reports Server (NTRS)

    Lewis, Blair F.; Bunker, Robert L.

    1991-01-01

    MAX is a fault-tolerant multicomputer hardware and software architecture designed to meet the needs of NASA spacecraft systems. It consists of conventional computing modules (computers) connected via a dual network topology. One network is used to transfer data among the computers and between computers and I/O devices. This network's topology is arbitrary. The second network operates as a broadcast medium for operating system synchronization messages and supports the operating system's Byzantine resilience. A fully distributed operating system supports multitasking in an asynchronous event and data driven environment. A large grain dataflow paradigm is used to coordinate the multitasking and provide easy control of concurrency. It is the basis of the system's fault tolerance and allows both static and dynamical location of tasks. Redundant execution of tasks with software voting of results may be specified for critical tasks. The dataflow paradigm also supports simplified software design, test and maintenance. A unique feature is a method for reliably patching code in an executing dataflow application.

  17. Advanced Computational Aeroacoustics Methods for Fan Noise Prediction

    NASA Technical Reports Server (NTRS)

    Envia, Edmane (Technical Monitor); Tam, Christopher

    2003-01-01

    Direct computation of fan noise is presently not possible. One of the major difficulties is the geometrical complexity of the problem. In the case of fan noise, the blade geometry is critical to the loading on the blade and hence the intensity of the radiated noise. The precise geometry must be incorporated into the computation. In computational fluid dynamics (CFD), there are two general ways to handle problems with complex geometry. One way is to use unstructured grids. The other is to use body fitted overset grids. In the overset grid method, accurate data transfer is of utmost importance. For acoustic computation, it is not clear that the currently used data transfer methods are sufficiently accurate as not to contaminate the very small amplitude acoustic disturbances. In CFD, low order schemes are, invariably, used in conjunction with unstructured grids. However, low order schemes are known to be numerically dispersive and dissipative. dissipative errors are extremely undesirable for acoustic wave problems. The objective of this project is to develop a high order unstructured grid Dispersion-Relation-Preserving (DRP) scheme. would minimize numerical dispersion and dissipation errors. contains the results of the funded portion of the project. scheme on an unstructured grid has been developed. constructed in the wave number space. The characteristics of the scheme can be improved by the inclusion of additional constraints. Stability of the scheme has been investigated. Stability can be improved by adopting the upwinding strategy.

  18. Project T.E.A.M. (Technical Education Advancement Modules). Introduction to Computers.

    ERIC Educational Resources Information Center

    Ellis, Brenda

    This instructional guide, one of a series developed by the Technical Education Advancement Modules (TEAM) project, is a 3-hour introduction to computers. The purpose is to develop the following competencies: (1) orientation to data processing; (2) use of data entry devices; (3) use of computer menus; and (4) entry of data with accuracy and…

  19. Advanced Computational Thermal Studies and their Assessment for Supercritical-Pressure Reactors (SCRs)

    SciTech Connect

    D. M. McEligot; J. Y. Yoo; J. S. Lee; S. T. Ro; E. Lurien; S. O. Park; R. H. Pletcher; B. L. Smith; P. Vukoslavcevic; J. M. Wallace

    2009-04-01

    The goal of this laboratory / university collaboration of coupled computational and experimental studies is the improvement of predictive methods for supercritical-pressure reactors. The general objective is to develop supporting knowledge needed of advanced computational techniques for the technology development of the concepts and their safety systems.

  20. Teaching Advanced Concepts in Computer Networks: VNUML-UM Virtualization Tool

    ERIC Educational Resources Information Center

    Ruiz-Martinez, A.; Pereniguez-Garcia, F.; Marin-Lopez, R.; Ruiz-Martinez, P. M.; Skarmeta-Gomez, A. F.

    2013-01-01

    In the teaching of computer networks the main problem that arises is the high price and limited number of network devices the students can work with in the laboratories. Nowadays, with virtualization we can overcome this limitation. In this paper, we present a methodology that allows students to learn advanced computer network concepts through…

  1. Computer vision-based technologies and commercial best practices for the advancement of the motion imagery tradecraft

    NASA Astrophysics Data System (ADS)

    Phipps, Marja; Capel, David; Srinivasan, James

    2014-06-01

    Motion imagery capabilities within the Department of Defense/Intelligence Community (DoD/IC) have advanced significantly over the last decade, attempting to meet continuously growing data collection, video processing and analytical demands in operationally challenging environments. The motion imagery tradecraft has evolved accordingly, enabling teams of analysts to effectively exploit data and generate intelligence reports across multiple phases in structured Full Motion Video (FMV) Processing Exploitation and Dissemination (PED) cells. Yet now the operational requirements are drastically changing. The exponential growth in motion imagery data continues, but to this the community adds multi-INT data, interoperability with existing and emerging systems, expanded data access, nontraditional users, collaboration, automation, and support for ad hoc configurations beyond the current FMV PED cells. To break from the legacy system lifecycle, we look towards a technology application and commercial adoption model course which will meet these future Intelligence, Surveillance and Reconnaissance (ISR) challenges. In this paper, we explore the application of cutting edge computer vision technology to meet existing FMV PED shortfalls and address future capability gaps. For example, real-time georegistration services developed from computer-vision-based feature tracking, multiple-view geometry, and statistical methods allow the fusion of motion imagery with other georeferenced information sources - providing unparalleled situational awareness. We then describe how these motion imagery capabilities may be readily deployed in a dynamically integrated analytical environment; employing an extensible framework, leveraging scalable enterprise-wide infrastructure and following commercial best practices.

  2. Advances and Computational Tools towards Predictable Design in Biological Engineering

    PubMed Central

    2014-01-01

    The design process of complex systems in all the fields of engineering requires a set of quantitatively characterized components and a method to predict the output of systems composed by such elements. This strategy relies on the modularity of the used components or the prediction of their context-dependent behaviour, when parts functioning depends on the specific context. Mathematical models usually support the whole process by guiding the selection of parts and by predicting the output of interconnected systems. Such bottom-up design process cannot be trivially adopted for biological systems engineering, since parts function is hard to predict when components are reused in different contexts. This issue and the intrinsic complexity of living systems limit the capability of synthetic biologists to predict the quantitative behaviour of biological systems. The high potential of synthetic biology strongly depends on the capability of mastering this issue. This review discusses the predictability issues of basic biological parts (promoters, ribosome binding sites, coding sequences, transcriptional terminators, and plasmids) when used to engineer simple and complex gene expression systems in Escherichia coli. A comparison between bottom-up and trial-and-error approaches is performed for all the discussed elements and mathematical models supporting the prediction of parts behaviour are illustrated. PMID:25161694

  3. Advances on modelling of ITER scenarios: physics and computational challenges

    NASA Astrophysics Data System (ADS)

    Giruzzi, G.; Garcia, J.; Artaud, J. F.; Basiuk, V.; Decker, J.; Imbeaux, F.; Peysson, Y.; Schneider, M.

    2011-12-01

    Methods and tools for design and modelling of tokamak operation scenarios are discussed with particular application to ITER advanced scenarios. Simulations of hybrid and steady-state scenarios performed with the integrated tokamak modelling suite of codes CRONOS are presented. The advantages of a possible steady-state scenario based on cyclic operations, alternating phases of positive and negative loop voltage, with no magnetic flux consumption on average, are discussed. For regimes in which current alignment is an issue, a general method for scenario design is presented, based on the characteristics of the poloidal current density profile.

  4. Concept for Inclusion of Analytical and Computational Capability in Optical Plume Anomaly Detection (OPAD) for Measurement of Neutron Flux

    NASA Technical Reports Server (NTRS)

    Patrick, M. Clinton; Cooper, Anita E.; Powers, W. T.

    2004-01-01

    Researchers are working on many konts to make possible high speed, automated classification and quantification of constituent materials in numerous environments. NASA's Marshall Space Flight Center has implemented a system for rocket engine flow fields/plumes; the Optical Plume Anomaly Detection (OPAD) system was designed to utilize emission and absorption spectroscopy for monitoring molecular and atomic particulates in gas plasma. An accompanying suite of tools and analytical package designed to utilize information collected by OPAD is known as the Engine Diagnostic Filtering System (EDIFIS). The current combination of these systems identifies atomic and molecular species and quantifies mass loss rates in H2/O2 rocket plumes. Additionally, efforts are being advanced to hardware encode components of the EDIFIS in order to address real-time operational requirements for health monitoring and management. This paper addresses the OPAD with its tool suite, and discusses what is considered a natural progression: a concept for migrating OPAD towards detection of high energy particles, including neutrons and gamma rays. The integration of these tools and capabilities will provide NASA with a systematic approach to monitor space vehicle internal and external environment.

  5. Concept for Inclusion of Analytical and Computational Capability in Optical Plume Anomaly Detection (OPAD) for Measurement of Neutron Flux

    NASA Technical Reports Server (NTRS)

    Patrick, Marshall Clint; Cooper, Anita E.; Powers, W. T.

    2004-01-01

    Researchers are working on many fronts to make possible high-speed, automated classification and quantification of constituent materials in numerous environments. NASA's Marshall Space Flight Center has implemented a system for rocket engine flowfields/plumes. The Optical Plume Anomaly Detector (OPAD) system was designed to utilize emission and absorption spectroscopy for monitoring molecular and atomic particulates in gas plasma. An accompanying suite of tools and analytical package designed to utilize information collected by OPAD is known as the Engine Diagnostic Filtering System (EDiFiS). The current combination of these systems identifies atomic and molecular species and quantifies mass loss rates in H2/O2 rocket plumes. Capabilities for real-time processing are being advanced on several fronts, including an effort to hardware encode components of the EDiFiS for health monitoring and management. This paper addresses the OPAD with its tool suites, and discusses what is considered a natural progression: a concept for taking OPAD to the next logical level of high energy physics, incorporating fermion and boson particle analyses in measurement of neutron flux.

  6. Using Advanced Computer Vision Algorithms on Small Mobile Robots

    DTIC Science & Technology

    2006-04-20

    Lab. This classification algorithm has been used to successfully detect the license plates of automobiles in motion in real-time. While working...use in real-time. Test results are shown for a variety of environments. KEYWORDS: robotics, computer vision, car /license plate detection, SIFT...when detecting the make and model of automobiles , SIFT can be used to achieve very high detection rates at the expense of a hefty performance cost when

  7. Advanced Computational Methods for Security Constrained Financial Transmission Rights

    SciTech Connect

    Kalsi, Karanjit; Elbert, Stephen T.; Vlachopoulou, Maria; Zhou, Ning; Huang, Zhenyu

    2012-07-26

    Financial Transmission Rights (FTRs) are financial insurance tools to help power market participants reduce price risks associated with transmission congestion. FTRs are issued based on a process of solving a constrained optimization problem with the objective to maximize the FTR social welfare under power flow security constraints. Security constraints for different FTR categories (monthly, seasonal or annual) are usually coupled and the number of constraints increases exponentially with the number of categories. Commercial software for FTR calculation can only provide limited categories of FTRs due to the inherent computational challenges mentioned above. In this paper, first an innovative mathematical reformulation of the FTR problem is presented which dramatically improves the computational efficiency of optimization problem. After having re-formulated the problem, a novel non-linear dynamic system (NDS) approach is proposed to solve the optimization problem. The new formulation and performance of the NDS solver is benchmarked against widely used linear programming (LP) solvers like CPLEX™ and tested on both standard IEEE test systems and large-scale systems using data from the Western Electricity Coordinating Council (WECC). The performance of the NDS is demonstrated to be comparable and in some cases is shown to outperform the widely used CPLEX algorithms. The proposed formulation and NDS based solver is also easily parallelizable enabling further computational improvement.

  8. Vision 20/20: Automation and advanced computing in clinical radiation oncology

    SciTech Connect

    Moore, Kevin L. Moiseenko, Vitali; Kagadis, George C.; McNutt, Todd R.; Mutic, Sasa

    2014-01-15

    This Vision 20/20 paper considers what computational advances are likely to be implemented in clinical radiation oncology in the coming years and how the adoption of these changes might alter the practice of radiotherapy. Four main areas of likely advancement are explored: cloud computing, aggregate data analyses, parallel computation, and automation. As these developments promise both new opportunities and new risks to clinicians and patients alike, the potential benefits are weighed against the hazards associated with each advance, with special considerations regarding patient safety under new computational platforms and methodologies. While the concerns of patient safety are legitimate, the authors contend that progress toward next-generation clinical informatics systems will bring about extremely valuable developments in quality improvement initiatives, clinical efficiency, outcomes analyses, data sharing, and adaptive radiotherapy.

  9. Simulation of the Secondary Frequency Control Capability of the Advanced PSH Technology and Its Application to the SMUD System

    SciTech Connect

    Koritarov, Vladimir; Feltes, James; Kazachkov, Yuriy

    2013-11-01

    The Sacramento Municipal Utility District (SMUD), as a typical balancing authority and project team member, was suggested by the Advanced Technology Modeling TFG for testing the models of the advanced pump storage hydro technology newly developed in the course of the DOE project and for demonstration of the potential benefits of this technology.

  10. Computational Approaches to Enhance Nanosafety and Advance Nanomedicine

    NASA Astrophysics Data System (ADS)

    Mendoza, Eduardo R.

    With the increasing use of nanoparticles in food processing, filtration/purification and consumer products, as well as the huge potential of their use in nanomedicine, a quantitative understanding of the effects of nanoparticle uptake and transport is needed. We provide examples of novel methods for modeling complex bio-nano interactions which are based on stochastic process algebras. Since model construction presumes sufficient availability of experimental data, recent developments in "nanoinformatics", an emerging discipline analogous to bioinfomatics, in building an accessible information infrastructure are subsequently discussed. Both computational areas offer opportunities for Filipinos to engage in collaborative, cutting edge research in this impactful field.

  11. First Responders Guide to Computer Forensics: Advanced Topics

    DTIC Science & Technology

    2005-09-01

    server of the sender , the mail server of the receiver, and the computer that receives the email. Assume that Alice wants to send an email to her friend...pleased to meet you MAIL FROM: alice.price@alphanet.com 250 alice.price@alphanet.com... Sender ok RCPT TO: bob.doe@betanet.com 250 bob.doe...betanet.com... Sender ok DATA 354 Please start mail input From: alice.price@alphanet.com To: bob.doe@betanet.com Subject: Lunch Bob, It was good

  12. Capabilities and Facilities Available at the Advanced Test Reactor to Support Development of the Next Generation Reactors

    SciTech Connect

    S. Blaine Grover; Raymond V. Furstenau

    2005-10-01

    The ATR is one of the world’s premiere test reactors for performing long term, high flux, and/or large volume irradiation test programs. It is a very versatile facility with a wide variety of experimental test capabilities for providing the environment needed in an irradiation experiment. These different capabilities include passive sealed capsule experiments, instrumented and/or temperature-controlled experiments, and pressurized water loop experiment facilities. The Irradiation Test Vehicle (ITV) installed in 1999 enhanced these capabilities by providing a built in experiment monitoring and control system for instrumented and/or temperature controlled experiments. This built in control system significantly reduces the cost for an actively monitored/temperature controlled experiments by providing the thermocouple connections, temperature control system, and temperature control gas supply and exhaust systems already in place at the irradiation position. Although the ITV in-core hardware was removed from the ATR during the last core replacement completed in early 2005, it (or a similar facility) could be re-installed for an irradiation program when the need arises. The proposed Gas Test Loop currently being designed for installation in the ATR will provide additional capability for testing of not only gas reactor materials and fuels but will also include enhanced fast flux rates for testing of materials and fuels for other next generation reactors including preliminary testing for fast reactor fuels and materials. This paper discusses the different irradiation capabilities available and the cost benefit issues related to each capability.

  13. Computational Efforts in Support of Advanced Coal Research

    SciTech Connect

    Suljo Linic

    2006-08-17

    The focus in this project was to employ first principles computational methods to study the underlying molecular elementary processes that govern hydrogen diffusion through Pd membranes as well as the elementary processes that govern the CO- and S-poisoning of these membranes. Our computational methodology integrated a multiscale hierarchical modeling approach, wherein a molecular understanding of the interactions between various species is gained from ab-initio quantum chemical Density Functional Theory (DFT) calculations, while a mesoscopic statistical mechanical model like Kinetic Monte Carlo is employed to predict the key macroscopic membrane properties such as permeability. The key developments are: (1) We have coupled systematically the ab initio calculations with Kinetic Monte Carlo (KMC) simulations to model hydrogen diffusion through the Pd based-membranes. The predicted tracer diffusivity of hydrogen atoms through the bulk of Pd lattice from KMC simulations are in excellent agreement with experiments. (2) The KMC simulations of dissociative adsorption of H{sub 2} over Pd(111) surface indicates that for thin membranes (less than 10{micro} thick), the diffusion of hydrogen from surface to the first subsurface layer is rate limiting. (3) Sulfur poisons the Pd surface by altering the electronic structure of the Pd atoms in the vicinity of the S atom. The KMC simulations indicate that increasing sulfur coverage drastically reduces the hydrogen coverage on the Pd surface and hence the driving force for diffusion through the membrane.

  14. Activities and operations of the Advanced Computing Research Facility, July-October 1986

    SciTech Connect

    Pieper, G.W.

    1986-01-01

    Research activities and operations of the Advanced Computing Research Facility (ACRF) at Argonne National Laboratory are discussed for the period from July 1986 through October 1986. The facility is currently supported by the Department of Energy, and is operated by the Mathematics and Computer Science Division at Argonne. Over the past four-month period, a new commercial multiprocessor, the Intel iPSC-VX/d4 hypercube was installed. In addition, four other commercial multiprocessors continue to be available for research - an Encore Multimax, a Sequent Balance 21000, an Alliant FX/8, and an Intel iPSC/d5 - as well as a locally designed multiprocessor, the Lemur. These machines are being actively used by scientists at Argonne and throughout the nation in a wide variety of projects concerning computer systems with parallel and vector architectures. A variety of classes, workshops, and seminars have been sponsored to train researchers on computing techniques for the advanced computer systems at the Advanced Computing Research Facility. For example, courses were offered on writing programs for parallel computer systems and hosted the first annual Alliant users group meeting. A Sequent users group meeting and a two-day workshop on performance evaluation of parallel computers and programs are being organized.

  15. Computing aerodynamic sound using advanced statistical turbulence theories

    NASA Technical Reports Server (NTRS)

    Hecht, A. M.; Teske, M. E.; Bilanin, A. J.

    1981-01-01

    It is noted that the calculation of turbulence-generated aerodynamic sound requires knowledge of the spatial and temporal variation of Q sub ij (xi sub k, tau), the two-point, two-time turbulent velocity correlations. A technique is presented to obtain an approximate form of these correlations based on closure of the Reynolds stress equations by modeling of higher order terms. The governing equations for Q sub ij are first developed for a general flow. The case of homogeneous, stationary turbulence in a unidirectional constant shear mean flow is then assumed. The required closure form for Q sub ij is selected which is capable of qualitatively reproducing experimentally observed behavior. This form contains separation time dependent scale factors as parameters and depends explicitly on spatial separation. The approximate forms of Q sub ij are used in the differential equations and integral moments are taken over the spatial domain. The velocity correlations are used in the Lighthill theory of aerodynamic sound by assuming normal joint probability.

  16. Advances in x-ray computed microtomography at the NSLS

    SciTech Connect

    Dowd, B.A.; Andrews, A.B.; Marr, R.B.; Siddons, D.P.; Jones, K.W.; Peskin, A.M.

    1998-08-01

    The X-Ray Computed Microtomography workstation at beamline X27A at the NSLS has been utilized by scientists from a broad range of disciplines from industrial materials processing to environmental science. The most recent applications are presented here as well as a description of the facility that has evolved to accommodate a wide variety of materials and sample sizes. One of the most exciting new developments reported here resulted from a pursuit of faster reconstruction techniques. A Fast Filtered Back Transform (FFBT) reconstruction program has been developed and implemented, that is based on a refinement of the gridding algorithm first developed for use with radio astronomical data. This program has reduced the reconstruction time to 8.5 sec for a 929 x 929 pixel{sup 2} slice on an R10,000 CPU, more than 8x reduction compared with the Filtered Back-Projection method.

  17. ADVANCES IN X-RAY COMPUTED MICROTOMOGRAPHY AT THE NSLS.

    SciTech Connect

    DOWD,B.A.

    1998-08-07

    The X-Ray Computed Microtomography workstation at beamline X27A at the NSLS has been utilized by scientists from a broad range of disciplines from industrial materials processing to environmental science. The most recent applications are presented here as well as a description of the facility that has evolved to accommodate a wide variety of materials and sample sizes. One of the most exciting new developments reported here resulted from a pursuit of faster reconstruction techniques. A Fast Filtered Back Transform (FFBT) reconstruction program has been developed and implemented, that is based on a refinement of the ''gridding'' algorithm first developed for use with radio astronomical data. This program has reduced the reconstruction time to 8.5 sec for a 929 x 929 pixel{sup 2} slice on an R10,000 CPU, more than 8x reduction compared with the Filtered Back-Projection method.

  18. National Energy Research Scientific Computing Center (NERSC): Advancing the frontiers of computational science and technology

    SciTech Connect

    Hules, J.

    1996-11-01

    National Energy Research Scientific Computing Center (NERSC) provides researchers with high-performance computing tools to tackle science`s biggest and most challenging problems. Founded in 1974 by DOE/ER, the Controlled Thermonuclear Research Computer Center was the first unclassified supercomputer center and was the model for those that followed. Over the years the center`s name was changed to the National Magnetic Fusion Energy Computer Center and then to NERSC; it was relocated to LBNL. NERSC, one of the largest unclassified scientific computing resources in the world, is the principal provider of general-purpose computing services to DOE/ER programs: Magnetic Fusion Energy, High Energy and Nuclear Physics, Basic Energy Sciences, Health and Environmental Research, and the Office of Computational and Technology Research. NERSC users are a diverse community located throughout US and in several foreign countries. This brochure describes: the NERSC advantage, its computational resources and services, future technologies, scientific resources, and computational science of scale (interdisciplinary research over a decade or longer; examples: combustion in engines, waste management chemistry, global climate change modeling).

  19. 2014 Annual Report - Argonne Leadership Computing Facility

    SciTech Connect

    Collins, James R.; Papka, Michael E.; Cerny, Beth A.; Coffey, Richard M.

    2014-01-01

    The Argonne Leadership Computing Facility provides supercomputing capabilities to the scientific and engineering community to advance fundamental discovery and understanding in a broad range of disciplines.

  20. 2015 Annual Report - Argonne Leadership Computing Facility

    SciTech Connect

    Collins, James R.; Papka, Michael E.; Cerny, Beth A.; Coffey, Richard M.

    2015-01-01

    The Argonne Leadership Computing Facility provides supercomputing capabilities to the scientific and engineering community to advance fundamental discovery and understanding in a broad range of disciplines.

  1. Five-Year Implementation Plan For Advanced Separations and Waste Forms Capabilities at the Idaho National Laboratory (FY 2011 to FY 2015)

    SciTech Connect

    Not Listed

    2011-03-01

    DOE-NE separations research is focused today on developing a science-based understanding that builds on historical research and focuses on combining a fundamental understanding of separations and waste forms processes with small-scale experimentation coupled with modeling and simulation. The result of this approach is the development of a predictive capability that supports evaluation of separations and waste forms technologies. The specific suite of technologies explored will depend on and must be integrated with the fuel development effort, as well as an understanding of potential waste form requirements. This five-year implementation plan lays out the specific near-term tactical investments in people, equipment and facilities, and customer capture efforts that will be required over the next five years to quickly and safely bring on line the capabilities needed to support the science-based goals and objectives of INL’s Advanced Separations and Waste Forms RD&D Capabilities Strategic Plan.

  2. Strategies for casualty mitigation programs by using advanced tsunami computation

    NASA Astrophysics Data System (ADS)

    IMAI, K.; Imamura, F.

    2012-12-01

    1. Purpose of the study In this study, based on the scenario of great earthquakes along the Nankai trough, we aim on the estimation of the run up and high accuracy inundation process of tsunami in coastal areas including rivers. Here, using a practical method of tsunami analytical model, and taking into account characteristics of detail topography, land use and climate change in a realistic present and expected future environment, we examined the run up and tsunami inundation process. Using these results we estimated the damage due to tsunami and obtained information for the mitigation of human casualties. Considering the time series from the occurrence of the earthquake and the risk of tsunami damage, in order to mitigate casualties we provide contents of disaster risk information displayed in a tsunami hazard and risk map. 2. Creating a tsunami hazard and risk map From the analytical and practical tsunami model (a long wave approximated model) and the high resolution topography (5 m) including detailed data of shoreline, rivers, building and houses, we present a advanced analysis of tsunami inundation considering the land use. Based on the results of tsunami inundation and its analysis; it is possible to draw a tsunami hazard and risk map with information of human casualty, building damage estimation, drift of vehicles, etc. 3. Contents of disaster prevention information To improve the hazard, risk and evacuation information distribution, it is necessary to follow three steps. (1) Provide basic information such as tsunami attack info, areas and routes for evacuation and location of tsunami evacuation facilities. (2) Provide as additional information the time when inundation starts, the actual results of inundation, location of facilities with hazard materials, presence or absence of public facilities and areas underground that required evacuation. (3) Provide information to support disaster response such as infrastructure and traffic network damage prediction

  3. CART V: recent advancements in computer-aided camouflage assessment

    NASA Astrophysics Data System (ADS)

    Müller, Thomas; Müller, Markus

    2011-05-01

    In order to facilitate systematic, computer aided improvements of camouflage and concealment assessment methods, the software system CART (Camouflage Assessment in Real-Time) was built up for the camouflage assessment of objects in multispectral image sequences (see contributions to SPIE 2007-2010 [1], [2], [3], [4]). It comprises a semi-automatic marking of target objects (ground truth generation) including their propagation over the image sequence and the evaluation via user-defined feature extractors as well as methods to assess the object's movement conspicuity. In this fifth part in an annual series at the SPIE conference in Orlando, this paper presents the enhancements over the recent year and addresses the camouflage assessment of static and moving objects in multispectral image data that can show noise or image artefacts. The presented methods fathom the correlations between image processing and camouflage assessment. A novel algorithm is presented based on template matching to assess the structural inconspicuity of an object objectively and quantitatively. The results can easily be combined with an MTI (moving target indication) based movement conspicuity assessment function in order to explore the influence of object movement to a camouflage effect in different environments. As the results show, the presented methods contribute to a significant benefit in the field of camouflage assessment.

  4. Block sparse Cholesky algorithms on advanced uniprocessor computers

    SciTech Connect

    Ng, E.G.; Peyton, B.W.

    1991-12-01

    As with many other linear algebra algorithms, devising a portable implementation of sparse Cholesky factorization that performs well on the broad range of computer architectures currently available is a formidable challenge. Even after limiting our attention to machines with only one processor, as we have done in this report, there are still several interesting issues to consider. For dense matrices, it is well known that block factorization algorithms are the best means of achieving this goal. We take this approach for sparse factorization as well. This paper has two primary goals. First, we examine two sparse Cholesky factorization algorithms, the multifrontal method and a blocked left-looking sparse Cholesky method, in a systematic and consistent fashion, both to illustrate the strengths of the blocking techniques in general and to obtain a fair evaluation of the two approaches. Second, we assess the impact of various implementation techniques on time and storage efficiency, paying particularly close attention to the work-storage requirement of the two methods and their variants.

  5. Advances in computed radiography systems and their physical imaging characteristics.

    PubMed

    Cowen, A R; Davies, A G; Kengyelics, S M

    2007-12-01

    Radiological imaging is progressing towards an all-digital future, across the spectrum of medical imaging techniques. Computed radiography (CR) has provided a ready pathway from screen film to digital radiography and a convenient entry point to PACS. This review briefly revisits the principles of modern CR systems and their physical imaging characteristics. Wide dynamic range and digital image enhancement are well-established benefits of CR, which lend themselves to improved image presentation and reduced rates of repeat exposures. However, in its original form CR offered limited scope for reducing the radiation dose per radiographic exposure, compared with screen film. Recent innovations in CR, including the use of dual-sided image readout and channelled storage phosphor have eased these concerns. For example, introduction of these technologies has improved detective quantum efficiency (DQE) by approximately 50 and 100%, respectively, compared with standard CR. As a result CR currently affords greater scope for reducing patient dose, and provides a more substantive challenge to the new solid-state, flat-panel, digital radiography detectors.

  6. WAATS: A computer program for Weights Analysis of Advanced Transportation Systems

    NASA Technical Reports Server (NTRS)

    Glatt, C. R.

    1974-01-01

    A historical weight estimating technique for advanced transportation systems is presented. The classical approach to weight estimation is discussed and sufficient data is presented to estimate weights for a large spectrum of flight vehicles including horizontal and vertical takeoff aircraft, boosters and reentry vehicles. A computer program, WAATS (Weights Analysis for Advanced Transportation Systems) embracing the techniques discussed has been written and user instructions are presented. The program was developed for use in the ODIN (Optimal Design Integration System) system.

  7. ADVANCED METHODS FOR THE COMPUTATION OF PARTICLE BEAM TRANSPORT AND THE COMPUTATION OF ELECTROMAGNETIC FIELDS AND MULTIPARTICLE PHENOMENA

    SciTech Connect

    Alex J. Dragt

    2012-08-31

    Since 1980, under the grant DEFG02-96ER40949, the Department of Energy has supported the educational and research work of the University of Maryland Dynamical Systems and Accelerator Theory (DSAT) Group. The primary focus of this educational/research group has been on the computation and analysis of charged-particle beam transport using Lie algebraic methods, and on advanced methods for the computation of electromagnetic fields and multiparticle phenomena. This Final Report summarizes the accomplishments of the DSAT Group from its inception in 1980 through its end in 2011.

  8. ADVANCED COMPUTATIONAL MODEL FOR THREE-PHASE SLURRY REACTORS

    SciTech Connect

    Goodarz Ahmadi

    2004-10-01

    In this project, an Eulerian-Lagrangian formulation for analyzing three-phase slurry flows in a bubble column was developed. The approach used an Eulerian analysis of liquid flows in the bubble column, and made use of the Lagrangian trajectory analysis for the bubbles and particle motions. The bubble-bubble and particle-particle collisions are included the model. The model predictions are compared with the experimental data and good agreement was found An experimental setup for studying two-dimensional bubble columns was developed. The multiphase flow conditions in the bubble column were measured using optical image processing and Particle Image Velocimetry techniques (PIV). A simple shear flow device for bubble motion in a constant shear flow field was also developed. The flow conditions in simple shear flow device were studied using PIV method. Concentration and velocity of particles of different sizes near a wall in a duct flow was also measured. The technique of Phase-Doppler anemometry was used in these studies. An Eulerian volume of fluid (VOF) computational model for the flow condition in the two-dimensional bubble column was also developed. The liquid and bubble motions were analyzed and the results were compared with observed flow patterns in the experimental setup. Solid-fluid mixture flows in ducts and passages at different angle of orientations were also analyzed. The model predictions were compared with the experimental data and good agreement was found. Gravity chute flows of solid-liquid mixtures were also studied. The simulation results were compared with the experimental data and discussed A thermodynamically consistent model for multiphase slurry flows with and without chemical reaction in a state of turbulent motion was developed. The balance laws were obtained and the constitutive laws established.

  9. Advances and perspectives in lung cancer imaging using multidetector row computed tomography.

    PubMed

    Coche, Emmanuel

    2012-10-01

    The introduction of multidetector row computed tomography (CT) into clinical practice has revolutionized many aspects of the clinical work-up. Lung cancer imaging has benefited from various breakthroughs in computing technology, with advances in the field of lung cancer detection, tissue characterization, lung cancer staging and response to therapy. Our paper discusses the problems of radiation, image visualization and CT examination comparison. It also reviews the most significant advances in lung cancer imaging and highlights the emerging clinical applications that use state of the art CT technology in the field of lung cancer diagnosis and follow-up.

  10. Transmutation Performance Analysis for Inert Matrix Fuels in Light Water Reactors and Computational Neutronics Methods Capabilities at INL

    SciTech Connect

    Michael A. Pope; Samuel E. Bays; S. Piet; R. Ferrer; Mehdi Asgari; Benoit Forget

    2009-05-01

    The urgency for addressing repository impacts has grown in the past few years as a result of Spent Nuclear Fuel (SNF) accumulation from commercial nuclear power plants. One path that has been explored by many is to eliminate the transuranic (TRU) inventory from the SNF, thus reducing the need for additional long term repository storage sites. One strategy for achieving this is to burn the separated TRU elements in the currently operating U.S. Light Water Reactor (LWR) fleet. Many studies have explored the viability of this strategy by loading a percentage of LWR cores with TRU in the form of either Mixed Oxide (MOX) fuels or Inert Matrix Fuels (IMF). A task was undertaken at INL to establish specific technical capabilities to perform neutronics analyses in order to further assess several key issues related to the viability of thermal recycling. The initial computational study reported here is focused on direct thermal recycling of IMF fuels in a heterogeneous Pressurized Water Reactor (PWR) bundle design containing Plutonium, Neptunium, Americium, and Curium (IMF-PuNpAmCm) in a multi-pass strategy using legacy 5 year cooled LWR SNF. In addition to this initial high-priority analysis, three other alternate analyses with different TRU vectors in IMF pins were performed. These analyses provide comparison of direct thermal recycling of PuNpAmCmCf, PuNpAm, PuNp, and Pu. The results of this infinite lattice assembly-wise study using SCALE 5.1 indicate that it may be feasible to recycle TRU in this manner using an otherwise typical PWR assembly without violating peaking factor limits.

  11. The Nuclear Energy Advanced Modeling and Simulation Enabling Computational Technologies FY09 Report

    SciTech Connect

    Diachin, L F; Garaizar, F X; Henson, V E; Pope, G

    2009-10-12

    In this document we report on the status of the Nuclear Energy Advanced Modeling and Simulation (NEAMS) Enabling Computational Technologies (ECT) effort. In particular, we provide the context for ECT In the broader NEAMS program and describe the three pillars of the ECT effort, namely, (1) tools and libraries, (2) software quality assurance, and (3) computational facility (computers, storage, etc) needs. We report on our FY09 deliverables to determine the needs of the integrated performance and safety codes (IPSCs) in these three areas and lay out the general plan for software quality assurance to meet the requirements of DOE and the DOE Advanced Fuel Cycle Initiative (AFCI). We conclude with a brief description of our interactions with the Idaho National Laboratory computer center to determine what is needed to expand their role as a NEAMS user facility.

  12. Integrated Graphics Operations and Analysis Lab Development of Advanced Computer Graphics Algorithms

    NASA Technical Reports Server (NTRS)

    Wheaton, Ira M.

    2011-01-01

    The focus of this project is to aid the IGOAL in researching and implementing algorithms for advanced computer graphics. First, this project focused on porting the current International Space Station (ISS) Xbox experience to the web. Previously, the ISS interior fly-around education and outreach experience only ran on an Xbox 360. One of the desires was to take this experience and make it into something that can be put on NASA s educational site for anyone to be able to access. The current code works in the Unity game engine which does have cross platform capability but is not 100% compatible. The tasks for an intern to complete this portion consisted of gaining familiarity with Unity and the current ISS Xbox code, porting the Xbox code to the web as is, and modifying the code to work well as a web application. In addition, a procedurally generated cloud algorithm will be developed. Currently, the clouds used in AGEA animations and the Xbox experiences are a texture map. The desire is to create a procedurally generated cloud algorithm to provide dynamically generated clouds for both AGEA animations and the Xbox experiences. This task consists of gaining familiarity with AGEA and the plug-in interface, developing the algorithm, creating an AGEA plug-in to implement the algorithm inside AGEA, and creating a Unity script to implement the algorithm for the Xbox. This portion of the project was unable to be completed in the time frame of the internship; however, the IGOAL will continue to work on it in the future.

  13. Computational Modeling and High Performance Computing in Advanced Materials Processing, Synthesis, and Design

    DTIC Science & Technology

    2014-12-07

    research efforts in this project focused on the synergistic coupling of: • Computational material science and mechanics of hybrid and light weight polymeric...MATERIAL SCIENCE AND MECHANICS OF HYBRID AND LIGHT WEIGHT POLYMERIC COMPOSITE STRUCTURES 11 A-l-l: Atomistic Modeling in Polymer Nanocomposite Systems...DETAILED TECHNICAL REPORT 16 A-1: COMPUTATIONAL MATERIAL SCIENCE AND MECHANICS OF HYBRID AND LIGHT WEIGHT POLYMERIC COMPOSITE STRUCTURES 16 A-l-l

  14. Data Collection Capabilities of a New Non-Invasive Monitoring System for Patients with Advanced Multiple Sclerosis

    PubMed Central

    Arias, Diego E.; Pino, Esteban J.; Aqueveque, Pablo; Curtis, Dorothy W.

    2013-01-01

    This paper reports on a data collection study in a clinical environment to evaluate a new non-invasive monitoring system for people with advanced Multiple Sclerosis (MS) who use powered wheelchairs. The proposed system can acquire respiration and heart activity from ballistocardiogram (BCG) signals, seat and back pressure changes, wheelchair tilt angle, ambient temperature and relative humidity. The data was collected at The Boston Home (TBH), a specialized care residence for adults with advanced MS. The collected data will be used to design algorithms to generate alarms and recommendations for residents and caregivers. These alarms and recommendations will be related to vital signs, low mobility problems and heat exposure. We present different cases where it is possible to illustrate the type of information acquired by our system and the possible alarms we will generate. PMID:24551323

  15. Data collection capabilities of a new non-invasive monitoring system for patients with advanced multiple sclerosis.

    PubMed

    Arias, Diego E; Pino, Esteban J; Aqueveque, Pablo; Curtis, Dorothy W

    2013-01-01

    This paper reports on a data collection study in a clinical environment to evaluate a new non-invasive monitoring system for people with advanced Multiple Sclerosis (MS) who use powered wheelchairs. The proposed system can acquire respiration and heart activity from ballistocardiogram (BCG) signals, seat and back pressure changes, wheelchair tilt angle, ambient temperature and relative humidity. The data was collected at The Boston Home (TBH), a specialized care residence for adults with advanced MS. The collected data will be used to design algorithms to generate alarms and recommendations for residents and caregivers. These alarms and recommendations will be related to vital signs, low mobility problems and heat exposure. We present different cases where it is possible to illustrate the type of information acquired by our system and the possible alarms we will generate.

  16. Volumes to learn: advancing therapeutics with innovative computed tomography image data analysis.

    PubMed

    Maitland, Michael L

    2010-09-15

    Semi-automated methods for calculating tumor volumes from computed tomography images are a new tool for advancing the development of cancer therapeutics. Volumetric measurements, relying on already widely available standard clinical imaging techniques, could shorten the observation intervals needed to identify cohorts of patients sensitive or resistant to treatment.

  17. Next Generation Risk Assessment: Incorporation of Recent Advances in Molecular, Computational, and Systems Biology (Final Report)

    EPA Science Inventory

    EPA announced the release of the final report, Next Generation Risk Assessment: Incorporation of Recent Advances in Molecular, Computational, and Systems Biology. This report describes new approaches that are faster, less resource intensive, and more robust that can help ...

  18. Response to House Joint Resolution No. 118 [To Advance Computer-Assisted Instruction].

    ERIC Educational Resources Information Center

    Virginia State General Assembly, Richmond.

    This response by the Virginia Department of Education to House Joint Resolution No. 118 of the General Assembly of Virginia, which requested the Department of Education to study initiatives to advance computer-assisted instruction, is based on input from state and national task forces and on a 1986 survey of 80 Viriginia school divisions. The…

  19. PARTNERING WITH DOE TO APPLY ADVANCED BIOLOGICAL, ENVIRONMENTAL, AND COMPUTATIONAL SCIENCE TO ENVIRONMENTAL ISSUES

    EPA Science Inventory

    On February 18, 2004, the U.S. Environmental Protection Agency and Department of Energy signed a Memorandum of Understanding to expand the research collaboration of both agencies to advance biological, environmental, and computational sciences for protecting human health and the ...

  20. Computers-for-edu: An Advanced Business Application Programming (ABAP) Teaching Case

    ERIC Educational Resources Information Center

    Boyle, Todd A.

    2007-01-01

    The "Computers-for-edu" case is designed to provide students with hands-on exposure to creating Advanced Business Application Programming (ABAP) reports and dialogue programs, as well as navigating various mySAP Enterprise Resource Planning (ERP) transactions needed by ABAP developers. The case requires students to apply a wide variety…

  1. Research in Computational Aeroscience Applications Implemented on Advanced Parallel Computing Systems

    NASA Technical Reports Server (NTRS)

    Wigton, Larry

    1996-01-01

    Improving the numerical linear algebra routines for use in new Navier-Stokes codes, specifically Tim Barth's unstructured grid code, with spin-offs to TRANAIR is reported. A fast distance calculation routine for Navier-Stokes codes using the new one-equation turbulence models is written. The primary focus of this work was devoted to improving matrix-iterative methods. New algorithms have been developed which activate the full potential of classical Cray-class computers as well as distributed-memory parallel computers.

  2. A first attempt to bring computational biology into advanced high school biology classrooms.

    PubMed

    Gallagher, Suzanne Renick; Coon, William; Donley, Kristin; Scott, Abby; Goldberg, Debra S

    2011-10-01

    Computer science has become ubiquitous in many areas of biological research, yet most high school and even college students are unaware of this. As a result, many college biology majors graduate without adequate computational skills for contemporary fields of biology. The absence of a computational element in secondary school biology classrooms is of growing concern to the computational biology community and biology teachers who would like to acquaint their students with updated approaches in the discipline. We present a first attempt to correct this absence by introducing a computational biology element to teach genetic evolution into advanced biology classes in two local high schools. Our primary goal was to show students how computation is used in biology and why a basic understanding of computation is necessary for research in many fields of biology. This curriculum is intended to be taught by a computational biologist who has worked with a high school advanced biology teacher to adapt the unit for his/her classroom, but a motivated high school teacher comfortable with mathematics and computing may be able to teach this alone. In this paper, we present our curriculum, which takes into consideration the constraints of the required curriculum, and discuss our experiences teaching it. We describe the successes and challenges we encountered while bringing this unit to high school students, discuss how we addressed these challenges, and make suggestions for future versions of this curriculum.We believe that our curriculum can be a valuable seed for further development of computational activities aimed at high school biology students. Further, our experiences may be of value to others teaching computational biology at this level. Our curriculum can be obtained at http://ecsite.cs.colorado.edu/?page_id=149#biology or by contacting the authors.

  3. Advanced concepts report on the detection of xenon with a miniature whole air sampler capable of extended operating times

    SciTech Connect

    Motes, B.G.; McManus, G.J.; Bird, S.K.; Fernandez, S.J.

    1993-07-01

    Many monitoring activities require the collection of whole air samples over an extended time interval without loss or concentration of any atmospheric constituents. Described is the development and laboratory testing of a whole air sampler capable of collecting a 100 liter sample over a period of 0.63 days. The sampler has an empty weight of 7.79 kg and an overall size of 20.8-cm {times} 20.8-cm {times} 66.1-cm. The conceptual design for the development of smaller, higher-performance whole air samplers is also reported.

  4. COLLABORATIVE RESEARCH: TOWARDS ADVANCED UNDERSTANDING AND PREDICTIVE CAPABILITY OF CLIMATE CHANGE IN THE ARCTIC USING A HIGH-RESOLUTION REGIONAL ARCTIC CLIMATE SYSTEM MODEL

    SciTech Connect

    Gutowski, William J.

    2013-02-07

    The motivation for this project was to advance the science of climate change and prediction in the Arctic region. Its primary goals were to (i) develop a state-of-the-art Regional Arctic Climate system Model (RACM) including high-resolution atmosphere, land, ocean, sea ice and land hydrology components and (ii) to perform extended numerical experiments using high performance computers to minimize uncertainties and fundamentally improve current predictions of climate change in the northern polar regions. These goals were realized first through evaluation studies of climate system components via one-way coupling experiments. Simulations were then used to examine the effects of advancements in climate component systems on their representation of main physics, time-mean fields and to understand variability signals at scales over many years. As such this research directly addressed some of the major science objectives of the BER Climate Change Research Division (CCRD) regarding the advancement of long-term climate prediction.

  5. Putting Integrated Systems Health Management Capabilities to Work: Development of an Advanced Caution and Warning System for Next-Generation Crewed Spacecraft Missions

    NASA Technical Reports Server (NTRS)

    Mccann, Robert S.; Spirkovska, Lilly; Smith, Irene

    2013-01-01

    Integrated System Health Management (ISHM) technologies have advanced to the point where they can provide significant automated assistance with real-time fault detection, diagnosis, guided troubleshooting, and failure consequence assessment. To exploit these capabilities in actual operational environments, however, ISHM information must be integrated into operational concepts and associated information displays in ways that enable human operators to process and understand the ISHM system information rapidly and effectively. In this paper, we explore these design issues in the context of an advanced caution and warning system (ACAWS) for next-generation crewed spacecraft missions. User interface concepts for depicting failure diagnoses, failure effects, redundancy loss, "what-if" failure analysis scenarios, and resolution of ambiguity groups are discussed and illustrated.

  6. Investing American Recovery and Reinvestment Act Funds to Advance Capability, Reliability, and Performance in NASA Wind Tunnels

    NASA Technical Reports Server (NTRS)

    Sydnor, Goerge H.

    2010-01-01

    The National Aeronautics and Space Administration's (NASA) Aeronautics Test Program (ATP) is implementing five significant ground-based test facility projects across the nation with funding provided by the American Recovery and Reinvestment Act (ARRA). The projects were selected as the best candidates within the constraints of the ARRA and the strategic plan of ATP. They are a combination of much-needed large scale maintenance, reliability, and system upgrades plus creating new test beds for upcoming research programs. The projects are: 1.) Re-activation of a large compressor to provide a second source for compressed air and vacuum to the Unitary Plan Wind Tunnel at the Ames Research Center (ARC) 2.) Addition of high-altitude ice crystal generation at the Glenn Research Center Propulsion Systems Laboratory Test Cell 3, 3.) New refrigeration system and tunnel heat exchanger for the Icing Research Tunnel at the Glenn Research Center, 4.) Technical viability improvements for the National Transonic Facility at the Langley Research Center, and 5.) Modifications to conduct Environmentally Responsible Aviation and Rotorcraft research at the 14 x 22 Subsonic Tunnel at Langley Research Center. The selection rationale, problem statement, and technical solution summary for each project is given here. The benefits and challenges of the ARRA funded projects are discussed. Indirectly, this opportunity provides the advantages of developing experience in NASA's workforce in large projects and maintaining corporate knowledge in that very unique capability. It is envisioned that improved facilities will attract a larger user base and capabilities that are needed for current and future research efforts will offer revenue growth and future operations stability. Several of the chosen projects will maximize wind tunnel reliability and maintainability by using newer, proven technologies in place of older and obsolete equipment and processes. The projects will meet NASA's goal of

  7. Lightweighting Automotive Materials for Increased Fuel Efficiency and Delivering Advanced Modeling and Simulation Capabilities to U.S. Manufacturers

    SciTech Connect

    Hale, Steve

    2013-09-11

    Abstract The National Center for Manufacturing Sciences (NCMS) worked with the U.S. Department of Energy (DOE), National Energy Technology Laboratory (NETL), to bring together research and development (R&D) collaborations to develop and accelerate the knowledgebase and infrastructure for lightweighting materials and manufacturing processes for their use in structural and applications in the automotive sector. The purpose/importance of this DOE program: • 2016 CAFÉ standards. • Automotive industry technology that shall adopt the insertion of lightweighting material concepts towards manufacturing of production vehicles. • Development and manufacture of advanced research tools for modeling and simulation (M&S) applications to reduce manufacturing and material costs. • U.S. competitiveness that will help drive the development and manufacture of the next generation of materials. NCMS established a focused portfolio of applied R&D projects utilizing lightweighting materials for manufacture into automotive structures and components. Areas that were targeted in this program: • Functionality of new lightweighting materials to meet present safety requirements. • Manufacturability using new lightweighting materials. • Cost reduction for the development and use of new lightweighting materials. The automotive industry’s future continuously evolves through innovation, and lightweight materials are key in achieving a new era of lighter, more efficient vehicles. Lightweight materials are among the technical advances needed to achieve fuel/energy efficiency and reduce carbon dioxide (CO2) emissions: • Establish design criteria methodology to identify the best materials for lightweighting. • Employ state-of-the-art design tools for optimum material development for their specific applications. • Match new manufacturing technology to production volume. • Address new process variability with new production-ready processes.

  8. ENHANCING THE ATOMIC-LEVEL UNDERSTANDING OF CO2 MINERAL SEQUESTRATION MECHANISMS VIA ADVANCED COMPUTATIONAL MODELING

    SciTech Connect

    A.V.G. Chizmeshya; M.J. McKelvy; G.H. Wolf; R.W. Carpenter; D.A. Gormley; J.R. Diefenbacher; R. Marzke

    2006-03-01

    significantly improved our understanding of mineral carbonation. Group members at the Albany Research Center have recently shown that carbonation of olivine and serpentine, which naturally occurs over geological time (i.e., 100,000s of years), can be accelerated to near completion in hours. Further process refinement will require a synergetic science/engineering approach that emphasizes simultaneous investigation of both thermodynamic processes and the detailed microscopic, atomic-level mechanisms that govern carbonation kinetics. Our previously funded Phase I Innovative Concepts project demonstrated the value of advanced quantum-mechanical modeling as a complementary tool in bridging important gaps in our understanding of the atomic/molecular structure and reaction mechanisms that govern CO2 mineral sequestration reaction processes for the model Mg-rich lamellar hydroxide feedstock material Mg(OH)2. In the present simulation project, improved techniques and more efficient computational schemes have allowed us to expand and augment these capabilities and explore more complex Mg-rich, lamellar hydroxide-based feedstock materials, including the serpentine-based minerals. These feedstock materials are being actively investigated due to their wide availability, and low-cost CO2 mineral sequestration potential. Cutting-edge first principles quantum chemical, computational solid-state and materials simulation methodology studies proposed herein, have been strategically integrated with our new DOE supported (ASU-Argonne National Laboratory) project to investigate the mechanisms that govern mineral feedstock heat-treatment and aqueous/fluid-phase serpentine mineral carbonation in situ. This unified, synergetic theoretical and experimental approach has provided a deeper understanding of the key reaction mechanisms than either individual approach can alone. We used ab initio techniques to significantly advance our understanding of atomic-level processes at the solid/solution interface by

  9. ENHANCING THE ATOMIC-LEVEL UNDERSTANDING OF CO2 MINERAL SEQUESTRATION MECHANISMS VIA ADVANCED COMPUTATIONAL MODELING

    SciTech Connect

    A.V.G. Chizmeshya

    2003-12-19

    /NETL managed National Mineral Sequestration Working Group we have already significantly improved our understanding of mineral carbonation. Group members at the Albany Research Center have recently shown that carbonation of olivine and serpentine, which naturally occurs over geological time (i.e., 100,000s of years), can be accelerated to near completion in hours. Further process refinement will require a synergetic science/engineering approach that emphasizes simultaneous investigation of both thermodynamic processes and the detailed microscopic, atomic-level mechanisms that govern carbonation kinetics. Our previously funded Phase I Innovative Concepts project demonstrated the value of advanced quantum-mechanical modeling as a complementary tool in bridging important gaps in our understanding of the atomic/molecular structure and reaction mechanisms that govern CO{sub 2} mineral sequestration reaction processes for the model Mg-rich lamellar hydroxide feedstock material Mg(OH){sub 2}. In the present simulation project, improved techniques and more efficient computational schemes have allowed us to expand and augment these capabilities and explore more complex Mg-rich, lamellar hydroxide-based feedstock materials, including the serpentine-based minerals. These feedstock materials are being actively investigated due to their wide availability, and low-cost CO{sub 2} mineral sequestration potential. Cutting-edge first principles quantum chemical, computational solid-state and materials simulation methodology studies proposed herein, have been strategically integrated with our new DOE supported (ASU-Argonne National Laboratory) project to investigate the mechanisms that govern mineral feedstock heat-treatment and aqueous/fluid-phase serpentine mineral carbonation in situ. This unified, synergetic theoretical and experimental approach will provide a deeper understanding of the key reaction mechanisms than either individual approach can alone. Ab initio techniques will also

  10. ENHANCING THE ATOMIC-LEVEL UNDERSTANDING OF CO2 MINERAL SEQUESTRATION MECHANISMS VIA ADVANCED COMPUTATIONAL MODELING

    SciTech Connect

    A.V.G. Chizmeshya

    2002-12-19

    /NETL managed National Mineral Sequestration Working Group we have already significantly improved our understanding of mineral carbonation. Group members at the Albany Research Center have recently shown that carbonation of olivine and serpentine, which naturally occurs over geological time (i.e., 100,000s of years), can be accelerated to near completion in hours. Further process refinement will require a synergetic science/engineering approach that emphasizes simultaneous investigation of both thermodynamic processes and the detailed microscopic, atomic-level mechanisms that govern carbonation kinetics. Our previously funded Phase I Innovative Concepts project demonstrated the value of advanced quantum-mechanical modeling as a complementary tool in bridging important gaps in our understanding of the atomic/molecular structure and reaction mechanisms that govern CO{sub 2} mineral sequestration reaction processes for the model Mg-rich lamellar hydroxide feedstock material Mg(OH){sub 2}. In the present simulation project, improved techniques and more efficient computational schemes have allowed us to expand and augment these capabilities and explore more complex Mg-rich, lamellar hydroxide-based feedstock materials, including the serpentine-based minerals. These feedstock materials are being actively investigated due to their wide availability, and low-cost CO{sub 2} mineral sequestration potential. Cutting-edge first principles quantum chemical, computational solid-state and materials simulation methodology studies proposed herein, have been strategically integrated with our new DOE supported (ASU-Argonne National Laboratory) project to investigate the mechanisms that govern mineral feedstock heat-treatment and aqueous/fluid-phase serpentine mineral carbonation in situ. This unified, synergetic theoretical and experimental approach will provide a deeper understanding of the key reaction mechanisms than either individual approach can alone. Ab initio techniques will also

  11. Advanced earth observation spacecraft computer-aided design software: Technical, user and programmer guide

    NASA Technical Reports Server (NTRS)

    Farrell, C. E.; Krauze, L. D.

    1983-01-01

    The IDEAS computer of NASA is a tool for interactive preliminary design and analysis of LSS (Large Space System). Nine analysis modules were either modified or created. These modules include the capabilities of automatic model generation, model mass properties calculation, model area calculation, nonkinematic deployment modeling, rigid-body controls analysis, RF performance prediction, subsystem properties definition, and EOS science sensor selection. For each module, a section is provided that contains technical information, user instructions, and programmer documentation.

  12. Computation of Loads on the McDonnell Douglas Advanced Bearingless Rotor

    NASA Technical Reports Server (NTRS)

    Nguyen, Khanh; Lauzon, Dan; Anand, Vaidyanathan

    1994-01-01

    Computed results from UMARC and DART analyses are compared with the blade bending moments and vibratory hub loads data obtained from a full-scale wind tunnel test of the McDonnell Douglas five-bladed advanced bearingless rotor. The 5 per-rev vibratory hub loads data are corrected using results from a dynamic calibration of the rotor balance. The comparison between UMARC computed blade bending moments at different flight conditions are poor to fair, while DART results are fair to good. Using the free wake module, UMARC adequately computes the 5P vibratory hub loads for this rotor, capturing both magnitude and variations with forward speed. DART employs a uniform inflow wake model and does not adequately compute the 5P vibratory hub loads for this rotor.

  13. PREFACE: 16th International workshop on Advanced Computing and Analysis Techniques in physics research (ACAT2014)

    NASA Astrophysics Data System (ADS)

    Fiala, L.; Lokajicek, M.; Tumova, N.

    2015-05-01

    This volume of the IOP Conference Series is dedicated to scientific contributions presented at the 16th International Workshop on Advanced Computing and Analysis Techniques in Physics Research (ACAT 2014), this year the motto was ''bridging disciplines''. The conference took place on September 1-5, 2014, at the Faculty of Civil Engineering, Czech Technical University in Prague, Czech Republic. The 16th edition of ACAT explored the boundaries of computing system architectures, data analysis algorithmics, automatic calculations, and theoretical calculation technologies. It provided a forum for confronting and exchanging ideas among these fields, where new approaches in computing technologies for scientific research were explored and promoted. This year's edition of the workshop brought together over 140 participants from all over the world. The workshop's 16 invited speakers presented key topics on advanced computing and analysis techniques in physics. During the workshop, 60 talks and 40 posters were presented in three tracks: Computing Technology for Physics Research, Data Analysis - Algorithms and Tools, and Computations in Theoretical Physics: Techniques and Methods. The round table enabled discussions on expanding software, knowledge sharing and scientific collaboration in the respective areas. ACAT 2014 was generously sponsored by Western Digital, Brookhaven National Laboratory, Hewlett Packard, DataDirect Networks, M Computers, Bright Computing, Huawei and PDV-Systemhaus. Special appreciations go to the track liaisons Lorenzo Moneta, Axel Naumann and Grigory Rubtsov for their work on the scientific program and the publication preparation. ACAT's IACC would also like to express its gratitude to all referees for their work on making sure the contributions are published in the proceedings. Our thanks extend to the conference liaisons Andrei Kataev and Jerome Lauret who worked with the local contacts and made this conference possible as well as to the program

  14. INL Initial Input to the Mission Need for Advanced Post-Irradiation Examination Capability A Non-Major System Acquisition Project

    SciTech Connect

    Vince Tonc

    2010-04-01

    Consolidated and comprehensive post-irradiation examination (PIE) capabilities will enable the science and engineering understanding needed to develop the innovative nuclear fuels and materials that are critical to the success of the U.S. Department of Energy’s (DOE) Office of Nuclear Energy (NE) programs. Existing PIE capabilities at DOE Laboratories, universities, and in the private sector are widely distributed, largely antiquated, and insufficient to support the long-range mission needs. In addition, DOE’s aging nuclear infrastructure was not designed to accommodate modern, state-of-the-art equipment and instrumentation. Currently, the U.S. does not have the capability to make use of state-of-the-art technology in a remote, hot cell environment to characterize irradiated fuels and materials on the micro, nano, and atomic scale. This “advanced PIE capability” to make use of state-of-the-art scientific instruments in a consolidated nuclear operating environment will enable comprehensive characterization and investigation that is essential for effectively implementing the nuclear fuels and materials development programs in support of achieving the U.S. DOE-NE Mission.

  15. Functional Assessment for Human-Computer Interaction: A Method for Quantifying Physical Functional Capabilities for Information Technology Users

    ERIC Educational Resources Information Center

    Price, Kathleen J.

    2011-01-01

    The use of information technology is a vital part of everyday life, but for a person with functional impairments, technology interaction may be difficult at best. Information technology is commonly designed to meet the needs of a theoretical "normal" user. However, there is no such thing as a "normal" user. A user's capabilities will vary over…

  16. Cluster Computing for Embedded/Real-Time Systems

    NASA Technical Reports Server (NTRS)

    Katz, D.; Kepner, J.

    1999-01-01

    Embedded and real-time systems, like other computing systems, seek to maximize computing power for a given price, and thus can significantly benefit from the advancing capabilities of cluster computing.

  17. Recent Advances in Cardiac Computed Tomography: Dual Energy, Spectral and Molecular CT Imaging

    PubMed Central

    Danad, Ibrahim; Fayad, Zahi A.; Willemink, Martin J.; Min, James K.

    2015-01-01

    Computed tomography (CT) evolved into a powerful diagnostic tool and it is impossible to imagine current clinical practice without CT imaging. Due to its widespread availability, ease of clinical application, superb sensitivity for detection of CAD, and non-invasive nature, CT has become a valuable tool within the armamentarium of the cardiologist. In the last few years, numerous technological advances in CT have occurred—including dual energy CT (DECT), spectral CT and CT-based molecular imaging. By harnessing the advances in technology, cardiac CT has advanced beyond the mere evaluation of coronary stenosis to an imaging modality tool that permits accurate plaque characterization, assessment of myocardial perfusion and even probing of molecular processes that are involved in coronary atherosclerosis. Novel innovations in CT contrast agents and pre-clinical spectral CT devices have paved the way for CT-based molecular imaging. PMID:26068288

  18. Advances in Single-Photon Emission Computed Tomography Hardware and Software.

    PubMed

    Piccinelli, Marina; Garcia, Ernest V

    2016-02-01

    Nuclear imaging techniques remain today's most reliable modality for the assessment and quantification of myocardial perfusion. In recent years, the field has experienced tremendous progress both in terms of dedicated cameras for cardiac applications and software techniques for image reconstruction. The most recent advances in single-photon emission computed tomography hardware and software are reviewed, focusing on how these improvements have resulted in an even more powerful diagnostic tool with reduced injected radiation dose and acquisition time.

  19. Condition monitoring through advanced sensor and computational technology : final report (January 2002 to May 2005).

    SciTech Connect

    Kim, Jung-Taek; Luk, Vincent K.

    2005-05-01

    The overall goal of this joint research project was to develop and demonstrate advanced sensors and computational technology for continuous monitoring of the condition of components, structures, and systems in advanced and next-generation nuclear power plants (NPPs). This project included investigating and adapting several advanced sensor technologies from Korean and US national laboratory research communities, some of which were developed and applied in non-nuclear industries. The project team investigated and developed sophisticated signal processing, noise reduction, and pattern recognition techniques and algorithms. The researchers installed sensors and conducted condition monitoring tests on two test loops, a check valve (an active component) and a piping elbow (a passive component), to demonstrate the feasibility of using advanced sensors and computational technology to achieve the project goal. Acoustic emission (AE) devices, optical fiber sensors, accelerometers, and ultrasonic transducers (UTs) were used to detect mechanical vibratory response of check valve and piping elbow in normal and degraded configurations. Chemical sensors were also installed to monitor the water chemistry in the piping elbow test loop. Analysis results of processed sensor data indicate that it is feasible to differentiate between the normal and degraded (with selected degradation mechanisms) configurations of these two components from the acquired sensor signals, but it is questionable that these methods can reliably identify the level and type of degradation. Additional research and development efforts are needed to refine the differentiation techniques and to reduce the level of uncertainties.

  20. Computer-assisted virtual planning and surgical template fabrication for frontoorbital advancement.

    PubMed

    Soleman, Jehuda; Thieringer, Florian; Beinemann, Joerg; Kunz, Christoph; Guzman, Raphael

    2015-05-01

    OBJECT The authors describe a novel technique using computer-assisted design (CAD) and computed-assisted manufacturing (CAM) for the fabrication of individualized 3D printed surgical templates for frontoorbital advancement surgery. METHODS Two patients underwent frontoorbital advancement surgery for unilateral coronal synostosis. Virtual surgical planning (SurgiCase-CMF, version 5.0, Materialise) was done by virtual mirroring techniques and superposition of an age-matched normative 3D pediatric skull model. Based on these measurements, surgical templates were fabricated using a 3D printer. Bifrontal craniotomy and the osteotomies for the orbital bandeau were performed based on the sterilized 3D templates. The remodeling was then done placing the bone plates within the negative 3D templates and fixing them using absorbable poly-dl-lactic acid plates and screws. RESULTS Both patients exhibited a satisfying head shape postoperatively and at follow-up. No surgery-related complications occurred. The cutting and positioning of the 3D surgical templates proved to be very accurate and easy to use as well as reproducible and efficient. CONCLUSIONS Computer-assisted virtual planning and 3D template fabrication for frontoorbital advancement surgery leads to reconstructions based on standardizedmeasurements, precludes subjective remodeling, and seems to be overall safe and feasible. A larger series of patients with long-term follow-up is needed for further evaluation of this novel technique.

  1. Recent advances in 3D computed tomography techniques for simulation and navigation in hepatobiliary pancreatic surgery.

    PubMed

    Uchida, Masafumi

    2014-04-01

    A few years ago it could take several hours to complete a 3D image using a 3D workstation. Thanks to advances in computer science, obtaining results of interest now requires only a few minutes. Many recent 3D workstations or multimedia computers are equipped with onboard 3D virtual patient modeling software, which enables patient-specific preoperative assessment and virtual planning, navigation, and tool positioning. Although medical 3D imaging can now be conducted using various modalities, including computed tomography (CT), magnetic resonance imaging (MRI), positron emission tomography (PET), and ultrasonography (US) among others, the highest quality images are obtained using CT data, and CT images are now the most commonly used source of data for 3D simulation and navigation image. If the 2D source image is bad, no amount of 3D image manipulation in software will provide a quality 3D image. In this exhibition, the recent advances in CT imaging technique and 3D visualization of the hepatobiliary and pancreatic abnormalities are featured, including scan and image reconstruction technique, contrast-enhanced techniques, new application of advanced CT scan techniques, and new virtual reality simulation and navigation imaging.

  2. Community Petascale Project for Accelerator Science and Simulation: Advancing Computational Science for Future Accelerators and Accelerator Technologies

    SciTech Connect

    Spentzouris, P.; Cary, J.; McInnes, L.C.; Mori, W.; Ng, C.; Ng, E.; Ryne, R.; /LBL, Berkeley

    2011-11-14

    The design and performance optimization of particle accelerators are essential for the success of the DOE scientific program in the next decade. Particle accelerators are very complex systems whose accurate description involves a large number of degrees of freedom and requires the inclusion of many physics processes. Building on the success of the SciDAC-1 Accelerator Science and Technology project, the SciDAC-2 Community Petascale Project for Accelerator Science and Simulation (ComPASS) is developing a comprehensive set of interoperable components for beam dynamics, electromagnetics, electron cooling, and laser/plasma acceleration modelling. ComPASS is providing accelerator scientists the tools required to enable the necessary accelerator simulation paradigm shift from high-fidelity single physics process modeling (covered under SciDAC1) to high-fidelity multiphysics modeling. Our computational frameworks have been used to model the behavior of a large number of accelerators and accelerator R&D experiments, assisting both their design and performance optimization. As parallel computational applications, the ComPASS codes have been shown to make effective use of thousands of processors. ComPASS is in the first year of executing its plan to develop the next-generation HPC accelerator modeling tools. ComPASS aims to develop an integrated simulation environment that will utilize existing and new accelerator physics modules with petascale capabilities, by employing modern computing and solver technologies. The ComPASS vision is to deliver to accelerator scientists a virtual accelerator and virtual prototyping modeling environment, with the necessary multiphysics, multiscale capabilities. The plan for this development includes delivering accelerator modeling applications appropriate for each stage of the ComPASS software evolution. Such applications are already being used to address challenging problems in accelerator design and optimization. The ComPASS organization

  3. TRAC-PF1: an advanced best-estimate computer program for pressurized water reactor analysis

    SciTech Connect

    Liles, D.R.; Mahaffy, J.H.

    1984-02-01

    The Transient Reactor Analysis Code (TRAC) is being developed at the Los Alamos National Laboratory to provide advanced best-estimate predictions of postulated accidents in light water reactors. The TRAC-PF1 program provides this capability for pressurized water reactors and for many thermal-hydraulic experimental facilities. The code features either a one-dimensional or a three-dimensional treatment of the pressure vessel and its associated internals; a two-phase, two-fluid nonequilibrium hydrodynamics model with a noncondensable gas field; flow-regime-dependent constitutive equation treatment; optional reflood tracking capability for both bottom flood and falling-film quench fronts; and consistent treatment of entire accident sequences including the generation of consistent initial conditions. This report describes the thermal-hydraulic models and the numerical solution methods used in the code. Detailed programming and user information also are provided.

  4. Research Institute for Advanced Computer Science: Annual Report October 1998 through September 1999

    NASA Technical Reports Server (NTRS)

    Leiner, Barry M.; Gross, Anthony R. (Technical Monitor)

    1999-01-01

    The Research Institute for Advanced Computer Science (RIACS) carries out basic research and technology development in computer science, in support of the National Aeronautics and Space Administration's missions. RIACS is located at the NASA Ames Research Center (ARC). It currently operates under a multiple year grant/cooperative agreement that began on October 1, 1997 and is up for renewal in the year 2002. ARC has been designated NASA's Center of Excellence in Information Technology. In this capacity, ARC is charged with the responsibility to build an Information Technology Research Program that is preeminent within NASA. RIACS serves as a bridge between NASA ARC and the academic community, and RIACS scientists and visitors work in close collaboration with NASA scientists. RIACS has the additional goal of broadening the base of researchers in these areas of importance to the nation's space and aeronautics enterprises. RIACS research focuses on the three cornerstones of information technology research necessary to meet the future challenges of NASA missions: (1) Automated Reasoning for Autonomous Systems. Techniques are being developed enabling spacecraft that will be self-guiding and self-correcting to the extent that they will require little or no human intervention. Such craft will be equipped to independently solve problems as they arise, and fulfill their missions with minimum direction from Earth. (2) Human-Centered Computing. Many NASA missions require synergy between humans and computers, with sophisticated computational aids amplifying human cognitive and perceptual abilities; (3) High Performance Computing and Networking Advances in the performance of computing and networking continue to have major impact on a variety of NASA endeavors, ranging from modeling and simulation to data analysis of large datasets to collaborative engineering, planning and execution. In addition, RIACS collaborates with NASA scientists to apply information technology research to

  5. Advances in I/O, Speedup, and Universality on Colossus, an Unconventional Computer

    NASA Astrophysics Data System (ADS)

    Wells, Benjamin

    Colossus, the first electronic digital (and very unconventional) computer, was not a stored-program general purpose computer in the modern sense, although there are printed claims to the contrary. At least one of these asserts Colossus was a Turing machine. Certainly, an appropriate Turing machine can simulate the operation of Colossus. That is hardly an argument for generality of computation. But this is: a universal Turing machine could have been implemented on a clustering of the ten Colossus machines installed at Bletchley Park, England, by the end of WWII in 1945. Along with the presentation of this result, several improvements in input, output, and speed, within the hardware capability and specification of Colossus are discussed.

  6. Designing a single board computers for space using the most advanced processor and mitigation technologies

    NASA Astrophysics Data System (ADS)

    Longden, L.; Thibodeau, C.; Hiliman, R.; Layton, P.; Dowd, M.

    2002-12-01

    As high-end computing becomes more of a necessity in space, there currently exists a large gap between what is available to satellite manufacturers and the state of the commercial processor industry. As a result, Maxwell Technologies has developed a Super Computer for Space that utilizes the latest commercial Silicon-on-Insulator PowerPC processors and state-of- the-art memory modules to achieve space-qualified performance that is from 10 to 1000 times that of current technology. In addition, Maxwell's Super Computer for Space (SCS750) SBC is capable of executing up to 1800+ millions of instruction per second (MIPS), while guaranteeing upset rates for the entire board of less then 1 every 1000 years. Presented is a brief synopsis of Maxwell's design approach and radiation mitigation techniques and radiation test results employed on Maxwell's next generation SBC.

  7. Computer-Assisted Instruction in the Context of the Advanced Instructional System: Authoring Support Software. Final Report.

    ERIC Educational Resources Information Center

    Montgomery, Ann D.; Judd, Wilson A.

    This report details the design, development, and implementation of computer software to support the cost-effective production of computer assisted instruction (CAI) within the context of the Advanced Instructional System (AIS) located at Lowry Air Force Base. The report supplements the computer managed Air Force technical training that is…

  8. Meeting the Needs of CALS Students for Computing Capabilities. Final Report of the Ad Hoc Committee on College of Agriculture and Life Sciences Student Computing Competencies.

    ERIC Educational Resources Information Center

    Monk, David; And Others

    The Ad Hoc Committee on the Cornell University (New York) College of Agriculture and Life Sciences (CALS) Student Computing Competencies was appointed in the fall of 1995 to determine (1) what all CALS undergraduate students should know about computing and related technologies; (2) how the college can make it possible for students to develop these…

  9. ADVANCING THE FUNDAMENTAL UNDERSTANDING AND SCALE-UP OF TRISO FUEL COATERS VIA ADVANCED MEASUREMENT AND COMPUTATIONAL TECHNIQUES

    SciTech Connect

    Biswas, Pratim; Al-Dahhan, Muthanna

    2012-11-01

    to advance the fundamental understanding of the hydrodynamics by systematically investigating the effect of design and operating variables, to evaluate the reported dimensionless groups as scaling factors, and to establish a reliable scale-up methodology for the TRISO fuel particle spouted bed coaters based on hydrodynamic similarity via advanced measurement and computational techniques. An additional objective is to develop an on-line non-invasive measurement technique based on gamma ray densitometry (i.e. Nuclear Gauge Densitometry) that can be installed and used for coater process monitoring to ensure proper performance and operation and to facilitate the developed scale-up methodology. To achieve the objectives set for the project, the work will use optical probes and gamma ray computed tomography (CT) (for the measurements of solids/voidage holdup cross-sectional distribution and radial profiles along the bed height, spouted diameter, and fountain height) and radioactive particle tracking (RPT) (for the measurements of the 3D solids flow field, velocity, turbulent parameters, circulation time, solids lagrangian trajectories, and many other of spouted bed related hydrodynamic parameters). In addition, gas dynamic measurement techniques and pressure transducers will be utilized to complement the obtained information. The measurements obtained by these techniques will be used as benchmark data to evaluate and validate the computational fluid dynamic (CFD) models (two fluid model or discrete particle model) and their closures. The validated CFD models and closures will be used to facilitate the developed methodology for scale-up, design and hydrodynamic similarity. Successful execution of this work and the proposed tasks will advance the fundamental understanding of the coater flow field and quantify it for proper and safe design, scale-up, and performance. Such achievements will overcome the barriers to AGR applications and will help assure that the US maintains

  10. PREFACE: 15th International Workshop on Advanced Computing and Analysis Techniques in Physics Research (ACAT2013)

    NASA Astrophysics Data System (ADS)

    Wang, Jianxiong

    2014-06-01

    This volume of Journal of Physics: Conference Series is dedicated to scientific contributions presented at the 15th International Workshop on Advanced Computing and Analysis Techniques in Physics Research (ACAT 2013) which took place on 16-21 May 2013 at the Institute of High Energy Physics, Chinese Academy of Sciences, Beijing, China. The workshop series brings together computer science researchers and practitioners, and researchers from particle physics and related fields to explore and confront the boundaries of computing and of automatic data analysis and theoretical calculation techniques. This year's edition of the workshop brought together over 120 participants from all over the world. 18 invited speakers presented key topics on the universe in computer, Computing in Earth Sciences, multivariate data analysis, automated computation in Quantum Field Theory as well as computing and data analysis challenges in many fields. Over 70 other talks and posters presented state-of-the-art developments in the areas of the workshop's three tracks: Computing Technologies, Data Analysis Algorithms and Tools, and Computational Techniques in Theoretical Physics. The round table discussions on open-source, knowledge sharing and scientific collaboration stimulate us to think over the issue in the respective areas. ACAT 2013 was generously sponsored by the Chinese Academy of Sciences (CAS), National Natural Science Foundation of China (NFSC), Brookhaven National Laboratory in the USA (BNL), Peking University (PKU), Theoretical Physics Cernter for Science facilities of CAS (TPCSF-CAS) and Sugon. We would like to thank all the participants for their scientific contributions and for the en- thusiastic participation in all its activities of the workshop. Further information on ACAT 2013 can be found at http://acat2013.ihep.ac.cn. Professor Jianxiong Wang Institute of High Energy Physics Chinese Academy of Science Details of committees and sponsors are available in the PDF

  11. PREFACE: 14th International Workshop on Advanced Computing and Analysis Techniques in Physics Research (ACAT 2011)

    NASA Astrophysics Data System (ADS)

    Teodorescu, Liliana; Britton, David; Glover, Nigel; Heinrich, Gudrun; Lauret, Jérôme; Naumann, Axel; Speer, Thomas; Teixeira-Dias, Pedro

    2012-06-01

    ACAT2011 This volume of Journal of Physics: Conference Series is dedicated to scientific contributions presented at the 14th International Workshop on Advanced Computing and Analysis Techniques in Physics Research (ACAT 2011) which took place on 5-7 September 2011 at Brunel University, UK. The workshop series, which began in 1990 in Lyon, France, brings together computer science researchers and practitioners, and researchers from particle physics and related fields in order to explore and confront the boundaries of computing and of automatic data analysis and theoretical calculation techniques. It is a forum for the exchange of ideas among the fields, exploring and promoting cutting-edge computing, data analysis and theoretical calculation techniques in fundamental physics research. This year's edition of the workshop brought together over 100 participants from all over the world. 14 invited speakers presented key topics on computing ecosystems, cloud computing, multivariate data analysis, symbolic and automatic theoretical calculations as well as computing and data analysis challenges in astrophysics, bioinformatics and musicology. Over 80 other talks and posters presented state-of-the art developments in the areas of the workshop's three tracks: Computing Technologies, Data Analysis Algorithms and Tools, and Computational Techniques in Theoretical Physics. Panel and round table discussions on data management and multivariate data analysis uncovered new ideas and collaboration opportunities in the respective areas. This edition of ACAT was generously sponsored by the Science and Technology Facility Council (STFC), the Institute for Particle Physics Phenomenology (IPPP) at Durham University, Brookhaven National Laboratory in the USA and Dell. We would like to thank all the participants of the workshop for the high level of their scientific contributions and for the enthusiastic participation in all its activities which were, ultimately, the key factors in the

  12. A Concept for the Inclusion of Analytical and Computational Capability in Existing Systems for Measurement of Neutron Flux

    NASA Technical Reports Server (NTRS)

    Patrick, Clinton; Cooper, Anita E.; Powers, W. T.

    2005-01-01

    For approximately two decades, efforts have been sponsored by NASA's Marshall Space Flight Center to make possible high-speed, automated classification and quantification of constituent materials in various harsh environments. MSFC, along with the Air Force/Arnold Engineering Development Center, has led the work, developing and implementing systems that employ principles of emission and absorption spectroscopy to monitor molecular and atomic particulates in gas plasma of rocket engine flow fields. One such system identifies species and quantifies mass loss rates in H2/O2 rocket plumes. Other gases have been examined and the physics of their detection under numerous conditions were made a part of the knowledge base for the MSFC/USAF team. Additionally, efforts are being advanced to hardware encode components of the data analysis tools in order to address real-time operational requirements for health monitoring and management. NASA has a significant investment in these systems, warranting a spiral approach that meshes current tools and experience with technological advancements. This paper addresses current systems - the Optical Plume Anomaly Detector (OPAD) and the Engine Diagnostic Filtering System (EDIFIS) - and discusses what is considered a natural progression: a concept for migrating them towards detection of high energy particles, including neutrons and gamma rays. The proposal outlines system development to date, basic concepts for future advancements, and recommendations for accomplishing them.

  13. Advances and trends in the development of computational models for tires

    NASA Technical Reports Server (NTRS)

    Noor, A. K.; Tanner, J. A.

    1985-01-01

    Status and some recent developments of computational models for tires are summarized. Discussion focuses on a number of aspects of tire modeling and analysis including: tire materials and their characterization; evolution of tire models; characteristics of effective finite element models for analyzing tires; analysis needs for tires; and impact of the advances made in finite element technology, computational algorithms, and new computing systems on tire modeling and analysis. An initial set of benchmark problems has been proposed in concert with the U.S. tire industry. Extensive sets of experimental data will be collected for these problems and used for evaluating and validating different tire models. Also, the new Aircraft Landing Dynamics Facility (ALDF) at NASA Langley Research Center is described.

  14. User's Guide for Subroutine FFORM. Physical Processes in Terrestrial and Aquatic Ecosystems, Computer Programs and Graphics Capabilities.

    ERIC Educational Resources Information Center

    Gales, Larry; Anderson, Lougenia

    These materials were designed to be used by life science students for instruction in the application of physical theory to ecosystem operation. Most modules contain computer programs which are built around a particular application of a physical process. FFORM is a portable format-free input subroutine package which simplifies the input of values…

  15. User's Guide for Subroutine PLOT3D. Physical Processes in Terrestrial and Aquatic Ecosystems, Computer Programs and Graphics Capabilities.

    ERIC Educational Resources Information Center

    Gales, Larry

    This module is part of a series designed to be used by life science students for instruction in the application of physical theory to ecosystem operation. Most modules contain computer programs which are built around a particular application of a physical process. PLOT3D is a subroutine package which generates a variety of three dimensional hidden…

  16. Programmer's Guide for Subroutine PLOT3D. Physical Processes in Terrestrial and Aquatic Ecosystems, Computer Programs and Graphics Capabilities.

    ERIC Educational Resources Information Center

    Gales, Larry

    This module is part of a series designed to be used by life science students for instruction in the application of physical theory to ecosystem operation. Most modules contain computer programs which are built around a particular application of a physical process. PLOT3D is a subroutine package which generates a variety of three-dimensional hidden…

  17. User's Guide for Subroutine PRNT3D. Physical Processes in Terrestrial and Aquatic Ecosystems, Computer Programs and Graphics Capabilities.

    ERIC Educational Resources Information Center

    Gales, Larry

    These materials were designed to be used by life science students for instruction in the application of physical theory to ecosystem operation. Most modules contain computer programs which are built around a particular application of a physical process. PRNT3D is a subroutine package which generates a variety of printer plot displays. The displays…

  18. Programmer's Guide for Subroutine PRNT3D. Physical Processes in Terrestrial and Aquatic Ecosystems, Computer Programs and Graphics Capabilities.

    ERIC Educational Resources Information Center

    Gales, Larry

    These materials were designed to be used by life science students for instruction in the application of physical theory to ecosystem operation. Most modules contain computer programs which are built around a particular application of a physical process. PRNT3D is a subroutine package which generates a variety of printed plot displays. The displays…

  19. Regional Arctic System Model (RASM): A Tool to Address the U.S. Priorities and Advance Capabilities for Arctic Climate Modeling and Prediction

    NASA Astrophysics Data System (ADS)

    Maslowski, W.; Roberts, A.; Cassano, J. J.; Gutowski, W. J., Jr.; Nijssen, B.; Osinski, R.; Zeng, X.; Brunke, M.; Duvivier, A.; Hamman, J.; Hossainzadeh, S.; Hughes, M.; Seefeldt, M. W.

    2015-12-01

    The Arctic is undergoing some of the most coordinated rapid climatic changes currently occurring anywhere on Earth, including the retreat of the perennial sea ice cover, which integrates forcing by, exchanges with and feedbacks between atmosphere, ocean and land. While historical reconstructions from Earth System Models (ESMs) are in broad agreement with these changes, the rate of change in ESMs generally remains outpaced by observations. Reasons for that relate to a combination of coarse resolution, inadequate parameterizations, under-represented processes and a limited knowledge of physical interactions. We demonstrate the capability of the Regional Arctic System Model (RASM) in addressing some of the ESM limitations in simulating observed variability and trends in arctic surface climate. RASM is a high resolution, pan-Arctic coupled climate model with the sea ice and ocean model components configured at an eddy-permitting resolution of 1/12o and the atmosphere and land hydrology model components at 50 km resolution, which are all coupled at 20-minute intervals. RASM is an example of limited-area, process-resolving, fully coupled ESM, which due to the constraints from boundary conditions facilitates detailed comparisons with observational statistics that are not possible with ESMs. The overall goal of RASM is to address key requirements published in the Navy Arctic Roadmap: 2014-2030 and in the Implementation Plan for the National Strategy for the Arctic Region, regarding the need for advanced modeling capabilities for operational forecasting and strategic climate predictions through 2030. The main science objectives of RASM are to advance understanding and model representation of critical physical processes and feedbacks of importance to sea ice thickness and area distribution. RASM results are presented to quantify relative contributions by (i) resolved processes and feedbacks as well as (ii) sensitivity to space dependent sub-grid parameterizations to better

  20. Graphical Visualization of Human Exploration Capabilities

    NASA Technical Reports Server (NTRS)

    Rodgers, Erica M.; Williams-Byrd, Julie; Arney, Dale C.; Simon, Matthew A.; Williams, Phillip A.; Barsoum, Christopher; Cowan, Tyler; Larman, Kevin T.; Hay, Jason; Burg, Alex

    2016-01-01

    NASA's pioneering space strategy will require advanced capabilities to expand the boundaries of human exploration on the Journey to Mars (J2M). The Evolvable Mars Campaign (EMC) architecture serves as a framework to identify critical capabilities that need to be developed and tested in order to enable a range of human exploration destinations and missions. Agency-wide System Maturation Teams (SMT) are responsible for the maturation of these critical exploration capabilities and help formulate, guide and resolve performance gaps associated with the EMC-identified capabilities. Systems Capability Organization Reporting Engine boards (SCOREboards) were developed to integrate the SMT data sets into cohesive human exploration capability stories that can be used to promote dialog and communicate NASA's exploration investments. Each SCOREboard provides a graphical visualization of SMT capability development needs that enable exploration missions, and presents a comprehensive overview of data that outlines a roadmap of system maturation needs critical for the J2M. SCOREboards are generated by a computer program that extracts data from a main repository, sorts the data based on a tiered data reduction structure, and then plots the data according to specified user inputs. The ability to sort and plot varying data categories provides the flexibility to present specific SCOREboard capability roadmaps based on customer requests. This paper presents the development of the SCOREboard computer program and shows multiple complementary, yet different datasets through a unified format designed to facilitate comparison between datasets. Example SCOREboard capability roadmaps are presented followed by a discussion of how the roadmaps are used to: 1) communicate capability developments and readiness of systems for future missions, and 2) influence the definition of NASA's human exploration investment portfolio through capability-driven processes. The paper concludes with a description

  1. Analysis of the confluence of three patterns using the Centering and Pointing System (CAPS) images for the Advanced Radiographic Capability (ARC) at the National Ignition Facility

    NASA Astrophysics Data System (ADS)

    Leach, Richard R.; Awwal, Abdul; Bliss, Erlan; Roberts, Randy; Rushford, Michael; Wilhelmsen, Karl; Zobrist, Thomas

    2014-09-01

    The Advance Radiographic Capability (ARC) at the National Ignition Facility (NIF) is a laser system that employs up to four petawatt (PW) lasers to produce a sequence of short pulses that generate X-rays which backlight highdensity internal confinement fusion (ICF) targets. Employing up to eight backlighters, ARC can produce an X-ray "motion picture" to diagnose the compression and ignition of a cryogenic deuterium-tritium target with tens-ofpicosecond temporal resolution during the critical phases of an ICF shot. Multi-frame, hard-X-ray radiography of imploding NIF capsules is a capability which is critical to the success of NIF's missions. The function of the Centering and Pointing System (CAPS) in ARC is to provide superimposed near-field and far-field images on a common optical path. The Images are then analyzed to extract beam centering and pointing data for the control system. The images contain the confluence of pointing, centering, and reference patterns. The patterns may have uneven illumination, particularly when the laser is misaligned. In addition, the simultaneous appearance of three reference patterns may be co-incidental, possibly masking one or more of the patterns. Image analysis algorithms have been developed to determine the centering and pointing position of ARC from these images. In the paper we describe the image analysis algorithms used to detect and identify the centers of these patterns. Results are provided, illustrating how well the process meets system requirements.

  2. Development of Parallel Computing Framework to Enhance Radiation Transport Code Capabilities for Rare Isotope Beam Facility Design

    SciTech Connect

    Kostin, Mikhail; Mokhov, Nikolai; Niita, Koji

    2013-09-25

    A parallel computing framework has been developed to use with general-purpose radiation transport codes. The framework was implemented as a C++ module that uses MPI for message passing. It is intended to be used with older radiation transport codes implemented in Fortran77, Fortran 90 or C. The module is significantly independent of radiation transport codes it can be used with, and is connected to the codes by means of a number of interface functions. The framework was developed and tested in conjunction with the MARS15 code. It is possible to use it with other codes such as PHITS, FLUKA and MCNP after certain adjustments. Besides the parallel computing functionality, the framework offers a checkpoint facility that allows restarting calculations with a saved checkpoint file. The checkpoint facility can be used in single process calculations as well as in the parallel regime. The framework corrects some of the known problems with the scheduling and load balancing found in the original implementations of the parallel computing functionality in MARS15 and PHITS. The framework can be used efficiently on homogeneous systems and networks of workstations, where the interference from the other users is possible.

  3. An expanded framework for the advanced computational testing and simulation toolkit

    SciTech Connect

    Marques, Osni A.; Drummond, Leroy A.

    2003-11-09

    The Advanced Computational Testing and Simulation (ACTS) Toolkit is a set of computational tools developed primarily at DOE laboratories and is aimed at simplifying the solution of common and important computational problems. The use of the tools reduces the development time for new codes and the tools provide functionality that might not otherwise be available. This document outlines an agenda for expanding the scope of the ACTS Project based on lessons learned from current activities. Highlights of this agenda include peer-reviewed certification of new tools; finding tools to solve problems that are not currently addressed by the Toolkit; working in collaboration with other software initiatives and DOE computer facilities; expanding outreach efforts; promoting interoperability, further development of the tools; and improving functionality of the ACTS Information Center, among other tasks. The ultimate goal is to make the ACTS tools more widely used and more effective in solving DOE's and the nation's scientific problems through the creation of a reliable software infrastructure for scientific computing.

  4. Computational fluid dynamic study on obstructive sleep apnea syndrome treated with maxillomandibular advancement.

    PubMed

    Yu, Chung-Chih; Hsiao, Hung-Da; Lee, Lung-Cheng; Yao, Chih-Min; Chen, Ning-Hung; Wang, Chau-Jan; Chen, Yu-Ray

    2009-03-01

    Maxillomandibular advancement is one of the treatments available for obstructive sleep apnea. The influence of this surgery on the upper airway and its mechanism are not fully understood. The present research simulates the flow fields of narrowed upper airways of 2 patients with obstructive sleep apnea treated with maxillomandibular advancement. The geometry of the upper airway was reconstructed from computed tomographic images taken before and after surgery. The consequent three-dimensional surface model was rendered for measurement and computational fluid dynamics simulation. Patients showed clinical improvement 6 months after surgery. The cross-sectional area of the narrowest part of the upper airway was increased in all dimensions. The simulated results showed a less constricted upper airway, with less velocity change and a decreased pressure gradient across the whole conduit during passage of air. Less breathing effort is therefore expected to achieve equivalent ventilation with the postoperative airway. This study demonstrates the possibility of computational fluid dynamics in providing information for understanding the pathogenesis of OSA and the effects of its treatment.

  5. Present capabilities and future requirements for computer-aided geometric modeling in the design and manufacture of gas turbine

    NASA Technical Reports Server (NTRS)

    Caille, E.; Propen, M.; Hoffman, A.

    1984-01-01

    Gas turbine engine design requires the ability to rapidly develop complex structures which are subject to severe thermal and mechanical operating loads. As in all facets of the aerospace industry, engine designs are constantly driving towards increased performance, higher temperatures, higher speeds, and lower weight. The ability to address such requirements in a relatively short time frame has resulted in a major thrust towards integrated design/analysis/manufacturing systems. These computer driven graphics systems represent a unique challenge, with major payback opportunities if properly conceived, implemented, and applied.

  6. Computational aerodynamics and design

    NASA Technical Reports Server (NTRS)

    Ballhaus, W. F., Jr.

    1982-01-01

    The role of computational aerodynamics in design is reviewed with attention given to the design process; the proper role of computations; the importance of calibration, interpretation, and verification; the usefulness of a given computational capability; and the marketing of new codes. Examples of computational aerodynamics in design are given with particular emphasis on the Highly Maneuverable Aircraft Technology. Finally, future prospects are noted, with consideration given to the role of advanced computers, advances in numerical solution techniques, turbulence models, complex geometries, and computational design procedures. Previously announced in STAR as N82-33348

  7. Advanced computer techniques for inverse modeling of electric current in cardiac tissue

    SciTech Connect

    Hutchinson, S.A.; Romero, L.A.; Diegert, C.F.

    1996-08-01

    For many years, ECG`s and vector cardiograms have been the tools of choice for non-invasive diagnosis of cardiac conduction problems, such as found in reentrant tachycardia or Wolff-Parkinson-White (WPW) syndrome. Through skillful analysis of these skin-surface measurements of cardiac generated electric currents, a physician can deduce the general location of heart conduction irregularities. Using a combination of high-fidelity geometry modeling, advanced mathematical algorithms and massively parallel computing, Sandia`s approach would provide much more accurate information and thus allow the physician to pinpoint the source of an arrhythmia or abnormal conduction pathway.

  8. Advanced Imaging of Athletes: Added Value of Coronary Computed Tomography and Cardiac Magnetic Resonance Imaging.

    PubMed

    Martinez, Matthew W

    2015-07-01

    Cardiac magnetic resonance imaging and cardiac computed tomographic angiography have become important parts of the armamentarium for noninvasive diagnosis of cardiovascular disease. Emerging technologies have produced faster imaging, lower radiation dose, improved spatial and temporal resolution, as well as a wealth of prognostic data to support usage. Investigating true pathologic disease as well as distinguishing normal from potentially dangerous is now increasingly more routine for the cardiologist in practice. This article investigates how advanced imaging technologies can assist the clinician when evaluating all athletes for pathologic disease that may put them at risk.

  9. DOE Advanced Scientific Computing Advisory Committee (ASCAC) Subcommittee Report on Scientific and Technical Information

    SciTech Connect

    Hey, Tony; Agarwal, Deborah; Borgman, Christine; Cartaro, Concetta; Crivelli, Silvia; Van Dam, Kerstin Kleese; Luce, Richard; Arjun, Shankar; Trefethen, Anne; Wade, Alex; Williams, Dean

    2015-09-04

    The Advanced Scientific Computing Advisory Committee (ASCAC) was charged to form a standing subcommittee to review the Department of Energy’s Office of Scientific and Technical Information (OSTI) and to begin by assessing the quality and effectiveness of OSTI’s recent and current products and services and to comment on its mission and future directions in the rapidly changing environment for scientific publication and data. The Committee met with OSTI staff and reviewed available products, services and other materials. This report summaries their initial findings and recommendations.

  10. Cardiovascular proteomics in the era of big data: experimental and computational advances.

    PubMed

    Lam, Maggie P Y; Lau, Edward; Ng, Dominic C M; Wang, Ding; Ping, Peipei

    2016-01-01

    Proteomics plays an increasingly important role in our quest to understand cardiovascular biology. Fueled by analytical and computational advances in the past decade, proteomics applications can now go beyond merely inventorying protein species, and address sophisticated questions on cardiac physiology. The advent of massive mass spectrometry datasets has in turn led to increasing intersection between proteomics and big data science. Here we review new frontiers in technological developments and their applications to cardiovascular medicine. The impact of big data science on cardiovascular proteomics investigations and translation to medicine is highlighted.

  11. Computational Models of Exercise on the Advanced Resistance Exercise Device (ARED)

    NASA Technical Reports Server (NTRS)

    Newby, Nate; Caldwell, Erin; Scott-Pandorf, Melissa; Peters,Brian; Fincke, Renita; DeWitt, John; Poutz-Snyder, Lori

    2011-01-01

    Muscle and bone loss remain a concern for crew returning from space flight. The advanced resistance exercise device (ARED) is used for on-orbit resistance exercise to help mitigate these losses. However, characterization of how the ARED loads the body in microgravity has yet to be determined. Computational models allow us to analyze ARED exercise in both 1G and 0G environments. To this end, biomechanical models of the squat, single-leg squat, and deadlift exercise on the ARED have been developed to further investigate bone and muscle forces resulting from the exercises.

  12. Future Computer Requirements for Computational Aerodynamics

    NASA Technical Reports Server (NTRS)

    1978-01-01

    Recent advances in computational aerodynamics are discussed as well as motivations for and potential benefits of a National Aerodynamic Simulation Facility having the capability to solve fluid dynamic equations at speeds two to three orders of magnitude faster than presently possible with general computers. Two contracted efforts to define processor architectures for such a facility are summarized.

  13. Advanced manned space flight simulation and training: An investigation of simulation host computer system concepts

    NASA Technical Reports Server (NTRS)

    Montag, Bruce C.; Bishop, Alfred M.; Redfield, Joe B.

    1989-01-01

    The findings of a preliminary investigation by Southwest Research Institute (SwRI) in simulation host computer concepts is presented. It is designed to aid NASA in evaluating simulation technologies for use in spaceflight training. The focus of the investigation is on the next generation of space simulation systems that will be utilized in training personnel for Space Station Freedom operations. SwRI concludes that NASA should pursue a distributed simulation host computer system architecture for the Space Station Training Facility (SSTF) rather than a centralized mainframe based arrangement. A distributed system offers many advantages and is seen by SwRI as the only architecture that will allow NASA to achieve established functional goals and operational objectives over the life of the Space Station Freedom program. Several distributed, parallel computing systems are available today that offer real-time capabilities for time critical, man-in-the-loop simulation. These systems are flexible in terms of connectivity and configurability, and are easily scaled to meet increasing demands for more computing power.

  14. High performance computing and communications: Advancing the frontiers of information technology

    SciTech Connect

    1997-12-31

    This report, which supplements the President`s Fiscal Year 1997 Budget, describes the interagency High Performance Computing and Communications (HPCC) Program. The HPCC Program will celebrate its fifth anniversary in October 1996 with an impressive array of accomplishments to its credit. Over its five-year history, the HPCC Program has focused on developing high performance computing and communications technologies that can be applied to computation-intensive applications. Major highlights for FY 1996: (1) High performance computing systems enable practical solutions to complex problems with accuracies not possible five years ago; (2) HPCC-funded research in very large scale networking techniques has been instrumental in the evolution of the Internet, which continues exponential growth in size, speed, and availability of information; (3) The combination of hardware capability measured in gigaflop/s, networking technology measured in gigabit/s, and new computational science techniques for modeling phenomena has demonstrated that very large scale accurate scientific calculations can be executed across heterogeneous parallel processing systems located thousands of miles apart; (4) Federal investments in HPCC software R and D support researchers who pioneered the development of parallel languages and compilers, high performance mathematical, engineering, and scientific libraries, and software tools--technologies that allow scientists to use powerful parallel systems to focus on Federal agency mission applications; and (5) HPCC support for virtual environments has enabled the development of immersive technologies, where researchers can explore and manipulate multi-dimensional scientific and engineering problems. Educational programs fostered by the HPCC Program have brought into classrooms new science and engineering curricula designed to teach computational science. This document contains a small sample of the significant HPCC Program accomplishments in FY 1996.

  15. Integrated Computational Materials Engineering (ICME) for Third Generation Advanced High-Strength Steel Development

    SciTech Connect

    Savic, Vesna; Hector, Louis G.; Ezzat, Hesham; Sachdev, Anil K.; Quinn, James; Krupitzer, Ronald; Sun, Xin

    2015-06-01

    This paper presents an overview of a four-year project focused on development of an integrated computational materials engineering (ICME) toolset for third generation advanced high-strength steels (3GAHSS). Following a brief look at ICME as an emerging discipline within the Materials Genome Initiative, technical tasks in the ICME project will be discussed. Specific aims of the individual tasks are multi-scale, microstructure-based material model development using state-of-the-art computational and experimental techniques, forming, toolset assembly, design optimization, integration and technical cost modeling. The integrated approach is initially illustrated using a 980 grade transformation induced plasticity (TRIP) steel, subject to a two-step quenching and partitioning (Q&P) heat treatment, as an example.

  16. Study to define an approach for developing a computer-based system capable of automatic, unattended assembly/disassembly of spacecraft, phase 1

    NASA Technical Reports Server (NTRS)

    Nevins, J. L.; Defazio, T. L.; Seltzer, D. S.; Whitney, D. E.

    1981-01-01

    The initial set of requirements for additional studies necessary to implement a space-borne, computer-based work system capable of achieving assembly, disassembly, repair, or maintenance in space were developed. The specific functions required of a work system to perform repair and maintenance were discussed. Tasks and relevant technologies were identified and delineated. The interaction of spacecraft design and technology options, including a consideration of the strategic issues of repair versus retrieval-replacement or destruction by removal were considered along with the design tradeoffs for accomplishing each of the options. A concept system design and its accompanying experiment or test plan were discussed.

  17. Displaying Computer Simulations Of Physical Phenomena

    NASA Technical Reports Server (NTRS)

    Watson, Val

    1991-01-01

    Paper discusses computer simulation as means of experiencing and learning to understand physical phenomena. Covers both present simulation capabilities and major advances expected in near future. Visual, aural, tactile, and kinesthetic effects used to teach such physical sciences as dynamics of fluids. Recommends classrooms in universities, government, and industry be linked to advanced computing centers so computer simulations integrated into education process.

  18. Multi-scale 3D X-ray Imaging Capabilities at the Advanced Photon Source - Current status and future direction (Invited)

    NASA Astrophysics Data System (ADS)

    DeCarlo, F.; Xiao, X.; Khan, F.; Glowacki, A.; Schwarz, N.; Jacobsen, C.

    2013-12-01

    In x-ray computed μ-tomography (μ-XCT), a thin scintillator screen is coupled to a visible light lens and camera system to obtain micrometer-scale transmission imaging of specimens as large as a few millimeters. Recent advances in detector technology allow collecting these images at unprecedented frame rates. For a high x-ray flux density synchrotron facility like the Advanced Photon Source (APS), the detector exposure time ranges from hundreds of milliseconds to hundreds of picoseconds, making possible to acquire a full 3D micrometer-resolution dataset in less than one second. The micron resolution limitation of parallel x-ray beam projection systems can be overcame by Transmission X-ray Microscopes (TXM) where part of the image magnification is done in x-ray regime using x-ray optics like capillary condensers and Fresnel zone plates. These systems, when installed on a synchrotron x-ray source, can generate 2D images with up to 20 nm resolution with second exposure time and collect a full 3D nano-resolution dataset in few minutes. μ-XCT and TXM systems available at the x-ray imaging beamlines of the APS are routinely used in material science and geoscience applications where high-resolution and fast 3D imaging are instrumental in extracting in situ four-dimensional dynamic information. In this presentation we describe the computational challenges associated with μ-XCT and TXM systems and present the framework and infrastructure developed at the APS to allow for routine multi-scale data integration between the two systems.

  19. Multi-scale 3D X-ray Imaging Capabilities at the Advanced Photon Source - Current status and future direction (Invited)

    NASA Astrophysics Data System (ADS)

    DeCarlo, F.; Xiao, X.; Khan, F.; Glowacki, A.; Schwarz, N.; Jacobsen, C.

    2011-12-01

    In x-ray computed μ-tomography (μ-XCT), a thin scintillator screen is coupled to a visible light lens and camera system to obtain micrometer-scale transmission imaging of specimens as large as a few millimeters. Recent advances in detector technology allow collecting these images at unprecedented frame rates. For a high x-ray flux density synchrotron facility like the Advanced Photon Source (APS), the detector exposure time ranges from hundreds of milliseconds to hundreds of picoseconds, making possible to acquire a full 3D micrometer-resolution dataset in less than one second. The micron resolution limitation of parallel x-ray beam projection systems can be overcame by Transmission X-ray Microscopes (TXM) where part of the image magnification is done in x-ray regime using x-ray optics like capillary condensers and Fresnel zone plates. These systems, when installed on a synchrotron x-ray source, can generate 2D images with up to 20 nm resolution with second exposure time and collect a full 3D nano-resolution dataset in few minutes. μ-XCT and TXM systems available at the x-ray imaging beamlines of the APS are routinely used in material science and geoscience applications where high-resolution and fast 3D imaging are instrumental in extracting in situ four-dimensional dynamic information. In this presentation we describe the computational challenges associated with μ-XCT and TXM systems and present the framework and infrastructure developed at the APS to allow for routine multi-scale data integration between the two systems.

  20. A fission matrix based validation protocol for computed power distributions in the advanced test reactor

    SciTech Connect

    Nielsen, J. W.; Nigg, D. W.; LaPorta, A. W.

    2013-07-01

    The Idaho National Laboratory (INL) has been engaged in a significant multi year effort to modernize the computational reactor physics tools and validation procedures used to support operations of the Advanced Test Reactor (ATR) and its companion critical facility (ATRC). Several new protocols for validation of computed neutron flux distributions and spectra as well as for validation of computed fission power distributions, based on new experiments and well-recognized least-squares statistical analysis techniques, have been under development. In the case of power distributions, estimates of the a priori ATR-specific fuel element-to-element fission power correlation and covariance matrices are required for validation analysis. A practical method for generating these matrices using the element-to-element fission matrix is presented, along with a high-order scheme for estimating the underlying fission matrix itself. The proposed methodology is illustrated using the MCNP5 neutron transport code for the required neutronics calculations. The general approach is readily adaptable for implementation using any multidimensional stochastic or deterministic transport code that offers the required level of spatial, angular, and energy resolution in the computed solution for the neutron flux and fission source. (authors)

  1. Sensing with Advanced Computing Technology: Fin Field-Effect Transistors with High-k Gate Stack on Bulk Silicon.

    PubMed

    Rigante, Sara; Scarbolo, Paolo; Wipf, Mathias; Stoop, Ralph L; Bedner, Kristine; Buitrago, Elizabeth; Bazigos, Antonios; Bouvet, Didier; Calame, Michel; Schönenberger, Christian; Ionescu, Adrian M

    2015-05-26

    Field-effect transistors (FETs) form an established technology for sensing applications. However, recent advancements and use of high-performance multigate metal-oxide semiconductor FETs (double-gate, FinFET, trigate, gate-all-around) in computing technology, instead of bulk MOSFETs, raise new opportunities and questions about the most suitable device architectures for sensing integrated circuits. In this work, we propose pH and ion sensors exploiting FinFETs fabricated on bulk silicon by a fully CMOS compatible approach, as an alternative to the widely investigated silicon nanowires on silicon-on-insulator substrates. We also provide an analytical insight of the concept of sensitivity for the electronic integration of sensors. N-channel fully depleted FinFETs with critical dimensions on the order of 20 nm and HfO2 as a high-k gate insulator have been developed and characterized, showing excellent electrical properties, subthreshold swing, SS ∼ 70 mV/dec, and on-to-off current ratio, Ion/Ioff ∼ 10(6), at room temperature. The same FinFET architecture is validated as a highly sensitive, stable, and reproducible pH sensor. An intrinsic sensitivity close to the Nernst limit, S = 57 mV/pH, is achieved. The pH response in terms of output current reaches Sout = 60%. Long-term measurements have been performed over 4.5 days with a resulting drift in time δVth/δt = 0.10 mV/h. Finally, we show the capability to reproduce experimental data with an extended three-dimensional commercial finite element analysis simulator, in both dry and wet environments, which is useful for future advanced sensor design and optimization.

  2. Comparison of computing capability and information system abilities of state hospitals owned by Ministry of Labor and Social Security and Ministry of Health.

    PubMed

    Tengilimoğlu, Dilaver; Celik, Yusuf; Ulgü, Mahir

    2006-08-01

    The main purpose of this study is to give an idea to the readers about how big and important the computing and information problems that hospital managers as well as policy makers will face with after collecting the Ministry of Labor and Social Security (MoLSS) and Ministry of Health (MoH) hospitals under single structure in Turkey by comparing the current level of computing capability of hospitals owned by two ministries. The data used in this study were obtained from 729 hospitals that belong to both ministries by using a data collection tool. The results indicate that there have been considerable differences among the hospitals owned by the two ministries in terms of human resources and information systems. The hospital managers and decision makers making their decisions based on the data produced by current hospital information system (HIS) would more likely face very important difficulties after merging MoH and MoLSS hospitals in Turkey. It is also possible to claim that the level and adequacy of computing abilities and devices do not allow the managers of public hospitals to use computer technology effectively in their information management practices. Lack of technical information, undeveloped information culture, inappropriate management styles, and being inexperienced are the main reasons of why HIS does not run properly and effectively in Turkish hospitals.

  3. Advanced computational sensors technology: testing and evaluation in visible, SWIR, and LWIR imaging

    NASA Astrophysics Data System (ADS)

    Rizk, Charbel G.; Wilson, John P.; Pouliquen, Philippe

    2015-05-01

    The Advanced Computational Sensors Team at the Johns Hopkins University Applied Physics Laboratory and the Johns Hopkins University Department of Electrical and Computer Engineering has been developing advanced readout integrated circuit (ROIC) technology for more than 10 years with a particular focus on the key challenges of dynamic range, sampling rate, system interface and bandwidth, and detector materials or band dependencies. Because the pixel array offers parallel sampling by default, the team successfully demonstrated that adding smarts in the pixel and the chip can increase performance significantly. Each pixel becomes a smart sensor and can operate independently in collecting, processing, and sharing data. In addition, building on the digital circuit revolution, the effective well size can be increased by orders of magnitude within the same pixel pitch over analog designs. This research has yielded an innovative class of a system-on-chip concept: the Flexible Readout and Integration Sensor (FRIS) architecture. All key parameters are programmable and/or can be adjusted dynamically, and this architecture can potentially be sensor and application agnostic. This paper reports on the testing and evaluation of one prototype that can support either detector polarity and includes sample results with visible, short-wavelength infrared (SWIR), and long-wavelength infrared (LWIR) imaging.

  4. Advances in physiologic lung assessment via electron beam computed tomography (EBCT)

    NASA Astrophysics Data System (ADS)

    Hoffman, Eric A.

    1999-09-01

    Lung function has been evaluated in both health and disease states by techniques, such as pulmonary function tests, which generally study aggregate function. These decades old modalities have yielded a valuable understanding of global physiologic and pathophysiologic structure-to-function relationships. However, such approaches have reached their limits. They cannot meet the current and anticipated needs of new surgical and pharmaceutical treatments. 4-D CT can provide insights into regional lung function (ventilation and blood flow) and thus can provide information at an early stage of disease when intervention will have the greatest impact. Lung CT over the last decade has helped with further defining anatomic features in disease, but has lagged behind advances on the cellular and molecular front largely because of the failure to account for functional correlates to structural pathology. Commercially available CT scanners are now capable of volumetric data acquisition in a breath-hold and capable of multi-level slice acquisitions of the heart and lungs with a per slice scan aperture of 50 - 300 msec, allowing for regional blood flow measurements. Static, volumetric imaging of the lung is inadequate in that much of lung pathology is a dynamic phenomenon and, thus, is only detectable if the lung is imaged as air and blood are flowing. This paper review the methodologies and early physiologic findings associated with our measures of lung tissue properties coupled with regional ventilation and perfusion.

  5. Application of advanced grid generation techniques for flow field computations about complex configurations

    NASA Technical Reports Server (NTRS)

    Kathong, Monchai; Tiwari, Surendra N.

    1988-01-01

    In the computation of flowfields about complex configurations, it is very difficult to construct a boundary-fitted coordinate system. An alternative approach is to use several grids at once, each of which is generated independently. This procedure is called the multiple grids or zonal grids approach; its applications are investigated. The method conservative providing conservation of fluxes at grid interfaces. The Euler equations are solved numerically on such grids for various configurations. The numerical scheme used is the finite-volume technique with a three-stage Runge-Kutta time integration. The code is vectorized and programmed to run on the CDC VPS-32 computer. Steady state solutions of the Euler equations are presented and discussed. The solutions include: low speed flow over a sphere, high speed flow over a slender body, supersonic flow through a duct, and supersonic internal/external flow interaction for an aircraft configuration at various angles of attack. The results demonstrate that the multiple grids approach along with the conservative interfacing is capable of computing the flows about the complex configurations where the use of a single grid system is not possible.

  6. Fluid/Structure Interaction Computational Investigation of Blast-Wave Mitigation Efficacy of the Advanced Combat Helmet

    NASA Astrophysics Data System (ADS)

    Grujicic, M.; Bell, W. C.; Pandurangan, B.; Glomski, P. S.

    2011-08-01

    To combat the problem of traumatic brain injury (TBI), a signature injury of the current military conflicts, there is an urgent need to design head protection systems with superior blast/ballistic impact mitigation capabilities. Toward that end, the blast impact mitigation performance of an advanced combat helmet (ACH) head protection system equipped with polyurea suspension pads and subjected to two different blast peak pressure loadings has been investigated computationally. A fairly detailed (Lagrangian) finite-element model of a helmet/skull/brain assembly is first constructed and placed into an Eulerian air domain through which a single planar blast wave propagates. A combined Eulerian/Lagrangian transient nonlinear dynamics computational fluid/solid interaction analysis is next conducted in order to assess the extent of reduction in intra-cranial shock-wave ingress (responsible for TBI). This was done by comparing temporal evolutions of intra-cranial normal and shear stresses for the cases of an unprotected head and the helmet-protected head and by correlating these quantities with the three most common types of mild traumatic brain injury (mTBI), i.e., axonal damage, contusion, and subdural hemorrhage. The results obtained show that the ACH provides some level of protection against all investigated types of mTBI and that the level of protection increases somewhat with an increase in blast peak pressure. In order to rationalize the aforementioned findings, a shockwave propagation/reflection analysis is carried out for the unprotected head and helmet-protected head cases. The analysis qualitatively corroborated the results pertaining to the blast-mitigation efficacy of an ACH, but also suggested that there are additional shockwave energy dissipation phenomena which play an important role in the mechanical response of the unprotected/protected head to blast impact.

  7. Advanced Simulation and Computing: A Summary Report to the Director's Review

    SciTech Connect

    McCoy, M G; Peck, T

    2003-06-01

    It has now been three years since the Advanced Simulation and Computing Program (ASCI), as managed by Defense and Nuclear Technologies (DNT) Directorate, has been reviewed by this Director's Review Committee (DRC). Since that time, there has been considerable progress for all components of the ASCI Program, and these developments will be highlighted in this document and in the presentations planned for June 9 and 10, 2003. There have also been some name changes. Today, the Program is called ''Advanced Simulation and Computing,'' Although it retains the familiar acronym ASCI, the initiative nature of the effort has given way to sustained services as an integral part of the Stockpile Stewardship Program (SSP). All computing efforts at LLNL and the other two Defense Program (DP) laboratories are funded and managed under ASCI. This includes the so-called legacy codes, which remain essential tools in stockpile stewardship. The contract between the Department of Energy (DOE) and the University of California (UC) specifies an independent appraisal of Directorate technical work and programmatic management. Such represents the work of this DNT Review Committee. Beginning this year, the Laboratory is implementing a new review system. This process was negotiated between UC, the National Nuclear Security Administration (NNSA), and the Laboratory Directors. Central to this approach are eight performance objectives that focus on key programmatic and administrative goals. Associated with each of these objectives are a number of performance measures to more clearly characterize the attainment of the objectives. Each performance measure has a lead directorate and one or more contributing directorates. Each measure has an evaluation plan and has identified expected documentation to be included in the ''Assessment File''.

  8. FY 2009 Annual Report of Joule Software Metric SC GG 3.1/2.5.2, Improve Computational Science Capabilities

    SciTech Connect

    Kothe, Douglas B; Roche, Kenneth J; Kendall, Ricky A

    2010-01-01

    The Joule Software Metric for Computational Effectiveness is established by Public Authorizations PL 95-91, Department of Energy Organization Act, and PL 103-62, Government Performance and Results Act. The U.S. Office of Management and Budget (OMB) oversees the preparation and administration of the President s budget; evaluates the effectiveness of agency programs, policies, and procedures; assesses competing funding demands across agencies; and sets the funding priorities for the federal government. The OMB has the power of audit and exercises this right annually for each federal agency. According to the Government Performance and Results Act of 1993 (GPRA), federal agencies are required to develop three planning and performance documents: 1.Strategic Plan: a broad, 3 year outlook; 2.Annual Performance Plan: a focused, 1 year outlook of annual goals and objectives that is reflected in the annual budget request (What results can the agency deliver as part of its public funding?); and 3.Performance and Accountability Report: an annual report that details the previous fiscal year performance (What results did the agency produce in return for its public funding?). OMB uses its Performance Assessment Rating Tool (PART) to perform evaluations. PART has seven worksheets for seven types of agency functions. The function of Research and Development (R&D) programs is included. R&D programs are assessed on the following criteria: Does the R&D program perform a clear role? Has the program set valid long term and annual goals? Is the program well managed? Is the program achieving the results set forth in its GPRA documents? In Fiscal Year (FY) 2003, the Department of Energy Office of Science (DOE SC-1) worked directly with OMB to come to a consensus on an appropriate set of performance measures consistent with PART requirements. The scientific performance expectations of these requirements reach the scope of work conducted at the DOE national laboratories. The Joule system

  9. Recent advances in computational methodology for simulation of mechanical circulatory assist devices

    PubMed Central

    Marsden, Alison L.; Bazilevs, Yuri; Long, Christopher C.; Behr, Marek

    2014-01-01

    Ventricular assist devices (VADs) provide mechanical circulatory support to offload the work of one or both ventricles during heart failure. They are used in the clinical setting as destination therapy, as bridge to transplant, or more recently as bridge to recovery to allow for myocardial remodeling. Recent developments in computational simulation allow for detailed assessment of VAD hemodynamics for device design and optimization for both children and adults. Here, we provide a focused review of the recent literature on finite element methods and optimization for VAD simulations. As VAD designs typically fall into two categories, pulsatile and continuous flow devices, we separately address computational challenges of both types of designs, and the interaction with the circulatory system with three representative case studies. In particular, we focus on recent advancements in finite element methodology that has increased the fidelity of VAD simulations. We outline key challenges, which extend to the incorporation of biological response such as thrombosis and hemolysis, as well as shape optimization methods and challenges in computational methodology. PMID:24449607

  10. Advances in the design of a thermomechanical analyzer for fibers. II. Computer facilities and software

    NASA Astrophysics Data System (ADS)

    Noui, L.; Hearle, J. W. S.

    1995-06-01

    PC-based software for the full control of the flexible thermomechanical analyzer (FTMA) for yarns and fibers is described. The software permits a flexible procedure to control three essential parameters of the FTMA, namely tension, twist, and temperature. The computer program allows data acquisition at a programmable rate of up to 62.5 ksamples/s, on-line data display, and on-line data storage. Up to eight channels can be monitored. A circular buffer was used to store unlimited amount of data. For FTMA applications, data were calibrated in terms of Newtons for the tension, degree Celsius for the temperature, and Newton-meter for the torque and can be saved in three different formats, ASCII, LOTUS, or binary. The software is user friendly as it makes use of graphical user interface for motor control and data display. The software is also capable of controlling thermomechanical tests at constant force.

  11. Experimental and Computational Study on the Cusp-DEC and TWDEC for Advanced Fueled Fusion

    SciTech Connect

    Tomita, Y.; Yasaka, Y.; Takeno, H.; Ishikawa, M.; Nemoto, T.

    2005-01-15

    Experimental and computational results of direct energy converters (DECs) for advanced fueled fusion such as D-{sup 3}He are presented. Kinetic energy of thermal component of end loss plasma is converted to electricity by using the Cusp DEC. The proof-of-principle experiments of a single slanted cusp have been carried out and verified the faculty of the configuration. To improve a separation of electrons from ions, numerical simulation shows a Helmholtz magnetic configuration with a uniform magnetic field is more effective than the Cusp DEC. The fusion-produced high-energy ions like 15 MeV protons in D-{sup 3}He fueled fusion can pass through the Cusp DEC without disturbing their orbits and enter a traveling-wave direct energy converter (TWDEC). Small scale experiments have shown the effectiveness of the TWDEC and the numerical simulation on optimization of interval of electrodes in a decelerator gives high conversion efficiency up to 60 %.

  12. Development of an Advanced Computational Model for OMCVD of Indium Nitride

    NASA Technical Reports Server (NTRS)

    Cardelino, Carlos A.; Moore, Craig E.; Cardelino, Beatriz H.; Zhou, Ning; Lowry, Sam; Krishnan, Anantha; Frazier, Donald O.; Bachmann, Klaus J.

    1999-01-01

    An advanced computational model is being developed to predict the formation of indium nitride (InN) film from the reaction of trimethylindium (In(CH3)3) with ammonia (NH3). The components are introduced into the reactor in the gas phase within a background of molecular nitrogen (N2). Organometallic chemical vapor deposition occurs on a heated sapphire surface. The model simulates heat and mass transport with gas and surface chemistry under steady state and pulsed conditions. The development and validation of an accurate model for the interactions between the diffusion of gas phase species and surface kinetics is essential to enable the regulation of the process in order to produce a low defect material. The validation of the model will be performed in concert with a NASA-North Carolina State University project.

  13. Advanced computational tools for optimization and uncertainty quantification of carbon capture processes

    SciTech Connect

    Miller, David C.; Ng, Brenda; Eslick, John

    2014-01-01

    Advanced multi-scale modeling and simulation has the potential to dramatically reduce development time, resulting in considerable cost savings. The Carbon Capture Simulation Initiative (CCSI) is a partnership among national laboratories, industry and universities that is developing, demonstrating, and deploying a suite of multi-scale modeling and simulation tools. One significant computational tool is FOQUS, a Framework for Optimization and Quantification of Uncertainty and Sensitivity, which enables basic data submodels, including thermodynamics and kinetics, to be used within detailed process models to rapidly synthesize and optimize a process and determine the level of uncertainty associated with the resulting process. The overall approach of CCSI is described with a more detailed discussion of FOQUS and its application to carbon capture systems.

  14. Advanced Scientific Computing Research Network Requirements: ASCR Network Requirements Review Final Report

    SciTech Connect

    Bacon, Charles; Bell, Greg; Canon, Shane; Dart, Eli; Dattoria, Vince; Goodwin, Dave; Lee, Jason; Hicks, Susan; Holohan, Ed; Klasky, Scott; Lauzon, Carolyn; Rogers, Jim; Shipman, Galen; Skinner, David; Tierney, Brian

    2013-03-08

    The Energy Sciences Network (ESnet) is the primary provider of network connectivity for the U.S. Department of Energy (DOE) Office of Science (SC), the single largest supporter of basic research in the physical sciences in the United States. In support of SC programs, ESnet regularly updates and refreshes its understanding of the networking requirements of the instruments, facilities, scientists, and science programs that it serves. This focus has helped ESnet to be a highly successful enabler of scientific discovery for over 25 years. In October 2012, ESnet and the Office of Advanced Scientific Computing Research (ASCR) of the DOE SC organized a review to characterize the networking requirements of the programs funded by the ASCR program office. The requirements identified at the review are summarized in the Findings section, and are described in more detail in the body of the report.

  15. Advanced Computational Modeling of Vapor Deposition in a High-pressure Reactor

    NASA Technical Reports Server (NTRS)

    Cardelino, Beatriz H.; Moore, Craig E.; McCall, Sonya D.; Cardelino, Carlos A.; Dietz, Nikolaus; Bachmann, Klaus

    2004-01-01

    In search of novel approaches to produce new materials for electro-optic technologies, advances have been achieved in the development of computer models for vapor deposition reactors in space. Numerical simulations are invaluable tools for costly and difficult processes, such as those experiments designed for high pressures and microgravity conditions. Indium nitride is a candidate compound for high-speed laser and photo diodes for optical communication system, as well as for semiconductor lasers operating into the blue and ultraviolet regions. But InN and other nitride compounds exhibit large thermal decomposition at its optimum growth temperature. In addition, epitaxy at lower temperatures and subatmospheric pressures incorporates indium droplets into the InN films. However, surface stabilization data indicate that InN could be grown at 900 K in high nitrogen pressures, and microgravity could provide laminar flow conditions. Numerical models for chemical vapor deposition have been developed, coupling complex chemical kinetics with fluid dynamic properties.

  16. Advanced Computational Modeling of Vapor Deposition in a High-Pressure Reactor

    NASA Technical Reports Server (NTRS)

    Cardelino, Beatriz H.; Moore, Craig E.; McCall, Sonya D.; Cardelino, Carlos A.; Dietz, Nikolaus; Bachmann, Klaus

    2004-01-01

    In search of novel approaches to produce new materials for electro-optic technologies, advances have been achieved in the development of computer models for vapor deposition reactors in space. Numerical simulations are invaluable tools for costly and difficult processes, such as those experiments designed for high pressures and microgravity conditions. Indium nitride is a candidate compound for high-speed laser and photo diodes for optical communication system, as well as for semiconductor lasers operating into the blue and ultraviolet regions. But InN and other nitride compounds exhibit large thermal decomposition at its optimum growth temperature. In addition, epitaxy at lower temperatures and subatmospheric pressures incorporates indium droplets into the InN films. However, surface stabilization data indicate that InN could be grown at 900 K in high nitrogen pressures, and microgravity could provide laminar flow conditions. Numerical models for chemical vapor deposition have been developed, coupling complex chemical kinetics with fluid dynamic properties.

  17. A Computational Methodology for Simulating Thermal Loss Testing of the Advanced Stirling Convertor

    NASA Technical Reports Server (NTRS)

    Reid, Terry V.; Wilson, Scott D.; Schifer, Nicholas A.; Briggs, Maxwell H.

    2012-01-01

    The U.S. Department of Energy (DOE) and Lockheed Martin Space Systems Company (LMSSC) have been developing the Advanced Stirling Radioisotope Generator (ASRG) for use as a power system for space science missions. This generator would use two highefficiency Advanced Stirling Convertors (ASCs), developed by Sunpower Inc. and NASA Glenn Research Center (GRC). The ASCs convert thermal energy from a radioisotope heat source into electricity. As part of ground testing of these ASCs, different operating conditions are used to simulate expected mission conditions. These conditions require achieving a particular operating frequency, hot end and cold end temperatures, and specified electrical power output for a given net heat input. In an effort to improve net heat input predictions, numerous tasks have been performed which provided a more accurate value for net heat input into the ASCs, including the use of multidimensional numerical models. Validation test hardware has also been used to provide a direct comparison of numerical results and validate the multi-dimensional numerical models used to predict convertor net heat input and efficiency. These validation tests were designed to simulate the temperature profile of an operating Stirling convertor and resulted in a measured net heat input of 244.4 W. The methodology was applied to the multi-dimensional numerical model which resulted in a net heat input of 240.3 W. The computational methodology resulted in a value of net heat input that was 1.7 percent less than that measured during laboratory testing. The resulting computational methodology and results are discussed.

  18. Current Advances in the Computational Simulation of the Formation of Low-Mass Stars

    SciTech Connect

    Klein, R I; Inutsuka, S; Padoan, P; Tomisaka, K

    2005-10-24

    Developing a theory of low-mass star formation ({approx} 0.1 to 3 M{sub {circle_dot}}) remains one of the most elusive and important goals of theoretical astrophysics. The star-formation process is the outcome of the complex dynamics of interstellar gas involving non-linear interactions of turbulence, gravity, magnetic field and radiation. The evolution of protostellar condensations, from the moment they are assembled by turbulent flows to the time they reach stellar densities, spans an enormous range of scales, resulting in a major computational challenge for simulations. Since the previous Protostars and Planets conference, dramatic advances in the development of new numerical algorithmic techniques have been successfully implemented on large scale parallel supercomputers. Among such techniques, Adaptive Mesh Refinement and Smooth Particle Hydrodynamics have provided frameworks to simulate the process of low-mass star formation with a very large dynamic range. It is now feasible to explore the turbulent fragmentation of molecular clouds and the gravitational collapse of cores into stars self-consistently within the same calculation. The increased sophistication of these powerful methods comes with substantial caveats associated with the use of the techniques and the interpretation of the numerical results. In this review, we examine what has been accomplished in the field and present a critique of both numerical methods and scientific results. We stress that computational simulations should obey the available observational constraints and demonstrate numerical convergence. Failing this, results of large scale simulations do not advance our understanding of low-mass star formation.

  19. Advanced computations of multi-physics, multi-scale effects in beam dynamics

    SciTech Connect

    Amundson, J.F.; Macridin, A.; Spentzouris, P.; Stern, E.G.; /Fermilab

    2009-01-01

    Current state-of-the-art beam dynamics simulations include multiple physical effects and multiple physical length and/or time scales. We present recent developments in Synergia2, an accelerator modeling framework designed for multi-physics, multi-scale simulations. We summarize recent several recent results in multi-physics beam dynamics, including simulations of three Fermilab accelerators: the Tevatron, the Main Injector and the Debuncher. Early accelerator simulations focused on single-particle dynamics. To a first approximation, the forces on the particles in an accelerator beam are dominated by the external fields due to magnets, RF cavities, etc., so the single-particle dynamics are the leading physical effects. Detailed simulations of accelerators must include collective effects such as the space-charge repulsion of the beam particles, the effects of wake fields in the beam pipe walls and beam-beam interactions in colliders. These simulations require the sort of massively parallel computers that have only become available in recent times. We give an overview of the accelerator framework Synergia2, which was designed to take advantage of the capabilities of modern computational resources and enable simulations of multiple physical effects. We also summarize some recent results utilizing Synergia2 and BeamBeam3d, a tool specialized for beam-beam simulations.

  20. Identifying human disease genes: advances in molecular genetics and computational approaches.

    PubMed

    Bakhtiar, S M; Ali, A; Baig, S M; Barh, D; Miyoshi, A; Azevedo, V

    2014-07-04

    The human genome project is one of the significant achievements that have provided detailed insight into our genetic legacy. During the last two decades, biomedical investigations have gathered a considerable body of evidence by detecting more than 2000 disease genes. Despite the imperative advances in the genetic understanding of various diseases, the pathogenesis of many others remains obscure. With recent advances, the laborious methodologies used to identify DNA variations are replaced by direct sequencing of genomic DNA to detect genetic changes. The ability to perform such studies depends equally on the development of high-throughput and economical genotyping methods. Currently, basically for every disease whose origen is still unknown, genetic approaches are available which could be pedigree-dependent or -independent with the capacity to elucidate fundamental disease mechanisms. Computer algorithms and programs for linkage analysis have formed the foundation for many disease gene detection projects, similarly databases of clinical findings have been widely used to support diagnostic decisions in dysmorphology and general human disease. For every disease type, genome sequence variations, particularly single nucleotide polymorphisms are mapped by comparing the genetic makeup of case and control groups. Methods that predict the effects of polymorphisms on protein stability are useful for the identification of possible disease associations, whereas structural effects can be assessed using methods to predict stability changes in proteins using sequence and/or structural information.

  1. Advanced Computed Tomography Inspection System (ACTIS): An overview of the technology and its application

    NASA Technical Reports Server (NTRS)

    Hediger, Lisa H.

    1991-01-01

    The Advanced Computed Tomography Inspection System (ACTIS) was developed by NASA Marshall to support solid propulsion test programs. ACTIS represents a significant advance in state-of-the-art inspection systems. Its flexibility and superior technical performance have made ACTIS very popular, both within and outside the aerospace community. Through technology utilization efforts, ACTIS has been applied to inspection problems in commercial aerospace, lumber, automotive, and nuclear waste disposal industries. ACTIS has been used to inspect items of historical interest. ACTIS has consistently produced valuable results, providing information which was unattainable through conventional inspection methods. Although many successes have already been shown, the full potential of ACTIS has not yet been realized. It is currently being applied in the commercial aerospace industry by Boeing. Smaller systems, based on ACTIS technology, are becoming increasingly available. This technology has much to offer the small business and industry, especially in identifying design and process problems early in the product development cycle to prevent defects. Several options are available to businesses interested in this technology.

  2. Advanced computed tomography inspection system (ACTIS): an overview of the technology and its applications

    NASA Astrophysics Data System (ADS)

    Beshears, Ronald D.; Hediger, Lisa H.

    1994-10-01

    The Advanced Computed Tomography Inspection System (ACTIS) was developed by the Marshall Space Flight Center to support in-house solid propulsion test programs. ACTIS represents a significant advance in state-of-the-art inspection systems. Its flexibility and superior technical performance have made ACTIS very popular, both within and outside the aerospace community. Through Technology Utilization efforts, ACTIS has been applied to inspection problems in commercial aerospace, lumber, automotive, and nuclear waste disposal industries. ACTIS has even been used to inspect items of historical interest. ACTIS has consistently produced valuable results, providing information which was unattainable through conventional inspection methods. Although many successes have already been demonstrated, the full potential of ACTIS has not yet been realized. It is currently being applied in the commercial aerospace industry by Boeing Aerospace Company. Smaller systems, based on ACTIS technology are becoming increasingly available. This technology has much to offer small businesses and industry, especially in identifying design and process problems early in the product development cycle to prevent defects. Several options are available to businesses interested in pursuing this technology.

  3. Computational studies of horizontal axis wind turbines in high wind speed condition using advanced turbulence models

    NASA Astrophysics Data System (ADS)

    Benjanirat, Sarun

    Next generation horizontal-axis wind turbines (HAWTs) will operate at very high wind speeds. Existing engineering approaches for modeling the flow phenomena are based on blade element theory, and cannot adequately account for 3-D separated, unsteady flow effects. Therefore, researchers around the world are beginning to model these flows using first principles-based computational fluid dynamics (CFD) approaches. In this study, an existing first principles-based Navier-Stokes approach is being enhanced to model HAWTs at high wind speeds. The enhancements include improved grid topology, implicit time-marching algorithms, and advanced turbulence models. The advanced turbulence models include the Spalart-Allmaras one-equation model, k-epsilon, k-o and Shear Stress Transport (k-o-SST) models. These models are also integrated with detached eddy simulation (DES) models. Results are presented for a range of wind speeds, for a configuration termed National Renewable Energy Laboratory Phase VI rotor, tested at NASA Ames Research Center. Grid sensitivity studies are also presented. Additionally, effects of existing transition models on the predictions are assessed. Data presented include power/torque production, radial distribution of normal and tangential pressure forces, root bending moments, and surface pressure fields. Good agreement was obtained between the predictions and experiments for most of the conditions, particularly with the Spalart-Allmaras-DES model.

  4. Spectral computed tomography in advanced gastric cancer: Can iodine concentration non-invasively assess angiogenesis?

    PubMed Central

    Chen, Xiao-Hua; Ren, Ke; Liang, Pan; Chai, Ya-Ru; Chen, Kui-Sheng; Gao, Jian-Bo

    2017-01-01

    AIM To investigate the correlation of iodine concentration (IC) generated by spectral computed tomography (CT) with micro-vessel density (MVD) and vascular endothelial growth factor (VEGF) expression in patients with advanced gastric carcinoma (GC). METHODS Thirty-four advanced GC patients underwent abdominal enhanced CT in the gemstone spectral imaging mode. The IC of the primary lesion in the arterial phase (AP) and venous phase (VP) were measured, and were then normalized against that in the aorta to provide the normalized IC (nIC). MVD and VEGF were detected by immunohistochemical assays, using CD34 and VEGF-A antibodies, respectively. Correlations of nIC with MVD, VEGF, and clinical-pathological features were analyzed. RESULTS Both nICs correlated linearly with MVD and were higher in the primary lesion site than in the normal control site, but were not correlated with VEGF expression. After stratification by clinical-pathological subtypes, nIC-AP showed a statistically significant correlation with MVD, particularly in the group with tumors at stage T4, without nodular involvement, of a mixed Lauren type, where the tumor was located at the antrum site, and occurred in female individuals. nIC-VP showed a positive correlation with MVD in the group with the tumor at stage T4 and above, had nodular involvement, was poorly differentiated, was located at the pylorus site, of a mixed and diffused Lauren subtype, and occurred in male individuals. nIC-AP and nIC-VP showed significant differences in terms of histological differentiation and Lauren subtype. CONCLUSION The IC detected by spectral CT correlated with the MVD. nIC-AP and nIC-VP can reflect angiogenesis in different pathological subgroups of advanced GC. PMID:28321168

  5. Influence of setback and advancement osseous genioplasty on facial outcome: A computer-simulated study.

    PubMed

    Möhlhenrich, Stephan Christian; Heussen, Nicole; Kamal, Mohammad; Peters, Florian; Fritz, Ulrike; Hölzle, Frank; Modabber, Ali

    2015-12-01

    The aim of this virtual study was to investigate the influence of angular deviation and displacement distance on the overlying soft tissue during chin genioplasty. Computed tomography data from 21 patients were read using ProPlan CMF software. Twelve simulated genioplasties were performed per patient with variable osteotomy angles and displacement distances. Soft-tissue deformations and cephalometric analysis were compared. Changes in anterior and inferior soft-tissue of the chin along with resultant lower facial third area were determined. Maximum average changes in soft-tissue were obtained anterior after 10-mm advancement about 4.19 SD 0.84 mm and inferior about -1.55 SD 0.96 mm. After 10-mm setback anterior -4.63 SD 0.56 mm and inferior 0.75 SD 1.16 mm were deviations found. The anterior soft tissue showed a statistically significant change with bony displacement in both directions independent of osteotomy angle (p < 0.001) and only after a 10-mm advancement with an angle of -5° significant differences at inferior soft-tissue were noted (p = 0.0055). The average area of the total lower third of the face was 24,807.80 SD 4,091.72 mm(2) and up to 62.75% was influenced. Advanced genioplasty leads to greater changes in the overlying soft tissue, whereas the affected area is larger after setback displacement. The ratio between soft and hard tissue movements largely depends on the displacement distance.

  6. Investigation of Facsimile Camera-spectrometer Capability in the 1.0 to 2.7 Micron Spectral Range. [using computer techniques

    NASA Technical Reports Server (NTRS)

    Kelly, W. L., IV

    1975-01-01

    The capability of the facsimile camera augmented with a filter-spectrometer to provide scientifically valuable information in the 1.0 to 2.7 microns spectral range was investigated for a future planetary lander mission to Mars. A computer model was used to evaluate tradeoffs between signal-to-noise ratio, spatial and spectral resolution, and the number of spectral channels. Spectral absorption features resulting from water and chemical variations found in pyroxenes were used to represent scientific information of interest to biologists and geologists. Expected output data from a filter-spectrometer is illustrated which indicates that important information pertaining to water content and chemical composition can be obtained using six to eight spectral channels with 0.3 degree spatial resolution.

  7. Advanced display object selection methods for enhancing user-computer productivity

    NASA Technical Reports Server (NTRS)

    Osga, Glenn A.

    1993-01-01

    The User-Interface Technology Branch at NCCOSC RDT&E Division has been conducting a series of studies to address the suitability of commercial off-the-shelf (COTS) graphic user-interface (GUI) methods for efficiency and performance in critical naval combat systems. This paper presents an advanced selection algorithm and method developed to increase user performance when making selections on tactical displays. The method has also been applied with considerable success to a variety of cursor and pointing tasks. Typical GUI's allow user selection by: (1) moving a cursor with a pointing device such as a mouse, trackball, joystick, touchscreen; and (2) placing the cursor on the object. Examples of GUI objects are the buttons, icons, folders, scroll bars, etc. used in many personal computer and workstation applications. This paper presents an improved method of selection and the theoretical basis for the significant performance gains achieved with various input devices tested. The method is applicable to all GUI styles and display sizes, and is particularly useful for selections on small screens such as notebook computers. Considering the amount of work-hours spent pointing and clicking across all styles of available graphic user-interfaces, the cost/benefit in applying this method to graphic user-interfaces is substantial, with the potential for increasing productivity across thousands of users and applications.

  8. Building highly available control system applications with Advanced Telecom Computing Architecture and open standards

    NASA Astrophysics Data System (ADS)

    Kazakov, Artem; Furukawa, Kazuro

    2010-11-01

    Requirements for modern and future control systems for large projects like International Linear Collider demand high availability for control system components. Recently telecom industry came up with a great open hardware specification - Advanced Telecom Computing Architecture (ATCA). This specification is aimed for better reliability, availability and serviceability. Since its first market appearance in 2004, ATCA platform has shown tremendous growth and proved to be stable and well represented by a number of vendors. ATCA is an industry standard for highly available systems. On the other hand Service Availability Forum, a consortium of leading communications and computing companies, describes interaction between hardware and software. SAF defines a set of specifications such as Hardware Platform Interface, Application Interface Specification. SAF specifications provide extensive description of highly available systems, services and their interfaces. Originally aimed for telecom applications, these specifications can be used for accelerator controls software as well. This study describes benefits of using these specifications and their possible adoption to accelerator control systems. It is demonstrated how EPICS Redundant IOC was extended using Hardware Platform Interface specification, which made it possible to utilize benefits of the ATCA platform.

  9. Development of Experimental and Computational Aeroacoustic Tools for Advanced Liner Evaluation

    NASA Technical Reports Server (NTRS)

    Jones, Michael G.; Watson, Willie R.; Nark, Douglas N.; Parrott, Tony L.; Gerhold, Carl H.; Brown, Martha C.

    2006-01-01

    Acoustic liners in aircraft engine nacelles suppress radiated noise. Therefore, as air travel increases, increasingly sophisticated tools are needed to maximize noise suppression. During the last 30 years, NASA has invested significant effort in development of experimental and computational acoustic liner evaluation tools. The Curved Duct Test Rig is a 152-mm by 381- mm curved duct that supports liner evaluation at Mach numbers up to 0.3 and source SPLs up to 140 dB, in the presence of user-selected modes. The Grazing Flow Impedance Tube is a 51- mm by 63-mm duct currently being fabricated to operate at Mach numbers up to 0.6 with source SPLs up to at least 140 dB, and will replace the existing 51-mm by 51-mm duct. Together, these test rigs allow evaluation of advanced acoustic liners over a range of conditions representative of those observed in aircraft engine nacelles. Data acquired with these test ducts are processed using three aeroacoustic propagation codes. Two are based on finite element solutions to convected Helmholtz and linearized Euler equations. The third is based on a parabolic approximation to the convected Helmholtz equation. The current status of these computational tools and their associated usage with the Langley test rigs is provided.

  10. Computational fluid dynamics in the design and analysis of thermal processes: a review of recent advances.

    PubMed

    Norton, Tomás; Tiwari, Brijesh; Sun, Da Wen

    2013-01-01

    The design of thermal processes in the food industry has undergone great developments in the last two decades due to the availability of cheap computer power alongside advanced modelling techniques such as computational fluid dynamics (CFD). CFD uses numerical algorithms to solve the non-linear partial differential equations of fluid mechanics and heat transfer so that the complex mechanisms that govern many food-processing systems can be resolved. In thermal processing applications, CFD can be used to build three-dimensional models that are both spatially and temporally representative of a physical system to produce solutions with high levels of physical realism without the heavy costs associated with experimental analyses. Therefore, CFD is playing an ever growing role in the development of optimization of conventional as well as the development of new thermal processes in the food industry. This paper discusses the fundamental aspects involved in developing CFD solutions and forms a state-of-the-art review on various CFD applications in conventional as well as novel thermal processes. The challenges facing CFD modellers of thermal processes are also discussed. From this review it is evident that present-day CFD software, with its rich tapestries of mathematical physics, numerical methods and visualization techniques, is currently recognized as a formidable and pervasive technology which can permit comprehensive analyses of thermal processing.

  11. The New MCNP6 Depletion Capability

    SciTech Connect

    Fensin, Michael Lorne; James, Michael R.; Hendricks, John S.; Goorley, John T.

    2012-06-19

    The first MCNP based inline Monte Carlo depletion capability was officially released from the Radiation Safety Information and Computational Center as MCNPX 2.6.0. Both the MCNP5 and MCNPX codes have historically provided a successful combinatorial geometry based, continuous energy, Monte Carlo radiation transport solution for advanced reactor modeling and simulation. However, due to separate development pathways, useful simulation capabilities were dispersed between both codes and not unified in a single technology. MCNP6, the next evolution in the MCNP suite of codes, now combines the capability of both simulation tools, as well as providing new advanced technology, in a single radiation transport code. We describe here the new capabilities of the MCNP6 depletion code dating from the official RSICC release MCNPX 2.6.0, reported previously, to the now current state of MCNP6. NEA/OECD benchmark results are also reported. The MCNP6 depletion capability enhancements beyond MCNPX 2.6.0 reported here include: (1) new performance enhancing parallel architecture that implements both shared and distributed memory constructs; (2) enhanced memory management that maximizes calculation fidelity; and (3) improved burnup physics for better nuclide prediction. MCNP6 depletion enables complete, relatively easy-to-use depletion calculations in a single Monte Carlo code. The enhancements described here help provide a powerful capability as well as dictate a path forward for future development to improve the usefulness of the technology.

  12. The new MCNP6 depletion capability

    SciTech Connect

    Fensin, M. L.; James, M. R.; Hendricks, J. S.; Goorley, J. T.

    2012-07-01

    The first MCNP based in-line Monte Carlo depletion capability was officially released from the Radiation Safety Information and Computational Center as MCNPX 2.6.0. Both the MCNP5 and MCNPX codes have historically provided a successful combinatorial geometry based, continuous energy, Monte Carlo radiation transport solution for advanced reactor modeling and simulation. However, due to separate development pathways, useful simulation capabilities were dispersed between both codes and not unified in a single technology. MCNP6, the next evolution in the MCNP suite of codes, now combines the capability of both simulation tools, as well as providing new advanced technology, in a single radiation transport code. We describe here the new capabilities of the MCNP6 depletion code dating from the official RSICC release MCNPX 2.6.0, reported previously, to the now current state of MCNP6. NEA/OECD benchmark results are also reported. The MCNP6 depletion capability enhancements beyond MCNPX 2.6.0 reported here include: (1) new performance enhancing parallel architecture that implements both shared and distributed memory constructs; (2) enhanced memory management that maximizes calculation fidelity; and (3) improved burnup physics for better nuclide prediction. MCNP6 depletion enables complete, relatively easy-to-use depletion calculations in a single Monte Carlo code. The enhancements described here help provide a powerful capability as well as dictate a path forward for future development to improve the usefulness of the technology. (authors)

  13. Capability Disillusionment

    DTIC Science & Technology

    2011-08-01

    Defense AT&L: July–August 2011 22 Capability Disillusionment Cochrane is an operations research analyst and has worked for the past 6 years at the... Disillusionment 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) 5d. PROJECT NUMBER 5e. TASK NUMBER 5f. WORK UNIT NUMBER 7...unsup- ported by either academic investigation or practical utility. The definition of “capability” in the literature suggests that capabilities are

  14. Current advances in molecular, biochemical, and computational modeling analysis of microalgal triacylglycerol biosynthesis.

    PubMed

    Lenka, Sangram K; Carbonaro, Nicole; Park, Rudolph; Miller, Stephen M; Thorpe, Ian; Li, Yantao

    2016-01-01

    Triacylglycerols (TAGs) are highly reduced energy storage molecules ideal for biodiesel production. Microalgal TAG biosynthesis has been studied extensively in recent years, both at the molecular level and systems level through experimental studies and computational modeling. However, discussions of the strategies and products of the experimental and modeling approaches are rarely integrated and summarized together in a way that promotes collaboration among modelers and biologists in this field. In this review, we outline advances toward understanding the cellular and molecular factors regulating TAG biosynthesis in unicellular microalgae with an emphasis on recent studies on rate-limiting steps in fatty acid and TAG synthesis, while also highlighting new insights obtained from the integration of multi-omics datasets with mathematical models. Computational methodologies such as kinetic modeling, metabolic flux analysis, and new variants of flux balance analysis are explained in detail. We discuss how these methods have been used to simulate algae growth and lipid metabolism in response to changing culture conditions and how they have been used in conjunction with experimental validations. Since emerging evidence indicates that TAG synthesis in microalgae operates through coordinated crosstalk between multiple pathways in diverse subcellular destinations including the endoplasmic reticulum and plastids, we discuss new experimental studies and models that incorporate these findings for discovering key regulatory checkpoints. Finally, we describe tools for genetic manipulation of microalgae and their potential for future rational algal strain design. This comprehensive review explores the potential synergistic impact of pathway analysis, computational approaches, and molecular genetic manipulation strategies on improving TAG production in microalgae.

  15. TRANS_MU computer code for computation of transmutant formation kinetics in advanced structural materials for fusion reactors

    NASA Astrophysics Data System (ADS)

    Markina, Natalya V.; Shimansky, Gregory A.

    A method of controlling a systematic error in transmutation computations is described for a class of problems, in which strictly a one-parental and one-residual nucleus are considered in each nuclear transformation channel. A discrete-logical algorithm is stated for the differential equations system matrix to reduce it to a block-triangular type. A computing procedure is developed determining a strict estimation of a computing error for each value of the computation results for the above named class of transmutation computation problems with some additional restrictions on the complexity of the nuclei transformations scheme. The computer code for this computing procedure - TRANS_MU - compared with an analogue approach has a number of advantages. Besides the mentioned quantitative control of a systematic and computing errors as an important feature of the code TRANS_MU, it is necessary to indicate the calculation of the contribution of each considered reaction to the transmutant accumulation and gas production. The application of the TRANS_MU computer code is shown using copper alloys as an example when the planning of irradiation experiments with fusion reactor material specimens in fission reactors, and processing the experimental results.

  16. Utilizing Computer and Multimedia Technology in Generating Choreography for the Advanced Dance Student at the High School Level.

    ERIC Educational Resources Information Center

    Griffin, Irma Amado

    This study describes a pilot program utilizing various multimedia computer programs on a MacQuadra 840 AV. The target group consisted of six advanced dance students who participated in the pilot program within the dance curriculum by creating a database of dance movement using video and still photography. The students combined desktop publishing,…

  17. Using an Advanced Computational Laboratory Experiment to Extend and Deepen Physical Chemistry Students' Understanding of Atomic Structure

    ERIC Educational Resources Information Center

    Hoffman, Gary G.

    2015-01-01

    A computational laboratory experiment is described, which involves the advanced study of an atomic system. The students use concepts and techniques typically covered in a physical chemistry course but extend those concepts and techniques to more complex situations. The students get a chance to explore the study of atomic states and perform…

  18. Computer experiments on periodic systems identification using rotor blade transient flapping-torsion responses at high advance ratio

    NASA Technical Reports Server (NTRS)

    Hohenemser, K. H.; Prelewicz, D. A.

    1974-01-01

    Systems identification methods have recently been applied to rotorcraft to estimate stability derivatives from transient flight control response data. While these applications assumed a linear constant coefficient representation of the rotorcraft, the computer experiments described in this paper used transient responses in flap-bending and torsion of a rotor blade at high advance ratio which is a rapidly time varying periodic system.

  19. Capability of a regional climate model to simulate climate variables requested for water balance computation: a case study over northeastern France

    NASA Astrophysics Data System (ADS)

    Boulard, Damien; Castel, Thierry; Camberlin, Pierre; Sergent, Anne-Sophie; Bréda, Nathalie; Badeau, Vincent; Rossi, Aurélien; Pohl, Benjamin

    2016-05-01

    This paper documents the capability of the ARW/WRF regional climate model to regionalize near-surface atmospheric variables at high resolution (8 km) over Burgundy (northeastern France) from daily to interannual timescales. To that purpose, a 20-year continuous simulation (1989-2008) was carried out. The WRF model driven by ERA-Interim reanalyses was compared to in situ observations and a mesoscale atmospheric analyses system (SAFRAN) for five near-surface variables: precipitation, air temperature, wind speed, relative humidity and solar radiation, the last four variables being used for the calculation of potential evapotranspiration (ET0). Results show a significant improvement upon ERA-Interim. This is due to a good skill of the model to reproduce the spatial distribution for all weather variables, in spite of a slight over-estimation of precipitation amounts mostly during the summer convective season, and wind speed during winter. As compared to the Météo-France observations, WRF also improves upon SAFRAN analyses, which partly fail at showing realistic spatial distributions for wind speed, relative humidity and solar radiation—the latter being strongly underestimated. The SAFRAN ET0 is thus highly under-estimated too. WRF ET0 is in better agreement with observations. In order to evaluate WRF's capability to simulate a reliable ET0, the water balance of thirty Douglas-fir stands was computed using a process-based model. Three soil water deficit indexes corresponding to the sum of the daily deviations between the relative extractible water and a critical value of 40 % below which the low soil water content affects tree growth, were calculated using the nearest weather station, SAFRAN analyses weather data, or by merging observation and WRF weather variables. Correlations between Douglas-fir growth and the three estimated soil water deficit indexes show similar results. These results showed through the ET0 estimation and the relation between mean annual SWDI

  20. IAC - INTEGRATED ANALYSIS CAPABILITY

    NASA Technical Reports Server (NTRS)

    Frisch, H. P.

    1994-01-01

    The objective of the Integrated Analysis Capability (IAC) system is to provide a highly effective, interactive analysis tool for the integrated design of large structures. With the goal of supporting the unique needs of engineering analysis groups concerned with interdisciplinary problems, IAC was developed to interface programs from the fields of structures, thermodynamics, controls, and system dynamics with an executive system and database to yield a highly efficient multi-disciplinary system. Special attention is given to user requirements such as data handling and on-line assistance with operational features, and the ability to add new modules of the user's choice at a future date. IAC contains an executive system, a data base, general utilities, interfaces to various engineering programs, and a framework for building interfaces to other programs. IAC has shown itself to be effective in automatic data transfer among analysis programs. IAC 2.5, designed to be compatible as far as possible with Level 1.5, contains a major upgrade in executive and database management system capabilities, and includes interfaces to enable thermal, structures, optics, and control interaction dynamics analysis. The IAC system architecture is modular in design. 1) The executive module contains an input command processor, an extensive data management system, and driver code to execute the application modules. 2) Technical modules provide standalone computational capability as well as support for various solution paths or coupled analyses. 3) Graphics and model generation interfaces are supplied for building and viewing models. Advanced graphics capabilities are provided within particular analysis modules such as INCA and NASTRAN. 4) Interface modules provide for the required data flow between IAC and other modules. 5) User modules can be arbitrary executable programs or JCL procedures with no pre-defined relationship to IAC. 6) Special purpose modules are included, such as MIMIC (Model

  1. Evaluation of Computed Tomography of Mock Uranium Fuel Rods at the Advanced Photon Source

    DOE PAGES

    Hunter, James F.; Brown, Donald William; Okuniewski, Maria

    2015-06-01

    This study discusses a multi-year effort to evaluate the utility of computed tomography at the Advanced Photon Source (APS) as a tool for non-destructive evaluation of uranium based fuel rods. The majority of the data presented is on mock material made with depleted uranium which mimics the x-ray attenuation characteristics of fuel rods while allowing for simpler handling. A range of data is presented including full thickness (5mm diameter) fuel rodlets, reduced thickness (1.8mm) sintering test samples, and pre/post irradiation samples (< 1mm thick). These data were taken on both a white beam (bending magnet) beamline and a high energy,more » monochromatic beamline. This data shows the utility of a synchrotron type source in the evealuation of manufacturing defects (pre-irradiation) and lays out the case for in situ CT of fuel pellet sintering. Finally, in addition data is shown from small post-irradiation samples and a case is made for post-irradiation CT of larger samples.« less

  2. Effect of surgical mandibular advancement on pharyngeal airway dimensions: a three-dimensional computed tomography study.

    PubMed

    Kochar, G D; Chakranarayan, A; Kohli, S; Kohli, V S; Khanna, V; Jayan, B; Chopra, S S; Verma, M

    2016-05-01

    The aim of this study was to quantify the changes in pharyngeal airway space (PAS) in patients with a skeletal class II malocclusion managed by bilateral sagittal split ramus osteotomy for mandibular advancement, using three-dimensional (3D) registration. The sample comprised 16 patients (mean age 21.69±2.80 years). Preoperative (T0) and postoperative (T1) computed tomography scans were recorded. Linear, cross-sectional area (CSA), and volumetric parameters of the velopharynx, oropharynx, and hypopharynx were evaluated. Parameters were compared with paired samples t-tests. Highly significant changes in dimension were measured in both sagittal and transverse planes (P<0.001). CSA measurements increased significantly between T0 and T1 (P<0.001). A significant increase in PAS volume was found at T1 compared with T0 (P<0.001). The changes in PAS were quantified using 3D reconstruction. Along the sagittal and transverse planes, the greatest increase was seen in the oropharynx (12.16% and 11.50%, respectively), followed by hypopharynx (11.00% and 9.07%) and velopharynx (8.97% and 6.73%). CSA increased by 41.69%, 34.56%, and 28.81% in the oropharynx, hypopharynx, and velopharynx, respectively. The volumetric increase was greatest in the oropharynx (49.79%) and least in the velopharynx (38.92%). These established quantifications may act as a useful guide for clinicians in the field of dental sleep medicine.

  3. Recent advances in computational fluid dynamics relevant to the modelling of pesticide flow on leaf surfaces.

    PubMed

    Glass, C Richard; Walters, Keith F A; Gaskell, Philip H; Lee, Yeaw C; Thompson, Harvey M; Emerson, David R; Gu, Xiao-Jun

    2010-01-01

    Increasing societal and governmental concern about the worldwide use of chemical pesticides is now providing strong drivers towards maximising the efficiency of pesticide utilisation and the development of alternative control techniques. There is growing recognition that the ultimate goal of achieving efficient and sustainable pesticide usage will require greater understanding of the fluid mechanical mechanisms governing the delivery to, and spreading of, pesticide droplets on target surfaces such as leaves. This has led to increasing use of computational fluid dynamics (CFD) as an important component of efficient process design with regard to pesticide delivery to the leaf surface. This perspective highlights recent advances in CFD methods for droplet spreading and film flows, which have the potential to provide accurate, predictive models for pesticide flow on leaf surfaces, and which can take account of each of the key influences of surface topography and chemistry, initial spray deposition conditions, evaporation and multiple droplet spreading interactions. The mathematical framework of these CFD methods is described briefly, and a series of new flow simulation results relevant to pesticide flows over foliage is provided. The potential benefits of employing CFD for practical process design are also discussed briefly.

  4. Evaluation of Computed Tomography of Mock Uranium Fuel Rods at the Advanced Photon Source

    SciTech Connect

    Hunter, James F.; Brown, Donald William; Okuniewski, Maria

    2015-06-01

    This study discusses a multi-year effort to evaluate the utility of computed tomography at the Advanced Photon Source (APS) as a tool for non-destructive evaluation of uranium based fuel rods. The majority of the data presented is on mock material made with depleted uranium which mimics the x-ray attenuation characteristics of fuel rods while allowing for simpler handling. A range of data is presented including full thickness (5mm diameter) fuel rodlets, reduced thickness (1.8mm) sintering test samples, and pre/post irradiation samples (< 1mm thick). These data were taken on both a white beam (bending magnet) beamline and a high energy, monochromatic beamline. This data shows the utility of a synchrotron type source in the evealuation of manufacturing defects (pre-irradiation) and lays out the case for in situ CT of fuel pellet sintering. Finally, in addition data is shown from small post-irradiation samples and a case is made for post-irradiation CT of larger samples.

  5. IMPROVED COMPUTATIONAL NEUTRONICS METHODS AND VALIDATION PROTOCOLS FOR THE ADVANCED TEST REACTOR

    SciTech Connect

    David W. Nigg; Joseph W. Nielsen; Benjamin M. Chase; Ronnie K. Murray; Kevin A. Steuhm

    2012-04-01

    The Idaho National Laboratory (INL) is in the process of modernizing the various reactor physics modeling and simulation tools used to support operation and safety assurance of the Advanced Test Reactor (ATR). Key accomplishments so far have encompassed both computational as well as experimental work. A new suite of stochastic and deterministic transport theory based reactor physics codes and their supporting nuclear data libraries (HELIOS, KENO6/SCALE, NEWT/SCALE, ATTILA, and an extended implementation of MCNP5) has been installed at the INL. Corresponding models of the ATR and ATRC are now operational with all five codes, demonstrating the basic feasibility of the new code packages for their intended purpose. Of particular importance, a set of as-run core depletion HELIOS calculations for all ATR cycles since August 2009 was successfully completed during 2011. This demonstration supported a decision late in the year to proceed with the phased incorporation of the HELIOS methodology into the ATR fuel cycle management process beginning in 2012. On the experimental side of the project, new hardware was fabricated, measurement protocols were finalized, and the first four of six planned physics code validation experiments based on neutron activation spectrometry were conducted at the ATRC facility. Data analysis for the first three experiments, focused on characterization of the neutron spectrum in one of the ATR flux traps, has been completed. The six experiments will ultimately form the basis for a flexible, easily-repeatable ATR physics code validation protocol that is consistent with applicable ASTM standards.

  6. Advanced imaging findings and computer-assisted surgery of suspected synovial chondromatosis in the temporomandibular joint.

    PubMed

    Hohlweg-Majert, Bettina; Metzger, Marc C; Böhm, Joachim; Muecke, Thomas; Schulze, Dirk

    2008-11-01

    Synovial chondromatosis of the joint occurs mainly in teenagers and young adults. Only 3% of these neoplasms are located in the head and neck region. Synovial chondromatosis of the temporomandibular joint is therefore a very rare disorder. Therefore, developing a working, histological confirmation is required for differential diagnosis. In this case series, the outcome of histological investigation and imaging techniques are compared. Based on clinical symptoms, five cases of suspected synovial chondromatosis of the temporomandibular joint are presented. In each of the subjects, the diagnosis was confirmed by histology. Specific imaging features for each case are described. The tomography images were compared with the histological findings. All patients demonstrated preauricular swelling, dental midline deviation, and limited mouth opening. Computer-assisted surgery was performed. Histology disclosed synovial chondromatosis of the temporomandibular joint in four cases. The other case was found to be a developmental disorder of the tympanic bone. The diagnosis of synovial chondromatosis of the temporomandibular joint can only be based on histology. Clinical symptoms are too general and the available imaging techniques only show nonspecific tumorous destruction, infiltration, and/or residual calcified bodies, they are only for advanced cases. A rare developmental disorder of the tympanic bone--persistence of foramen of Huschke--has to be differentiated.

  7. Advances in automated deception detection in text-based computer-mediated communication

    NASA Astrophysics Data System (ADS)

    Adkins, Mark; Twitchell, Douglas P.; Burgoon, Judee K.; Nunamaker, Jay F., Jr.

    2004-08-01

    The Internet has provided criminals, terrorists, spies, and other threats to national security a means of communication. At the same time it also provides for the possibility of detecting and tracking their deceptive communication. Recent advances in natural language processing, machine learning and deception research have created an environment where automated and semi-automated deception detection of text-based computer-mediated communication (CMC, e.g. email, chat, instant messaging) is a reachable goal. This paper reviews two methods for discriminating between deceptive and non-deceptive messages in CMC. First, Document Feature Mining uses document features or cues in CMC messages combined with machine learning techniques to classify messages according to their deceptive potential. The method, which is most useful in asynchronous applications, also allows for the visualization of potential deception cues in CMC messages. Second, Speech Act Profiling, a method for quantifying and visualizing synchronous CMC, has shown promise in aiding deception detection. The methods may be combined and are intended to be a part of a suite of tools for automating deception detection.

  8. Advanced practice registered nurse usability testing of a tailored computer-mediated health communication program.

    PubMed

    Lin, Carolyn A; Neafsey, Patricia J; Anderson, Elizabeth

    2010-01-01

    This study tested the usability of a touch-screen-enabled Personal Education Program with advanced practice RNs. The Personal Education Program is designed to enhance medication adherence and reduce adverse self-medication behaviors in older adults with hypertension. An iterative research process was used, which involved the use of (1) pretrial focus groups to guide the design of system information architecture, (2) two different cycles of think-aloud trials to test the software interface, and (3) post-trial focus groups to gather feedback on the think-aloud studies. Results from this iterative usability-testing process were used to systematically modify and improve the three Personal Education Program prototype versions-the pilot, prototype 1, and prototype 2. Findings contrasting the two separate think-aloud trials showed that APRN users rated the Personal Education Program system usability, system information, and system-use satisfaction at a moderately high level between trials. In addition, errors using the interface were reduced by 76%, and the interface time was reduced by 18.5% between the two trials. The usability-testing processes used in this study ensured an interface design adapted to APRNs' needs and preferences to allow them to effectively use the computer-mediated health-communication technology in a clinical setting.

  9. Study of flutter related computational procedures for minimum weight structural sizing of advanced aircraft

    NASA Technical Reports Server (NTRS)

    Oconnell, R. F.; Hassig, H. J.; Radovcich, N. A.

    1976-01-01

    Results of a study of the development of flutter modules applicable to automated structural design of advanced aircraft configurations, such as a supersonic transport, are presented. Automated structural design is restricted to automated sizing of the elements of a given structural model. It includes a flutter optimization procedure; i.e., a procedure for arriving at a structure with minimum mass for satisfying flutter constraints. Methods of solving the flutter equation and computing the generalized aerodynamic force coefficients in the repetitive analysis environment of a flutter optimization procedure are studied, and recommended approaches are presented. Five approaches to flutter optimization are explained in detail and compared. An approach to flutter optimization incorporating some of the methods discussed is presented. Problems related to flutter optimization in a realistic design environment are discussed and an integrated approach to the entire flutter task is presented. Recommendations for further investigations are made. Results of numerical evaluations, applying the five methods of flutter optimization to the same design task, are presented.

  10. Computers in manufacturing.

    PubMed

    Hudson, C A

    1982-02-12

    Computers are now widely used in product design and in automation of selected areas in factories. Within the next decade, the use of computers in the entire spectrum of manufacturing applications, from computer-aided design to computer-aided manufacturing and robotics, is expected to be practical and economically justified. Such widespread use of computers on the factory floor awaits further advances in computer capabilities, the emergence of systems that are adaptive to the workplace, and the development of interfaces to link islands of automation and to allow effective user communications.

  11. Observations on computational methodologies for use in large-scale, gradient-based, multidisciplinary design incorporating advanced CFD codes

    NASA Technical Reports Server (NTRS)

    Newman, P. A.; Hou, G. J.-W.; Jones, H. E.; Taylor, A. C., III; Korivi, V. M.

    1992-01-01

    How a combination of various computational methodologies could reduce the enormous computational costs envisioned in using advanced CFD codes in gradient based optimized multidisciplinary design (MdD) procedures is briefly outlined. Implications of these MdD requirements upon advanced CFD codes are somewhat different than those imposed by a single discipline design. A means for satisfying these MdD requirements for gradient information is presented which appear to permit: (1) some leeway in the CFD solution algorithms which can be used; (2) an extension to 3-D problems; and (3) straightforward use of other computational methodologies. Many of these observations have previously been discussed as possibilities for doing parts of the problem more efficiently; the contribution here is observing how they fit together in a mutually beneficial way.

  12. Capability Extension to the Turbine Off-Design Computer Program AXOD With Applications to the Highly Loaded Fan-Drive Turbines

    NASA Technical Reports Server (NTRS)

    Chen, Shu-cheng S.

    2011-01-01

    The axial flow turbine off-design computer program AXOD has been upgraded to include the outlet guide vane (OGV) into its acceptable turbine configurations. The mathematical bases and the techniques used for the code implementation are described and discussed in lengths in this paper. This extended capability is verified and validated with two cases of highly loaded fan-drive turbines, designed and tested in the V/STOL Program of NASA. The first case is a 4 1/2-stage turbine with an average stage loading factor of 4.66, designed by Pratt & Whitney Aircraft. The second case is a 3 1/2-stage turbine with an average loading factor of 4.0, designed in-house by the NASA Lewis Research Center (now the NASA Glenn Research Center). Both cases were experimentally tested in the turbine facility located at the Glenn Research Center. The processes conducted in these studies are described in detail in this paper, and the results in comparison with the experimental data are presented and discussed. The comparisons between the AXOD results and the experimental data are in excellent agreement.

  13. Final report for %22High performance computing for advanced national electric power grid modeling and integration of solar generation resources%22, LDRD Project No. 149016.

    SciTech Connect

    Reno, Matthew J.; Riehm, Andrew Charles; Hoekstra, Robert John; Munoz-Ramirez, Karina; Stamp, Jason Edwin; Phillips, Laurence R.; Adams, Brian M.; Russo, Thomas V.; Oldfield, Ron A.; McLendon, William Clarence, III; Nelson, Jeffrey Scott; Hansen, Clifford W.; Richardson, Bryan T.; Stein, Joshua S.; Schoenwald, David Alan; Wolfenbarger, Paul R.

    2011-02-01

    Design and operation of the electric power grid (EPG) relies heavily on computational models. High-fidelity, full-order models are used to study transient phenomena on only a small part of the network. Reduced-order dynamic and power flow models are used when analysis involving thousands of nodes are required due to the computational demands when simulating large numbers of nodes. The level of complexity of the future EPG will dramatically increase due to large-scale deployment of variable renewable generation, active load and distributed generation resources, adaptive protection and control systems, and price-responsive demand. High-fidelity modeling of this future grid will require significant advances in coupled, multi-scale tools and their use on high performance computing (HPC) platforms. This LDRD report demonstrates SNL's capability to apply HPC resources to these 3 tasks: (1) High-fidelity, large-scale modeling of power system dynamics; (2) Statistical assessment of grid security via Monte-Carlo simulations of cyber attacks; and (3) Development of models to predict variability of solar resources at locations where little or no ground-based measurements are available.

  14. Recent Advances and Issues in Computers. Oryx Frontiers of Science Series.

    ERIC Educational Resources Information Center

    Gay, Martin K.

    Discussing recent issues in computer science, this book contains 11 chapters covering: (1) developments that have the potential for changing the way computers operate, including microprocessors, mass storage systems, and computing environments; (2) the national computational grid for high-bandwidth, high-speed collaboration among scientists, and…

  15. Space Logistics: Launch Capabilities

    NASA Technical Reports Server (NTRS)

    Furnas, Randall B.

    1989-01-01

    The current maximum launch capability for the United States are shown. The predicted Earth-to-orbit requirements for the United States are presented. Contrasting the two indicates the strong National need for a major increase in Earth-to-orbit lift capability. Approximate weights for planned payloads are shown. NASA is studying the following options to meet the need for a new heavy-lift capability by mid to late 1990's: (1) Shuttle-C for near term (include growth versions); and (2) the Advanced Lauching System (ALS) for the long term. The current baseline two-engine Shuttle-C has a 15 x 82 ft payload bay and an expected lift capability of 82,000 lb to Low Earth Orbit. Several options are being considered which have expanded diameter payload bays. A three-engine Shuttle-C with an expected lift of 145,000 lb to LEO is being evaluated as well. The Advanced Launch System (ALS) is a potential joint development between the Air Force and NASA. This program is focused toward long-term launch requirements, specifically beyond the year 2000. The basic approach is to develop a family of vehicles with the same high reliability as the Shuttle system, yet offering a much greater lift capability at a greatly reduced cost (per pound of payload). The ALS unmanned family of vehicles will provide a low end lift capability equivalent to Titan IV, and a high end lift capability greater than the Soviet Energia if requirements for such a high-end vehicle are defined.In conclusion, the planning of the next generation space telescope should not be constrained to the current launch vehicles. New vehicle designs will be driven by the needs of anticipated heavy users.

  16. Remote Controlled Orbiter Capability

    NASA Technical Reports Server (NTRS)

    Garske, Michael; delaTorre, Rafael

    2007-01-01

    The Remote Control Orbiter (RCO) capability allows a Space Shuttle Orbiter to perform an unmanned re-entry and landing. This low-cost capability employs existing and newly added functions to perform key activities typically performed by flight crews and controllers during manned re-entries. During an RCO landing attempt, these functions are triggered by automation resident in the on-board computers or uplinked commands from flight controllers on the ground. In order to properly route certain commands to the appropriate hardware, an In-Flight Maintenance (IFM) cable was developed. Currently, the RCO capability is reserved for the scenario where a safe return of the crew from orbit may not be possible. The flight crew would remain in orbit and await a rescue mission. After the crew is rescued, the RCO capability would be used on the unmanned Orbiter in an attempt to salvage this national asset.

  17. Development of 3D multimedia with advanced computer animation tools for outreach activities related to Meteor Science and Meteoritics

    NASA Astrophysics Data System (ADS)

    Madiedo, J. M.

    2012-09-01

    Documentaries related to Astronomy and Planetary Sciences are a common and very attractive way to promote the interest of the public in these areas. These educational tools can get benefit from new advanced computer animation software and 3D technologies, as these allow making these documentaries even more attractive. However, special care must be taken in order to guarantee that the information contained in them is serious and objective. In this sense, an additional value is given when the footage is produced by the own researchers. With this aim, a new documentary produced and directed by Prof. Madiedo has been developed. The documentary, which has been entirely developed by means of advanced computer animation tools, is dedicated to several aspects of Meteor Science and Meteoritics. The main features of this outreach and education initiative are exposed here.

  18. Grand Challenges of Advanced Computing for Energy Innovation Report from the Workshop Held July 31-August 2, 2012

    SciTech Connect

    Larzelere, Alex R.; Ashby, Steven F.; Christensen, Dana C.; Crawford, Dona L.; Khaleel, Mohammad A.; John, Grosh; Stults, B. Ray; Lee, Steven L.; Hammond, Steven W.; Grover, Benjamin T.; Neely, Rob; Dudney, Lee Ann; Goldstein, Noah C.; Wells, Jack; Peltz, Jim

    2013-03-06

    On July 31-August 2 of 2012, the U.S. Department of Energy (DOE) held a workshop entitled Grand Challenges of Advanced Computing for Energy Innovation. This workshop built on three earlier workshops that clearly identified the potential for the Department and its national laboratories to enable energy innovation. The specific goal of the workshop was to identify the key challenges that the nation must overcome to apply the full benefit of taxpayer-funded advanced computing technologies to U.S. energy innovation in the ways that the country produces, moves, stores, and uses energy. Perhaps more importantly, the workshop also developed a set of recommendations to help the Department overcome those challenges. These recommendations provide an action plan for what the Department can do in the coming years to improve the nation’s energy future.

  19. Prediction of helicopter rotor discrete frequency noise: A computer program incorporating realistic blade motions and advanced acoustic formulation

    NASA Technical Reports Server (NTRS)

    Brentner, K. S.

    1986-01-01

    A computer program has been developed at the Langley Research Center to predict the discrete frequency noise of conventional and advanced helicopter rotors. The program, called WOPWOP, uses the most advanced subsonic formulation of Farassat that is less sensitive to errors and is valid for nearly all helicopter rotor geometries and flight conditions. A brief derivation of the acoustic formulation is presented along with a discussion of the numerical implementation of the formulation. The computer program uses realistic helicopter blade motion and aerodynamic loadings, input by the user, for noise calculation in the time domain. A detailed definition of all the input variables, default values, and output data is included. A comparison with experimental data shows good agreement between prediction and experiment; however, accurate aerodynamic loading is needed.

  20. Proceedings of the topical meeting on advances in human factors research on man/computer interactions

    SciTech Connect

    Not Available

    1990-01-01

    This book discusses the following topics: expert systems and knowledge engineering-I; verification and validation of software; methods for modeling UMAN/computer performance; MAN/computer interaction problems in producing procedures -1-2; progress and problems with automation-1-2; experience with electronic presentation of procedures-2; intelligent displays and monitors; modeling user/computer interface; and computer-based human decision-making aids.