Science.gov

Sample records for advanced computational analysis

  1. Advanced computing

    NASA Technical Reports Server (NTRS)

    1991-01-01

    Advanced concepts in hardware, software and algorithms are being pursued for application in next generation space computers and for ground based analysis of space data. The research program focuses on massively parallel computation and neural networks, as well as optical processing and optical networking which are discussed under photonics. Also included are theoretical programs in neural and nonlinear science, and device development for magnetic and ferroelectric memories.

  2. Transonic wing analysis using advanced computational methods

    NASA Technical Reports Server (NTRS)

    Henne, P. A.; Hicks, R. M.

    1978-01-01

    This paper discusses the application of three-dimensional computational transonic flow methods to several different types of transport wing designs. The purpose of these applications is to evaluate the basic accuracy and limitations associated with such numerical methods. The use of such computational methods for practical engineering problems can only be justified after favorable evaluations are completed. The paper summarizes a study of both the small-disturbance and the full potential technique for computing three-dimensional transonic flows. Computed three-dimensional results are compared to both experimental measurements and theoretical results. Comparisons are made not only of pressure distributions but also of lift and drag forces. Transonic drag rise characteristics are compared. Three-dimensional pressure distributions and aerodynamic forces, computed from the full potential solution, compare reasonably well with experimental results for a wide range of configurations and flow conditions.

  3. Advances in Computer-Based Autoantibodies Analysis

    NASA Astrophysics Data System (ADS)

    Soda, Paolo; Iannello, Giulio

    Indirect Immunofluorescence (IIF) imaging is the recommended me-thod to detect autoantibodies in patient serum, whose common markers are antinuclear autoantibodies (ANA) and autoantibodies directed against double strand DNA (anti-dsDNA). Since the availability of accurately performed and correctly reported laboratory determinations is crucial for the clinicians, an evident medical demand is the development of Computer Aided Diagnosis (CAD) tools supporting physicians' decisions.

  4. Advanced computational tools for 3-D seismic analysis

    SciTech Connect

    Barhen, J.; Glover, C.W.; Protopopescu, V.A.

    1996-06-01

    The global objective of this effort is to develop advanced computational tools for 3-D seismic analysis, and test the products using a model dataset developed under the joint aegis of the United States` Society of Exploration Geophysicists (SEG) and the European Association of Exploration Geophysicists (EAEG). The goal is to enhance the value to the oil industry of the SEG/EAEG modeling project, carried out with US Department of Energy (DOE) funding in FY` 93-95. The primary objective of the ORNL Center for Engineering Systems Advanced Research (CESAR) is to spearhead the computational innovations techniques that would enable a revolutionary advance in 3-D seismic analysis. The CESAR effort is carried out in collaboration with world-class domain experts from leading universities, and in close coordination with other national laboratories and oil industry partners.

  5. Advances in Computational Stability Analysis of Composite Aerospace Structures

    SciTech Connect

    Degenhardt, R.; Araujo, F. C. de

    2010-09-30

    European aircraft industry demands for reduced development and operating costs. Structural weight reduction by exploitation of structural reserves in composite aerospace structures contributes to this aim, however, it requires accurate and experimentally validated stability analysis of real structures under realistic loading conditions. This paper presents different advances from the area of computational stability analysis of composite aerospace structures which contribute to that field. For stringer stiffened panels main results of the finished EU project COCOMAT are given. It investigated the exploitation of reserves in primary fibre composite fuselage structures through an accurate and reliable simulation of postbuckling and collapse. For unstiffened cylindrical composite shells a proposal for a new design method is presented.

  6. Computer modeling for advanced life support system analysis.

    PubMed

    Drysdale, A

    1997-01-01

    This article discusses the equivalent mass approach to advanced life support system analysis, describes a computer model developed to use this approach, and presents early results from modeling the NASA JSC BioPlex. The model is built using an object oriented approach and G2, a commercially available modeling package Cost factor equivalencies are given for the Volosin scenarios. Plant data from NASA KSC and Utah State University (USU) are used, together with configuration data from the BioPlex design effort. Initial results focus on the importance of obtaining high plant productivity with a flight-like configuration. PMID:11540448

  7. Whole-genome CNV analysis: advances in computational approaches

    PubMed Central

    Pirooznia, Mehdi; Goes, Fernando S.; Zandi, Peter P.

    2015-01-01

    Accumulating evidence indicates that DNA copy number variation (CNV) is likely to make a significant contribution to human diversity and also play an important role in disease susceptibility. Recent advances in genome sequencing technologies have enabled the characterization of a variety of genomic features, including CNVs. This has led to the development of several bioinformatics approaches to detect CNVs from next-generation sequencing data. Here, we review recent advances in CNV detection from whole genome sequencing. We discuss the informatics approaches and current computational tools that have been developed as well as their strengths and limitations. This review will assist researchers and analysts in choosing the most suitable tools for CNV analysis as well as provide suggestions for new directions in future development. PMID:25918519

  8. Recent advances in computational structural reliability analysis methods

    NASA Technical Reports Server (NTRS)

    Thacker, Ben H.; Wu, Y.-T.; Millwater, Harry R.; Torng, Tony Y.; Riha, David S.

    1993-01-01

    The goal of structural reliability analysis is to determine the probability that the structure will adequately perform its intended function when operating under the given environmental conditions. Thus, the notion of reliability admits the possibility of failure. Given the fact that many different modes of failure are usually possible, achievement of this goal is a formidable task, especially for large, complex structural systems. The traditional (deterministic) design methodology attempts to assure reliability by the application of safety factors and conservative assumptions. However, the safety factor approach lacks a quantitative basis in that the level of reliability is never known and usually results in overly conservative designs because of compounding conservatisms. Furthermore, problem parameters that control the reliability are not identified, nor their importance evaluated. A summary of recent advances in computational structural reliability assessment is presented. A significant level of activity in the research and development community was seen recently, much of which was directed towards the prediction of failure probabilities for single mode failures. The focus is to present some early results and demonstrations of advanced reliability methods applied to structural system problems. This includes structures that can fail as a result of multiple component failures (e.g., a redundant truss), or structural components that may fail due to multiple interacting failure modes (e.g., excessive deflection, resonate vibration, or creep rupture). From these results, some observations and recommendations are made with regard to future research needs.

  9. Quantitative Computed Tomography and Image Analysis for Advanced Muscle Assessment

    PubMed Central

    Edmunds, Kyle Joseph; Gíslason, Magnus K.; Arnadottir, Iris D.; Marcante, Andrea; Piccione, Francesco; Gargiulo, Paolo

    2016-01-01

    Medical imaging is of particular interest in the field of translational myology, as extant literature describes the utilization of a wide variety of techniques to non-invasively recapitulate and quantity various internal and external tissue morphologies. In the clinical context, medical imaging remains a vital tool for diagnostics and investigative assessment. This review outlines the results from several investigations on the use of computed tomography (CT) and image analysis techniques to assess muscle conditions and degenerative process due to aging or pathological conditions. Herein, we detail the acquisition of spiral CT images and the use of advanced image analysis tools to characterize muscles in 2D and 3D. Results from these studies recapitulate changes in tissue composition within muscles, as visualized by the association of tissue types to specified Hounsfield Unit (HU) values for fat, loose connective tissue or atrophic muscle, and normal muscle, including fascia and tendon. We show how results from these analyses can be presented as both average HU values and compositions with respect to total muscle volumes, demonstrating the reliability of these tools to monitor, assess and characterize muscle degeneration. PMID:27478562

  10. Advances in computational design and analysis of airbreathing propulsion systems

    NASA Technical Reports Server (NTRS)

    Klineberg, John M.

    1989-01-01

    The development of commercial and military aircraft depends, to a large extent, on engine manufacturers being able to achieve significant increases in propulsion capability through improved component aerodynamics, materials, and structures. The recent history of propulsion has been marked by efforts to develop computational techniques that can speed up the propulsion design process and produce superior designs. The availability of powerful supercomputers, such as the NASA Numerical Aerodynamic Simulator, and the potential for even higher performance offered by parallel computer architectures, have opened the door to the use of multi-dimensional simulations to study complex physical phenomena in propulsion systems that have previously defied analysis or experimental observation. An overview of several NASA Lewis research efforts is provided that are contributing toward the long-range goal of a numerical test-cell for the integrated, multidisciplinary design, analysis, and optimization of propulsion systems. Specific examples in Internal Computational Fluid Mechanics, Computational Structural Mechanics, Computational Materials Science, and High Performance Computing are cited and described in terms of current capabilities, technical challenges, and future research directions.

  11. Connecting Performance Analysis and Visualization to Advance Extreme Scale Computing

    SciTech Connect

    Bremer, Peer-Timo; Mohr, Bernd; Schulz, Martin; Pasccci, Valerio; Gamblin, Todd; Brunst, Holger

    2015-07-29

    The characterization, modeling, analysis, and tuning of software performance has been a central topic in High Performance Computing (HPC) since its early beginnings. The overall goal is to make HPC software run faster on particular hardware, either through better scheduling, on-node resource utilization, or more efficient distributed communication.

  12. Generalized Advanced Propeller Analysis System (GAPAS). Volume 2: Computer program user manual

    NASA Technical Reports Server (NTRS)

    Glatt, L.; Crawford, D. R.; Kosmatka, J. B.; Swigart, R. J.; Wong, E. W.

    1986-01-01

    The Generalized Advanced Propeller Analysis System (GAPAS) computer code is described. GAPAS was developed to analyze advanced technology multi-bladed propellers which operate on aircraft with speeds up to Mach 0.8 and altitudes up to 40,000 feet. GAPAS includes technology for analyzing aerodynamic, structural, and acoustic performance of propellers. The computer code was developed for the CDC 7600 computer and is currently available for industrial use on the NASA Langley computer. A description of all the analytical models incorporated in GAPAS is included. Sample calculations are also described as well as users requirements for modifying the analysis system. Computer system core requirements and running times are also discussed.

  13. Proceedings: Workshop on advanced mathematics and computer science for power systems analysis

    SciTech Connect

    Esselman, W.H.; Iveson, R.H. )

    1991-08-01

    The Mathematics and Computer Workshop on Power System Analysis was held February 21--22, 1989, in Palo Alto, California. The workshop was the first in a series sponsored by EPRI's Office of Exploratory Research as part of its effort to develop ways in which recent advances in mathematics and computer science can be applied to the problems of the electric utility industry. The purpose of this workshop was to identify research objectives in the field of advanced computational algorithms needed for the application of advanced parallel processing architecture to problems of power system control and operation. Approximately 35 participants heard six presentations on power flow problems, transient stability, power system control, electromagnetic transients, user-machine interfaces, and database management. In the discussions that followed, participants identified five areas warranting further investigation: system load flow analysis, transient power and voltage analysis, structural instability and bifurcation, control systems design, and proximity to instability. 63 refs.

  14. WAATS: A computer program for Weights Analysis of Advanced Transportation Systems

    NASA Technical Reports Server (NTRS)

    Glatt, C. R.

    1974-01-01

    A historical weight estimating technique for advanced transportation systems is presented. The classical approach to weight estimation is discussed and sufficient data is presented to estimate weights for a large spectrum of flight vehicles including horizontal and vertical takeoff aircraft, boosters and reentry vehicles. A computer program, WAATS (Weights Analysis for Advanced Transportation Systems) embracing the techniques discussed has been written and user instructions are presented. The program was developed for use in the ODIN (Optimal Design Integration System) system.

  15. PREFACE: 14th International Workshop on Advanced Computing and Analysis Techniques in Physics Research (ACAT 2011)

    NASA Astrophysics Data System (ADS)

    Teodorescu, Liliana; Britton, David; Glover, Nigel; Heinrich, Gudrun; Lauret, Jérôme; Naumann, Axel; Speer, Thomas; Teixeira-Dias, Pedro

    2012-06-01

    ACAT2011 This volume of Journal of Physics: Conference Series is dedicated to scientific contributions presented at the 14th International Workshop on Advanced Computing and Analysis Techniques in Physics Research (ACAT 2011) which took place on 5-7 September 2011 at Brunel University, UK. The workshop series, which began in 1990 in Lyon, France, brings together computer science researchers and practitioners, and researchers from particle physics and related fields in order to explore and confront the boundaries of computing and of automatic data analysis and theoretical calculation techniques. It is a forum for the exchange of ideas among the fields, exploring and promoting cutting-edge computing, data analysis and theoretical calculation techniques in fundamental physics research. This year's edition of the workshop brought together over 100 participants from all over the world. 14 invited speakers presented key topics on computing ecosystems, cloud computing, multivariate data analysis, symbolic and automatic theoretical calculations as well as computing and data analysis challenges in astrophysics, bioinformatics and musicology. Over 80 other talks and posters presented state-of-the art developments in the areas of the workshop's three tracks: Computing Technologies, Data Analysis Algorithms and Tools, and Computational Techniques in Theoretical Physics. Panel and round table discussions on data management and multivariate data analysis uncovered new ideas and collaboration opportunities in the respective areas. This edition of ACAT was generously sponsored by the Science and Technology Facility Council (STFC), the Institute for Particle Physics Phenomenology (IPPP) at Durham University, Brookhaven National Laboratory in the USA and Dell. We would like to thank all the participants of the workshop for the high level of their scientific contributions and for the enthusiastic participation in all its activities which were, ultimately, the key factors in the

  16. PREFACE: 16th International workshop on Advanced Computing and Analysis Techniques in physics research (ACAT2014)

    NASA Astrophysics Data System (ADS)

    Fiala, L.; Lokajicek, M.; Tumova, N.

    2015-05-01

    This volume of the IOP Conference Series is dedicated to scientific contributions presented at the 16th International Workshop on Advanced Computing and Analysis Techniques in Physics Research (ACAT 2014), this year the motto was ''bridging disciplines''. The conference took place on September 1-5, 2014, at the Faculty of Civil Engineering, Czech Technical University in Prague, Czech Republic. The 16th edition of ACAT explored the boundaries of computing system architectures, data analysis algorithmics, automatic calculations, and theoretical calculation technologies. It provided a forum for confronting and exchanging ideas among these fields, where new approaches in computing technologies for scientific research were explored and promoted. This year's edition of the workshop brought together over 140 participants from all over the world. The workshop's 16 invited speakers presented key topics on advanced computing and analysis techniques in physics. During the workshop, 60 talks and 40 posters were presented in three tracks: Computing Technology for Physics Research, Data Analysis - Algorithms and Tools, and Computations in Theoretical Physics: Techniques and Methods. The round table enabled discussions on expanding software, knowledge sharing and scientific collaboration in the respective areas. ACAT 2014 was generously sponsored by Western Digital, Brookhaven National Laboratory, Hewlett Packard, DataDirect Networks, M Computers, Bright Computing, Huawei and PDV-Systemhaus. Special appreciations go to the track liaisons Lorenzo Moneta, Axel Naumann and Grigory Rubtsov for their work on the scientific program and the publication preparation. ACAT's IACC would also like to express its gratitude to all referees for their work on making sure the contributions are published in the proceedings. Our thanks extend to the conference liaisons Andrei Kataev and Jerome Lauret who worked with the local contacts and made this conference possible as well as to the program

  17. PREFACE: 15th International Workshop on Advanced Computing and Analysis Techniques in Physics Research (ACAT2013)

    NASA Astrophysics Data System (ADS)

    Wang, Jianxiong

    2014-06-01

    This volume of Journal of Physics: Conference Series is dedicated to scientific contributions presented at the 15th International Workshop on Advanced Computing and Analysis Techniques in Physics Research (ACAT 2013) which took place on 16-21 May 2013 at the Institute of High Energy Physics, Chinese Academy of Sciences, Beijing, China. The workshop series brings together computer science researchers and practitioners, and researchers from particle physics and related fields to explore and confront the boundaries of computing and of automatic data analysis and theoretical calculation techniques. This year's edition of the workshop brought together over 120 participants from all over the world. 18 invited speakers presented key topics on the universe in computer, Computing in Earth Sciences, multivariate data analysis, automated computation in Quantum Field Theory as well as computing and data analysis challenges in many fields. Over 70 other talks and posters presented state-of-the-art developments in the areas of the workshop's three tracks: Computing Technologies, Data Analysis Algorithms and Tools, and Computational Techniques in Theoretical Physics. The round table discussions on open-source, knowledge sharing and scientific collaboration stimulate us to think over the issue in the respective areas. ACAT 2013 was generously sponsored by the Chinese Academy of Sciences (CAS), National Natural Science Foundation of China (NFSC), Brookhaven National Laboratory in the USA (BNL), Peking University (PKU), Theoretical Physics Cernter for Science facilities of CAS (TPCSF-CAS) and Sugon. We would like to thank all the participants for their scientific contributions and for the en- thusiastic participation in all its activities of the workshop. Further information on ACAT 2013 can be found at http://acat2013.ihep.ac.cn. Professor Jianxiong Wang Institute of High Energy Physics Chinese Academy of Science Details of committees and sponsors are available in the PDF

  18. Interfaces for Advanced Computing.

    ERIC Educational Resources Information Center

    Foley, James D.

    1987-01-01

    Discusses the coming generation of supercomputers that will have the power to make elaborate "artificial realities" that facilitate user-computer communication. Illustrates these technological advancements with examples of the use of head-mounted monitors which are connected to position and orientation sensors, and gloves that track finger and…

  19. Current advances in molecular, biochemical, and computational modeling analysis of microalgal triacylglycerol biosynthesis.

    PubMed

    Lenka, Sangram K; Carbonaro, Nicole; Park, Rudolph; Miller, Stephen M; Thorpe, Ian; Li, Yantao

    2016-01-01

    Triacylglycerols (TAGs) are highly reduced energy storage molecules ideal for biodiesel production. Microalgal TAG biosynthesis has been studied extensively in recent years, both at the molecular level and systems level through experimental studies and computational modeling. However, discussions of the strategies and products of the experimental and modeling approaches are rarely integrated and summarized together in a way that promotes collaboration among modelers and biologists in this field. In this review, we outline advances toward understanding the cellular and molecular factors regulating TAG biosynthesis in unicellular microalgae with an emphasis on recent studies on rate-limiting steps in fatty acid and TAG synthesis, while also highlighting new insights obtained from the integration of multi-omics datasets with mathematical models. Computational methodologies such as kinetic modeling, metabolic flux analysis, and new variants of flux balance analysis are explained in detail. We discuss how these methods have been used to simulate algae growth and lipid metabolism in response to changing culture conditions and how they have been used in conjunction with experimental validations. Since emerging evidence indicates that TAG synthesis in microalgae operates through coordinated crosstalk between multiple pathways in diverse subcellular destinations including the endoplasmic reticulum and plastids, we discuss new experimental studies and models that incorporate these findings for discovering key regulatory checkpoints. Finally, we describe tools for genetic manipulation of microalgae and their potential for future rational algal strain design. This comprehensive review explores the potential synergistic impact of pathway analysis, computational approaches, and molecular genetic manipulation strategies on improving TAG production in microalgae. PMID:27321475

  20. Center for Advanced Computational Technology

    NASA Technical Reports Server (NTRS)

    Noor, Ahmed K.

    2000-01-01

    The Center for Advanced Computational Technology (ACT) was established to serve as a focal point for diverse research activities pertaining to application of advanced computational technology to future aerospace systems. These activities include the use of numerical simulations, artificial intelligence methods, multimedia and synthetic environments, and computational intelligence, in the modeling, analysis, sensitivity studies, optimization, design and operation of future aerospace systems. The Center is located at NASA Langley and is an integral part of the School of Engineering and Applied Science of the University of Virginia. The Center has four specific objectives: 1) conduct innovative research on applications of advanced computational technology to aerospace systems; 2) act as pathfinder by demonstrating to the research community what can be done (high-potential, high-risk research); 3) help in identifying future directions of research in support of the aeronautical and space missions of the twenty-first century; and 4) help in the rapid transfer of research results to industry and in broadening awareness among researchers and engineers of the state-of-the-art in applications of advanced computational technology to the analysis, design prototyping and operations of aerospace and other high-performance engineering systems. In addition to research, Center activities include helping in the planning and coordination of the activities of a multi-center team of NASA and JPL researchers who are developing an intelligent synthesis environment for future aerospace systems; organizing workshops and national symposia; as well as writing state-of-the-art monographs and NASA special publications on timely topics.

  1. Recent advances in computational aerodynamics

    NASA Astrophysics Data System (ADS)

    Agarwal, Ramesh K.; Desse, Jerry E.

    1991-04-01

    The current state of the art in computational aerodynamics is described. Recent advances in the discretization of surface geometry, grid generation, and flow simulation algorithms have led to flowfield predictions for increasingly complex and realistic configurations. As a result, computational aerodynamics is emerging as a crucial enabling technology for the development and design of flight vehicles. Examples illustrating the current capability for the prediction of aircraft, launch vehicle and helicopter flowfields are presented. Unfortunately, accurate modeling of turbulence remains a major difficulty in the analysis of viscosity-dominated flows. In the future inverse design methods, multidisciplinary design optimization methods, artificial intelligence technology and massively parallel computer technology will be incorporated into computational aerodynamics, opening up greater opportunities for improved product design at substantially reduced costs.

  2. Advances in Computational Astrophysics

    SciTech Connect

    Calder, Alan C.; Kouzes, Richard T.

    2009-03-01

    I was invited to be the guest editor for a special issue of Computing in Science and Engineering along with a colleague from Stony Brook. This is the guest editors' introduction to a special issue of Computing in Science and Engineering. Alan and I have written this introduction and have been the editors for the 4 papers to be published in this special edition.

  3. Structural analysis of advanced polymeric foams by means of high resolution X-ray computed tomography

    NASA Astrophysics Data System (ADS)

    Nacucchi, M.; De Pascalis, F.; Scatto, M.; Capodieci, L.; Albertoni, R.

    2016-06-01

    Advanced polymeric foams with enhanced thermal insulation and mechanical properties are used in a wide range of industrial applications. The properties of a foam strongly depend upon its cell structure. Traditionally, their microstructure has been studied using 2D imaging systems based on optical or electron microscopy, with the obvious disadvantage that only the surface of the sample can be analysed. To overcome this shortcoming, the adoption of X-ray micro-tomography imaging is here suggested to allow for a complete 3D, non-destructive analysis of advanced polymeric foams. Unlike metallic foams, the resolution of the reconstructed structural features is hampered by the low contrast in the images due to weak X-ray absorption in the polymer. In this work an advanced methodology based on high-resolution and low-contrast techniques is used to perform quantitative analyses on both closed and open cells foams. Local structural features of individual cells such as equivalent diameter, sphericity, anisotropy and orientation are statistically evaluated. In addition, thickness and length of the struts are determined, underlining the key role played by the achieved resolution. In perspective, the quantitative description of these structural features will be used to evaluate the results of in situ mechanical and thermal test on foam samples.

  4. Advances and trends in computational structural mechanics

    NASA Technical Reports Server (NTRS)

    Noor, A. K.

    1986-01-01

    Recent developments in computational structural mechanics are reviewed with reference to computational needs for future structures technology, advances in computational models for material behavior, discrete element technology, assessment and control of numerical simulations of structural response, hybrid analysis, and techniques for large-scale optimization. Research areas in computational structural mechanics which have high potential for meeting future technological needs are identified. These include prediction and analysis of the failure of structural components made of new materials, development of computational strategies and solution methodologies for large-scale structural calculations, and assessment of reliability and adaptive improvement of response predictions.

  5. TRAC-PF1: an advanced best-estimate computer program for pressurized water reactor analysis

    SciTech Connect

    Liles, D.R.; Mahaffy, J.H.

    1984-02-01

    The Transient Reactor Analysis Code (TRAC) is being developed at the Los Alamos National Laboratory to provide advanced best-estimate predictions of postulated accidents in light water reactors. The TRAC-PF1 program provides this capability for pressurized water reactors and for many thermal-hydraulic experimental facilities. The code features either a one-dimensional or a three-dimensional treatment of the pressure vessel and its associated internals; a two-phase, two-fluid nonequilibrium hydrodynamics model with a noncondensable gas field; flow-regime-dependent constitutive equation treatment; optional reflood tracking capability for both bottom flood and falling-film quench fronts; and consistent treatment of entire accident sequences including the generation of consistent initial conditions. This report describes the thermal-hydraulic models and the numerical solution methods used in the code. Detailed programming and user information also are provided.

  6. Advanced foam computer model helps in the design and analysis of underbalanced drilling

    SciTech Connect

    Liu, G.; Medley, G.H. Jr.

    1996-09-01

    Foam generated by mixing gas and liquid for underbalanced drilling has unique rheological characteristics, making it very difficult to accurately predict the pressure profile. A new mechanistic model attempts to overcome many of the problems associated with existing foam flow analyses. Varying Fanning friction factors, rather than assumed constants, are calculated along the flow path. A user-friendly PC computer program was developed to numerically solve the mechanical energy balance equation for compressible foam flow, taking into account influxes of gas, liquid, and oil from formations. The pressure profile, foam quality, density, and cuttings transport are predicted by the model. A Sensitivity Analysis Window allows the user to quickly optimize the hydraulics program by selecting the best combination of injection pressure, back pressure, and gas/liquid injection rates. This new model handles inclined and horizontal wellbores, and provides handy engineering and design tools for underbalanced drilling, wellbore clean-out, and other foam operations.

  7. Integrated Graphics Operations and Analysis Lab Development of Advanced Computer Graphics Algorithms

    NASA Technical Reports Server (NTRS)

    Wheaton, Ira M.

    2011-01-01

    The focus of this project is to aid the IGOAL in researching and implementing algorithms for advanced computer graphics. First, this project focused on porting the current International Space Station (ISS) Xbox experience to the web. Previously, the ISS interior fly-around education and outreach experience only ran on an Xbox 360. One of the desires was to take this experience and make it into something that can be put on NASA s educational site for anyone to be able to access. The current code works in the Unity game engine which does have cross platform capability but is not 100% compatible. The tasks for an intern to complete this portion consisted of gaining familiarity with Unity and the current ISS Xbox code, porting the Xbox code to the web as is, and modifying the code to work well as a web application. In addition, a procedurally generated cloud algorithm will be developed. Currently, the clouds used in AGEA animations and the Xbox experiences are a texture map. The desire is to create a procedurally generated cloud algorithm to provide dynamically generated clouds for both AGEA animations and the Xbox experiences. This task consists of gaining familiarity with AGEA and the plug-in interface, developing the algorithm, creating an AGEA plug-in to implement the algorithm inside AGEA, and creating a Unity script to implement the algorithm for the Xbox. This portion of the project was unable to be completed in the time frame of the internship; however, the IGOAL will continue to work on it in the future.

  8. Advanced multi-dimensional deterministic transport computational capability for safety analysis of pebble-bed reactors

    NASA Astrophysics Data System (ADS)

    Tyobeka, Bismark Mzubanzi

    A coupled neutron transport thermal-hydraulics code system with both diffusion and transport theory capabilities is presented. At the heart of the coupled code is a powerful neutronics solver, based on a neutron transport theory approach, powered by the time-dependent extension of the well known DORT code, DORT-TD. DORT-TD uses a fully implicit time integration scheme and is coupled via a general interface to the thermal-hydraulics code THERMIX-DIREKT, an HTR-specific two dimensional core thermal-hydraulics code. Feedback is accounted for by interpolating multigroup cross sections from pre-generated libraries which are structured for user specified discrete sets of thermal-hydraulic parameters e.g. fuel and moderator temperatures. The coupled code system is applied to two HTGR designs, the PBMR 400MW and the PBMR 268MW. Steady-state and several design basis transients are modeled in an effort to discern with the adequacy of using neutron diffusion theory as against the more accurate but yet computationally expensive neutron transport theory. It turns out that there are small but significant differences in the results from using either of the two theories. It is concluded that diffusion theory can be used with a higher degree of confidence in the PBMR as long as more than two energy groups are used and that the result must be checked against lower order transport solution, especially for safety analysis purposes. The end product of this thesis is a high fidelity, state-of-the-art computer code system, with multiple capabilities to analyze all PBMR safety related transients in an accurate and efficient manner.

  9. Investigation of advanced counterrotation blade configuration concepts for high speed turboprop systems. Task 4: Advanced fan section aerodynamic analysis computer program user's manual

    NASA Technical Reports Server (NTRS)

    Crook, Andrew J.; Delaney, Robert A.

    1992-01-01

    The computer program user's manual for the ADPACAPES (Advanced Ducted Propfan Analysis Code-Average Passage Engine Simulation) program is included. The objective of the computer program is development of a three-dimensional Euler/Navier-Stokes flow analysis for fan section/engine geometries containing multiple blade rows and multiple spanwise flow splitters. An existing procedure developed by Dr. J. J. Adamczyk and associates at the NASA Lewis Research Center was modified to accept multiple spanwise splitter geometries and simulate engine core conditions. The numerical solution is based upon a finite volume technique with a four stage Runge-Kutta time marching procedure. Multiple blade row solutions are based upon the average-passage system of equations. The numerical solutions are performed on an H-type grid system, with meshes meeting the requirement of maintaining a common axisymmetric mesh for each blade row grid. The analysis was run on several geometry configurations ranging from one to five blade rows and from one to four radial flow splitters. The efficiency of the solution procedure was shown to be the same as the original analysis.

  10. Novel genotype-phenotype associations in human cancers enabled by advanced molecular platforms and computational analysis of whole slide images

    PubMed Central

    Cooper, Lee A.D.; Kong, Jun; Gutman, David A.; Dunn, William D.; Nalisnik, Michael; Brat, Daniel J.

    2014-01-01

    Technological advances in computing, imaging and genomics have created new opportunities for exploring relationships between histology, molecular events and clinical outcomes using quantitative methods. Slide scanning devices are now capable of rapidly producing massive digital image archives that capture histological details in high-resolution. Commensurate advances in computing and image analysis algorithms enable mining of archives to extract descriptions of histology, ranging from basic human annotations to automatic and precisely quantitative morphometric characterization of hundreds of millions of cells. These imaging capabilities represent a new dimension in tissue-based studies, and when combined with genomic and clinical endpoints, can be used to explore biologic characteristics of the tumor microenvironment and to discover new morphologic biomarkers of genetic alterations and patient outcomes. In this paper we review developments in quantitative imaging technology and illustrate how image features can be integrated with clinical and genomic data to investigate fundamental problems in cancer. Using motivating examples from the study of glioblastomas (GBMs), we demonstrate how public data from The Cancer Genome Atlas (TCGA) can serve as an open platform to conduct in silico tissue based studies that integrate existing data resources. We show how these approaches can be used to explore the relation of the tumor microenvironment to genomic alterations and gene expression patterns and to define nuclear morphometric features that are predictive of genetic alterations and clinical outcomes. Challenges, limitations and emerging opportunities in the area of quantitative imaging and integrative analyses are also discussed. PMID:25599536

  11. Advanced flight computer. Special study

    NASA Technical Reports Server (NTRS)

    Coo, Dennis

    1995-01-01

    This report documents a special study to define a 32-bit radiation hardened, SEU tolerant flight computer architecture, and to investigate current or near-term technologies and development efforts that contribute to the Advanced Flight Computer (AFC) design and development. An AFC processing node architecture is defined. Each node may consist of a multi-chip processor as needed. The modular, building block approach uses VLSI technology and packaging methods that demonstrate a feasible AFC module in 1998 that meets that AFC goals. The defined architecture and approach demonstrate a clear low-risk, low-cost path to the 1998 production goal, with intermediate prototypes in 1996.

  12. Computational engine structural analysis

    NASA Technical Reports Server (NTRS)

    Chamis, C. C.; Johns, R. H.

    1986-01-01

    A significant research activity at the NASA Lewis Research Center is the computational simulation of complex multidisciplinary engine structural problems. This simulation is performed using computational engine structural analysis (CESA) which consists of integrated multidisciplinary computer codes in conjunction with computer post-processing for problem-specific application. A variety of the computational simulations of specific cases are described in some detail in this paper. These case studies include: (1) aeroelastic behavior of bladed rotors, (2) high velocity impact of fan blades, (3) blade-loss transient response, (4) rotor/stator/squeeze-film/bearing interaction, (5) blade-fragment/rotor-burst containment, and (6) structural behavior of advanced swept turboprops. These representative case studies are selected to demonstrate the breath of the problems analyzed and the role of the computer including post-processing and graphical display of voluminous output data.

  13. Advanced Artificial Science. The development of an artificial science and engineering research infrastructure to facilitate innovative computational modeling, analysis, and application to interdisciplinary areas of scientific investigation.

    SciTech Connect

    Saffer, Shelley I.

    2014-12-01

    This is a final report of the DOE award DE-SC0001132, Advanced Artificial Science. The development of an artificial science and engineering research infrastructure to facilitate innovative computational modeling, analysis, and application to interdisciplinary areas of scientific investigation. This document describes the achievements of the goals, and resulting research made possible by this award.

  14. Development and validation of burnup dependent computational schemes for the analysis of assemblies with advanced lattice codes

    NASA Astrophysics Data System (ADS)

    Ramamoorthy, Karthikeyan

    The main aim of this research is the development and validation of computational schemes for advanced lattice codes. The advanced lattice code which forms the primary part of this research is "DRAGON Version4". The code has unique features like self shielding calculation with capabilities to represent distributed and mutual resonance shielding effects, leakage models with space-dependent isotropic or anisotropic streaming effect, availability of the method of characteristics (MOC), burnup calculation with reaction-detailed energy production etc. Qualified reactor physics codes are essential for the study of all existing and envisaged designs of nuclear reactors. Any new design would require a thorough analysis of all the safety parameters and burnup dependent behaviour. Any reactor physics calculation requires the estimation of neutron fluxes in various regions of the problem domain. The calculation goes through several levels before the desired solution is obtained. Each level of the lattice calculation has its own significance and any compromise at any step will lead to poor final result. The various levels include choice of nuclear data library and energy group boundaries into which the multigroup library is cast; self shielding of nuclear data depending on the heterogeneous geometry and composition; tracking of geometry, keeping error in volume and surface to an acceptable minimum; generation of regionwise and groupwise collision probabilities or MOC-related information and their subsequent normalization thereof, solution of transport equation using the previously generated groupwise information and obtaining the fluxes and reaction rates in various regions of the lattice; depletion of fuel and of other materials based on normalization with constant power or constant flux. Of the above mentioned levels, the present research will mainly focus on two aspects, namely self shielding and depletion. The behaviour of the system is determined by composition of resonant

  15. Integrating Remote Sensing Data, Hybrid-Cloud Computing, and Event Notifications for Advanced Rapid Imaging & Analysis (Invited)

    NASA Astrophysics Data System (ADS)

    Hua, H.; Owen, S. E.; Yun, S.; Lundgren, P.; Fielding, E. J.; Agram, P.; Manipon, G.; Stough, T. M.; Simons, M.; Rosen, P. A.; Wilson, B. D.; Poland, M. P.; Cervelli, P. F.; Cruz, J.

    2013-12-01

    Space-based geodetic measurement techniques such as Interferometric Synthetic Aperture Radar (InSAR) and Continuous Global Positioning System (CGPS) are now important elements in our toolset for monitoring earthquake-generating faults, volcanic eruptions, hurricane damage, landslides, reservoir subsidence, and other natural and man-made hazards. Geodetic imaging's unique ability to capture surface deformation with high spatial and temporal resolution has revolutionized both earthquake science and volcanology. Continuous monitoring of surface deformation and surface change before, during, and after natural hazards improves decision-making from better forecasts, increased situational awareness, and more informed recovery. However, analyses of InSAR and GPS data sets are currently handcrafted following events and are not generated rapidly and reliably enough for use in operational response to natural disasters. Additionally, the sheer data volumes needed to handle a continuous stream of InSAR data sets also presents a bottleneck. It has been estimated that continuous processing of InSAR coverage of California alone over 3-years would reach PB-scale data volumes. Our Advanced Rapid Imaging and Analysis for Monitoring Hazards (ARIA-MH) science data system enables both science and decision-making communities to monitor areas of interest with derived geodetic data products via seamless data preparation, processing, discovery, and access. We will present our findings on the use of hybrid-cloud computing to improve the timely processing and delivery of geodetic data products, integrating event notifications from USGS to improve the timely processing for response, as well as providing browse results for quick looks with other tools for integrative analysis.

  16. 76 FR 64330 - Advanced Scientific Computing Advisory Committee

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-10-18

    ... Workshop on Mathematics for the Analysis, Simulation, and Optimization of Complex Systems Report from ASCR..., Office of Advanced Scientific Computing Research; SC-21/Germantown Building; U. S. Department of Energy... Department of Energy on scientific priorities within the field of advanced scientific computing...

  17. Airborne Advanced Reconfigurable Computer System (ARCS)

    NASA Technical Reports Server (NTRS)

    Bjurman, B. E.; Jenkins, G. M.; Masreliez, C. J.; Mcclellan, K. L.; Templeman, J. E.

    1976-01-01

    A digital computer subsystem fault-tolerant concept was defined, and the potential benefits and costs of such a subsystem were assessed when used as the central element of a new transport's flight control system. The derived advanced reconfigurable computer system (ARCS) is a triple-redundant computer subsystem that automatically reconfigures, under multiple fault conditions, from triplex to duplex to simplex operation, with redundancy recovery if the fault condition is transient. The study included criteria development covering factors at the aircraft's operation level that would influence the design of a fault-tolerant system for commercial airline use. A new reliability analysis tool was developed for evaluating redundant, fault-tolerant system availability and survivability; and a stringent digital system software design methodology was used to achieve design/implementation visibility.

  18. Research Institute for Advanced Computer Science

    NASA Technical Reports Server (NTRS)

    Gross, Anthony R. (Technical Monitor); Leiner, Barry M.

    2000-01-01

    The Research Institute for Advanced Computer Science (RIACS) carries out basic research and technology development in computer science, in support of the National Aeronautics and Space Administration's missions. RIACS is located at the NASA Ames Research Center. It currently operates under a multiple year grant/cooperative agreement that began on October 1, 1997 and is up for renewal in the year 2002. Ames has been designated NASA's Center of Excellence in Information Technology. In this capacity, Ames is charged with the responsibility to build an Information Technology Research Program that is preeminent within NASA. RIACS serves as a bridge between NASA Ames and the academic community, and RIACS scientists and visitors work in close collaboration with NASA scientists. RIACS has the additional goal of broadening the base of researchers in these areas of importance to the nation's space and aeronautics enterprises. RIACS research focuses on the three cornerstones of information technology research necessary to meet the future challenges of NASA missions: (1) Automated Reasoning for Autonomous Systems. Techniques are being developed enabling spacecraft that will be self-guiding and self-correcting to the extent that they will require little or no human intervention. Such craft will be equipped to independently solve problems as they arise, and fulfill their missions with minimum direction from Earth; (2) Human-Centered Computing. Many NASA missions require synergy between humans and computers, with sophisticated computational aids amplifying human cognitive and perceptual abilities; (3) High Performance Computing and Networking. Advances in the performance of computing and networking continue to have major impact on a variety of NASA endeavors, ranging from modeling and simulation to data analysis of large datasets to collaborative engineering, planning and execution. In addition, RIACS collaborates with NASA scientists to apply information technology research to a

  19. Development and validation of burnup dependent computational schemes for the analysis of assemblies with advanced lattice codes

    NASA Astrophysics Data System (ADS)

    Ramamoorthy, Karthikeyan

    The main aim of this research is the development and validation of computational schemes for advanced lattice codes. The advanced lattice code which forms the primary part of this research is "DRAGON Version4". The code has unique features like self shielding calculation with capabilities to represent distributed and mutual resonance shielding effects, leakage models with space-dependent isotropic or anisotropic streaming effect, availability of the method of characteristics (MOC), burnup calculation with reaction-detailed energy production etc. Qualified reactor physics codes are essential for the study of all existing and envisaged designs of nuclear reactors. Any new design would require a thorough analysis of all the safety parameters and burnup dependent behaviour. Any reactor physics calculation requires the estimation of neutron fluxes in various regions of the problem domain. The calculation goes through several levels before the desired solution is obtained. Each level of the lattice calculation has its own significance and any compromise at any step will lead to poor final result. The various levels include choice of nuclear data library and energy group boundaries into which the multigroup library is cast; self shielding of nuclear data depending on the heterogeneous geometry and composition; tracking of geometry, keeping error in volume and surface to an acceptable minimum; generation of regionwise and groupwise collision probabilities or MOC-related information and their subsequent normalization thereof, solution of transport equation using the previously generated groupwise information and obtaining the fluxes and reaction rates in various regions of the lattice; depletion of fuel and of other materials based on normalization with constant power or constant flux. Of the above mentioned levels, the present research will mainly focus on two aspects, namely self shielding and depletion. The behaviour of the system is determined by composition of resonant

  20. Computing and monitoring potential of public spaces by shading analysis using 3d lidar data and advanced image analysis

    NASA Astrophysics Data System (ADS)

    Zwolinski, A.; Jarzemski, M.

    2015-04-01

    The paper regards specific context of public spaces in "shadow" of tall buildings located in European cities. Majority of tall buildings in European cities were built in last 15 years. Tall buildings appear mainly in city centres, directly at important public spaces being viable environment for inhabitants with variety of public functions (open spaces, green areas, recreation places, shops, services etc.). All these amenities and services are under direct impact of extensive shading coming from the tall buildings. The paper focuses on analyses and representation of impact of shading from tall buildings on various public spaces in cities using 3D city models. Computer environment of 3D city models in cityGML standard uses 3D LiDAR data as one of data types for definition of 3D cities. The structure of cityGML allows analytic applications using existing computer tools, as well as developing new techniques to estimate extent of shading coming from high-risers, affecting life in public spaces. These measurable shading parameters in specific time are crucial for proper functioning, viability and attractiveness of public spaces - finally it is extremely important for location of tall buildings at main public spaces in cities. The paper explores impact of shading from tall buildings in different spatial contexts on the background of using cityGML models based on core LIDAR data to support controlled urban development in sense of viable public spaces. The article is prepared within research project 2TaLL: Application of 3D Virtual City Models in Urban Analyses of Tall Buildings, realized as a part of Polish-Norway Grants.

  1. Advanced computational research in materials processing for design and manufacturing

    SciTech Connect

    Zacharia, T.

    1995-04-01

    Advanced mathematical techniques and computer simulation play a major role in providing enhanced understanding of conventional and advanced materials processing operations. Development and application of mathematical models and computer simulation techniques can provide a quantitative understanding of materials processes and will minimize the need for expensive and time consuming trial- and error-based product development. As computer simulations and materials databases grow in complexity, high performance computing and simulation are expected to play a key role in supporting the improvements required in advanced material syntheses and processing by lessening the dependence on expensive prototyping and re-tooling. Many of these numerical models are highly compute-intensive. It is not unusual for an analysis to require several hours of computational time on current supercomputers despite the simplicity of the models being studied. For example, to accurately simulate the heat transfer in a 1-m{sup 3} block using a simple computational method requires 10`2 arithmetic operations per second of simulated time. For a computer to do the simulation in real time would require a sustained computation rate 1000 times faster than that achievable by current supercomputers. Massively parallel computer systems, which combine several thousand processors able to operate concurrently on a problem are expected to provide orders of magnitude increase in performance. This paper briefly describes advanced computational research in materials processing at ORNL. Continued development of computational techniques and algorithms utilizing the massively parallel computers will allow the simulation of conventional and advanced materials processes in sufficient generality.

  2. Advancing manufacturing through computational chemistry

    SciTech Connect

    Noid, D.W.; Sumpter, B.G.; Tuzun, R.E.

    1995-12-31

    The capabilities of nanotechnology and computational chemistry are reaching a point of convergence. New computer hardware and novel computational methods have created opportunities to test proposed nanometer-scale devices, investigate molecular manufacturing and model and predict properties of new materials. Experimental methods are also beginning to provide new capabilities that make the possibility of manufacturing various devices with atomic precision tangible. In this paper, we will discuss some of the novel computational methods we have used in molecular dynamics simulations of polymer processes, neural network predictions of new materials, and simulations of proposed nano-bearings and fluid dynamics in nano- sized devices.

  3. Computational Fluid Dynamic Analysis of the Posterior Airway Space After Maxillomandibular Advancement For Obstructive Sleep Apnea Syndrome

    PubMed Central

    Sittitavornwong, Somsak; Waite, Peter D.; Shih, Alan M.; Cheng, Gary C.; Koomullil, Roy; Ito, Yasushi; Cure, Joel K; Harding, Susan M.; Litaker, Mark

    2013-01-01

    Purpose Evaluate the soft tissue change of the upper airway after maxillomandibular advancement (MMA) by computational fluid dynamics (CFD). Materials and Methods Eight OSAS patients who required MMA were recruited into this study. All participants had pre- and post-operative computed tomography (CT) and underwent MMA by a single oral and maxillofacial surgeon. Upper airway CT data sets for these 8 participants were created with high-fidelity 3-D numerical models for computational fluid dynamics (CFD). The 3-D models were simulated and analyzed to study how changes in airway anatomy affects pressure effort required for normal breathing. Airway dimensions, skeletal changes, Apnea-Hypopnea Index (AHI), and pressure efforts of pre- and post-operative 3-D models were compared and correlations interpreted. Results After MMA, laminar and turbulent air flow was significantly decreased at every level of the airway. The cross-sectional areas at the soft palate and tongue base were significantly increased. Conclusions This study shows that MMA increases airway dimensions by the increasing the occipital base (Base) - pogonion (Pg) distance. An increase of the Base-Pg distance showed a significant correlation with an AHI improvement and a decreased pressure effort of the upper airway. Decreasing the pressure effort will decrease the breathing workload. This improves the condition of OSAS. PMID:23642544

  4. Quantum chromodynamics with advanced computing

    SciTech Connect

    Kronfeld, Andreas S.; /Fermilab

    2008-07-01

    We survey results in lattice quantum chromodynamics from groups in the USQCD Collaboration. The main focus is on physics, but many aspects of the discussion are aimed at an audience of computational physicists.

  5. Aerodynamic Analyses Requiring Advanced Computers, Part 1

    NASA Technical Reports Server (NTRS)

    1975-01-01

    Papers are presented which deal with results of theoretical research on aerodynamic flow problems requiring the use of advanced computers. Topics discussed include: viscous flows, boundary layer equations, turbulence modeling and Navier-Stokes equations, and internal flows.

  6. Bringing Advanced Computational Techniques to Energy Research

    SciTech Connect

    Mitchell, Julie C

    2012-11-17

    Please find attached our final technical report for the BACTER Institute award. BACTER was created as a graduate and postdoctoral training program for the advancement of computational biology applied to questions of relevance to bioenergy research.

  7. Aerodynamic Analyses Requiring Advanced Computers, part 2

    NASA Technical Reports Server (NTRS)

    1975-01-01

    Papers given at the conference present the results of theoretical research on aerodynamic flow problems requiring the use of advanced computers. Topics discussed include two-dimensional configurations, three-dimensional configurations, transonic aircraft, and the space shuttle.

  8. Advanced laptop and small personal computer technology

    NASA Technical Reports Server (NTRS)

    Johnson, Roger L.

    1991-01-01

    Advanced laptop and small personal computer technology is presented in the form of the viewgraphs. The following areas of hand carried computers and mobile workstation technology are covered: background, applications, high end products, technology trends, requirements for the Control Center application, and recommendations for the future.

  9. Advanced Biomedical Computing Center (ABCC) | DSITP

    Cancer.gov

    The Advanced Biomedical Computing Center (ABCC), located in Frederick Maryland (MD), provides HPC resources for both NIH/NCI intramural scientists and the extramural biomedical research community. Its mission is to provide HPC support, to provide collaborative research, and to conduct in-house research in various areas of computational biology and biomedical research.

  10. Opportunities in computational mechanics: Advances in parallel computing

    SciTech Connect

    Lesar, R.A.

    1999-02-01

    In this paper, the authors will discuss recent advances in computing power and the prospects for using these new capabilities for studying plasticity and failure. They will first review the new capabilities made available with parallel computing. They will discuss how these machines perform and how well their architecture might work on materials issues. Finally, they will give some estimates on the size of problems possible using these computers.

  11. Advances and challenges in computational plasma science

    NASA Astrophysics Data System (ADS)

    Tang, W. M.

    2005-02-01

    Scientific simulation, which provides a natural bridge between theory and experiment, is an essential tool for understanding complex plasma behaviour. Recent advances in simulations of magnetically confined plasmas are reviewed in this paper, with illustrative examples, chosen from associated research areas such as microturbulence, magnetohydrodynamics and other topics. Progress has been stimulated, in particular, by the exponential growth of computer speed along with significant improvements in computer technology. The advances in both particle and fluid simulations of fine-scale turbulence and large-scale dynamics have produced increasingly good agreement between experimental observations and computational modelling. This was enabled by two key factors: (a) innovative advances in analytic and computational methods for developing reduced descriptions of physics phenomena spanning widely disparate temporal and spatial scales and (b) access to powerful new computational resources. Excellent progress has been made in developing codes for which computer run-time and problem-size scale well with the number of processors on massively parallel processors (MPPs). Examples include the effective usage of the full power of multi-teraflop (multi-trillion floating point computations per second) MPPs to produce three-dimensional, general geometry, nonlinear particle simulations that have accelerated advances in understanding the nature of turbulence self-regulation by zonal flows. These calculations, which typically utilized billions of particles for thousands of time-steps, would not have been possible without access to powerful present generation MPP computers and the associated diagnostic and visualization capabilities. In looking towards the future, the current results from advanced simulations provide great encouragement for being able to include increasingly realistic dynamics to enable deeper physics insights into plasmas in both natural and laboratory environments. This

  12. Modeling, Simulation and Analysis of Complex Networked Systems: A Program Plan for DOE Office of Advanced Scientific Computing Research

    SciTech Connect

    Brown, D L

    2009-05-01

    Many complex systems of importance to the U.S. Department of Energy consist of networks of discrete components. Examples are cyber networks, such as the internet and local area networks over which nearly all DOE scientific, technical and administrative data must travel, the electric power grid, social networks whose behavior can drive energy demand, and biological networks such as genetic regulatory networks and metabolic networks. In spite of the importance of these complex networked systems to all aspects of DOE's operations, the scientific basis for understanding these systems lags seriously behind the strong foundations that exist for the 'physically-based' systems usually associated with DOE research programs that focus on such areas as climate modeling, fusion energy, high-energy and nuclear physics, nano-science, combustion, and astrophysics. DOE has a clear opportunity to develop a similarly strong scientific basis for understanding the structure and dynamics of networked systems by supporting a strong basic research program in this area. Such knowledge will provide a broad basis for, e.g., understanding and quantifying the efficacy of new security approaches for computer networks, improving the design of computer or communication networks to be more robust against failures or attacks, detecting potential catastrophic failure on the power grid and preventing or mitigating its effects, understanding how populations will respond to the availability of new energy sources or changes in energy policy, and detecting subtle vulnerabilities in large software systems to intentional attack. This white paper outlines plans for an aggressive new research program designed to accelerate the advancement of the scientific basis for complex networked systems of importance to the DOE. It will focus principally on four research areas: (1) understanding network structure, (2) understanding network dynamics, (3) predictive modeling and simulation for complex networked systems

  13. Role of HPC in Advancing Computational Aeroelasticity

    NASA Technical Reports Server (NTRS)

    Guruswamy, Guru P.

    2004-01-01

    On behalf of the High Performance Computing and Modernization Program (HPCMP) and NASA Advanced Supercomputing Division (NAS) a study is conducted to assess the role of supercomputers on computational aeroelasticity of aerospace vehicles. The study is mostly based on the responses to a web based questionnaire that was designed to capture the nuances of high performance computational aeroelasticity, particularly on parallel computers. A procedure is presented to assign a fidelity-complexity index to each application. Case studies based on major applications using HPCMP resources are presented.

  14. Advanced Computed-Tomography Inspection System

    NASA Technical Reports Server (NTRS)

    Harris, Lowell D.; Gupta, Nand K.; Smith, Charles R.; Bernardi, Richard T.; Moore, John F.; Hediger, Lisa

    1993-01-01

    Advanced Computed Tomography Inspection System (ACTIS) is computed-tomography x-ray apparatus revealing internal structures of objects in wide range of sizes and materials. Three x-ray sources and adjustable scan geometry gives system unprecedented versatility. Gantry contains translation and rotation mechanisms scanning x-ray beam through object inspected. Distance between source and detector towers varied to suit object. System used in such diverse applications as development of new materials, refinement of manufacturing processes, and inspection of components.

  15. Advances and Challenges in Computational Plasma Science

    SciTech Connect

    W.M. Tang; V.S. Chan

    2005-01-03

    Scientific simulation, which provides a natural bridge between theory and experiment, is an essential tool for understanding complex plasma behavior. Recent advances in simulations of magnetically-confined plasmas are reviewed in this paper with illustrative examples chosen from associated research areas such as microturbulence, magnetohydrodynamics, and other topics. Progress has been stimulated in particular by the exponential growth of computer speed along with significant improvements in computer technology.

  16. Proceedings of the NATO-Advanced Study Institute on Computer Aided Analysis of Rigid and Flexible Mechanical Systems. Volume 1: Main lectures

    NASA Astrophysics Data System (ADS)

    Pereira, Manuel S.; Ambrosio, Jorge A. C.

    1993-07-01

    During the last few years, major scientific progress has been achieved in fields related to computer aided analysis of multibody systems. In view of this progress and recent developments of computer hardware and general purpose software, there is a need to access the current state of art and results from different schools of thought, with the objective of focussing trends in future research. Going back to 1983 when an important NATO-NSF-ARO Advanced Study Institute on Computer Aided Analysis and Optimization of Mechanical Systems was held at the University of Iowa, one may notice that less then 10 years ago the state of art was mainly dwelling on rigid body dynamics. The interest in the dynamic simulation of mechanical systems has steadily increased in recent years coming mainly from the aerospace and automative industries. The development of multibody system analysis formulations have been more recently motivated with the need to include several features such as: real-time simulation capabilities, highly non-linear control devices, work space path planing, active control of machine flexibilities and reliability and accuracy in the analysis results. The need for accurate and efficient analysis tools for design of large and lightweight mechanical systems has driven many research groups in the challenging problem of flexible systems with an increasing interaction with finite element methodologies. Basic approaches to mechanical systems dynamic analysis have recently been presented in several new text books. These publications demonstrate that both recursive and absolute methods still have their proponents to resolve the redundancy encountered in most mechanical systems.

  17. Advanced networks and computing in healthcare

    PubMed Central

    Ackerman, Michael

    2011-01-01

    As computing and network capabilities continue to rise, it becomes increasingly important to understand the varied applications for using them to provide healthcare. The objective of this review is to identify key characteristics and attributes of healthcare applications involving the use of advanced computing and communication technologies, drawing upon 45 research and development projects in telemedicine and other aspects of healthcare funded by the National Library of Medicine over the past 12 years. Only projects publishing in the professional literature were included in the review. Four projects did not publish beyond their final reports. In addition, the authors drew on their first-hand experience as project officers, reviewers and monitors of the work. Major themes in the corpus of work were identified, characterizing key attributes of advanced computing and network applications in healthcare. Advanced computing and network applications are relevant to a range of healthcare settings and specialties, but they are most appropriate for solving a narrower range of problems in each. Healthcare projects undertaken primarily to explore potential have also demonstrated effectiveness and depend on the quality of network service as much as bandwidth. Many applications are enabling, making it possible to provide service or conduct research that previously was not possible or to achieve outcomes in addition to those for which projects were undertaken. Most notable are advances in imaging and visualization, collaboration and sense of presence, and mobility in communication and information-resource use. PMID:21486877

  18. Advanced networks and computing in healthcare.

    PubMed

    Ackerman, Michael; Locatis, Craig

    2011-01-01

    As computing and network capabilities continue to rise, it becomes increasingly important to understand the varied applications for using them to provide healthcare. The objective of this review is to identify key characteristics and attributes of healthcare applications involving the use of advanced computing and communication technologies, drawing upon 45 research and development projects in telemedicine and other aspects of healthcare funded by the National Library of Medicine over the past 12 years. Only projects publishing in the professional literature were included in the review. Four projects did not publish beyond their final reports. In addition, the authors drew on their first-hand experience as project officers, reviewers and monitors of the work. Major themes in the corpus of work were identified, characterizing key attributes of advanced computing and network applications in healthcare. Advanced computing and network applications are relevant to a range of healthcare settings and specialties, but they are most appropriate for solving a narrower range of problems in each. Healthcare projects undertaken primarily to explore potential have also demonstrated effectiveness and depend on the quality of network service as much as bandwidth. Many applications are enabling, making it possible to provide service or conduct research that previously was not possible or to achieve outcomes in addition to those for which projects were undertaken. Most notable are advances in imaging and visualization, collaboration and sense of presence, and mobility in communication and information-resource use. PMID:21486877

  19. Advanced Algebra and Trigonometry: Supplemental Computer Units.

    ERIC Educational Resources Information Center

    Dotseth, Karen

    A set of computer-oriented, supplemental activities is offered which can be used with a course in advanced algebra and trigonometry. The activities involve use of the BASIC programming language; it is assumed that the teacher is familiar with programming in BASIC. Students will learn some BASIC; however, the intent is not to develop proficient…

  20. Computational Biology, Advanced Scientific Computing, and Emerging Computational Architectures

    SciTech Connect

    2007-06-27

    This CRADA was established at the start of FY02 with $200 K from IBM and matching funds from DOE to support post-doctoral fellows in collaborative research between International Business Machines and Oak Ridge National Laboratory to explore effective use of emerging petascale computational architectures for the solution of computational biology problems. 'No cost' extensions of the CRADA were negotiated with IBM for FY03 and FY04.

  1. Predictive Dynamic Security Assessment through Advanced Computing

    SciTech Connect

    Huang, Zhenyu; Diao, Ruisheng; Jin, Shuangshuang; Chen, Yousu

    2014-11-30

    Abstract— Traditional dynamic security assessment is limited by several factors and thus falls short in providing real-time information to be predictive for power system operation. These factors include the steady-state assumption of current operating points, static transfer limits, and low computational speed. This addresses these factors and frames predictive dynamic security assessment. The primary objective of predictive dynamic security assessment is to enhance the functionality and computational process of dynamic security assessment through the use of high-speed phasor measurements and the application of advanced computing technologies for faster-than-real-time simulation. This paper presents algorithms, computing platforms, and simulation frameworks that constitute the predictive dynamic security assessment capability. Examples of phasor application and fast computation for dynamic security assessment are included to demonstrate the feasibility and speed enhancement for real-time applications.

  2. Investigation of advanced counterrotation blade configuration concepts for high speed turboprop systems. Task 5: Unsteady counterrotation ducted propfan analysis. Computer program user's manual

    NASA Technical Reports Server (NTRS)

    Hall, Edward J.; Delaney, Robert A.; Adamczyk, John J.; Miller, Christopher J.; Arnone, Andrea; Swanson, Charles

    1993-01-01

    The primary objective of this study was the development of a time-marching three-dimensional Euler/Navier-Stokes aerodynamic analysis to predict steady and unsteady compressible transonic flows about ducted and unducted propfan propulsion systems employing multiple blade rows. The computer codes resulting from this study are referred to as ADPAC-AOACR (Advanced Ducted Propfan Analysis Codes-Angle of Attack Coupled Row). This report is intended to serve as a computer program user's manual for the ADPAC-AOACR codes developed under Task 5 of NASA Contract NAS3-25270, Unsteady Counterrotating Ducted Propfan Analysis. The ADPAC-AOACR program is based on a flexible multiple blocked grid discretization scheme permitting coupled 2-D/3-D mesh block solutions with application to a wide variety of geometries. For convenience, several standard mesh block structures are described for turbomachinery applications. Aerodynamic calculations are based on a four-stage Runge-Kutta time-marching finite volume solution technique with added numerical dissipation. Steady flow predictions are accelerated by a multigrid procedure. Numerical calculations are compared with experimental data for several test cases to demonstrate the utility of this approach for predicting the aerodynamics of modern turbomachinery configurations employing multiple blade rows.

  3. Investigation of advanced counterrotation blade configuration concepts for high speed turboprop systems. Task 2: Unsteady ducted propfan analysis computer program users manual

    NASA Technical Reports Server (NTRS)

    Hall, Edward J.; Delaney, Robert A.; Bettner, James L.

    1991-01-01

    The primary objective of this study was the development of a time-dependent three-dimensional Euler/Navier-Stokes aerodynamic analysis to predict unsteady compressible transonic flows about ducted and unducted propfan propulsion systems at angle of attack. The computer codes resulting from this study are referred to as Advanced Ducted Propfan Analysis Codes (ADPAC). This report is intended to serve as a computer program user's manual for the ADPAC developed under Task 2 of NASA Contract NAS3-25270, Unsteady Ducted Propfan Analysis. Aerodynamic calculations were based on a four-stage Runge-Kutta time-marching finite volume solution technique with added numerical dissipation. A time-accurate implicit residual smoothing operator was utilized for unsteady flow predictions. For unducted propfans, a single H-type grid was used to discretize each blade passage of the complete propeller. For ducted propfans, a coupled system of five grid blocks utilizing an embedded C-grid about the cowl leading edge was used to discretize each blade passage. Grid systems were generated by a combined algebraic/elliptic algorithm developed specifically for ducted propfans. Numerical calculations were compared with experimental data for both ducted and unducted propfan flows. The solution scheme demonstrated efficiency and accuracy comparable with other schemes of this class.

  4. Advanced computational research in materials processing for design and manufacturing

    SciTech Connect

    Zacharia, T.

    1994-12-31

    The computational requirements for design and manufacture of automotive components have seen dramatic increases for producing automobiles with three times the mileage. Automotive component design systems are becoming increasingly reliant on structural analysis requiring both overall larger analysis and more complex analyses, more three-dimensional analyses, larger model sizes, and routine consideration of transient and non-linear effects. Such analyses must be performed rapidly to minimize delays in the design and development process, which drives the need for parallel computing. This paper briefly describes advanced computational research in superplastic forming and automotive crash worthiness.

  5. Advanced PFBC transient analysis

    SciTech Connect

    White, J.S.; Bonk, D.L.

    1997-05-01

    Transient modeling and analysis of advanced Pressurized Fluidized Bed Combustion (PFBC) systems is a research area that is currently under investigation by the US Department of Energy`s Federal Energy Technology Center (FETC). The object of the effort is to identify key operating parameters that affect plant performance and then quantify the basic response of major sub-systems to changes in operating conditions. PC-TRAX{trademark}, a commercially available dynamic software program, was chosen and applied in this modeling and analysis effort. This paper describes the development of a series of TRAX-based transient models of advanced PFBC power plants. These power plants burn coal or other suitable fuel in a PFBC, and the high temperature flue gas supports low-Btu fuel gas or natural gas combustion in a gas turbine topping combustor. When it is utilized, the low-Btu fuel gas is produced in a bubbling bed carbonizer. High temperature, high pressure combustion products exiting the topping combustor are expanded in a modified gas turbine to generate electrical power. Waste heat from the system is used to raise and superheat steam for a reheat steam turbine bottoming cycle that generates additional electrical power. Basic control/instrumentation models were developed and modeled in PC-TRAX and used to investigate off-design plant performance. System performance for various transient conditions and control philosophies was studied.

  6. Advances in Computational Capabilities for Hypersonic Flows

    NASA Technical Reports Server (NTRS)

    Kumar, Ajay; Gnoffo, Peter A.; Moss, James N.; Drummond, J. Philip

    1997-01-01

    The paper reviews the growth and advances in computational capabilities for hypersonic applications over the period from the mid-1980's to the present day. The current status of the code development issues such as surface and field grid generation, algorithms, physical and chemical modeling, and validation is provided. A brief description of some of the major codes being used at NASA Langley Research Center for hypersonic continuum and rarefied flows is provided, along with their capabilities and deficiencies. A number of application examples are presented, and future areas of research to enhance accuracy, reliability, efficiency, and robustness of computational codes are discussed.

  7. Leveraging On-premise and Public Cloud Computing to Enable Advanced Rapid Imaging & Analysis for Monitoring Hazards (Invited)

    NASA Astrophysics Data System (ADS)

    Hua, H.; Owen, S. E.; Yun, S.; Lundgren, P.; Moore, A. W.; Fielding, E. J.; Agram, P.; Manipon, G.; Simons, M.; Rosen, P. A.; Stough, T. M.; Wilson, B. D.; Poland, M. P.; Cervelli, P. F.; Cruz, J.

    2013-12-01

    Many of the fundamental processes underlying hazards such as earthquakes and volcanoes are poorly understood. Hazard systems are difficult to replicate in lab environments, and so we need to observe them in 'natural laboratories'. The global coverage offered by satellite-based SAR missions, and rapidly expanding GPS networks can provide orders of magnitude more observations. These combined geodetic data products will enable greater understanding of processes leading up to, during, and after natural and man-made disasters. However, a science data system is needed that can efficiently monitor & analyze the voluminous data, and provide users the tools to access the data products. In the interpretation process from observations to decision-making, data from observations are first used to improve the understanding of the physical processes, which then lead to more informed knowledge. However the need for handling high data volumes and processing expertise are often bottlenecks to providing the data product streams needed for improved decision-making. To help address lower latency and high data volume needs for monitoring and response to globally distributed hazards, we leveraged a hybrid-cloud computing approach that utilizes a seamless mixture of an on-premise Eucalyptus-based cloud computing environment with public Amazon Web Services (AWS) cloud computing resources. We will present some findings on the automation of geodetic processing, use of hybrid-cloud computing to address on-premise resource constraint issues, scalability issues in processing latency and data movement, as well as data discovery, access, and integration for other tools for location analytics.

  8. Computer analysis of arteriograms

    NASA Technical Reports Server (NTRS)

    Selzer, R. H.; Armstrong, J. H.; Beckenbach, E. B.; Blankenhorn, D. H.; Crawford, D. W.; Brooks, S. H.; Sanmarco, M. E.

    1977-01-01

    A computer system has been developed to quantify the degree of atherosclerosis in the human femoral artery. The analysis involves first scanning and digitizing angiographic film, then tracking the outline of the arterial image and finally computing the relative amount of roughness or irregularity in the vessel wall. The image processing system and method are described.

  9. Recent advances in computational actinoid chemistry.

    PubMed

    Wang, Dongqi; van Gunsteren, Wilfred F; Chai, Zhifang

    2012-09-01

    We briefly review advances in computational actinoid (An) chemistry during the past ten years in regard to two issues: the geometrical and electronic structures, and reactions. The former addresses the An-O, An-C, and M-An (M is a metal atom including An) bonds in the actinoid molecular systems, including actinoid oxo and oxide species, actinoid-carbenoid, dinuclear and diatomic systems, and the latter the hydration and ligand exchange, the disproportionation, the oxidation, the reduction of uranyl, hydroamination, and the photolysis of uranium azide. Concerning their relevance to the electronic structures and reactions of actinoids and their importance in the development of an advanced nuclear fuel cycle, we also mentioned the work on actinoid carbides and nitrides, which have been proposed to be candidates of the next generation of nuclear fuel, and the oxidation of PuO(x), which is important to understand the speciation of actinoids in the environment, followed by a brief discussion on the urgent need for a heavier involvement of computational actinoid chemistry in developing advanced reprocessing protocols of spent nuclear fuel. The paper is concluded with an outlook. PMID:22777520

  10. 75 FR 43518 - Advanced Scientific Computing Advisory Committee; Meeting

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-07-26

    ... Advanced Scientific Computing Advisory Committee; Meeting AGENCY: Office of Science, DOE. ACTION: Notice of open meeting. SUMMARY: This notice announces a meeting of the Advanced Scientific Computing Advisory..., Office of Advanced Scientific Computing Research; SC-21/Germantown Building; U. S. Department of...

  11. 78 FR 6087 - Advanced Scientific Computing Advisory Committee

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-01-29

    ... Advanced Scientific Computing Advisory Committee AGENCY: Department of Energy, Office of Science. ACTION: Notice of open meeting. SUMMARY: This notice announces a meeting of the Advanced Scientific Computing..., Office of Advanced Scientific Computing Research; SC-21/Germantown Building; U. S. Department of...

  12. 76 FR 41234 - Advanced Scientific Computing Advisory Committee Charter Renewal

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-07-13

    ... Advanced Scientific Computing Advisory Committee Charter Renewal AGENCY: Department of Energy, Office of... Administration, notice is hereby given that the Advanced Scientific Computing Advisory Committee will be renewed... concerning the Advanced Scientific Computing program in response only to charges from the Director of...

  13. 76 FR 9765 - Advanced Scientific Computing Advisory Committee

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-02-22

    ... Advanced Scientific Computing Advisory Committee AGENCY: Office of Science, Department of Energy. ACTION: Notice of open meeting. SUMMARY: This notice announces a meeting of the Advanced Scientific Computing..., Office of Advanced Scientific Computing Research, SC-21/Germantown Building, U.S. Department of...

  14. 78 FR 41046 - Advanced Scientific Computing Advisory Committee

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-07-09

    ... Advanced Scientific Computing Advisory Committee AGENCY: Department of Energy, Office of Science. ACTION... hereby given that the Advanced Scientific Computing Advisory Committee will be renewed for a two-year... (DOE), on the Advanced Scientific Computing Research Program managed by the Office of...

  15. 75 FR 9887 - Advanced Scientific Computing Advisory Committee

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-03-04

    ... Advanced Scientific Computing Advisory Committee AGENCY: Department of Energy, Office of Science. ACTION: Notice of open meeting. SUMMARY: This notice announces a meeting of the Advanced Scientific Computing... Baker, Office of Advanced Scientific Computing Research; SC-21/Germantown Building; U.S. Department...

  16. 78 FR 56871 - Advanced Scientific Computing Advisory Committee

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-09-16

    ... Advanced Scientific Computing Advisory Committee AGENCY: Office of Science, Department of Energy. ACTION: Notice of open meeting. SUMMARY: This notice announces a meeting of the Advanced Scientific Computing... Baker, Office of Advanced Scientific Computing Research; SC-21/Germantown Building; U.S. Department...

  17. Computational Design of Advanced Nuclear Fuels

    SciTech Connect

    Savrasov, Sergey; Kotliar, Gabriel; Haule, Kristjan

    2014-06-03

    The objective of the project was to develop a method for theoretical understanding of nuclear fuel materials whose physical and thermophysical properties can be predicted from first principles using a novel dynamical mean field method for electronic structure calculations. We concentrated our study on uranium, plutonium, their oxides, nitrides, carbides, as well as some rare earth materials whose 4f eletrons provide a simplified framework for understanding complex behavior of the f electrons. We addressed the issues connected to the electronic structure, lattice instabilities, phonon and magnon dynamics as well as thermal conductivity. This allowed us to evaluate characteristics of advanced nuclear fuel systems using computer based simulations and avoid costly experiments.

  18. ATCA for Machines-- Advanced Telecommunications Computing Architecture

    SciTech Connect

    Larsen, R.S.; /SLAC

    2008-04-22

    The Advanced Telecommunications Computing Architecture is a new industry open standard for electronics instrument modules and shelves being evaluated for the International Linear Collider (ILC). It is the first industrial standard designed for High Availability (HA). ILC availability simulations have shown clearly that the capabilities of ATCA are needed in order to achieve acceptable integrated luminosity. The ATCA architecture looks attractive for beam instruments and detector applications as well. This paper provides an overview of ongoing R&D including application of HA principles to power electronics systems.

  19. Advanced 3-D analysis, client-server systems, and cloud computing-Integration of cardiovascular imaging data into clinical workflows of transcatheter aortic valve replacement.

    PubMed

    Schoenhagen, Paul; Zimmermann, Mathis; Falkner, Juergen

    2013-06-01

    Degenerative aortic stenosis is highly prevalent in the aging populations of industrialized countries and is associated with poor prognosis. Surgical valve replacement has been the only established treatment with documented improvement of long-term outcome. However, many of the older patients with aortic stenosis (AS) are high-risk or ineligible for surgery. For these patients, transcatheter aortic valve replacement (TAVR) has emerged as a treatment alternative. The TAVR procedure is characterized by a lack of visualization of the operative field. Therefore, pre- and intra-procedural imaging is critical for patient selection, pre-procedural planning, and intra-operative decision-making. Incremental to conventional angiography and 2-D echocardiography, multidetector computed tomography (CT) has assumed an important role before TAVR. The analysis of 3-D CT data requires extensive post-processing during direct interaction with the dataset, using advance analysis software. Organization and storage of the data according to complex clinical workflows and sharing of image information have become a critical part of these novel treatment approaches. Optimally, the data are integrated into a comprehensive image data file accessible to multiple groups of practitioners across the hospital. This creates new challenges for data management requiring a complex IT infrastructure, spanning across multiple locations, but is increasingly achieved with client-server solutions and private cloud technology. This article describes the challenges and opportunities created by the increased amount of patient-specific imaging data in the context of TAVR. PMID:24282750

  20. Neutron analysis of spent fuel storage installation using parallel computing and advance discrete ordinates and Monte Carlo techniques.

    PubMed

    Shedlock, Daniel; Haghighat, Alireza

    2005-01-01

    In the United States, the Nuclear Waste Policy Act of 1982 mandated centralised storage of spent nuclear fuel by 1988. However, the Yucca Mountain project is currently scheduled to start accepting spent nuclear fuel in 2010. Since many nuclear power plants were only designed for -10 y of spent fuel pool storage, > 35 plants have been forced into alternate means of spent fuel storage. In order to continue operation and make room in spent fuel pools, nuclear generators are turning towards independent spent fuel storage installations (ISFSIs). Typical vertical concrete ISFSIs are -6.1 m high and 3.3 m in diameter. The inherently large system, and the presence of thick concrete shields result in difficulties for both Monte Carlo (MC) and discrete ordinates (SN) calculations. MC calculations require significant variance reduction and multiple runs to obtain a detailed dose distribution. SN models need a large number of spatial meshes to accurately model the geometry and high quadrature orders to reduce ray effects, therefore, requiring significant amounts of computer memory and time. The use of various differencing schemes is needed to account for radial heterogeneity in material cross sections and densities. Two P3, S12, discrete ordinate, PENTRAN (parallel environment neutral-particle TRANsport) models were analysed and different MC models compared. A multigroup MCNP model was developed for direct comparison to the SN models. The biased A3MCNP (automated adjoint accelerated MCNP) and unbiased (MCNP) continuous energy MC models were developed to assess the adequacy of the CASK multigroup (22 neutron, 18 gamma) cross sections. The PENTRAN SN results are in close agreement (5%) with the multigroup MC results; however, they differ by -20-30% from the continuous-energy MC predictions. This large difference can be attributed to the expected difference between multigroup and continuous energy cross sections, and the fact that the CASK library is based on the old ENDF

  1. Computer vision in microstructural analysis

    NASA Technical Reports Server (NTRS)

    Srinivasan, Malur N.; Massarweh, W.; Hough, C. L.

    1992-01-01

    The following is a laboratory experiment designed to be performed by advanced-high school and beginning-college students. It is hoped that this experiment will create an interest in and further understanding of materials science. The objective of this experiment is to demonstrate that the microstructure of engineered materials is affected by the processing conditions in manufacture, and that it is possible to characterize the microstructure using image analysis with a computer. The principle of computer vision will first be introduced followed by the description of the system developed at Texas A&M University. This in turn will be followed by the description of the experiment to obtain differences in microstructure and the characterization of the microstructure using computer vision.

  2. [Activities of Research Institute for Advanced Computer Science

    NASA Technical Reports Server (NTRS)

    Gross, Anthony R. (Technical Monitor); Leiner, Barry M.

    2001-01-01

    The Research Institute for Advanced Computer Science (RIACS) carries out basic research and technology development in computer science, in support of the National Aeronautics and Space Administrations missions. RIACS is located at the NASA Ames Research Center, Moffett Field, California. RIACS research focuses on the three cornerstones of IT research necessary to meet the future challenges of NASA missions: 1. Automated Reasoning for Autonomous Systems Techniques are being developed enabling spacecraft that will be self-guiding and self-correcting to the extent that they will require little or no human intervention. Such craft will be equipped to independently solve problems as they arise, and fulfill their missions with minimum direction from Earth. 2. Human-Centered Computing Many NASA missions require synergy between humans and computers, with sophisticated computational aids amplifying human cognitive and perceptual abilities. 3. High Performance Computing and Networking Advances in the performance of computing and networking continue to have major impact on a variety of NASA endeavors, ranging from modeling and simulation to analysis of large scientific datasets to collaborative engineering, planning and execution. In addition, RIACS collaborates with NASA scientists to apply IT research to a variety of NASA application domains. RIACS also engages in other activities, such as workshops, seminars, visiting scientist programs and student summer programs, designed to encourage and facilitate collaboration between the university and NASA IT research communities.

  3. Advanced Scientific Computing Research Network Requirements

    SciTech Connect

    Bacon, Charles; Bell, Greg; Canon, Shane; Dart, Eli; Dattoria, Vince; Goodwin, Dave; Lee, Jason; Hicks, Susan; Holohan, Ed; Klasky, Scott; Lauzon, Carolyn; Rogers, Jim; Shipman, Galen; Skinner, David; Tierney, Brian

    2013-03-08

    The Energy Sciences Network (ESnet) is the primary provider of network connectivity for the U.S. Department of Energy (DOE) Office of Science (SC), the single largest supporter of basic research in the physical sciences in the United States. In support of SC programs, ESnet regularly updates and refreshes its understanding of the networking requirements of the instruments, facilities, scientists, and science programs that it serves. This focus has helped ESnet to be a highly successful enabler of scientific discovery for over 25 years. In October 2012, ESnet and the Office of Advanced Scientific Computing Research (ASCR) of the DOE SC organized a review to characterize the networking requirements of the programs funded by the ASCR program office. The requirements identified at the review are summarized in the Findings section, and are described in more detail in the body of the report.

  4. 76 FR 45786 - Advanced Scientific Computing Advisory Committee; Meeting

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-08-01

    ... Advanced Scientific Computing Advisory Committee; Meeting AGENCY: Office of Science, Department of Energy. ACTION: Notice of open meeting. SUMMARY: This notice announces a meeting of the Advanced Scientific... INFORMATION CONTACT: Melea Baker, Office of Advanced Scientific Computing Research; SC-21/Germantown...

  5. Computational Analysis of Behavior.

    PubMed

    Egnor, S E Roian; Branson, Kristin

    2016-07-01

    In this review, we discuss the emerging field of computational behavioral analysis-the use of modern methods from computer science and engineering to quantitatively measure animal behavior. We discuss aspects of experiment design important to both obtaining biologically relevant behavioral data and enabling the use of machine vision and learning techniques for automation. These two goals are often in conflict. Restraining or restricting the environment of the animal can simplify automatic behavior quantification, but it can also degrade the quality or alter important aspects of behavior. To enable biologists to design experiments to obtain better behavioral measurements, and computer scientists to pinpoint fruitful directions for algorithm improvement, we review known effects of artificial manipulation of the animal on behavior. We also review machine vision and learning techniques for tracking, feature extraction, automated behavior classification, and automated behavior discovery, the assumptions they make, and the types of data they work best with. PMID:27090952

  6. OPENING REMARKS: Scientific Discovery through Advanced Computing

    NASA Astrophysics Data System (ADS)

    Strayer, Michael

    2006-01-01

    Good morning. Welcome to SciDAC 2006 and Denver. I share greetings from the new Undersecretary for Energy, Ray Orbach. Five years ago SciDAC was launched as an experiment in computational science. The goal was to form partnerships among science applications, computer scientists, and applied mathematicians to take advantage of the potential of emerging terascale computers. This experiment has been a resounding success. SciDAC has emerged as a powerful concept for addressing some of the biggest challenges facing our world. As significant as these successes were, I believe there is also significance in the teams that achieved them. In addition to their scientific aims these teams have advanced the overall field of computational science and set the stage for even larger accomplishments as we look ahead to SciDAC-2. I am sure that many of you are expecting to hear about the results of our current solicitation for SciDAC-2. I’m afraid we are not quite ready to make that announcement. Decisions are still being made and we will announce the results later this summer. Nearly 250 unique proposals were received and evaluated, involving literally thousands of researchers, postdocs, and students. These collectively requested more than five times our expected budget. This response is a testament to the success of SciDAC in the community. In SciDAC-2 our budget has been increased to about 70 million for FY 2007 and our partnerships have expanded to include the Environment and National Security missions of the Department. The National Science Foundation has also joined as a partner. These new partnerships are expected to expand the application space of SciDAC, and broaden the impact and visibility of the program. We have, with our recent solicitation, expanded to turbulence, computational biology, and groundwater reactive modeling and simulation. We are currently talking with the Department’s applied energy programs about risk assessment, optimization of complex systems - such

  7. 75 FR 57742 - Advanced Scientific Computing Advisory Committee

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-09-22

    ... Advanced Scientific Computing Advisory Committee AGENCY: Department of Energy, Office of Science. ACTION... Scientific Computing Advisory Committee (ASCAC). Federal Advisory Committee Act (Pub. L. 92-463, 86 Stat. 770...: Melea Baker, Office of Advanced Scientific Computing Research; SC-21/Germantown Building;...

  8. Making Advanced Computer Science Topics More Accessible through Interactive Technologies

    ERIC Educational Resources Information Center

    Shao, Kun; Maher, Peter

    2012-01-01

    Purpose: Teaching advanced technical concepts in a computer science program to students of different technical backgrounds presents many challenges. The purpose of this paper is to present a detailed experimental pedagogy in teaching advanced computer science topics, such as computer networking, telecommunications and data structures using…

  9. 77 FR 12823 - Advanced Scientific Computing Advisory Committee

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-03-02

    ... final report, Advanced Networking update Status from Computer Science COV Early Career technical talks Summary of Applied Math and Computer Science Workshops ASCR's new SBIR awards Data-intensive Science... Advanced Scientific Computing Advisory Committee AGENCY: Department of Energy, Office of Science....

  10. High-Performance Computing for Advanced Smart Grid Applications

    SciTech Connect

    Huang, Zhenyu; Chen, Yousu

    2012-07-06

    The power grid is becoming far more complex as a result of the grid evolution meeting an information revolution. Due to the penetration of smart grid technologies, the grid is evolving as an unprecedented speed and the information infrastructure is fundamentally improved with a large number of smart meters and sensors that produce several orders of magnitude larger amounts of data. How to pull data in, perform analysis, and put information out in a real-time manner is a fundamental challenge in smart grid operation and planning. The future power grid requires high performance computing to be one of the foundational technologies in developing the algorithms and tools for the significantly increased complexity. New techniques and computational capabilities are required to meet the demands for higher reliability and better asset utilization, including advanced algorithms and computing hardware for large-scale modeling, simulation, and analysis. This chapter summarizes the computational challenges in smart grid and the need for high performance computing, and present examples of how high performance computing might be used for future smart grid operation and planning.

  11. Application of advanced electronics to a future spacecraft computer design

    NASA Technical Reports Server (NTRS)

    Carney, P. C.

    1980-01-01

    Advancements in hardware and software technology are summarized with specific emphasis on spacecraft computer capabilities. Available state of the art technology is reviewed and candidate architectures are defined.

  12. Recent advances in transonic computational aeroelasticity

    NASA Technical Reports Server (NTRS)

    Batina, John T.; Bennett, Robert M.; Seidel, David A.; Cunningham, Herbert J.; Bland, Samuel R.

    1988-01-01

    A transonic unsteady aerodynamic and aeroelasticity code called CAP-TSD was developed for application to realistic aircraft configurations. The code permits the calculation of steady and unsteady flows about complete aircraft configurations for aeroelastic analysis in the flutter critical transonic speed range. The CAP-TSD code uses a time accurate approximate factorization algorithm for solution of the unsteady transonic small disturbance potential equation. An overview is given of the CAP-TSD code development effort and results are presented which demonstrate various capabilities of the code. Calculations are presented for several configurations including the General Dynamics 1/9 scale F-16 aircraft model and the ONERA M6 wing. Calculations are also presented from a flutter analysis of a 45 deg sweptback wing which agrees well with the experimental data. Descriptions are presented of the CAP-TSD code and algorithm details along with results and comparisons which demonstrate these recent developments in transonic computational aeroelasticity.

  13. 77 FR 45345 - DOE/Advanced Scientific Computing Advisory Committee

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-07-31

    ... Update from Committee of Visitors for Computer Science activities Facilities update including early science efforts ] Early Career technical talks Recompetition results for Scientific Discovery through.../Advanced Scientific Computing Advisory Committee AGENCY: Office of Science, Department of Energy....

  14. Developing an Advanced Environment for Collaborative Computing

    NASA Technical Reports Server (NTRS)

    Becerra-Fernandez, Irma; Stewart, Helen; DelAlto, Martha; DelAlto, Martha; Knight, Chris

    1999-01-01

    Knowledge management in general tries to organize and make available important know-how, whenever and where ever is needed. Today, organizations rely on decision-makers to produce "mission critical" decisions that am based on inputs from multiple domains. The ideal decision-maker has a profound understanding of specific domains that influence the decision-making process coupled with the experience that allows them to act quickly and decisively on the information. In addition, learning companies benefit by not repeating costly mistakes, and by reducing time-to-market in Research & Development projects. Group-decision making tools can help companies make better decisions by capturing the knowledge from groups of experts. Furthermore, companies that capture their customers preferences can improve their customer service, which translates to larger profits. Therefore collaborative computing provides a common communication space, improves sharing of knowledge, provides a mechanism for real-time feedback on the tasks being performed, helps to optimize processes, and results in a centralized knowledge warehouse. This paper presents the research directions. of a project which seeks to augment an advanced collaborative web-based environment called Postdoc, with workflow capabilities. Postdoc is a "government-off-the-shelf" document management software developed at NASA-Ames Research Center (ARC).

  15. Advanced Economic Analysis

    NASA Technical Reports Server (NTRS)

    Greenberg, Marc W.; Laing, William

    2013-01-01

    An Economic Analysis (EA) is a systematic approach to the problem of choosing the best method of allocating scarce resources to achieve a given objective. An EA helps guide decisions on the "worth" of pursuing an action that departs from status quo ... an EA is the crux of decision-support.

  16. Advanced flight computers for planetary exploration

    NASA Astrophysics Data System (ADS)

    Stephenson, R. Rhoads

    Research concerning flight computers for use on interplanetary probes is reviewed. The history of these computers from the Viking mission to the present is outlined. The differences between ground commercial computers and computers for planetary exploration are listed. The development of a computer for the Mariner Mark II comet rendezvous asteroid flyby mission is described. Various aspects of recently developed computer systems are examined, including the Max real time, embedded computer, a hypercube distributed supercomputer, a SAR data processor, a processor for the High Resolution IR Imaging Spectrometer, and a robotic vision multiresolution pyramid machine for processsing images obtained by a Mars Rover.

  17. Advanced flight computers for planetary exploration

    NASA Technical Reports Server (NTRS)

    Stephenson, R. Rhoads

    1988-01-01

    Research concerning flight computers for use on interplanetary probes is reviewed. The history of these computers from the Viking mission to the present is outlined. The differences between ground commercial computers and computers for planetary exploration are listed. The development of a computer for the Mariner Mark II comet rendezvous asteroid flyby mission is described. Various aspects of recently developed computer systems are examined, including the Max real time, embedded computer, a hypercube distributed supercomputer, a SAR data processor, a processor for the High Resolution IR Imaging Spectrometer, and a robotic vision multiresolution pyramid machine for processsing images obtained by a Mars Rover.

  18. The advanced computational testing and simulation toolkit (ACTS)

    SciTech Connect

    Drummond, L.A.; Marques, O.

    2002-05-21

    During the past decades there has been a continuous growth in the number of physical and societal problems that have been successfully studied and solved by means of computational modeling and simulation. Distinctively, a number of these are important scientific problems ranging in scale from the atomic to the cosmic. For example, ionization is a phenomenon as ubiquitous in modern society as the glow of fluorescent lights and the etching on silicon computer chips; but it was not until 1999 that researchers finally achieved a complete numerical solution to the simplest example of ionization, the collision of a hydrogen atom with an electron. On the opposite scale, cosmologists have long wondered whether the expansion of the Universe, which began with the Big Bang, would ever reverse itself, ending the Universe in a Big Crunch. In 2000, analysis of new measurements of the cosmic microwave background radiation showed that the geometry of the Universe is flat, and thus the Universe will continue expanding forever. Both of these discoveries depended on high performance computer simulations that utilized computational tools included in the Advanced Computational Testing and Simulation (ACTS) Toolkit. The ACTS Toolkit is an umbrella project that brought together a number of general purpose computational tool development projects funded and supported by the U.S. Department of Energy (DOE). These tools, which have been developed independently, mainly at DOE laboratories, make it easier for scientific code developers to write high performance applications for parallel computers. They tackle a number of computational issues that are common to a large number of scientific applications, mainly implementation of numerical algorithms, and support for code development, execution and optimization. The ACTS Toolkit Project enables the use of these tools by a much wider community of computational scientists, and promotes code portability, reusability, reduction of duplicate efforts

  19. High resolution computed tomography of advanced composite and ceramic materials

    NASA Technical Reports Server (NTRS)

    Yancey, R. N.; Klima, S. J.

    1991-01-01

    Advanced composite and ceramic materials are being developed for use in many new defense and commercial applications. In order to achieve the desired mechanical properties of these materials, the structural elements must be carefully analyzed and engineered. A study was conducted to evaluate the use of high resolution computed tomography (CT) as a macrostructural analysis tool for advanced composite and ceramic materials. Several samples were scanned using a laboratory high resolution CT scanner. Samples were also destructively analyzed at the locations of the scans and the nondestructive and destructive results were compared. The study provides useful information outlining the strengths and limitations of this technique and the prospects for further research in this area.

  20. Computation of the tip vortex flowfield for advanced aircraft propellers

    NASA Technical Reports Server (NTRS)

    Tsai, Tommy M.; Dejong, Frederick J.; Levy, Ralph

    1988-01-01

    The tip vortex flowfield plays a significant role in the performance of advanced aircraft propellers. The flowfield in the tip region is complex, three-dimensional and viscous with large secondary velocities. An analysis is presented using an approximate set of equations which contains the physics required by the tip vortex flowfield, but which does not require the resources of the full Navier-Stokes equations. A computer code was developed to predict the tip vortex flowfield of advanced aircraft propellers. A grid generation package was developed to allow specification of a variety of advanced aircraft propeller shapes. Calculations of the tip vortex generation on an SR3 type blade at high Reynolds numbers were made using this code and a parametric study was performed to show the effect of tip thickness on tip vortex intensity. In addition, calculations of the tip vortex generation on a NACA 0012 type blade were made, including the flowfield downstream of the blade trailing edge. Comparison of flowfield calculations with experimental data from an F4 blade was made. A user's manual was also prepared for the computer code (NASA CR-182178).

  1. Some Recent Advances in Computer Graphics.

    ERIC Educational Resources Information Center

    Whitted, Turner

    1982-01-01

    General principles of computer graphics are reviewed, including discussions of display hardware, geometric modeling, algorithms, and applications in science, computer-aided design, flight training, communications, business, art, and entertainment. (JN)

  2. Computing Advances in the Teaching of Chemistry.

    ERIC Educational Resources Information Center

    Baskett, W. P.; Matthews, G. P.

    1984-01-01

    Discusses three trends in computer-oriented chemistry instruction: (1) availability of interfaces to integrate computers with experiments; (2) impact of the development of higher resolution graphics and greater memory capacity; and (3) role of videodisc technology on computer assisted instruction. Includes program listings for auto-titration and…

  3. ADVANCED COMPUTATIONAL METHODS IN DOSE MODELING: APPLICATION OF COMPUTATIONAL BIOPHYSICAL TRANSPORT, COMPUTATIONAL CHEMISTRY, AND COMPUTATIONAL BIOLOGY

    EPA Science Inventory

    Computational toxicology (CompTox) leverages the significant gains in computing power and computational techniques (e.g., numerical approaches, structure-activity relationships, bioinformatics) realized over the last few years, thereby reducing costs and increasing efficiency i...

  4. New computing systems, future computing environment, and their implications on structural analysis and design

    NASA Technical Reports Server (NTRS)

    Noor, Ahmed K.; Housner, Jerrold M.

    1993-01-01

    Recent advances in computer technology that are likely to impact structural analysis and design of flight vehicles are reviewed. A brief summary is given of the advances in microelectronics, networking technologies, and in the user-interface hardware and software. The major features of new and projected computing systems, including high performance computers, parallel processing machines, and small systems, are described. Advances in programming environments, numerical algorithms, and computational strategies for new computing systems are reviewed. The impact of the advances in computer technology on structural analysis and the design of flight vehicles is described. A scenario for future computing paradigms is presented, and the near-term needs in the computational structures area are outlined.

  5. Advancing crime scene computer forensics techniques

    NASA Astrophysics Data System (ADS)

    Hosmer, Chet; Feldman, John; Giordano, Joe

    1999-02-01

    Computers and network technology have become inexpensive and powerful tools that can be applied to a wide range of criminal activity. Computers have changed the world's view of evidence because computers are used more and more as tools in committing `traditional crimes' such as embezzlements, thefts, extortion and murder. This paper will focus on reviewing the current state-of-the-art of the data recovery and evidence construction tools used in both the field and laboratory for prosection purposes.

  6. A computational image analysis glossary for biologists.

    PubMed

    Roeder, Adrienne H K; Cunha, Alexandre; Burl, Michael C; Meyerowitz, Elliot M

    2012-09-01

    Recent advances in biological imaging have resulted in an explosion in the quality and quantity of images obtained in a digital format. Developmental biologists are increasingly acquiring beautiful and complex images, thus creating vast image datasets. In the past, patterns in image data have been detected by the human eye. Larger datasets, however, necessitate high-throughput objective analysis tools to computationally extract quantitative information from the images. These tools have been developed in collaborations between biologists, computer scientists, mathematicians and physicists. In this Primer we present a glossary of image analysis terms to aid biologists and briefly discuss the importance of robust image analysis in developmental studies. PMID:22872081

  7. ADVANCED COMPUTATIONAL METHODS IN DOSE MODELING

    EPA Science Inventory

    The overall goal of the EPA-ORD NERL research program on Computational Toxicology (CompTox) is to provide the Agency with the tools of modern chemistry, biology, and computing to improve quantitative risk assessments and reduce uncertainties in the source-to-adverse outcome conti...

  8. Advanced ERS design using computer simulation

    SciTech Connect

    Melhem, G.A.

    1995-12-31

    There are two schools of thought regarding pressure relief design, shortcut/simplified methods and detailed methods. The shortcut/simplified methods are mostly applicable to non-reactive systems. These methods use direct scale-up techniques to obtain a vent size. Little useful information can be obtained for reaction data such as onset temperatures, activation energy, decompositon stoichiometry, etc. In addition, this approach does not readily provide the ability to perform what-if and sensitivity analysis or data that can be used for post-release mitigation design. The detailed approach advocates a more fundamental approach to pressure relief design, especially for reactive systems. First, the reaction chemistry is qualified using small scale experiments and then this data is coupled with fluid dynamics to design the emergency relief system. In addition to vent sizing information, this approach provides insights into process modification and refinement as well as the establishment of a safe operating envelope. This approach provides necessary flow data for vent containment design (if required), structural support, etc. This approach also allows the direct evaluation of design sensitivity to variables such as temperature, pressure, composition, fill level, etc. on vent sizing while the shortcut approach requires an additional experiment per what-if scenario. This approach meets DIERS technology requirements for two-phase flow and vapor/liquid disengagement and exceeds it in many key areas for reacting systems such as stoichiometry estimation for decomposition reactions, non-ideal solutions effects, continuing reactions in piping and vent containment systems, etc. This paper provides an overview of our proposed equation of state based modeling approach and its computer code implementation. Numerous examples and model validations are also described. 42 refs., 23 figs., 9 tabs.

  9. Application of advanced computational technology to propulsion CFD

    NASA Astrophysics Data System (ADS)

    Szuch, John R.

    The Internal Fluid Mechanics Division of the NASA Lewis Research Center is combining the key elements of computational fluid dynamics, aerothermodynamic experiments, and advanced computational technology to bring internal computational fluid dynamics (ICFM) to a state of practical application for aerospace propulsion system design. This paper presents an overview of efforts underway at NASA Lewis to advance and apply computational technology to ICFM. These efforts include the use of modern, software engineering principles for code development, the development of an AI-based user-interface for large codes, the establishment of a high-performance, data communications network to link ICFM researchers and facilities, and the application of parallel processing to speed up computationally intensive and/or time-critical ICFM problems. A multistage compressor flow physics program is cited as an example of efforts to use advanced computational technology to enhance a current NASA Lewis ICFM research program.

  10. ADVANCED POWER SYSTEMS ANALYSIS TOOLS

    SciTech Connect

    Robert R. Jensen; Steven A. Benson; Jason D. Laumb

    2001-08-31

    The use of Energy and Environmental Research Center (EERC) modeling tools and improved analytical methods has provided key information in optimizing advanced power system design and operating conditions for efficiency, producing minimal air pollutant emissions and utilizing a wide range of fossil fuel properties. This project was divided into four tasks: the demonstration of the ash transformation model, upgrading spreadsheet tools, enhancements to analytical capabilities using the scanning electron microscopy (SEM), and improvements to the slag viscosity model. The ash transformation model, Atran, was used to predict the size and composition of ash particles, which has a major impact on the fate of the combustion system. To optimize Atran key factors such as mineral fragmentation and coalescence, the heterogeneous and homogeneous interaction of the organically associated elements must be considered as they are applied to the operating conditions. The resulting model's ash composition compares favorably to measured results. Enhancements to existing EERC spreadsheet application included upgrading interactive spreadsheets to calculate the thermodynamic properties for fuels, reactants, products, and steam with Newton Raphson algorithms to perform calculations on mass, energy, and elemental balances, isentropic expansion of steam, and gasifier equilibrium conditions. Derivative calculations can be performed to estimate fuel heating values, adiabatic flame temperatures, emission factors, comparative fuel costs, and per-unit carbon taxes from fuel analyses. Using state-of-the-art computer-controlled scanning electron microscopes and associated microanalysis systems, a method to determine viscosity using the incorporation of grey-scale binning acquired by the SEM image was developed. The image analysis capabilities of a backscattered electron image can be subdivided into various grey-scale ranges that can be analyzed separately. Since the grey scale's intensity is

  11. Computing Algorithms for Nuffield Advanced Physics.

    ERIC Educational Resources Information Center

    Summers, M. K.

    1978-01-01

    Defines all recurrence relations used in the Nuffield course, to solve first- and second-order differential equations, and describes a typical algorithm for computer generation of solutions. (Author/GA)

  12. Aerodynamic optimization studies on advanced architecture computers

    NASA Technical Reports Server (NTRS)

    Chawla, Kalpana

    1995-01-01

    The approach to carrying out multi-discipline aerospace design studies in the future, especially in massively parallel computing environments, comprises of choosing (1) suitable solvers to compute solutions to equations characterizing a discipline, and (2) efficient optimization methods. In addition, for aerodynamic optimization problems, (3) smart methodologies must be selected to modify the surface shape. In this research effort, a 'direct' optimization method is implemented on the Cray C-90 to improve aerodynamic design. It is coupled with an existing implicit Navier-Stokes solver, OVERFLOW, to compute flow solutions. The optimization method is chosen such that it can accomodate multi-discipline optimization in future computations. In the work , however, only single discipline aerodynamic optimization will be included.

  13. Advanced Crew Personal Support Computer (CPSC) task

    NASA Technical Reports Server (NTRS)

    Muratore, Debra

    1991-01-01

    The topics are presented in view graph form and include: background; objectives of task; benefits to the Space Station Freedom (SSF) Program; technical approach; baseline integration; and growth and evolution options. The objective is to: (1) introduce new computer technology into the SSF Program; (2) augment core computer capabilities to meet additional mission requirements; (3) minimize risk in upgrading technology; and (4) provide a low cost way to enhance crew and ground operations support.

  14. Frontiers of research in advanced computations

    SciTech Connect

    1996-07-01

    The principal mission of the Institute for Scientific Computing Research is to foster interactions among LLNL researchers, universities, and industry on selected topics in scientific computing. In the area of computational physics, the Institute has developed a new algorithm, GaPH, to help scientists understand the chemistry of turbulent and driven plasmas or gases at far less cost than other methods. New low-frequency electromagnetic models better describe the plasma etching and deposition characteristics of a computer chip in the making. A new method for modeling realistic curved boundaries within an orthogonal mesh is resulting in a better understanding of the physics associated with such boundaries and much quicker solutions. All these capabilities are being developed for massively parallel implementation, which is an ongoing focus of Institute researchers. Other groups within the Institute are developing novel computational methods to address a range of other problems. Examples include feature detection and motion recognition by computer, improved monitoring of blood oxygen levels, and entirely new models of human joint mechanics and prosthetic devices.

  15. Advances in computing, and their impact on scientific computing.

    PubMed

    Giles, Mike

    2002-01-01

    This paper begins by discussing the developments and trends in computer hardware, starting with the basic components (microprocessors, memory, disks, system interconnect, networking and visualization) before looking at complete systems (death of vector supercomputing, slow demise of large shared-memory systems, rapid growth in very large clusters of PCs). It then considers the software side, the relative maturity of shared-memory (OpenMP) and distributed-memory (MPI) programming environments, and new developments in 'grid computing'. Finally, it touches on the increasing importance of software packages in scientific computing, and the increased importance and difficulty of introducing good software engineering practices into very large academic software development projects. PMID:12539947

  16. Advances in total scattering analysis

    SciTech Connect

    Proffen, Thomas E; Kim, Hyunjeong

    2008-01-01

    In recent years the analysis of the total scattering pattern has become an invaluable tool to study disordered crystalline and nanocrystalline materials. Traditional crystallographic structure determination is based on Bragg intensities and yields the long range average atomic structure. By including diffuse scattering into the analysis, the local and medium range atomic structure can be unravelled. Here we give an overview of recent experimental advances, using X-rays as well as neutron scattering as well as current trends in modelling of total scattering data.

  17. Recent advances in optical computing in Japan

    NASA Astrophysics Data System (ADS)

    Ishihara, Satoshi

    The results of recent Japanese research in optical and hybrid computer systems and components are summarized and illustrated with drawings and diagrams, and the organizational structure of the research efforts is outlined. Topics addressed include optical logic devices, spatial light modulators, two-dimensional lasers, optical bistable devices, device theory, optically controlled array processing, an optical bus for a multiprocessor system, real-time multiple-matrix-product processing, optical numerical processing, optical parallel-array logic systems, optical associative memory, and neural-network computation. Consideration is given to the roles of the Optical Computer Group of the Japan Society of Applied Physics, industry, and government (through the universities and Ministry of Education and through the Ministry of International Trade and Industry).

  18. Advances in Computer-Supported Learning

    ERIC Educational Resources Information Center

    Neto, Francisco; Brasileiro, Francisco

    2007-01-01

    The Internet and growth of computer networks have eliminated geographic barriers, creating an environment where education can be brought to a student no matter where that student may be. The success of distance learning programs and the availability of many Web-supported applications and multimedia resources have increased the effectiveness of…

  19. Space data systems: Advanced flight computers

    NASA Technical Reports Server (NTRS)

    Benz, Harry F.

    1991-01-01

    The technical objectives are to develop high-performance, space-qualifiable, onboard computing, storage, and networking technologies. The topics are presented in viewgraph form and include the following: technology challenges; state-of-the-art assessment; program description; relationship to external programs; and cooperation and coordination effort.

  20. Evaluation of Advanced Computing Techniques and Technologies: Reconfigurable Computing

    NASA Technical Reports Server (NTRS)

    Wells, B. Earl

    2003-01-01

    The focus of this project was to survey the technology of reconfigurable computing determine its level of maturity and suitability for NASA applications. To better understand and assess the effectiveness of the reconfigurable design paradigm that is utilized within the HAL-15 reconfigurable computer system. This system was made available to NASA MSFC for this purpose, from Star Bridge Systems, Inc. To implement on at least one application that would benefit from the performance levels that are possible with reconfigurable hardware. It was originally proposed that experiments in fault tolerance and dynamically reconfigurability would be perform but time constraints mandated that these be pursued as future research.

  1. Recent Advances in Computational Mechanics of the Human Knee Joint

    PubMed Central

    Kazemi, M.; Dabiri, Y.; Li, L. P.

    2013-01-01

    Computational mechanics has been advanced in every area of orthopedic biomechanics. The objective of this paper is to provide a general review of the computational models used in the analysis of the mechanical function of the knee joint in different loading and pathological conditions. Major review articles published in related areas are summarized first. The constitutive models for soft tissues of the knee are briefly discussed to facilitate understanding the joint modeling. A detailed review of the tibiofemoral joint models is presented thereafter. The geometry reconstruction procedures as well as some critical issues in finite element modeling are also discussed. Computational modeling can be a reliable and effective method for the study of mechanical behavior of the knee joint, if the model is constructed correctly. Single-phase material models have been used to predict the instantaneous load response for the healthy knees and repaired joints, such as total and partial meniscectomies, ACL and PCL reconstructions, and joint replacements. Recently, poromechanical models accounting for fluid pressurization in soft tissues have been proposed to study the viscoelastic response of the healthy and impaired knee joints. While the constitutive modeling has been considerably advanced at the tissue level, many challenges still exist in applying a good material model to three-dimensional joint simulations. A complete model validation at the joint level seems impossible presently, because only simple data can be obtained experimentally. Therefore, model validation may be concentrated on the constitutive laws using multiple mechanical tests of the tissues. Extensive model verifications at the joint level are still crucial for the accuracy of the modeling. PMID:23509602

  2. Development of an interactive computer program for advance care planning

    PubMed Central

    Green, Michael J.; Levi, Benjamin H.

    2013-01-01

    Objective To describe the development of an innovative, multimedia decision aid for advance care planning. Background Advance care planning is an important way for people to articulate their wishes for medical care when they are not able to speak for themselves. Living wills and other types of advance directives are the most commonly used tools for advance care planning, but have been criticized for being vague, difficult to interpret, and inconsistent with individuals’ core beliefs and values. Results We developed a multimedia, computer-based decision aid for advance care planning (‘Making Your Wishes Known: Planning Your Medical Future’) to overcome many of the limitations of standard advance directive forms. This computer program guides individuals through the process of advance care planning, and unlike standard advance directives, provides tailored education, values clarification exercises, and a decision-making tool that translates an individual’s values and preferences into a specific medical plan that can be implemented by a health-care team. Pilot testing with 50 adult volunteers recruited from an outpatient primary care clinic showed high levels of satisfaction with the program. Further pilot testing with 34 cancer patients indicated that the program was perceived to be highly accurate at representing patients’ wishes. Conclusions This paper describes the development of an innovative decision aid for advance care planning that was designed to overcome common problems with standard advance directives. Preliminary testing suggests that it is acceptable to users and is accurate. PMID:18823445

  3. Advanced Computing Tools and Models for Accelerator Physics

    SciTech Connect

    Ryne, Robert; Ryne, Robert D.

    2008-06-11

    This paper is based on a transcript of my EPAC'08 presentation on advanced computing tools for accelerator physics. Following an introduction I present several examples, provide a history of the development of beam dynamics capabilities, and conclude with thoughts on the future of large scale computing in accelerator physics.

  4. Advances in Computationally Modeling Human Oral Bioavailability

    PubMed Central

    Wang, Junmei; Hou, Tingjun

    2015-01-01

    Although significant progress has been made in experimental high throughput screening (HTS) of ADME (absorption, distribution, metabolism, excretion) and pharmacokinetic properties, the ADME and Toxicity (ADME-Tox) in silico modeling is still indispensable in drug discovery as it can guide us to wisely select drug candidates prior to expensive ADME screenings and clinical trials. Compared to other ADME-Tox properties, human oral bioavailability (HOBA) is particularly important but extremely difficult to predict. In this paper, the advances in human oral bioavailability modeling will be reviewed. Moreover, our deep insight on how to construct more accurate and reliable HOBA QSAR and classification models will also discussed. PMID:25582307

  5. Advances in computationally modeling human oral bioavailability.

    PubMed

    Wang, Junmei; Hou, Tingjun

    2015-06-23

    Although significant progress has been made in experimental high throughput screening (HTS) of ADME (absorption, distribution, metabolism, excretion) and pharmacokinetic properties, the ADME and Toxicity (ADME-Tox) in silico modeling is still indispensable in drug discovery as it can guide us to wisely select drug candidates prior to expensive ADME screenings and clinical trials. Compared to other ADME-Tox properties, human oral bioavailability (HOBA) is particularly important but extremely difficult to predict. In this paper, the advances in human oral bioavailability modeling will be reviewed. Moreover, our deep insight on how to construct more accurate and reliable HOBA QSAR and classification models will also discussed. PMID:25582307

  6. ASDA - Advanced Suit Design Analyzer computer program

    NASA Technical Reports Server (NTRS)

    Bue, Grant C.; Conger, Bruce C.; Iovine, John V.; Chang, Chi-Min

    1992-01-01

    An ASDA model developed to evaluate the heat and mass transfer characteristics of advanced pressurized suit design concepts for low pressure or vacuum planetary applications is presented. The model is based on a generalized 3-layer suit that uses the Systems Integrated Numerical Differencing Analyzer '85 in conjunction with a 41-node FORTRAN routine. The latter simulates the transient heat transfer and respiratory processes of a human body in a suited environment. The user options for the suit encompass a liquid cooled garment, a removable jacket, a CO2/H2O permeable layer, and a phase change layer.

  7. Advances in computational fluid dynamics solvers for modern computing environments

    NASA Astrophysics Data System (ADS)

    Hertenstein, Daniel; Humphrey, John R.; Paolini, Aaron L.; Kelmelis, Eric J.

    2013-05-01

    EM Photonics has been investigating the application of massively multicore processors to a key problem area: Computational Fluid Dynamics (CFD). While the capabilities of CFD solvers have continually increased and improved to support features such as moving bodies and adjoint-based mesh adaptation, the software architecture has often lagged behind. This has led to poor scaling as core counts reach the tens of thousands. In the modern High Performance Computing (HPC) world, clusters with hundreds of thousands of cores are becoming the standard. In addition, accelerator devices such as NVIDIA GPUs and Intel Xeon Phi are being installed in many new systems. It is important for CFD solvers to take advantage of the new hardware as the computations involved are well suited for the massively multicore architecture. In our work, we demonstrate that new features in NVIDIA GPUs are able to empower existing CFD solvers by example using AVUS, a CFD solver developed by the Air Force Research Labratory (AFRL) and the Volcanic Ash Advisory Center (VAAC). The effort has resulted in increased performance and scalability without sacrificing accuracy. There are many well-known codes in the CFD space that can benefit from this work, such as FUN3D, OVERFLOW, and TetrUSS. Such codes are widely used in the commercial, government, and defense sectors.

  8. Advanced computational techniques for hypersonic propulsion

    NASA Technical Reports Server (NTRS)

    Povinelli, Louis A.

    1989-01-01

    Computational Fluid Dynamics (CFD) has played a major role in the resurgence of hypersonic flight, on the premise that numerical methods will allow performance of simulations at conditions for which no ground test capability exists. Validation of CFD methods is being established using the experimental data base available, which is below Mach 8. It is important, however, to realize the limitations involved in the extrapolation process as well as the deficiencies that exist in numerical methods at the present time. Current features of CFD codes are examined for application to propulsion system components. The shortcomings in simulation and modeling are identified and discussed.

  9. Intelligent Software Tools for Advanced Computing

    SciTech Connect

    Baumgart, C.W.

    2001-04-03

    Feature extraction and evaluation are two procedures common to the development of any pattern recognition application. These features are the primary pieces of information which are used to train the pattern recognition tool, whether that tool is a neural network, a fuzzy logic rulebase, or a genetic algorithm. Careful selection of the features to be used by the pattern recognition tool can significantly streamline the overall development and training of the solution for the pattern recognition application. This report summarizes the development of an integrated, computer-based software package called the Feature Extraction Toolbox (FET), which can be used for the development and deployment of solutions to generic pattern recognition problems. This toolbox integrates a number of software techniques for signal processing, feature extraction and evaluation, and pattern recognition, all under a single, user-friendly development environment. The toolbox has been developed to run on a laptop computer, so that it may be taken to a site and used to develop pattern recognition applications in the field. A prototype version of this toolbox has been completed and is currently being used for applications development on several projects in support of the Department of Energy.

  10. TRAC-PF1/MOD1: an advanced best-estimate computer program for pressurized water reactor thermal-hydraulic analysis

    SciTech Connect

    Liles, D.R.; Mahaffy, J.H.

    1986-07-01

    The Los Alamos National Laboratory is developing the Transient Reactor Analysis Code (TRAC) to provide advanced best-estimate predictions of postulated accidents in light-water reactors. The TRAC-PF1/MOD1 program provides this capability for pressurized water reactors and for many thermal-hydraulic test facilities. The code features either a one- or a three-dimensional treatment of the pressure vessel and its associated internals, a two-fluid nonequilibrium hydrodynamics model with a noncondensable gas field and solute tracking, flow-regime-dependent constitutive equation treatment, optional reflood tracking capability for bottom-flood and falling-film quench fronts, and consistent treatment of entire accident sequences including the generation of consistent initial conditions. The stability-enhancing two-step (SETS) numerical algorithm is used in the one-dimensional hydrodynamics and permits this portion of the fluid dynamics to violate the material Courant condition. This technique permits large time steps and, hence, reduced running time for slow transients.

  11. Sensitivity analysis in computational aerodynamics

    NASA Technical Reports Server (NTRS)

    Bristow, D. R.

    1984-01-01

    Information on sensitivity analysis in computational aerodynamics is given in outline, graphical, and chart form. The prediction accuracy if the MCAERO program, a perturbation analysis method, is discussed. A procedure for calculating perturbation matrix, baseline wing paneling for perturbation analysis test cases and applications of an inviscid sensitivity matrix are among the topics covered.

  12. TOPICAL REVIEW: Advances and challenges in computational plasma science

    NASA Astrophysics Data System (ADS)

    Tang, W. M.; Chan, V. S.

    2005-02-01

    Scientific simulation, which provides a natural bridge between theory and experiment, is an essential tool for understanding complex plasma behaviour. Recent advances in simulations of magnetically confined plasmas are reviewed in this paper, with illustrative examples, chosen from associated research areas such as microturbulence, magnetohydrodynamics and other topics. Progress has been stimulated, in particular, by the exponential growth of computer speed along with significant improvements in computer technology. The advances in both particle and fluid simulations of fine-scale turbulence and large-scale dynamics have produced increasingly good agreement between experimental observations and computational modelling. This was enabled by two key factors: (a) innovative advances in analytic and computational methods for developing reduced descriptions of physics phenomena spanning widely disparate temporal and spatial scales and (b) access to powerful new computational resources. Excellent progress has been made in developing codes for which computer run-time and problem-size scale well with the number of processors on massively parallel processors (MPPs). Examples include the effective usage of the full power of multi-teraflop (multi-trillion floating point computations per second) MPPs to produce three-dimensional, general geometry, nonlinear particle simulations that have accelerated advances in understanding the nature of turbulence self-regulation by zonal flows. These calculations, which typically utilized billions of particles for thousands of time-steps, would not have been possible without access to powerful present generation MPP computers and the associated diagnostic and visualization capabilities. In looking towards the future, the current results from advanced simulations provide great encouragement for being able to include increasingly realistic dynamics to enable deeper physics insights into plasmas in both natural and laboratory environments. This

  13. X ray computed tomography for failure analysis

    NASA Astrophysics Data System (ADS)

    Bossi, Richard H.; Crews, Alan R.; Georgeson, Gary E.

    1992-08-01

    Under a preliminary testing task assignment of the Advanced Development of X-Ray Computed Tomography Application program, computed tomography (CT) has been studied for its potential as a tool to assist in failure analysis investigations. CT provides three-dimensional spatial distribution of material that can be used to assess internal configurations and material conditions nondestructively. This capability has been used in failure analysis studies to determine the position of internal components and their operation. CT is particularly advantageous on complex systems, composite failure studies, and testing under operational or environmental conditions. CT plays an important role in reducing the time and effort of a failure analysis investigation. Aircraft manufacturing or logistical facilities perform failure analysis operations routinely and could be expected to reduce schedules, reduce costs and/or improve evaluation on about 10 to 30 percent of the problems they investigate by using CT.

  14. Computer analysis of railcar vibrations

    NASA Technical Reports Server (NTRS)

    Vlaminck, R. R.

    1975-01-01

    Computer models and techniques for calculating railcar vibrations are discussed along with criteria for vehicle ride optimization. The effect on vibration of car body structural dynamics, suspension system parameters, vehicle geometry, and wheel and rail excitation are presented. Ride quality vibration data collected on the state-of-the-art car and standard light rail vehicle is compared to computer predictions. The results show that computer analysis of the vehicle can be performed for relatively low cost in short periods of time. The analysis permits optimization of the design as it progresses and minimizes the possibility of excessive vibration on production vehicles.

  15. Use of advanced computers for aerodynamic flow simulation

    NASA Technical Reports Server (NTRS)

    Bailey, F. R.; Ballhaus, W. F.

    1980-01-01

    The current and projected use of advanced computers for large-scale aerodynamic flow simulation applied to engineering design and research is discussed. The design use of mature codes run on conventional, serial computers is compared with the fluid research use of new codes run on parallel and vector computers. The role of flow simulations in design is illustrated by the application of a three dimensional, inviscid, transonic code to the Sabreliner 60 wing redesign. Research computations that include a more complete description of the fluid physics by use of Reynolds averaged Navier-Stokes and large-eddy simulation formulations are also presented. Results of studies for a numerical aerodynamic simulation facility are used to project the feasibility of design applications employing these more advanced three dimensional viscous flow simulations.

  16. Advanced Placement Computer Science with Pascal. Volume 2. Experimental Edition.

    ERIC Educational Resources Information Center

    New York City Board of Education, Brooklyn, NY.

    This curriculum guide presents 100 lessons for an advanced placement course on programming in Pascal. Some of the topics covered include arrays, sorting, strings, sets, records, computers in society, files, stacks, queues, linked lists, binary trees, searching, hashing, and chaining. Performance objectives, vocabulary, motivation, aim,…

  17. The Federal Government's Role in Advancing Computer Technology

    ERIC Educational Resources Information Center

    Information Hotline, 1978

    1978-01-01

    As part of the Federal Data Processing Reorganization Study submitted by the Science and Technology Team, the Federal Government's role in advancing and diffusing computer technology is discussed. Findings and conclusions assess the state-of-the-art in government and in industry, and five recommendations provide directions for government policy…

  18. Computer-Assisted Foreign Language Teaching and Learning: Technological Advances

    ERIC Educational Resources Information Center

    Zou, Bin; Xing, Minjie; Wang, Yuping; Sun, Mingyu; Xiang, Catherine H.

    2013-01-01

    Computer-Assisted Foreign Language Teaching and Learning: Technological Advances highlights new research and an original framework that brings together foreign language teaching, experiments and testing practices that utilize the most recent and widely used e-learning resources. This comprehensive collection of research will offer linguistic…

  19. Advances in reversed field pinch theory and computation

    SciTech Connect

    Schnack, D.D.; Ho, Y.L.; Carreras, B.A.; Sidikman, K.; Craddock, G.G.; Mattor, N.; Nebel, R.A.; Prager, S.C.; Terry, P.W.; Zita, E.J.

    1992-12-31

    Advances in theory and computations related to the reversed field pinch (RFP) are presented. These are: (1) the effect of the dynamo on thermal transport; (2) a theory of ion heating due to dynamo fluctuations; (3) studies of active and passive feedback schemes for controlling dynamo fluctuations; and (4) an analytic model for coupled g-mode and rippling turbulence in the RFP edge.

  20. Advances in Numerical Boundary Conditions for Computational Aeroacoustics

    NASA Technical Reports Server (NTRS)

    Tam, Christopher K. W.

    1997-01-01

    Advances in Computational Aeroacoustics (CAA) depend critically on the availability of accurate, nondispersive, least dissipative computation algorithm as well as high quality numerical boundary treatments. This paper focuses on the recent developments of numerical boundary conditions. In a typical CAA problem, one often encounters two types of boundaries. Because a finite computation domain is used, there are external boundaries. On the external boundaries, boundary conditions simulating the solution outside the computation domain are to be imposed. Inside the computation domain, there may be internal boundaries. On these internal boundaries, boundary conditions simulating the presence of an object or surface with specific acoustic characteristics are to be applied. Numerical boundary conditions, both external or internal, developed for simple model problems are reviewed and examined. Numerical boundary conditions for real aeroacoustic problems are also discussed through specific examples. The paper concludes with a description of some much needed research in numerical boundary conditions for CAA.

  1. Potential applications of computational fluid dynamics to biofluid analysis

    NASA Technical Reports Server (NTRS)

    Kwak, D.; Chang, J. L. C.; Rogers, S. E.; Rosenfeld, M.; Kwak, D.

    1988-01-01

    Computational fluid dynamics was developed to the stage where it has become an indispensable part of aerospace research and design. In view of advances made in aerospace applications, the computational approach can be used for biofluid mechanics research. Several flow simulation methods developed for aerospace problems are briefly discussed for potential applications to biofluids, especially to blood flow analysis.

  2. Computer design and analysis of vacuum systems

    SciTech Connect

    Santeler, D.J.

    1987-07-01

    A computer program has been developed for an IBM compatible personal computer to assist in the design and analysis of vacuum systems. The program has a selection of 12 major schematics with several thousand minor variants incorporating diffusion, turbomolecular, cryogenic, ion, mechanical, and sorption pumps as well as circular tubes, bends, valves, traps, and purge gas connections. The gas throughput versus the inlet pressure of the pump is presented on a log--log graphical display. The conductance of each series component is sequentially added to the graph to obtain the net system behavior Q/sub (//sub P//sub )/. The component conductances may be calculated either from the inlet area and the transmission probability or from the tube length and the diameter. The gas-flow calculations are valid for orifices, short tubes, and long tubes throughout the entire pressure range from molecular through viscous to choked and nonchoked exit flows. The roughing-pump and high-vacuum-pump characteristic curves are numerically integrated to provide a graphical presentation of the system pumpdown. Outgassing data for different materials is then combined to produce a graph of the net system ''outgassing pressure.'' Computer routines are provided for differentiating a real pumpdown curve for system analysis. The computer program is included with the American Vacuum Society course, ''Advanced Vacuum System Design and Analysis,'' or it may be purchased from Process Applications, Inc.

  3. Activities of the Research Institute for Advanced Computer Science

    NASA Technical Reports Server (NTRS)

    Oliger, Joseph

    1994-01-01

    The Research Institute for Advanced Computer Science (RIACS) was established by the Universities Space Research Association (USRA) at the NASA Ames Research Center (ARC) on June 6, 1983. RIACS is privately operated by USRA, a consortium of universities with research programs in the aerospace sciences, under contract with NASA. The primary mission of RIACS is to provide research and expertise in computer science and scientific computing to support the scientific missions of NASA ARC. The research carried out at RIACS must change its emphasis from year to year in response to NASA ARC's changing needs and technological opportunities. Research at RIACS is currently being done in the following areas: (1) parallel computing; (2) advanced methods for scientific computing; (3) high performance networks; and (4) learning systems. RIACS technical reports are usually preprints of manuscripts that have been submitted to research journals or conference proceedings. A list of these reports for the period January 1, 1994 through December 31, 1994 is in the Reports and Abstracts section of this report.

  4. Advanced computer graphic techniques for laser range finder (LRF) simulation

    NASA Astrophysics Data System (ADS)

    Bedkowski, Janusz; Jankowski, Stanislaw

    2008-11-01

    This paper show an advanced computer graphic techniques for laser range finder (LRF) simulation. The LRF is the common sensor for unmanned ground vehicle, autonomous mobile robot and security applications. The cost of the measurement system is extremely high, therefore the simulation tool is designed. The simulation gives an opportunity to execute algorithm such as the obstacle avoidance[1], slam for robot localization[2], detection of vegetation and water obstacles in surroundings of the robot chassis[3], LRF measurement in crowd of people[1]. The Axis Aligned Bounding Box (AABB) and alternative technique based on CUDA (NVIDIA Compute Unified Device Architecture) is presented.

  5. Advanced computer architecture specification for automated weld systems

    NASA Technical Reports Server (NTRS)

    Katsinis, Constantine

    1994-01-01

    This report describes the requirements for an advanced automated weld system and the associated computer architecture, and defines the overall system specification from a broad perspective. According to the requirements of welding procedures as they relate to an integrated multiaxis motion control and sensor architecture, the computer system requirements are developed based on a proven multiple-processor architecture with an expandable, distributed-memory, single global bus architecture, containing individual processors which are assigned to specific tasks that support sensor or control processes. The specified architecture is sufficiently flexible to integrate previously developed equipment, be upgradable and allow on-site modifications.

  6. Advanced computer modeling techniques expand belt conveyor technology

    SciTech Connect

    Alspaugh, M.

    1998-07-01

    Increased mining production is continuing to challenge engineers and manufacturers to keep up. The pressure to produce larger and more versatile equipment is increasing. This paper will show some recent major projects in the belt conveyor industry that have pushed the limits of design and engineering technology. Also, it will discuss the systems engineering discipline and advanced computer modeling tools that have helped make these achievements possible. Several examples of technologically advanced designs will be reviewed. However, new technology can sometimes produce increased problems with equipment availability and reliability if not carefully developed. Computer modeling techniques that help one design larger equipment can also compound operational headaches if engineering processes and algorithms are not carefully analyzed every step of the way.

  7. Soft computing in design and manufacturing of advanced materials

    NASA Technical Reports Server (NTRS)

    Cios, Krzysztof J.; Baaklini, George Y; Vary, Alex

    1993-01-01

    The potential of fuzzy sets and neural networks, often referred to as soft computing, for aiding in all aspects of manufacturing of advanced materials like ceramics is addressed. In design and manufacturing of advanced materials, it is desirable to find which of the many processing variables contribute most to the desired properties of the material. There is also interest in real time quality control of parameters that govern material properties during processing stages. The concepts of fuzzy sets and neural networks are briefly introduced and it is shown how they can be used in the design and manufacturing processes. These two computational methods are alternatives to other methods such as the Taguchi method. The two methods are demonstrated by using data collected at NASA Lewis Research Center. Future research directions are also discussed.

  8. Thermal analysis simulation for a spin-motor used in the advanced main combustion chamber vacuum plasma spray project using the SINDA computer program

    NASA Technical Reports Server (NTRS)

    Mcdonald, Gary H.

    1990-01-01

    One of the many design challenges of this project is predicting the thermal effects due to the environment inside the vacuum chamber on the turntable and spin motor spindle assembly. The objective of the study is to model the spin motor using the computer program System Improved Numerical Differencing Analyzer (SINDA). By formulating the appropriate input information concerning the motor's geometry, coolant flow path, material composition, and bearing and motor winding characteristics, SINDA should predict temperatures at various predefined nodes. From these temperatures, hopefully, one can predict if the coolant flow rate is sufficient or if certain mechanical elements such as bearings, O ring seals, or motor windings will exceed maximum design temperatures.

  9. ADVANCED METHODS FOR THE COMPUTATION OF PARTICLE BEAM TRANSPORT AND THE COMPUTATION OF ELECTROMAGNETIC FIELDS AND MULTIPARTICLE PHENOMENA

    SciTech Connect

    Alex J. Dragt

    2012-08-31

    Since 1980, under the grant DEFG02-96ER40949, the Department of Energy has supported the educational and research work of the University of Maryland Dynamical Systems and Accelerator Theory (DSAT) Group. The primary focus of this educational/research group has been on the computation and analysis of charged-particle beam transport using Lie algebraic methods, and on advanced methods for the computation of electromagnetic fields and multiparticle phenomena. This Final Report summarizes the accomplishments of the DSAT Group from its inception in 1980 through its end in 2011.

  10. Advanced materials: Information and analysis needs

    SciTech Connect

    Curlee, T.R.; Das, S.; Lee, R.; Trumble, D.

    1990-09-01

    This report presents the findings of a study to identify the types of information and analysis that are needed for advanced materials. The project was sponsored by the US Bureau of Mines (BOM). It includes a conceptual description of information needs for advanced materials and the development and implementation of a questionnaire on the same subject. This report identifies twelve fundamental differences between advanced and traditional materials and discusses the implications of these differences for data and analysis needs. Advanced and traditional materials differ significantly in terms of physical and chemical properties. Advanced material properties can be customized more easily. The production of advanced materials may differ from traditional materials in terms of inputs, the importance of by-products, the importance of different processing steps (especially fabrication), and scale economies. The potential for change in advanced materials characteristics and markets is greater and is derived from the marriage of radically different materials and processes. In addition to the conceptual study, a questionnaire was developed and implemented to assess the opinions of people who are likely users of BOM information on advanced materials. The results of the questionnaire, which was sent to about 1000 people, generally confirm the propositions set forth in the conceptual part of the study. The results also provide data on the categories of advanced materials and the types of information that are of greatest interest to potential users. 32 refs., 1 fig., 12 tabs.

  11. Computational mechanics - Advances and trends; Proceedings of the Session - Future directions of Computational Mechanics of the ASME Winter Annual Meeting, Anaheim, CA, Dec. 7-12, 1986

    NASA Technical Reports Server (NTRS)

    Noor, Ahmed K. (Editor)

    1986-01-01

    The papers contained in this volume provide an overview of the advances made in a number of aspects of computational mechanics, identify some of the anticipated industry needs in this area, discuss the opportunities provided by new hardware and parallel algorithms, and outline some of the current government programs in computational mechanics. Papers are included on advances and trends in parallel algorithms, supercomputers for engineering analysis, material modeling in nonlinear finite-element analysis, the Navier-Stokes computer, and future finite-element software systems.

  12. Advanced Technology Lifecycle Analysis System (ATLAS)

    NASA Technical Reports Server (NTRS)

    O'Neil, Daniel A.; Mankins, John C.

    2004-01-01

    Developing credible mass and cost estimates for space exploration and development architectures require multidisciplinary analysis based on physics calculations, and parametric estimates derived from historical systems. Within the National Aeronautics and Space Administration (NASA), concurrent engineering environment (CEE) activities integrate discipline oriented analysis tools through a computer network and accumulate the results of a multidisciplinary analysis team via a centralized database or spreadsheet Each minute of a design and analysis study within a concurrent engineering environment is expensive due the size of the team and supporting equipment The Advanced Technology Lifecycle Analysis System (ATLAS) reduces the cost of architecture analysis by capturing the knowledge of discipline experts into system oriented spreadsheet models. A framework with a user interface presents a library of system models to an architecture analyst. The analyst selects models of launchers, in-space transportation systems, and excursion vehicles, as well as space and surface infrastructure such as propellant depots, habitats, and solar power satellites. After assembling the architecture from the selected models, the analyst can create a campaign comprised of missions spanning several years. The ATLAS controller passes analyst specified parameters to the models and data among the models. An integrator workbook calls a history based parametric analysis cost model to determine the costs. Also, the integrator estimates the flight rates, launched masses, and architecture benefits over the years of the campaign. An accumulator workbook presents the analytical results in a series of bar graphs. In no way does ATLAS compete with a CEE; instead, ATLAS complements a CEE by ensuring that the time of the experts is well spent Using ATLAS, an architecture analyst can perform technology sensitivity analysis, study many scenarios, and see the impact of design decisions. When the analyst is

  13. Advances in Electromagnetic Modelling through High Performance Computing

    SciTech Connect

    Ko, K.; Folwell, N.; Ge, L.; Guetz, A.; Lee, L.; Li, Z.; Ng, C.; Prudencio, E.; Schussman, G.; Uplenchwar, R.; Xiao, L.; /SLAC

    2006-03-29

    Under the DOE SciDAC project on Accelerator Science and Technology, a suite of electromagnetic codes has been under development at SLAC that are based on unstructured grids for higher accuracy, and use parallel processing to enable large-scale simulation. The new modeling capability is supported by SciDAC collaborations on meshing, solvers, refinement, optimization and visualization. These advances in computational science are described and the application of the parallel eigensolver Omega3P to the cavity design for the International Linear Collider is discussed.

  14. Forensic Analysis of Compromised Computers

    NASA Technical Reports Server (NTRS)

    Wolfe, Thomas

    2004-01-01

    Directory Tree Analysis File Generator is a Practical Extraction and Reporting Language (PERL) script that simplifies and automates the collection of information for forensic analysis of compromised computer systems. During such an analysis, it is sometimes necessary to collect and analyze information about files on a specific directory tree. Directory Tree Analysis File Generator collects information of this type (except information about directories) and writes it to a text file. In particular, the script asks the user for the root of the directory tree to be processed, the name of the output file, and the number of subtree levels to process. The script then processes the directory tree and puts out the aforementioned text file. The format of the text file is designed to enable the submission of the file as input to a spreadsheet program, wherein the forensic analysis is performed. The analysis usually consists of sorting files and examination of such characteristics of files as ownership, time of creation, and time of most recent access, all of which characteristics are among the data included in the text file.

  15. Computational ocean acoustics: Advances in 3D ocean acoustic modeling

    NASA Astrophysics Data System (ADS)

    Schmidt, Henrik; Jensen, Finn B.

    2012-11-01

    The numerical model of ocean acoustic propagation developed in the 1980's are still in widespread use today, and the field of computational ocean acoustics is often considered a mature field. However, the explosive increase in computational power available to the community has created opportunities for modeling phenomena that earlier were beyond reach. Most notably, three-dimensional propagation and scattering problems have been prohibitive computationally, but are now addressed routinely using brute force numerical approaches such as the Finite Element Method, in particular for target scattering problems, where they are being combined with the traditional wave theory propagation models in hybrid modeling frameworks. Also, recent years has seen the development of hybrid approaches coupling oceanographic circulation models with acoustic propagation models, enabling the forecasting of sonar performance uncertainty in dynamic ocean environments. These and other advances made over the last couple of decades support the notion that the field of computational ocean acoustics is far from being mature. [Work supported by the Office of Naval Research, Code 321OA].

  16. (Advanced materials, robotics, and advanced computers for use in nuclear power plants)

    SciTech Connect

    White, J.D.

    1989-11-17

    The aim of the IAEA Technical Committee Workshop was to provide an opportunity to exchange information on the status of advances in technologies such as improved materials, robotics, and advanced computers already used or expected to be used in the design of nuclear power plants, and to review possible applications of advanced technologies in future reactor designs. Papers were given in these areas by Belgium, France, Mexico, Canada, Russia, India, and the United States. Notably absent from this meeting were Japan, Germany, Italy, Spain, the United Kingdom, and the Scandinavian countries -- all of whom are working in the areas of interest to this meeting. Most of the workshop discussion, however, was focused on advanced controls (including human-machine interface and software development and testing) and electronic descriptions of power plants. Verification and validation of design was also a topic of considerable discussion. The traveler was surprised at the progress made in 3-D electronic images of nuclear power plants and automatic updating of these images to reflect as-built conditions. Canadian plants and one Mexican plant have used photogrammetry to update electronic drawings automatically. The Canadians also have started attaching other electronic data bases to the electronic drawings. These data bases include parts information and maintenance work. The traveler observed that the Advanced Controls Program is better balanced and more forward looking than other nuclear controls R D activities described. The French participants made this observation in the meeting and expressed interest in collaborative work in this area.

  17. Advances in computed tomography evaluation of skull base diseases.

    PubMed

    Prevedello, Luciano M

    2014-10-01

    Introduction Computed tomography (CT) is a key component in the evaluation of skull base diseases. With its ability to clearly delineate the osseous anatomy, CT can provide not only important tips to diagnosis but also key information for surgical planning. Objectives The purpose of this article is to describe some of the main CT imaging features that contribute to the diagnosis of skull base tumors, review recent knowledge related to bony manifestations of these conditions, and summarize recent technological advances in CT that contribute to image quality and improved diagnosis. Data Synthesis Recent advances in CT technology allow fine-detailed evaluation of the bony anatomy using submillimetric sections. Dual-energy CT material decomposition capabilities allow clear separation between contrast material, bone, and soft tissues with many clinical applications in the skull base. Dual-energy technology has also the ability to decrease image degradation from metallic hardwares using some techniques that can result in similar or even decreased radiation to patients. Conclusions CT is very useful in the evaluation of skull base diseases, and recent technological advances can increase disease conspicuity resulting in improved diagnostic capabilities and enhanced surgical planning. PMID:25992136

  18. Advanced analysis methods in particle physics

    SciTech Connect

    Bhat, Pushpalatha C.; /Fermilab

    2010-10-01

    Each generation of high energy physics experiments is grander in scale than the previous - more powerful, more complex and more demanding in terms of data handling and analysis. The spectacular performance of the Tevatron and the beginning of operations of the Large Hadron Collider, have placed us at the threshold of a new era in particle physics. The discovery of the Higgs boson or another agent of electroweak symmetry breaking and evidence of new physics may be just around the corner. The greatest challenge in these pursuits is to extract the extremely rare signals, if any, from huge backgrounds arising from known physics processes. The use of advanced analysis techniques is crucial in achieving this goal. In this review, I discuss the concepts of optimal analysis, some important advanced analysis methods and a few examples. The judicious use of these advanced methods should enable new discoveries and produce results with better precision, robustness and clarity.

  19. Advanced Power System Analysis Capabilities

    NASA Technical Reports Server (NTRS)

    1997-01-01

    As a continuing effort to assist in the design and characterization of space power systems, the NASA Lewis Research Center's Power and Propulsion Office developed a powerful computerized analysis tool called System Power Analysis for Capability Evaluation (SPACE). This year, SPACE was used extensively in analyzing detailed operational timelines for the International Space Station (ISS) program. SPACE was developed to analyze the performance of space-based photovoltaic power systems such as that being developed for the ISS. It is a highly integrated tool that combines numerous factors in a single analysis, providing a comprehensive assessment of the power system's capability. Factors particularly critical to the ISS include the orientation of the solar arrays toward the Sun and the shadowing of the arrays by other portions of the station.

  20. Advances and trends in the development of computational models for tires

    NASA Technical Reports Server (NTRS)

    Noor, A. K.; Tanner, J. A.

    1985-01-01

    Status and some recent developments of computational models for tires are summarized. Discussion focuses on a number of aspects of tire modeling and analysis including: tire materials and their characterization; evolution of tire models; characteristics of effective finite element models for analyzing tires; analysis needs for tires; and impact of the advances made in finite element technology, computational algorithms, and new computing systems on tire modeling and analysis. An initial set of benchmark problems has been proposed in concert with the U.S. tire industry. Extensive sets of experimental data will be collected for these problems and used for evaluating and validating different tire models. Also, the new Aircraft Landing Dynamics Facility (ALDF) at NASA Langley Research Center is described.

  1. DOE Advanced Scientific Computing Advisory Committee (ASCAC) Report: Exascale Computing Initiative Review

    SciTech Connect

    Reed, Daniel; Berzins, Martin; Pennington, Robert; Sarkar, Vivek; Taylor, Valerie

    2015-08-01

    On November 19, 2014, the Advanced Scientific Computing Advisory Committee (ASCAC) was charged with reviewing the Department of Energy’s conceptual design for the Exascale Computing Initiative (ECI). In particular, this included assessing whether there are significant gaps in the ECI plan or areas that need to be given priority or extra management attention. Given the breadth and depth of previous reviews of the technical challenges inherent in exascale system design and deployment, the subcommittee focused its assessment on organizational and management issues, considering technical issues only as they informed organizational or management priorities and structures. This report presents the observations and recommendations of the subcommittee.

  2. NASA Trapezoidal Wing Computations Including Transition and Advanced Turbulence Modeling

    NASA Technical Reports Server (NTRS)

    Rumsey, C. L.; Lee-Rausch, E. M.

    2012-01-01

    Flow about the NASA Trapezoidal Wing is computed with several turbulence models by using grids from the first High Lift Prediction Workshop in an effort to advance understanding of computational fluid dynamics modeling for this type of flowfield. Transition is accounted for in many of the computations. In particular, a recently-developed 4-equation transition model is utilized and works well overall. Accounting for transition tends to increase lift and decrease moment, which improves the agreement with experiment. Upper surface flap separation is reduced, and agreement with experimental surface pressures and velocity profiles is improved. The predicted shape of wakes from upstream elements is strongly influenced by grid resolution in regions above the main and flap elements. Turbulence model enhancements to account for rotation and curvature have the general effect of increasing lift and improving the resolution of the wing tip vortex as it convects downstream. However, none of the models improve the prediction of surface pressures near the wing tip, where more grid resolution is needed.

  3. NASA Applications for Computational Electromagnetic Analysis

    NASA Technical Reports Server (NTRS)

    Lewis, Catherine C.; Trout, Dawn H.; Krome, Mark E.; Perry, Thomas A.

    2011-01-01

    Computational Electromagnetic Software is used by NASA to analyze the compatibility of systems too large or too complex for testing. Recent advances in software packages and computer capabilities have made it possible to determine the effects of a transmitter inside a launch vehicle fairing, better analyze the environment threats, and perform on-orbit replacements with assured electromagnetic compatibility.

  4. Experimental analysis of computer system dependability

    NASA Technical Reports Server (NTRS)

    Iyer, Ravishankar, K.; Tang, Dong

    1993-01-01

    This paper reviews an area which has evolved over the past 15 years: experimental analysis of computer system dependability. Methodologies and advances are discussed for three basic approaches used in the area: simulated fault injection, physical fault injection, and measurement-based analysis. The three approaches are suited, respectively, to dependability evaluation in the three phases of a system's life: design phase, prototype phase, and operational phase. Before the discussion of these phases, several statistical techniques used in the area are introduced. For each phase, a classification of research methods or study topics is outlined, followed by discussion of these methods or topics as well as representative studies. The statistical techniques introduced include the estimation of parameters and confidence intervals, probability distribution characterization, and several multivariate analysis methods. Importance sampling, a statistical technique used to accelerate Monte Carlo simulation, is also introduced. The discussion of simulated fault injection covers electrical-level, logic-level, and function-level fault injection methods as well as representative simulation environments such as FOCUS and DEPEND. The discussion of physical fault injection covers hardware, software, and radiation fault injection methods as well as several software and hybrid tools including FIAT, FERARI, HYBRID, and FINE. The discussion of measurement-based analysis covers measurement and data processing techniques, basic error characterization, dependency analysis, Markov reward modeling, software-dependability, and fault diagnosis. The discussion involves several important issues studies in the area, including fault models, fast simulation techniques, workload/failure dependency, correlated failures, and software fault tolerance.

  5. Hybrid soft computing systems for electromyographic signals analysis: a review

    PubMed Central

    2014-01-01

    Electromyographic (EMG) is a bio-signal collected on human skeletal muscle. Analysis of EMG signals has been widely used to detect human movement intent, control various human-machine interfaces, diagnose neuromuscular diseases, and model neuromusculoskeletal system. With the advances of artificial intelligence and soft computing, many sophisticated techniques have been proposed for such purpose. Hybrid soft computing system (HSCS), the integration of these different techniques, aims to further improve the effectiveness, efficiency, and accuracy of EMG analysis. This paper reviews and compares key combinations of neural network, support vector machine, fuzzy logic, evolutionary computing, and swarm intelligence for EMG analysis. Our suggestions on the possible future development of HSCS in EMG analysis are also given in terms of basic soft computing techniques, further combination of these techniques, and their other applications in EMG analysis. PMID:24490979

  6. A Petaflops Era Computing Analysis

    NASA Technical Reports Server (NTRS)

    Preston, Frank S.

    1998-01-01

    This report covers a study of the potential for petaflops (1O(exp 15) floating point operations per second) computing. This study was performed within the year 1996 and should be considered as the first step in an on-going effort. 'Me analysis concludes that a petaflop system is technically feasible but not feasible with today's state-of-the-art. Since the computer arena is now a commodity business, most experts expect that a petaflops system will evolve from current technology in an evolutionary fashion. To meet the price expectations of users waiting for petaflop performance, great improvements in lowering component costs will be required. Lower power consumption is also a must. The present rate of progress in improved performance places the date of introduction of petaflop systems at about 2010. Several years before that date, it is projected that the resolution limit of chips will reach the now known resolution limit. Aside from the economic problems and constraints, software is identified as the major problem. The tone of this initial study is more pessimistic than most of the Super-published material available on petaflop systems. Workers in the field are expected to generate more data which could serve to provide a basis for a more informed projection. This report includes an annotated bibliography.

  7. Infrastructure Systems for Advanced Computing in E-science applications

    NASA Astrophysics Data System (ADS)

    Terzo, Olivier

    2013-04-01

    In the e-science field are growing needs for having computing infrastructure more dynamic and customizable with a model of use "on demand" that follow the exact request in term of resources and storage capacities. The integration of grid and cloud infrastructure solutions allows us to offer services that can adapt the availability in terms of up scaling and downscaling resources. The main challenges for e-sciences domains will on implement infrastructure solutions for scientific computing that allow to adapt dynamically the demands of computing resources with a strong emphasis on optimizing the use of computing resources for reducing costs of investments. Instrumentation, data volumes, algorithms, analysis contribute to increase the complexity for applications who require high processing power and storage for a limited time and often exceeds the computational resources that equip the majority of laboratories, research Unit in an organization. Very often it is necessary to adapt or even tweak rethink tools, algorithms, and consolidate existing applications through a phase of reverse engineering in order to adapt them to a deployment on Cloud infrastructure. For example, in areas such as rainfall monitoring, meteorological analysis, Hydrometeorology, Climatology Bioinformatics Next Generation Sequencing, Computational Electromagnetic, Radio occultation, the complexity of the analysis raises several issues such as the processing time, the scheduling of tasks of processing, storage of results, a multi users environment. For these reasons, it is necessary to rethink the writing model of E-Science applications in order to be already adapted to exploit the potentiality of cloud computing services through the uses of IaaS, PaaS and SaaS layer. An other important focus is on create/use hybrid infrastructure typically a federation between Private and public cloud, in fact in this way when all resources owned by the organization are all used it will be easy with a federate

  8. Computer analysis of cardiovascular parameters.

    PubMed

    Mass, H J; Gean, J T; Gwirtz, P A

    1987-01-01

    A computer program is described for the analysis of several cardiovascular parameters frequently measured or derived in the chronically instrumented dog model. Data are stored on magnetic tape and are subsequently analyzed with the Apple IIe microcomputer equipped with the ADALAB (Interactive Microware, Inc.) analog-to-digital convertor. Not limited to the chronically instrumented animal model, the program is capable of analyzing left ventricular pressure, three channels of regional myocardial segment length, coronary flow velocity as measured by the Doppler ultrasonic flow technique, and two channels of systemic arterial pressure. Derived data include: left ventricular dP/dtmax, left ventricular pressure-heart rate product, left ventricular ejection time, tension time index; percent segment length shortening and velocity of shortening, dL/dt(s)max, regional stroke work and power, duration of systole and diastole; mean coronary flow velocity, peak diastolic and systolic flow velocity, and true mean systemic arterial pressure. PMID:3581809

  9. Inference on arthropod demographic parameters: computational advances using R.

    PubMed

    Maia, Aline De Holanda Nunes; Pazianotto, Ricardo Antonio De Almeida; Luiz, Alfredo José Barreto; Marinho-Prado, Jeanne Scardini; Pervez, Ahmad

    2014-02-01

    We developed a computer program for life table analysis using the open source, free software programming environment R. It is useful to quantify chronic nonlethal effects of treatments on arthropod populations by summarizing information on their survival and fertility in key population parameters referred to as fertility life table parameters. Statistical inference on fertility life table parameters is not trivial because it requires the use of computationally intensive methods for variance estimation. Our codes present some advantages with respect to a previous program developed in Statistical Analysis System. Additional multiple comparison tests were incorporated for the analysis of qualitative factors; a module for regression analysis was implemented, thus, allowing analysis of quantitative factors such as temperature or agrochemical doses; availability is granted for users, once it was developed using an open source, free software programming environment. To illustrate the descriptive and inferential analysis implemented in lifetable.R, we present and discuss two examples: 1) a study quantifying the influence of the proteinase inhibitor berenil on the eucalyptus defoliator Thyrinteina arnobia (Stoll) and 2) a study investigating the influence of temperature on demographic parameters of a predaceous ladybird, Hippodamia variegata (Goeze). PMID:24665730

  10. Issues affecting advanced passive light-water reactor safety analysis

    SciTech Connect

    Beelman, R.J.; Fletcher, C.D.; Modro, S.M.

    1992-08-01

    Next generation commercial reactor designs emphasize enhanced safety through improved safety system reliability and performance by means of system simplification and reliance on immutable natural forces for system operation. Simulating the performance of these safety systems will be central to analytical safety evaluation of advanced passive reactor designs. Yet the characteristically small driving forces of these safety systems pose challenging computational problems to current thermal-hydraulic systems analysis codes. Additionally, the safety systems generally interact closely with one another, requiring accurate, integrated simulation of the nuclear steam supply system, engineered safeguards and containment. Furthermore, numerical safety analysis of these advanced passive reactor designs wig necessitate simulation of long-duration, slowly-developing transients compared with current reactor designs. The composite effects of small computational inaccuracies on induced system interactions and perturbations over long periods may well lead to predicted results which are significantly different than would otherwise be expected or might actually occur. Comparisons between the engineered safety features of competing US advanced light water reactor designs and analogous present day reactor designs are examined relative to the adequacy of existing thermal-hydraulic safety codes in predicting the mechanisms of passive safety. Areas where existing codes might require modification, extension or assessment relative to passive safety designs are identified. Conclusions concerning the applicability of these codes to advanced passive light water reactor safety analysis are presented.

  11. Issues affecting advanced passive light-water reactor safety analysis

    SciTech Connect

    Beelman, R.J.; Fletcher, C.D.; Modro, S.M.

    1992-01-01

    Next generation commercial reactor designs emphasize enhanced safety through improved safety system reliability and performance by means of system simplification and reliance on immutable natural forces for system operation. Simulating the performance of these safety systems will be central to analytical safety evaluation of advanced passive reactor designs. Yet the characteristically small driving forces of these safety systems pose challenging computational problems to current thermal-hydraulic systems analysis codes. Additionally, the safety systems generally interact closely with one another, requiring accurate, integrated simulation of the nuclear steam supply system, engineered safeguards and containment. Furthermore, numerical safety analysis of these advanced passive reactor designs wig necessitate simulation of long-duration, slowly-developing transients compared with current reactor designs. The composite effects of small computational inaccuracies on induced system interactions and perturbations over long periods may well lead to predicted results which are significantly different than would otherwise be expected or might actually occur. Comparisons between the engineered safety features of competing US advanced light water reactor designs and analogous present day reactor designs are examined relative to the adequacy of existing thermal-hydraulic safety codes in predicting the mechanisms of passive safety. Areas where existing codes might require modification, extension or assessment relative to passive safety designs are identified. Conclusions concerning the applicability of these codes to advanced passive light water reactor safety analysis are presented.

  12. From Archives to Analysis: Computers Redefine the AAVSO Mission

    NASA Astrophysics Data System (ADS)

    Foster, Grant

    1992-10-01

    Recent advances in computer technology and program development promise great progress for the AAVSO in the near future. Not only have we improved our methods of archiving and evaluating variable star observations, but we will also soon be ready to enter the field of substantial in-house data analysis.

  13. SciDAC Advances and Applications in Computational Beam Dynamics

    SciTech Connect

    Ryne, R.; Abell, D.; Adelmann, A.; Amundson, J.; Bohn, C.; Cary, J.; Colella, P.; Dechow, D.; Decyk, V.; Dragt, A.; Gerber, R.; Habib, S.; Higdon, D.; Katsouleas, T.; Ma, K.-L.; McCorquodale, P.; Mihalcea, D.; Mitchell, C.; Mori, W.; Mottershead, C.T.; Neri, F.; Pogorelov, I.; Qiang, J.; Samulyak, R.; Serafini, D.; Shalf, J.; Siegerist, C.; Spentzouris, P.; Stoltz, P.; Terzic, B.; Venturini, M.; Walstrom, P.

    2005-06-26

    SciDAC has had a major impact on computational beam dynamics and the design of particle accelerators. Particle accelerators--which account for half of the facilities in the DOE Office of Science Facilities for the Future of Science 20 Year Outlook--are crucial for US scientific, industrial, and economic competitiveness. Thanks to SciDAC, accelerator design calculations that were once thought impossible are now carried routinely, and new challenging and important calculations are within reach. SciDAC accelerator modeling codes are being used to get the most science out of existing facilities, to produce optimal designs for future facilities, and to explore advanced accelerator concepts that may hold the key to qualitatively new ways of accelerating charged particle beams. In this poster we present highlights from the SciDAC Accelerator Science and Technology (AST) project Beam Dynamics focus area in regard to algorithm development, software development, and applications.

  14. Advanced information processing system: Inter-computer communication services

    NASA Technical Reports Server (NTRS)

    Burkhardt, Laura; Masotto, Tom; Sims, J. Terry; Whittredge, Roy; Alger, Linda S.

    1991-01-01

    The purpose is to document the functional requirements and detailed specifications for the Inter-Computer Communications Services (ICCS) of the Advanced Information Processing System (AIPS). An introductory section is provided to outline the overall architecture and functional requirements of the AIPS and to present an overview of the ICCS. An overview of the AIPS architecture as well as a brief description of the AIPS software is given. The guarantees of the ICCS are provided, and the ICCS is described as a seven-layered International Standards Organization (ISO) Model. The ICCS functional requirements, functional design, and detailed specifications as well as each layer of the ICCS are also described. A summary of results and suggestions for future work are presented.

  15. Optical design and characterization of an advanced computational imaging system

    NASA Astrophysics Data System (ADS)

    Shepard, R. Hamilton; Fernandez-Cull, Christy; Raskar, Ramesh; Shi, Boxin; Barsi, Christopher; Zhao, Hang

    2014-09-01

    We describe an advanced computational imaging system with an optical architecture that enables simultaneous and dynamic pupil-plane and image-plane coding accommodating several task-specific applications. We assess the optical requirement trades associated with custom and commercial-off-the-shelf (COTS) optics and converge on the development of two low-cost and robust COTS testbeds. The first is a coded-aperture programmable pixel imager employing a digital micromirror device (DMD) for image plane per-pixel oversampling and spatial super-resolution experiments. The second is a simultaneous pupil-encoded and time-encoded imager employing a DMD for pupil apodization or a deformable mirror for wavefront coding experiments. These two testbeds are built to leverage two MIT Lincoln Laboratory focal plane arrays - an orthogonal transfer CCD with non-uniform pixel sampling and on-chip dithering and a digital readout integrated circuit (DROIC) with advanced on-chip per-pixel processing capabilities. This paper discusses the derivation of optical component requirements, optical design metrics, and performance analyses for the two testbeds built.

  16. Reliability of an interactive computer program for advance care planning.

    PubMed

    Schubart, Jane R; Levi, Benjamin H; Camacho, Fabian; Whitehead, Megan; Farace, Elana; Green, Michael J

    2012-06-01

    Despite widespread efforts to promote advance directives (ADs), completion rates remain low. Making Your Wishes Known: Planning Your Medical Future (MYWK) is an interactive computer program that guides individuals through the process of advance care planning, explaining health conditions and interventions that commonly involve life or death decisions, helps them articulate their values/goals, and translates users' preferences into a detailed AD document. The purpose of this study was to demonstrate that (in the absence of major life changes) the AD generated by MYWK reliably reflects an individual's values/preferences. English speakers ≥30 years old completed MYWK twice, 4 to 6 weeks apart. Reliability indices were assessed for three AD components: General Wishes; Specific Wishes for treatment; and Quality-of-Life values (QoL). Twenty-four participants completed the study. Both the Specific Wishes and QoL scales had high internal consistency in both time periods (Knuder Richardson formula 20 [KR-20]=0.83-0.95, and 0.86-0.89). Test-retest reliability was perfect for General Wishes (κ=1), high for QoL (Pearson's correlation coefficient=0.83), but lower for Specific Wishes (Pearson's correlation coefficient=0.57). MYWK generates an AD where General Wishes and QoL (but not Specific Wishes) statements remain consistent over time. PMID:22512830

  17. Reliability of an Interactive Computer Program for Advance Care Planning

    PubMed Central

    Levi, Benjamin H.; Camacho, Fabian; Whitehead, Megan; Farace, Elana; Green, Michael J

    2012-01-01

    Abstract Despite widespread efforts to promote advance directives (ADs), completion rates remain low. Making Your Wishes Known: Planning Your Medical Future (MYWK) is an interactive computer program that guides individuals through the process of advance care planning, explaining health conditions and interventions that commonly involve life or death decisions, helps them articulate their values/goals, and translates users' preferences into a detailed AD document. The purpose of this study was to demonstrate that (in the absence of major life changes) the AD generated by MYWK reliably reflects an individual's values/preferences. English speakers ≥30 years old completed MYWK twice, 4 to 6 weeks apart. Reliability indices were assessed for three AD components: General Wishes; Specific Wishes for treatment; and Quality-of-Life values (QoL). Twenty-four participants completed the study. Both the Specific Wishes and QoL scales had high internal consistency in both time periods (Knuder Richardson formula 20 [KR-20]=0.83–0.95, and 0.86–0.89). Test-retest reliability was perfect for General Wishes (κ=1), high for QoL (Pearson's correlation coefficient=0.83), but lower for Specific Wishes (Pearson's correlation coefficient=0.57). MYWK generates an AD where General Wishes and QoL (but not Specific Wishes) statements remain consistent over time. PMID:22512830

  18. Personal Computer Transport Analysis Program

    NASA Technical Reports Server (NTRS)

    DiStefano, Frank, III; Wobick, Craig; Chapman, Kirt; McCloud, Peter

    2012-01-01

    The Personal Computer Transport Analysis Program (PCTAP) is C++ software used for analysis of thermal fluid systems. The program predicts thermal fluid system and component transients. The output consists of temperatures, flow rates, pressures, delta pressures, tank quantities, and gas quantities in the air, along with air scrubbing component performance. PCTAP s solution process assumes that the tubes in the system are well insulated so that only the heat transfer between fluid and tube wall and between adjacent tubes is modeled. The system described in the model file is broken down into its individual components; i.e., tubes, cold plates, heat exchangers, etc. A solution vector is built from the components and a flow is then simulated with fluid being transferred from one component to the next. The solution vector of components in the model file is built at the initiation of the run. This solution vector is simply a list of components in the order of their inlet dependency on other components. The component parameters are updated in the order in which they appear in the list at every time step. Once the solution vectors have been determined, PCTAP cycles through the components in the solution vector, executing their outlet function for each time-step increment.

  19. Impact of computer advances on future finite elements computations. [for aircraft and spacecraft design

    NASA Technical Reports Server (NTRS)

    Fulton, Robert E.

    1985-01-01

    Research performed over the past 10 years in engineering data base management and parallel computing is discussed, and certain opportunities for research toward the next generation of structural analysis capability are proposed. Particular attention is given to data base management associated with the IPAD project and parallel processing associated with the Finite Element Machine project, both sponsored by NASA, and a near term strategy for a distributed structural analysis capability based on relational data base management software and parallel computers for a future structural analysis system.

  20. Computer graphics in aerodynamic analysis

    NASA Technical Reports Server (NTRS)

    Cozzolongo, J. V.

    1984-01-01

    The use of computer graphics and its application to aerodynamic analyses on a routine basis is outlined. The mathematical modelling of the aircraft geometries and the shading technique implemented are discussed. Examples of computer graphics used to display aerodynamic flow field data and aircraft geometries are shown. A future need in computer graphics for aerodynamic analyses is addressed.

  1. Advanced Placement: Model Policy Components. Policy Analysis

    ERIC Educational Resources Information Center

    Zinth, Jennifer

    2016-01-01

    Advanced Placement (AP), launched in 1955 by the College Board as a program to offer gifted high school students the opportunity to complete entry-level college coursework, has since expanded to encourage a broader array of students to tackle challenging content. This Education Commission of the State's Policy Analysis identifies key components of…

  2. TerraFERMA: Harnessing Advanced Computational Libraries in Earth Science

    NASA Astrophysics Data System (ADS)

    Wilson, C. R.; Spiegelman, M.; van Keken, P.

    2012-12-01

    Many important problems in Earth sciences can be described by non-linear coupled systems of partial differential equations. These "multi-physics" problems include thermo-chemical convection in Earth and planetary interiors, interactions of fluids and magmas with the Earth's mantle and crust and coupled flow of water and ice. These problems are of interest to a large community of researchers but are complicated to model and understand. Much of this complexity stems from the nature of multi-physics where small changes in the coupling between variables or constitutive relations can lead to radical changes in behavior, which in turn affect critical computational choices such as discretizations, solvers and preconditioners. To make progress in understanding such coupled systems requires a computational framework where multi-physics problems can be described at a high-level while maintaining the flexibility to easily modify the solution algorithm. Fortunately, recent advances in computational science provide a basis for implementing such a framework. Here we present the Transparent Finite Element Rapid Model Assembler (TerraFERMA), which leverages several advanced open-source libraries for core functionality. FEniCS (fenicsproject.org) provides a high level language for describing the weak forms of coupled systems of equations, and an automatic code generator that produces finite element assembly code. PETSc (www.mcs.anl.gov/petsc) provides a wide range of scalable linear and non-linear solvers that can be composed into effective multi-physics preconditioners. SPuD (amcg.ese.ic.ac.uk/Spud) is an application neutral options system that provides both human and machine-readable interfaces based on a single xml schema. Our software integrates these libraries and provides the user with a framework for exploring multi-physics problems. A single options file fully describes the problem, including all equations, coefficients and solver options. Custom compiled applications are

  3. Advances in computer technology: impact on the practice of medicine.

    PubMed

    Groth-Vasselli, B; Singh, K; Farnsworth, P N

    1995-01-01

    Advances in computer technology provide a wide range of applications which are revolutionizing the practice of medicine. The development of new software for the office creates a web of communication among physicians, staff members, health care facilities and associated agencies. This provides the physician with the prospect of a paperless office. At the other end of the spectrum, the development of 3D work stations and software based on computational chemistry permits visualization of protein molecules involved in disease. Computer assisted molecular modeling has been used to construct working 3D models of lens alpha-crystallin. The 3D structure of alpha-crystallin is basic to our understanding of the molecular mechanisms involved in lens fiber cell maturation, stabilization of the inner nuclear region, the maintenance of lens transparency and cataractogenesis. The major component of the high molecular weight aggregates that occur during cataractogenesis is alpha-crystallin subunits. Subunits of alpha-crystallin occur in other tissues of the body. In the central nervous system accumulation of these subunits in the form of dense inclusion bodies occurs in pathological conditions such as Alzheimer's disease, Huntington's disease, multiple sclerosis and toxoplasmosis (Iwaki, Wisniewski et al., 1992), as well as neoplasms of astrocyte origin (Iwaki, Iwaki, et al., 1991). Also cardiac ischemia is associated with an increased alpha B synthesis (Chiesi, Longoni et al., 1990). On a more global level, the molecular structure of alpha-crystallin may provide information pertaining to the function of small heat shock proteins, hsp, in maintaining cell stability under the stress of disease. PMID:8721907

  4. Activities and operations of Argonne's Advanced Computing Research Facility: February 1990 through April 1991

    SciTech Connect

    Pieper, G.W.

    1991-05-01

    This report reviews the activities and operations of the Advanced Computing Research Facility (ACRF) from February 1990 through April 1991. The ACRF is operated by the Mathematics and Computer Science Division at Argonne National Laboratory. The facility's principal objective is to foster research in parallel computing. Toward this objective, the ACRF operates experimental advanced computers, supports investigations in parallel computing, and sponsors technology transfer efforts to industry and academia. 5 refs., 1 fig.

  5. Grid computing in image analysis

    PubMed Central

    2011-01-01

    Diagnostic surgical pathology or tissue–based diagnosis still remains the most reliable and specific diagnostic medical procedure. The development of whole slide scanners permits the creation of virtual slides and to work on so-called virtual microscopes. In addition to interactive work on virtual slides approaches have been reported that introduce automated virtual microscopy, which is composed of several tools focusing on quite different tasks. These include evaluation of image quality and image standardization, analysis of potential useful thresholds for object detection and identification (segmentation), dynamic segmentation procedures, adjustable magnification to optimize feature extraction, and texture analysis including image transformation and evaluation of elementary primitives. Grid technology seems to possess all features to efficiently target and control the specific tasks of image information and detection in order to obtain a detailed and accurate diagnosis. Grid technology is based upon so-called nodes that are linked together and share certain communication rules in using open standards. Their number and functionality can vary according to the needs of a specific user at a given point in time. When implementing automated virtual microscopy with Grid technology, all of the five different Grid functions have to be taken into account, namely 1) computation services, 2) data services, 3) application services, 4) information services, and 5) knowledge services. Although all mandatory tools of automated virtual microscopy can be implemented in a closed or standardized open system, Grid technology offers a new dimension to acquire, detect, classify, and distribute medical image information, and to assure quality in tissue–based diagnosis. PMID:21516880

  6. Recovery Act: Advanced Interaction, Computation, and Visualization Tools for Sustainable Building Design

    SciTech Connect

    Greenberg, Donald P.; Hencey, Brandon M.

    2013-08-20

    Current building energy simulation technology requires excessive labor, time and expertise to create building energy models, excessive computational time for accurate simulations and difficulties with the interpretation of the results. These deficiencies can be ameliorated using modern graphical user interfaces and algorithms which take advantage of modern computer architectures and display capabilities. To prove this hypothesis, we developed an experimental test bed for building energy simulation. This novel test bed environment offers an easy-to-use interactive graphical interface, provides access to innovative simulation modules that run at accelerated computational speeds, and presents new graphics visualization methods to interpret simulation results. Our system offers the promise of dramatic ease of use in comparison with currently available building energy simulation tools. Its modular structure makes it suitable for early stage building design, as a research platform for the investigation of new simulation methods, and as a tool for teaching concepts of sustainable design. Improvements in the accuracy and execution speed of many of the simulation modules are based on the modification of advanced computer graphics rendering algorithms. Significant performance improvements are demonstrated in several computationally expensive energy simulation modules. The incorporation of these modern graphical techniques should advance the state of the art in the domain of whole building energy analysis and building performance simulation, particularly at the conceptual design stage when decisions have the greatest impact. More importantly, these better simulation tools will enable the transition from prescriptive to performative energy codes, resulting in better, more efficient designs for our future built environment.

  7. Advanced reliability method for fatigue analysis

    NASA Technical Reports Server (NTRS)

    Wu, Y.-T.; Wirsching, P. H.

    1984-01-01

    When design factors are considered as random variables and the failure condition cannot be expressed by a closed form algebraic inequality, computations of risk (or probability of failure) may become extremely difficult or very inefficient. This study suggests using a simple and easily constructed second degree polynomial to approximate the complicated limit state in the neighborhood of the design point; a computer analysis relates the design variables at selected points. Then a fast probability integration technique (i.e., the Rackwitz-Fiessler algorithm) can be used to estimate risk. The capability of the proposed method is demonstrated in an example of a low cycle fatigue problem for which a computer analysis is required to perform local strain analysis to relate the design variables. A comparison of the performance of this method is made with a far more costly Monte Carlo solution. Agreement of the proposed method with Monte Carlo is considered to be good.

  8. Recovery Act: Advanced Direct Methanol Fuel Cell for Mobile Computing

    SciTech Connect

    Fletcher, James H.; Cox, Philip; Harrington, William J; Campbell, Joseph L

    2013-09-03

    ABSTRACT Project Title: Recovery Act: Advanced Direct Methanol Fuel Cell for Mobile Computing PROJECT OBJECTIVE The objective of the project was to advance portable fuel cell system technology towards the commercial targets of power density, energy density and lifetime. These targets were laid out in the DOE’s R&D roadmap to develop an advanced direct methanol fuel cell power supply that meets commercial entry requirements. Such a power supply will enable mobile computers to operate non-stop, unplugged from the wall power outlet, by using the high energy density of methanol fuel contained in a replaceable fuel cartridge. Specifically this project focused on balance-of-plant component integration and miniaturization, as well as extensive component, subassembly and integrated system durability and validation testing. This design has resulted in a pre-production power supply design and a prototype that meet the rigorous demands of consumer electronic applications. PROJECT TASKS The proposed work plan was designed to meet the project objectives, which corresponded directly with the objectives outlined in the Funding Opportunity Announcement: To engineer the fuel cell balance-of-plant and packaging to meet the needs of consumer electronic systems, specifically at power levels required for mobile computing. UNF used existing balance-of-plant component technologies developed under its current US Army CERDEC project, as well as a previous DOE project completed by PolyFuel, to further refine them to both miniaturize and integrate their functionality to increase the system power density and energy density. Benefits of UNF’s novel passive water recycling MEA (membrane electrode assembly) and the simplified system architecture it enabled formed the foundation of the design approach. The package design was hardened to address orientation independence, shock, vibration, and environmental requirements. Fuel cartridge and fuel subsystems were improved to ensure effective fuel

  9. Condition monitoring through advanced sensor and computational technology : final report (January 2002 to May 2005).

    SciTech Connect

    Kim, Jung-Taek; Luk, Vincent K.

    2005-05-01

    The overall goal of this joint research project was to develop and demonstrate advanced sensors and computational technology for continuous monitoring of the condition of components, structures, and systems in advanced and next-generation nuclear power plants (NPPs). This project included investigating and adapting several advanced sensor technologies from Korean and US national laboratory research communities, some of which were developed and applied in non-nuclear industries. The project team investigated and developed sophisticated signal processing, noise reduction, and pattern recognition techniques and algorithms. The researchers installed sensors and conducted condition monitoring tests on two test loops, a check valve (an active component) and a piping elbow (a passive component), to demonstrate the feasibility of using advanced sensors and computational technology to achieve the project goal. Acoustic emission (AE) devices, optical fiber sensors, accelerometers, and ultrasonic transducers (UTs) were used to detect mechanical vibratory response of check valve and piping elbow in normal and degraded configurations. Chemical sensors were also installed to monitor the water chemistry in the piping elbow test loop. Analysis results of processed sensor data indicate that it is feasible to differentiate between the normal and degraded (with selected degradation mechanisms) configurations of these two components from the acquired sensor signals, but it is questionable that these methods can reliably identify the level and type of degradation. Additional research and development efforts are needed to refine the differentiation techniques and to reduce the level of uncertainties.

  10. Recent advances in morphological cell image analysis.

    PubMed

    Chen, Shengyong; Zhao, Mingzhu; Wu, Guang; Yao, Chunyan; Zhang, Jianwei

    2012-01-01

    This paper summarizes the recent advances in image processing methods for morphological cell analysis. The topic of morphological analysis has received much attention with the increasing demands in both bioinformatics and biomedical applications. Among many factors that affect the diagnosis of a disease, morphological cell analysis and statistics have made great contributions to results and effects for a doctor. Morphological cell analysis finds the cellar shape, cellar regularity, classification, statistics, diagnosis, and so forth. In the last 20 years, about 1000 publications have reported the use of morphological cell analysis in biomedical research. Relevant solutions encompass a rather wide application area, such as cell clumps segmentation, morphological characteristics extraction, 3D reconstruction, abnormal cells identification, and statistical analysis. These reports are summarized in this paper to enable easy referral to suitable methods for practical solutions. Representative contributions and future research trends are also addressed. PMID:22272215

  11. Computer-Based Linguistic Analysis.

    ERIC Educational Resources Information Center

    Wright, James R.

    Noam Chomsky's transformational-generative grammar model may effectively be translated into an equivalent computer model. Phrase-structure rules and transformations are tested as to their validity and ordering by the computer via the process of random lexical substitution. Errors appearing in the grammar are detected and rectified, and formal…

  12. Bimolecular dynamics by computer analysis

    SciTech Connect

    Eilbeck, J.C.; Lomdahl, P.S.; Scott, A.C.

    1984-01-01

    As numerical tools (computers and display equipment) become more powerful and the atomic structures of important biological molecules become known, the importance of detailed computation of nonequilibrium biomolecular dynamics increases. In this manuscript we report results from a well developed study of the hydrogen bonded polypeptide crystal acetanilide, a model protein. Directions for future research are suggested. 9 references, 6 figures.

  13. Activities and operations of the Advanced Computing Research Facility, October 1986-October 1987

    SciTech Connect

    Pieper, G.W.

    1987-01-01

    This paper contains a description of the work being carried out at the advanced computing research facility at Argonne National Laboratory. Topics covered are upgrading of computers, networking changes, algorithms, parallel programming, programming languages, and user training. (LSP)

  14. Advanced electric field computation for RF sheaths prediction with TOPICA

    NASA Astrophysics Data System (ADS)

    Milanesio, Daniele; Maggiora, Riccardo

    2012-10-01

    The design of an Ion Cyclotron (IC) launcher is not only driven by its coupling properties, but also by its capability of maintaining low parallel electric fields in front of it, in order to provide good power transfer to plasma and to reduce the impurities production. However, due to the impossibility to verify the antenna performances before the starting of the operations, advanced numerical simulation tools are the only alternative to carry out a proper antenna design. With this in mind, it should be clear that the adoption of a code, such as TOPICA [1], able to precisely take into account a realistic antenna geometry and an accurate plasma description, is extremely important to achieve these goals. Because of the recently introduced features that allow to compute the electric field distribution everywhere inside the antenna enclosure and in the plasma column, the TOPICA code appears to be the only alternative to understand which elements may have a not negligible impact on the antenna design and then to suggest further optimizations in order to mitigate RF potentials. The present work documents the evaluation of the electric field map from actual antennas, like the Tore Supra Q5 and the JET A2 launchers, and the foreseen ITER IC antenna. [4pt] [1] D. Milanesio et al., Nucl. Fusion 49, 115019 (2009).

  15. Analysis and design of advanced composite bounded joints

    NASA Technical Reports Server (NTRS)

    Hart-Smith, L. J.

    1974-01-01

    Advances in the analysis of adhesive-bonded joints are presented with particular emphasis on advanced composite structures. The joints analyzed are of double-lap, single-lap, scarf, stepped-lap and tapered-lap configurations. Tensile, compressive, and in-plane shear loads are covered. In addition to the usual geometric variables, the theory accounts for the strength increases attributable to adhesive plasticity (in terms of the elastic-plastic adhesive model) and the joint strength reductions imposed by imbalances between the adherends. The solutions are largely closed-form analytical results, employing iterative solutions on a digital computer for the more complicated joint configurations. In assessing the joint efficiency, three potential failure modes are considered. These are adherend failure outside the joint, adhesive failure in shear, and adherend interlaminar tension failure (or adhesive failure in peel). Each mode is governed by a distinct mathematical analysis and each prevails throughout different ranges of geometric sizes and proportions.

  16. Activities and operations of the Advanced Computing Research Facility, January 1989--January 1990

    SciTech Connect

    Pieper, G.W.

    1990-02-01

    This report reviews the activities and operations of the Advanced Computing Research Facility (ACRF) for the period January 1, 1989, through January 31, 1990. The ACRF is operated by the Mathematics and Computer Science Division at Argonne National Laboratory. The facility's principal objective is to foster research in parallel computing. Toward this objective, the ACRF continues to operate experimental advanced computers and to sponsor new technology transfer efforts and new research projects. 4 refs., 8 figs.

  17. Research Institute for Advanced Computer Science: Annual Report October 1998 through September 1999

    NASA Technical Reports Server (NTRS)

    Leiner, Barry M.; Gross, Anthony R. (Technical Monitor)

    1999-01-01

    The Research Institute for Advanced Computer Science (RIACS) carries out basic research and technology development in computer science, in support of the National Aeronautics and Space Administration's missions. RIACS is located at the NASA Ames Research Center (ARC). It currently operates under a multiple year grant/cooperative agreement that began on October 1, 1997 and is up for renewal in the year 2002. ARC has been designated NASA's Center of Excellence in Information Technology. In this capacity, ARC is charged with the responsibility to build an Information Technology Research Program that is preeminent within NASA. RIACS serves as a bridge between NASA ARC and the academic community, and RIACS scientists and visitors work in close collaboration with NASA scientists. RIACS has the additional goal of broadening the base of researchers in these areas of importance to the nation's space and aeronautics enterprises. RIACS research focuses on the three cornerstones of information technology research necessary to meet the future challenges of NASA missions: (1) Automated Reasoning for Autonomous Systems. Techniques are being developed enabling spacecraft that will be self-guiding and self-correcting to the extent that they will require little or no human intervention. Such craft will be equipped to independently solve problems as they arise, and fulfill their missions with minimum direction from Earth. (2) Human-Centered Computing. Many NASA missions require synergy between humans and computers, with sophisticated computational aids amplifying human cognitive and perceptual abilities; (3) High Performance Computing and Networking Advances in the performance of computing and networking continue to have major impact on a variety of NASA endeavors, ranging from modeling and simulation to data analysis of large datasets to collaborative engineering, planning and execution. In addition, RIACS collaborates with NASA scientists to apply information technology research to

  18. Advanced Fuel Cycle Economic Sensitivity Analysis

    SciTech Connect

    David Shropshire; Kent Williams; J.D. Smith; Brent Boore

    2006-12-01

    A fuel cycle economic analysis was performed on four fuel cycles to provide a baseline for initial cost comparison using the Gen IV Economic Modeling Work Group G4 ECON spreadsheet model, Decision Programming Language software, the 2006 Advanced Fuel Cycle Cost Basis report, industry cost data, international papers, the nuclear power related cost study from MIT, Harvard, and the University of Chicago. The analysis developed and compared the fuel cycle cost component of the total cost of energy for a wide range of fuel cycles including: once through, thermal with fast recycle, continuous fast recycle, and thermal recycle.

  19. Advanced Analysis Methods in High Energy Physics

    SciTech Connect

    Pushpalatha C. Bhat

    2001-10-03

    During the coming decade, high energy physics experiments at the Fermilab Tevatron and around the globe will use very sophisticated equipment to record unprecedented amounts of data in the hope of making major discoveries that may unravel some of Nature's deepest mysteries. The discovery of the Higgs boson and signals of new physics may be around the corner. The use of advanced analysis techniques will be crucial in achieving these goals. The author discusses some of the novel methods of analysis that could prove to be particularly valuable for finding evidence of any new physics, for improving precision measurements and for exploring parameter spaces of theoretical models.

  20. Large-scale computations in analysis of structures

    SciTech Connect

    McCallen, D.B.; Goudreau, G.L.

    1993-09-01

    Computer hardware and numerical analysis algorithms have progressed to a point where many engineering organizations and universities can perform nonlinear analyses on a routine basis. Through much remains to be done in terms of advancement of nonlinear analysis techniques and characterization on nonlinear material constitutive behavior, the technology exists today to perform useful nonlinear analysis for many structural systems. In the current paper, a survey on nonlinear analysis technologies developed and employed for many years on programmatic defense work at the Lawrence Livermore National Laboratory is provided, and ongoing nonlinear numerical simulation projects relevant to the civil engineering field are described.

  1. CFD Based Computations of Flexible Helicopter Blades for Stability Analysis

    NASA Technical Reports Server (NTRS)

    Guruswamy, Guru P.

    2011-01-01

    As a collaborative effort among government aerospace research laboratories an advanced version of a widely used computational fluid dynamics code, OVERFLOW, was recently released. This latest version includes additions to model flexible rotating multiple blades. In this paper, the OVERFLOW code is applied to improve the accuracy of airload computations from the linear lifting line theory that uses displacements from beam model. Data transfers required at every revolution are managed through a Unix based script that runs jobs on large super-cluster computers. Results are demonstrated for the 4-bladed UH-60A helicopter. Deviations of computed data from flight data are evaluated. Fourier analysis post-processing that is suitable for aeroelastic stability computations are performed.

  2. Computational methods for global/local analysis

    NASA Technical Reports Server (NTRS)

    Ransom, Jonathan B.; Mccleary, Susan L.; Aminpour, Mohammad A.; Knight, Norman F., Jr.

    1992-01-01

    Computational methods for global/local analysis of structures which include both uncoupled and coupled methods are described. In addition, global/local analysis methodology for automatic refinement of incompatible global and local finite element models is developed. Representative structural analysis problems are presented to demonstrate the global/local analysis methods.

  3. Advanced computer techniques for inverse modeling of electric current in cardiac tissue

    SciTech Connect

    Hutchinson, S.A.; Romero, L.A.; Diegert, C.F.

    1996-08-01

    For many years, ECG`s and vector cardiograms have been the tools of choice for non-invasive diagnosis of cardiac conduction problems, such as found in reentrant tachycardia or Wolff-Parkinson-White (WPW) syndrome. Through skillful analysis of these skin-surface measurements of cardiac generated electric currents, a physician can deduce the general location of heart conduction irregularities. Using a combination of high-fidelity geometry modeling, advanced mathematical algorithms and massively parallel computing, Sandia`s approach would provide much more accurate information and thus allow the physician to pinpoint the source of an arrhythmia or abnormal conduction pathway.

  4. ADVANCED COMPUTATIONAL MODEL FOR THREE-PHASE SLURRY REACTORS

    SciTech Connect

    Goodarz Ahmadi

    2001-10-01

    In the second year of the project, the Eulerian-Lagrangian formulation for analyzing three-phase slurry flows in a bubble column is further developed. The approach uses an Eulerian analysis of liquid flows in the bubble column, and makes use of the Lagrangian trajectory analysis for the bubbles and particle motions. An experimental set for studying a two-dimensional bubble column is also developed. The operation of the bubble column is being tested and diagnostic methodology for quantitative measurements is being developed. An Eulerian computational model for the flow condition in the two-dimensional bubble column is also being developed. The liquid and bubble motions are being analyzed and the results are being compared with the experimental setup. Solid-fluid mixture flows in ducts and passages at different angle of orientations were analyzed. The model predictions were compared with the experimental data and good agreement was found. Gravity chute flows of solid-liquid mixtures is also being studied. Further progress was also made in developing a thermodynamically consistent model for multiphase slurry flows with and without chemical reaction in a state of turbulent motion. The balance laws are obtained and the constitutive laws are being developed. Progress was also made in measuring concentration and velocity of particles of different sizes near a wall in a duct flow. The technique of Phase-Doppler anemometry was used in these studies. The general objective of this project is to provide the needed fundamental understanding of three-phase slurry reactors in Fischer-Tropsch (F-T) liquid fuel synthesis. The other main goal is to develop a computational capability for predicting the transport and processing of three-phase coal slurries. The specific objectives are: (1) To develop a thermodynamically consistent rate-dependent anisotropic model for multiphase slurry flows with and without chemical reaction for application to coal liquefaction. Also establish the

  5. Develop Advanced Nonlinear Signal Analysis Topographical Mapping System

    NASA Technical Reports Server (NTRS)

    Jong, Jen-Yi

    1997-01-01

    During the development of the SSME, a hierarchy of advanced signal analysis techniques for mechanical signature analysis has been developed by NASA and AI Signal Research Inc. (ASRI) to improve the safety and reliability for Space Shuttle operations. These techniques can process and identify intelligent information hidden in a measured signal which is often unidentifiable using conventional signal analysis methods. Currently, due to the highly interactive processing requirements and the volume of dynamic data involved, detailed diagnostic analysis is being performed manually which requires immense man-hours with extensive human interface. To overcome this manual process, NASA implemented this program to develop an Advanced nonlinear signal Analysis Topographical Mapping System (ATMS) to provide automatic/unsupervised engine diagnostic capabilities. The ATMS will utilize a rule-based Clips expert system to supervise a hierarchy of diagnostic signature analysis techniques in the Advanced Signal Analysis Library (ASAL). ASAL will perform automatic signal processing, archiving, and anomaly detection/identification tasks in order to provide an intelligent and fully automated engine diagnostic capability. The ATMS has been successfully developed under this contract. In summary, the program objectives to design, develop, test and conduct performance evaluation for an automated engine diagnostic system have been successfully achieved. Software implementation of the entire ATMS system on MSFC's OISPS computer has been completed. The significance of the ATMS developed under this program is attributed to the fully automated coherence analysis capability for anomaly detection and identification which can greatly enhance the power and reliability of engine diagnostic evaluation. The results have demonstrated that ATMS can significantly save time and man-hours in performing engine test/flight data analysis and performance evaluation of large volumes of dynamic test data.

  6. Advanced Power Plant Development and Analysis Methodologies

    SciTech Connect

    A.D. Rao; G.S. Samuelsen; F.L. Robson; B. Washom; S.G. Berenyi

    2006-06-30

    Under the sponsorship of the U.S. Department of Energy/National Energy Technology Laboratory, a multi-disciplinary team led by the Advanced Power and Energy Program of the University of California at Irvine is defining the system engineering issues associated with the integration of key components and subsystems into advanced power plant systems with goals of achieving high efficiency and minimized environmental impact while using fossil fuels. These power plant concepts include 'Zero Emission' power plants and the 'FutureGen' H2 co-production facilities. The study is broken down into three phases. Phase 1 of this study consisted of utilizing advanced technologies that are expected to be available in the 'Vision 21' time frame such as mega scale fuel cell based hybrids. Phase 2 includes current state-of-the-art technologies and those expected to be deployed in the nearer term such as advanced gas turbines and high temperature membranes for separating gas species and advanced gasifier concepts. Phase 3 includes identification of gas turbine based cycles and engine configurations suitable to coal-based gasification applications and the conceptualization of the balance of plant technology, heat integration, and the bottoming cycle for analysis in a future study. Also included in Phase 3 is the task of acquiring/providing turbo-machinery in order to gather turbo-charger performance data that may be used to verify simulation models as well as establishing system design constraints. The results of these various investigations will serve as a guide for the U. S. Department of Energy in identifying the research areas and technologies that warrant further support.

  7. Temporality Matters: Advancing a Method for Analyzing Problem-Solving Processes in a Computer-Supported Collaborative Environment

    ERIC Educational Resources Information Center

    Kapur, Manu

    2011-01-01

    This paper argues for a need to develop methods for examining temporal patterns in computer-supported collaborative learning (CSCL) groups. It advances one such quantitative method--Lag-sequential Analysis (LsA)--and instantiates it in a study of problem-solving interactions of collaborative groups in an online, synchronous environment. LsA…

  8. Environmental studies: Mathematical, computational, and statistical analysis

    SciTech Connect

    Wheeler, M.F.

    1996-12-31

    The Summer Program on Mathematical, Computational, and Statistical Analyses in Environmental Studies held 6--31 July 1992 was designed to provide a much needed interdisciplinary forum for joint exploration of recent advances in the formulation and application of (A) environmental models, (B) environmental data and data assimilation, (C) stochastic modeling and optimization, and (D) global climate modeling. These four conceptual frameworks provided common themes among a broad spectrum of specific technical topics at this workshop. The program brought forth a mix of physical concepts and processes such as chemical kinetics, atmospheric dynamics, cloud physics and dynamics, flow in porous media, remote sensing, climate statistical, stochastic processes, parameter identification, model performance evaluation, aerosol physics and chemistry, and data sampling together with mathematical concepts in stiff differential systems, advective-diffusive-reactive PDEs, inverse scattering theory, time series analysis, particle dynamics, stochastic equations, optimal control, and others. Nineteen papers are presented in this volume. Selected papers are indexed separately for inclusion in the Energy Science and Technology Database.

  9. Life-cycle cost analysis of advanced design mixer pump

    SciTech Connect

    Hall, M.N., Westinghouse Hanford

    1996-07-23

    This analysis provides cost justification for the Advanced Design Mixer Pump program based on the cost benefit to the Hanford Site of 4 mixer pump systems defined in terms of the life-cycle cost.A computer model is used to estimate the total number of service hours necessary for each mixer pump to operate over the 20-year retrieval sequence period for single-shell tank waste. This study also considered the double-shell tank waste retrieved prior to the single-shell tank waste which is considered the initial retrieval.

  10. Computer applications for engineering/structural analysis

    NASA Astrophysics Data System (ADS)

    Zaslawsky, M.; Samaddar, S. K.

    1991-10-01

    Analysts and organizations have a tendency to lock themselves into specific codes with the obvious consequence of not addressing the real problem and thus reaching the wrong conclusion. This paper discusses the role of the analyst in selecting computer codes. The participation and support of a computation division in modifying the source program, configuration management, and pre- and post-processing of codes are among the subjects discussed. Specific examples illustrating the computer code selection process are described in the following problem areas: soil structure interaction, structural analysis of nuclear reactors, analysis of waste tanks where fluid structure interaction is important, analysis of equipment, structure-structure interaction, analysis of the operation of the superconductor supercollider which includes friction and transient temperature, and 3D analysis of the 10-meter telescope being built in Hawaii. Validation and verification of computer codes and their impact on the selection process are also discussed.

  11. Computer applications for engineering/structural analysis

    SciTech Connect

    Zaslawsky, M.; Samaddar, S.K.

    1991-01-01

    Analysts and organizations have a tendency to lock themselves into specific codes with the obvious consequences of not addressing the real problem and thus reaching the wrong conclusion. This paper discusses the role of the analyst in selecting computer codes. The participation and support of a computation division in modifying the source program, configuration management, and pre- and post-processing of codes are among the subjects discussed. Specific examples illustrating the computer code selection process are described in the following problem areas: soil structure interaction, structural analysis of nuclear reactors, analysis of waste tanks where fluid structure interaction is important, analysis of equipment, structure-structure interaction, analysis of the operation of the superconductor supercollider which includes friction and transient temperature, and 3D analysis of the 10-meter telescope being built in Hawaii. Validation and verification of computer codes and their impact on the selection process are also discussed.

  12. Advanced Certification Program for Computer Graphic Specialists. Final Performance Report.

    ERIC Educational Resources Information Center

    Parkland Coll., Champaign, IL.

    A pioneer program in computer graphics was implemented at Parkland College (Illinois) to meet the demand for specialized technicians to visualize data generated on high performance computers. In summer 1989, 23 students were accepted into the pilot program. Courses included C programming, calculus and analytic geometry, computer graphics, and…

  13. Important Advances in Technology and Unique Applications to Cardiovascular Computed Tomography

    PubMed Central

    Chaikriangkrai, Kongkiat; Choi, Su Yeon; Nabi, Faisal; Chang, Su Min

    2014-01-01

    For the past decade, multidetector cardiac computed tomography and its main application, coronary computed tomography angiography, have been established as a noninvasive technique for anatomical assessment of coronary arteries. This new era of coronary artery evaluation by coronary computed tomography angiography has arisen from the rapid advancement in computed tomography technology, which has led to massive diagnostic and prognostic clinical studies in various patient populations. This article gives a brief overview of current multidetector cardiac computed tomography systems, developing cardiac computed tomography technologies in both hardware and software fields, innovative radiation exposure reduction measures, multidetector cardiac computed tomography functional studies, and their newer clinical applications beyond coronary computed tomography angiography. PMID:25574342

  14. Important advances in technology and unique applications to cardiovascular computed tomography.

    PubMed

    Chaikriangkrai, Kongkiat; Choi, Su Yeon; Nabi, Faisal; Chang, Su Min

    2014-01-01

    For the past decade, multidetector cardiac computed tomography and its main application, coronary computed tomography angiography, have been established as a noninvasive technique for anatomical assessment of coronary arteries. This new era of coronary artery evaluation by coronary computed tomography angiography has arisen from the rapid advancement in computed tomography technology, which has led to massive diagnostic and prognostic clinical studies in various patient populations. This article gives a brief overview of current multidetector cardiac computed tomography systems, developing cardiac computed tomography technologies in both hardware and software fields, innovative radiation exposure reduction measures, multidetector cardiac computed tomography functional studies, and their newer clinical applications beyond coronary computed tomography angiography. PMID:25574342

  15. First 3 years of operation of RIACS (Research Institute for Advanced Computer Science) (1983-1985)

    NASA Technical Reports Server (NTRS)

    Denning, P. J.

    1986-01-01

    The focus of the Research Institute for Advanced Computer Science (RIACS) is to explore matches between advanced computing architectures and the processes of scientific research. An architecture evaluation of the MIT static dataflow machine, specification of a graphical language for expressing distributed computations, and specification of an expert system for aiding in grid generation for two-dimensional flow problems was initiated. Research projects for 1984 and 1985 are summarized.

  16. IUE Data Analysis Software for Personal Computers

    NASA Technical Reports Server (NTRS)

    Thompson, R.; Caplinger, J.; Taylor, L.; Lawton , P.

    1996-01-01

    This report summarizes the work performed for the program titled, "IUE Data Analysis Software for Personal Computers" awarded under Astrophysics Data Program NRA 92-OSSA-15. The work performed was completed over a 2-year period starting in April 1994. As a result of the project, 450 IDL routines and eight database tables are now available for distribution for Power Macintosh computers and Personal Computers running Windows 3.1.

  17. Computer Programming in a Spatial Analysis Course.

    ERIC Educational Resources Information Center

    Gesler, Wilbert; Kaplan, Abram

    1993-01-01

    Contends that students in spatial analysis courses generally are familiar with computer use and programs but lack basic computer programing skills. Describes four exercises in which students learn programing using BASIC and dBASE. Asserts that programming exercises help students clarify concepts, understand the rationale behind calculations, use…

  18. Discourse Analysis of Teaching Computing Online

    ERIC Educational Resources Information Center

    Bower, Matt

    2009-01-01

    This paper analyses the teaching and learning of computing in a Web-conferencing environment. A discourse analysis of three introductory programming learning episodes is presented to demonstrate issues and effects that arise when teaching computing using such an approach. The subject of discussion, the interactive nature of discussion and any…

  19. A Computer Language for ECG Contour Analysis

    PubMed Central

    McConnochie, John W.

    1982-01-01

    The purpose of this paper is to demonstrate contructively that criteria for ECG contour analysis can be interpreted directly by a computer. Thereby, the programming task is greatly reduced. Direct interpretation is achieved by the creation of a computer language that is well-suited for the expression of such criteria. Further development of the language is planned.

  20. ADVANCED COMPUTATIONAL MODEL FOR THREE-PHASE SLURRY REACTORS

    SciTech Connect

    Goodarz Ahmadi

    2004-10-01

    In this project, an Eulerian-Lagrangian formulation for analyzing three-phase slurry flows in a bubble column was developed. The approach used an Eulerian analysis of liquid flows in the bubble column, and made use of the Lagrangian trajectory analysis for the bubbles and particle motions. The bubble-bubble and particle-particle collisions are included the model. The model predictions are compared with the experimental data and good agreement was found An experimental setup for studying two-dimensional bubble columns was developed. The multiphase flow conditions in the bubble column were measured using optical image processing and Particle Image Velocimetry techniques (PIV). A simple shear flow device for bubble motion in a constant shear flow field was also developed. The flow conditions in simple shear flow device were studied using PIV method. Concentration and velocity of particles of different sizes near a wall in a duct flow was also measured. The technique of Phase-Doppler anemometry was used in these studies. An Eulerian volume of fluid (VOF) computational model for the flow condition in the two-dimensional bubble column was also developed. The liquid and bubble motions were analyzed and the results were compared with observed flow patterns in the experimental setup. Solid-fluid mixture flows in ducts and passages at different angle of orientations were also analyzed. The model predictions were compared with the experimental data and good agreement was found. Gravity chute flows of solid-liquid mixtures were also studied. The simulation results were compared with the experimental data and discussed A thermodynamically consistent model for multiphase slurry flows with and without chemical reaction in a state of turbulent motion was developed. The balance laws were obtained and the constitutive laws established.

  1. An advanced probabilistic structural analysis method for implicit performance functions

    NASA Technical Reports Server (NTRS)

    Wu, Y.-T.; Millwater, H. R.; Cruse, T. A.

    1989-01-01

    In probabilistic structural analysis, the performance or response functions usually are implicitly defined and must be solved by numerical analysis methods such as finite element methods. In such cases, the most commonly used probabilistic analysis tool is the mean-based, second-moment method which provides only the first two statistical moments. This paper presents a generalized advanced mean value (AMV) method which is capable of establishing the distributions to provide additional information for reliability design. The method requires slightly more computations than the second-moment method but is highly efficient relative to the other alternative methods. In particular, the examples show that the AMV method can be used to solve problems involving non-monotonic functions that result in truncated distributions.

  2. Validation Database Based Thermal Analysis of an Advanced RPS Concept

    NASA Technical Reports Server (NTRS)

    Balint, Tibor S.; Emis, Nickolas D.

    2006-01-01

    Advanced RPS concepts can be conceived, designed and assessed using high-end computational analysis tools. These predictions may provide an initial insight into the potential performance of these models, but verification and validation are necessary and required steps to gain confidence in the numerical analysis results. This paper discusses the findings from a numerical validation exercise for a small advanced RPS concept, based on a thermal analysis methodology developed at JPL and on a validation database obtained from experiments performed at Oregon State University. Both the numerical and experimental configurations utilized a single GPHS module enabled design, resembling a Mod-RTG concept. The analysis focused on operating and environmental conditions during the storage phase only. This validation exercise helped to refine key thermal analysis and modeling parameters, such as heat transfer coefficients, and conductivity and radiation heat transfer values. Improved understanding of the Mod-RTG concept through validation of the thermal model allows for future improvements to this power system concept.

  3. Computer architectures for computational physics work done by Computational Research and Technology Branch and Advanced Computational Concepts Group

    NASA Technical Reports Server (NTRS)

    1985-01-01

    Slides are reproduced that describe the importance of having high performance number crunching and graphics capability. They also indicate the types of research and development underway at Ames Research Center to ensure that, in the near term, Ames is a smart buyer and user, and in the long-term that Ames knows the best possible solutions for number crunching and graphics needs. The drivers for this research are real computational physics applications of interest to Ames and NASA. They are concerned with how to map the applications, and how to maximize the physics learned from the results of the calculations. The computer graphics activities are aimed at getting maximum information from the three-dimensional calculations by using the real time manipulation of three-dimensional data on the Silicon Graphics workstation. Work is underway on new algorithms that will permit the display of experimental results that are sparse and random, the same way that the dense and regular computed results are displayed.

  4. Computer aided analysis of phonocardiogram.

    PubMed

    Singh, J; Anand, R S

    2007-01-01

    In the present paper analysis of phonocardiogram (PCG) records are presented. The analysis has been carried out in both time and frequency domains with the aim of detecting certain correlations between the time and frequency domain representations of PCG. The analysis is limited to first and second heart sounds (S1 and S2) only. In the time domain analysis the moving window averaging technique is used to determine the occurrence of S1 and S2, which helps in determination of cardiac interval and absolute and relative time duration of individual S1 and S2, as well as absolute and relative duration between them. In the frequency domain, fast Fourier transform (FFT) of the complete PCG record, and short time Fourier transform (STFT) and wavelet transform of individual heart sounds have been carried out. The frequency domain analysis gives an idea about the dominant frequency components in individual records and frequency spectrum of individual heart sounds. A comparative observation on both the analyses gives some correlation between time domain and frequency domain representations of PCG. PMID:17701776

  5. Recent developments of the NESSUS probabilistic structural analysis computer program

    NASA Technical Reports Server (NTRS)

    Millwater, H.; Wu, Y.-T.; Torng, T.; Thacker, B.; Riha, D.; Leung, C. P.

    1992-01-01

    The NESSUS probabilistic structural analysis computer program combines state-of-the-art probabilistic algorithms with general purpose structural analysis methods to compute the probabilistic response and the reliability of engineering structures. Uncertainty in loading, material properties, geometry, boundary conditions and initial conditions can be simulated. The structural analysis methods include nonlinear finite element and boundary element methods. Several probabilistic algorithms are available such as the advanced mean value method and the adaptive importance sampling method. The scope of the code has recently been expanded to include probabilistic life and fatigue prediction of structures in terms of component and system reliability and risk analysis of structures considering cost of failure. The code is currently being extended to structural reliability considering progressive crack propagation. Several examples are presented to demonstrate the new capabilities.

  6. Recent advances in flow injection analysis.

    PubMed

    Trojanowicz, Marek; Kołacińska, Kamila

    2016-04-01

    A dynamic development of methodologies of analytical flow injection measurements during four decades since their invention has reinforced the solid position of flow analysis in the arsenal of techniques and instrumentation of contemporary chemical analysis. With the number of published scientific papers exceeding 20 000, and advanced instrumentation available for environmental, food, and pharmaceutical analysis, flow analysis is well established as an extremely vital field of modern flow chemistry, which is developed simultaneously with methods of chemical synthesis carried out under flow conditions. This review work is based on almost 300 original papers published mostly in the last decade, with special emphasis put on presenting novel achievements from the most recent 2-3 years in order to indicate current development trends of this methodology. Besides the evolution of the design of whole measuring systems, and including especially new applications of various detections methods, several aspects of implications of progress in nanotechnology, and miniaturization of measuring systems for application in different field of modern chemical analysis are also discussed. PMID:26906258

  7. Radiological Safety Analysis Computer Program

    2001-08-28

    RSAC-6 is the latest version of the RSAC program. It calculates the consequences of a release of radionuclides to the atmosphere. Using a personal computer, a user can generate a fission product inventory; decay and in-grow the inventory during transport through processes, facilities, and the environment; model the downwind dispersion of the activity; and calculate doses to downwind individuals. Internal dose from the inhalation and ingestion pathways is calculated. External dose from ground surface andmore » plume gamma pathways is calculated. New and exciting updates to the program include the ability to evaluate a release to an enclosed room, resuspension of deposited activity and evaluation of a release up to 1 meter from the release point. Enhanced tools are included for dry deposition, building wake, occupancy factors, respirable fraction, AMAD adjustment, updated and enhanced radionuclide inventory and inclusion of the dose-conversion factors from FOR 11 and 12.« less

  8. ADVANCED COMPUTATIONAL MODEL FOR THREE-PHASE SLURRY REACTORS

    SciTech Connect

    Goodarz Ahmadi

    2000-11-01

    In the first year of the project, solid-fluid mixture flows in ducts and passages at different angle of orientations were analyzed. The model predictions are compared with the experimental data and good agreement was found. Progress was also made in analyzing the gravity chute flows of solid-liquid mixtures. An Eulerian-Lagrangian formulation for analyzing three-phase slurry flows in a bubble column is being developed. The approach uses an Eulerian analysis of gas liquid flows in the bubble column, and makes use of the Lagrangian particle tracking procedure to analyze the particle motions. Progress was also made in developing a rate dependent thermodynamically consistent model for multiphase slurry flows in a state of turbulent motion. The new model includes the effect of phasic interactions and leads to anisotropic effective phasic stress tensors. Progress was also made in measuring concentration and velocity of particles of different sizes near a wall in a duct flow. The formulation of a thermodynamically consistent model for chemically active multiphase solid-fluid flows in a turbulent state of motion was also initiated. The general objective of this project is to provide the needed fundamental understanding of three-phase slurry reactors in Fischer-Tropsch (F-T) liquid fuel synthesis. The other main goal is to develop a computational capability for predicting the transport and processing of three-phase coal slurries. The specific objectives are: (1) To develop a thermodynamically consistent rate-dependent anisotropic model for multiphase slurry flows with and without chemical reaction for application to coal liquefaction. Also to establish the material parameters of the model. (2) To provide experimental data for phasic fluctuation and mean velocities, as well as the solid volume fraction in the shear flow devices. (3) To develop an accurate computational capability incorporating the new rate-dependent and anisotropic model for analyzing reacting and

  9. [Advanced data analysis and visualization for clinical laboratory].

    PubMed

    Inada, Masanori; Yoneyama, Akiko

    2011-01-01

    This paper describes visualization techniques that help identify hidden structures in clinical laboratory data. The visualization of data is helpful for a rapid and better understanding of the characteristics of data sets. Various charts help the user identify trends in data. Scatter plots help prevent misinterpretations due to invalid data by identifying outliers. The representation of experimental data in figures is always useful for communicating results to others. Currently, flexible methods such as smoothing methods and latent structure analysis are available owing to the presence of advanced hardware and software. Principle component analysis, which is a well-known technique used to reduce multidimensional data sets, can be carried out on a personal computer. These methods could lead to advanced visualization with regard to exploratory data analysis. In this paper, we present 3 examples in order to introduce advanced data analysis. In the first example, a smoothing spline was fitted to a time-series from the control chart which is not in a state of statistical control. The trend line was clearly extracted from the daily measurements of the control samples. In the second example, principal component analysis was used to identify a new diagnostic indicator for Graves' disease. The multi-dimensional data obtained from patients were reduced to lower dimensions, and the principle components thus obtained summarized the variation in the data set. In the final example, a latent structure analysis for a Gaussian mixture model was used to draw complex density functions suitable for actual laboratory data. As a result, 5 clusters were extracted. The mixed density function of these clusters represented the data distribution graphically. The methods used in the above examples make the creation of complicated models for clinical laboratories more simple and flexible. PMID:21404582

  10. Distributed computing and nuclear reactor analysis

    SciTech Connect

    Brown, F.B.; Derstine, K.L.; Blomquist, R.N.

    1994-03-01

    Large-scale scientific and engineering calculations for nuclear reactor analysis can now be carried out effectively in a distributed computing environment, at costs far lower than for traditional mainframes. The distributed computing environment must include support for traditional system services, such as a queuing system for batch work, reliable filesystem backups, and parallel processing capabilities for large jobs. All ANL computer codes for reactor analysis have been adapted successfully to a distributed system based on workstations and X-terminals. Distributed parallel processing has been demonstrated to be effective for long-running Monte Carlo calculations.

  11. Computer aided cogeneration feasibility analysis

    SciTech Connect

    Anaya, D.A.; Caltenco, E.J.L.; Robles, L.F.

    1996-12-31

    A successful cogeneration system design depends of several factors, and the optimal configuration can be founded using a steam and power simulation software. The key characteristics of one of this kind of software are described below, and its application on a process plant cogeneration feasibility analysis is shown in this paper. Finally a study case is illustrated. 4 refs., 2 figs.

  12. Economic Analysis. Computer Simulation Models.

    ERIC Educational Resources Information Center

    Sterling Inst., Washington, DC. Educational Technology Center.

    A multimedia course in economic analysis was developed and used in conjunction with the United States Naval Academy. (See ED 043 790 and ED 043 791 for final reports of the project evaluation and development model.) This volume of the text discusses the simulation of behavioral relationships among variable elements in an economy and presents…

  13. A computer program for estimating the power-density spectrum of advanced continuous simulation language generated time histories

    NASA Technical Reports Server (NTRS)

    Dunn, H. J.

    1981-01-01

    A computer program for performing frequency analysis of time history data is presented. The program uses circular convolution and the fast Fourier transform to calculate power density spectrum (PDS) of time history data. The program interfaces with the advanced continuous simulation language (ACSL) so that a frequency analysis may be performed on ACSL generated simulation variables. An example of the calculation of the PDS of a Van de Pol oscillator is presented.

  14. Advanced hydrogen/oxygen thrust chamber design analysis

    NASA Technical Reports Server (NTRS)

    Shoji, J. M.

    1973-01-01

    The results are reported of the advanced hydrogen/oxygen thrust chamber design analysis program. The primary objectives of this program were to: (1) provide an in-depth analytical investigation to develop thrust chamber cooling and fatigue life limitations of an advanced, high pressure, high performance H2/O2 engine design of 20,000-pounds (88960.0 N) thrust; and (2) integrate the existing heat transfer analysis, thermal fatigue and stress aspects for advanced chambers into a comprehensive computer program. Thrust chamber designs and analyses were performed to evaluate various combustor materials, coolant passage configurations (tubes and channels), and cooling circuits to define the nominal 1900 psia (1.31 x 10 to the 7th power N/sq m) chamber pressure, 300-cycle life thrust chamber. The cycle life capability of the selected configuration was then determined for three duty cycles. Also the influence of cycle life and chamber pressure on thrust chamber design was investigated by varying in cycle life requirements at the nominal chamber pressure and by varying the chamber pressure at the nominal cycle life requirement.

  15. The Advance of Computing from the Ground to the Cloud

    ERIC Educational Resources Information Center

    Breeding, Marshall

    2009-01-01

    A trend toward the abstraction of computing platforms that has been developing in the broader IT arena over the last few years is just beginning to make inroads into the library technology scene. Cloud computing offers for libraries many interesting possibilities that may help reduce technology costs and increase capacity, reliability, and…

  16. Advanced Scientific Computing Environment Team new scientific database management task

    SciTech Connect

    Church, J.P.; Roberts, J.C.; Sims, R.N.; Smetana, A.O.; Westmoreland, B.W.

    1991-06-01

    The mission of the ASCENT Team is to continually keep pace with, evaluate, and select emerging computing technologies to define and implement prototypic scientific environments that maximize the ability of scientists and engineers to manage scientific data. These environments are to be implemented in a manner consistent with the site computing architecture and standards and NRTSC/SCS strategic plans for scientific computing. The major trends in computing hardware and software technology clearly indicate that the future computer'' will be a network environment that comprises supercomputers, graphics boxes, mainframes, clusters, workstations, terminals, and microcomputers. This network computer'' will have an architecturally transparent operating system allowing the applications code to run on any box supplying the required computing resources. The environment will include a distributed database and database managing system(s) that permits use of relational, hierarchical, object oriented, GIS, et al, databases. To reach this goal requires a stepwise progression from the present assemblage of monolithic applications codes running on disparate hardware platforms and operating systems. The first steps include converting from the existing JOSHUA system to a new J80 system that complies with modern language standards, development of a new J90 prototype to provide JOSHUA capabilities on Unix platforms, development of portable graphics tools to greatly facilitate preparation of input and interpretation of output; and extension of Jvv'' concepts and capabilities to distributed and/or parallel computing environments.

  17. Advancing Behavior Analysis in Zoos and Aquariums.

    PubMed

    Maple, Terry L; Segura, Valerie D

    2015-05-01

    Zoos, aquariums, and other captive animal facilities offer promising opportunities to advance the science and practice of behavior analysis. Zoos and aquariums are necessarily concerned with the health and well-being of their charges and are held to a high standard by their supporters (visitors, members, and donors), organized critics, and the media. Zoos and aquariums offer unique venues for teaching and research and a locus for expanding the footprint of behavior analysis. In North America, Europe, and the UK, formal agreements between zoos, aquariums, and university graduate departments have been operating successfully for decades. To expand on this model, it will be necessary to help zoo and aquarium managers throughout the world to recognize the value of behavior analysis in the delivery of essential animal health and welfare services. Academic institutions, administrators, and invested faculty should consider the utility of training students to meet the growing needs of applied behavior analysis in zoos and aquariums and other animal facilities such as primate research centers, sanctuaries, and rescue centers. PMID:27540508

  18. OPENING REMARKS: SciDAC: Scientific Discovery through Advanced Computing

    NASA Astrophysics Data System (ADS)

    Strayer, Michael

    2005-01-01

    Good morning. Welcome to SciDAC 2005 and San Francisco. SciDAC is all about computational science and scientific discovery. In a large sense, computational science characterizes SciDAC and its intent is change. It transforms both our approach and our understanding of science. It opens new doors and crosses traditional boundaries while seeking discovery. In terms of twentieth century methodologies, computational science may be said to be transformational. There are a number of examples to this point. First are the sciences that encompass climate modeling. The application of computational science has in essence created the field of climate modeling. This community is now international in scope and has provided precision results that are challenging our understanding of our environment. A second example is that of lattice quantum chromodynamics. Lattice QCD, while adding precision and insight to our fundamental understanding of strong interaction dynamics, has transformed our approach to particle and nuclear science. The individual investigator approach has evolved to teams of scientists from different disciplines working side-by-side towards a common goal. SciDAC is also undergoing a transformation. This meeting is a prime example. Last year it was a small programmatic meeting tracking progress in SciDAC. This year, we have a major computational science meeting with a variety of disciplines and enabling technologies represented. SciDAC 2005 should position itself as a new corner stone for Computational Science and its impact on science. As we look to the immediate future, FY2006 will bring a new cycle to SciDAC. Most of the program elements of SciDAC will be re-competed in FY2006. The re-competition will involve new instruments for computational science, new approaches for collaboration, as well as new disciplines. There will be new opportunities for virtual experiments in carbon sequestration, fusion, and nuclear power and nuclear waste, as well as collaborations

  19. Statistical energy analysis computer program, user's guide

    NASA Technical Reports Server (NTRS)

    Trudell, R. W.; Yano, L. I.

    1981-01-01

    A high frequency random vibration analysis, (statistical energy analysis (SEA) method) is examined. The SEA method accomplishes high frequency prediction of arbitrary structural configurations. A general SEA computer program is described. A summary of SEA theory, example problems of SEA program application, and complete program listing are presented.

  20. Use of Some "Discriminant Analysis" Computer Programs

    ERIC Educational Resources Information Center

    Huberty, Carl J.

    1977-01-01

    The objective of this paper is to review the outputs of selected computer programs often used to carry out a "discriminant analysis" with respect to two purposes of such an analysis, discrimination and classification. The programs selected are three BMD programs. (Author/JKS)

  1. Advancing Usability Evaluation through Human Reliability Analysis

    SciTech Connect

    Ronald L. Boring; David I. Gertman

    2005-07-01

    This paper introduces a novel augmentation to the current heuristic usability evaluation methodology. The SPAR-H human reliability analysis method was developed for categorizing human performance in nuclear power plants. Despite the specialized use of SPAR-H for safety critical scenarios, the method also holds promise for use in commercial off-the-shelf software usability evaluations. The SPAR-H method shares task analysis underpinnings with human-computer interaction, and it can be easily adapted to incorporate usability heuristics as performance shaping factors. By assigning probabilistic modifiers to heuristics, it is possible to arrive at the usability error probability (UEP). This UEP is not a literal probability of error but nonetheless provides a quantitative basis to heuristic evaluation. When combined with a consequence matrix for usability errors, this method affords ready prioritization of usability issues.

  2. NASTRAN flutter analysis of advanced turbopropellers

    NASA Technical Reports Server (NTRS)

    Elchuri, V.; Smith, G. C. C.

    1982-01-01

    An existing capability developed to conduct modal flutter analysis of tuned bladed-shrouded discs in NASTRAN was modified and applied to investigate the subsonic unstalled flutter characteristics of advanced turbopropellers. The modifications pertain to the inclusion of oscillatory modal aerodynamic loads of blades with large (backward and forward) variable sweep. The two dimensional subsonic cascade unsteady aerodynamic theory was applied in a strip theory manner with appropriate modifications for the sweep effects. Each strip is associated with a chord selected normal to any spanwise reference curve such as the blade leading edge. The stability of three operating conditions of a 10-bladed propeller is analyzed. Each of these operating conditions is iterated once to determine the flutter boundary. A 5-bladed propeller is also analyzed at one operating condition to investigate stability. Analytical results obtained are in very good agreement with those from wind tunnel tests.

  3. Advanced development in chemical analysis of Cordyceps.

    PubMed

    Zhao, J; Xie, J; Wang, L Y; Li, S P

    2014-01-01

    Cordyceps sinensis, also called DongChongXiaCao (winter worm summer grass) in Chinese, is a well-known and valued traditional Chinese medicine. In 2006, we wrote a review for discussing the markers and analytical methods in quality control of Cordyceps (J. Pharm. Biomed. Anal. 41 (2006) 1571-1584). Since then this review has been cited by others for more than 60 times, which suggested that scientists have great interest in this special herbal material. Actually, the number of publications related to Cordyceps after 2006 is about 2-fold of that in two decades before 2006 according to the data from Web of Science. Therefore, it is necessary to review and discuss the advanced development in chemical analysis of Cordyceps since then. PMID:23688494

  4. Automating sensitivity analysis of computer models using computer calculus

    SciTech Connect

    Oblow, E.M.; Pin, F.G.

    1985-01-01

    An automated procedure for performing sensitivity analyses has been developed. The procedure uses a new FORTRAN compiler with computer calculus capabilities to generate the derivatives needed to set up sensitivity equations. The new compiler is called GRESS - Gradient Enhanced Software System. Application of the automated procedure with ''direct'' and ''adjoint'' sensitivity theory for the analysis of non-linear, iterative systems of equations is discussed. Calculational efficiency consideration and techniques for adjoint sensitivity analysis are emphasized. The new approach is found to preserve the traditional advantages of adjoint theory while removing the tedious human effort previously needed to apply this theoretical methodology. Conclusions are drawn about the applicability of the automated procedure in numerical analysis and large-scale modelling sensitivity studies. 24 refs., 2 figs.

  5. Modeling and analysis of advanced binary cycles

    SciTech Connect

    Gawlik, K.

    1997-12-31

    A computer model (Cycle Analysis Simulation Tool, CAST) and a methodology have been developed to perform value analysis for small, low- to moderate-temperature binary geothermal power plants. The value analysis method allows for incremental changes in the levelized electricity cost (LEC) to be determined between a baseline plant and a modified plant. Thermodynamic cycle analyses and component sizing are carried out in the model followed by economic analysis which provides LEC results. The emphasis of the present work is on evaluating the effect of mixed working fluids instead of pure fluids on the LEC of a geothermal binary plant that uses a simple Organic Rankine Cycle. Four resources were studied spanning the range of 265{degrees}F to 375{degrees}F. A variety of isobutane and propane based mixtures, in addition to pure fluids, were used as working fluids. This study shows that the use of propane mixtures at a 265{degrees}F resource can reduce the LEC by 24% when compared to a base case value that utilizes commercial isobutane as its working fluid. The cost savings drop to 6% for a 375{degrees}F resource, where an isobutane mixture is favored. Supercritical cycles were found to have the lowest cost at all resources.

  6. Computer aided nonlinear electrical networks analysis

    NASA Technical Reports Server (NTRS)

    Slapnicar, P.

    1977-01-01

    Techniques used in simulating an electrical circuit with nonlinear elements for use in computer-aided circuit analysis programs are described. Elements of the circuit include capacitors, resistors, inductors, transistors, diodes, and voltage and current sources (constant or time varying). Simulation features are discussed for dc, ac, and/or transient circuit analysis. Calculations are based on the model approach of formulating the circuit equations. A particular solution of transient analysis for nonlinear storage elements is described.

  7. Composite Structure Modeling and Analysis of Advanced Aircraft Fuselage Concepts

    NASA Technical Reports Server (NTRS)

    Mukhopadhyay, Vivek; Sorokach, Michael R.

    2015-01-01

    NASA Environmentally Responsible Aviation (ERA) project and the Boeing Company are collabrating to advance the unitized damage arresting composite airframe technology with application to the Hybrid-Wing-Body (HWB) aircraft. The testing of a HWB fuselage section with Pultruded Rod Stitched Efficient Unitized Structure (PRSEUS) construction is presently being conducted at NASA Langley. Based on lessons learned from previous HWB structural design studies, improved finite-element models (FEM) of the HWB multi-bay and bulkhead assembly are developed to evaluate the performance of the PRSEUS construction. In order to assess the comparative weight reduction benefits of the PRSEUS technology, conventional cylindrical skin-stringer-frame models of a cylindrical and a double-bubble section fuselage concepts are developed. Stress analysis with design cabin-pressure load and scenario based case studies are conducted for design improvement in each case. Alternate analysis with stitched composite hat-stringers and C-frames are also presented, in addition to the foam-core sandwich frame and pultruded rod-stringer construction. The FEM structural stress, strain and weights are computed and compared for relative weight/strength benefit assessment. The structural analysis and specific weight comparison of these stitched composite advanced aircraft fuselage concepts demonstrated that the pressurized HWB fuselage section assembly can be structurally as efficient as the conventional cylindrical fuselage section with composite stringer-frame and PRSEUS construction, and significantly better than the conventional aluminum construction and the double-bubble section concept.

  8. Parallel biomolecular computation on surfaces with advanced finite automata.

    PubMed

    Soreni, Michal; Yogev, Sivan; Kossoy, Elizaveta; Shoham, Yuval; Keinan, Ehud

    2005-03-23

    A biomolecular, programmable 3-symbol-3-state finite automaton is reported. This automaton computes autonomously with all of its components, including hardware, software, input, and output being biomolecules mixed together in solution. The hardware consisted of two enzymes: an endonuclease, BbvI, and T4 DNA ligase. The software (transition rules represented by transition molecules) and the input were double-stranded (ds) DNA oligomers. Computation was carried out by autonomous processing of the input molecules via repetitive cycles of restriction, hybridization, and ligation reactions to produce a final-state output in the form of a dsDNA molecule. The 3-symbol-3-state deterministic automaton is an extension of the 2-symbol-2-state automaton previously reported, and theoretically it can be further expanded to a 37-symbol-3-state automaton. The applicability of this design was further amplified by employing surface-anchored input molecules, using the surface plasmon resonance technology to monitor the computation steps in real time. Computation was performed by alternating the feed solutions between endonuclease and a solution containing the ligase, ATP, and appropriate transition molecules. The output detection involved final ligation with one of three soluble detection molecules. Parallel computation and stepwise detection were carried out automatically with a Biacore chip that was loaded with four different inputs. PMID:15771530

  9. Computer aided stress analysis of long bones utilizing computer tomography

    SciTech Connect

    Marom, S.A.

    1986-01-01

    A computer aided analysis method, utilizing computed tomography (CT) has been developed, which together with a finite element program determines the stress-displacement pattern in a long bone section. The CT data file provides the geometry, the density and the material properties for the generated finite element model. A three-dimensional finite element model of a tibial shaft is automatically generated from the CT file by a pre-processing procedure for a finite element program. The developed pre-processor includes an edge detection algorithm which determines the boundaries of the reconstructed cross-sectional images of the scanned bone. A mesh generation procedure than automatically generates a three-dimensional mesh of a user-selected refinement. The elastic properties needed for the stress analysis are individually determined for each model element using the radiographic density (CT number) of each pixel with the elemental borders. The elastic modulus is determined from the CT radiographic density by using an empirical relationship from the literature. The generated finite element model, together with applied loads, determined from existing gait analysis and initial displacements, comprise a formatted input for the SAP IV finite element program. The output of this program, stresses and displacements at the model elements and nodes, are sorted and displayed by a developed post-processor to provide maximum and minimum values at selected locations in the model.

  10. Computer analysis of foetal monitoring signals.

    PubMed

    Nunes, Inês; Ayres-de-Campos, Diogo

    2016-01-01

    Five systems for computer analysis of foetal monitoring signals are currently available, incorporating the evaluation of cardiotocographic (CTG) or combined CTG with electrocardiographic ST data. All systems have been integrated with central monitoring stations, allowing the simultaneous monitoring of several tracings on the same computer screen in multiple hospital locations. Computer analysis elicits real-time visual and sound alerts for health care professionals when abnormal patterns are detected, with the aim of prompting a re-evaluation and subsequent clinical action, if considered necessary. Comparison between the CTG analyses provided by the computer and clinical experts has been carried out in all systems, and in three of them, the accuracy of computer alerts in predicting newborn outcomes was evaluated. Comparisons between these studies are hampered by the differences in selection criteria and outcomes. Two of these systems have just completed multicentre randomised clinical trials comparing them with conventional CTG monitoring, and their results are awaited shortly. For the time being, there is limited evidence regarding the impact of computer analysis of foetal monitoring signals on perinatal indicators and on health care professionals' behaviour. PMID:26211832

  11. Some recent advances in computational aerodynamics for helicopter applications

    NASA Technical Reports Server (NTRS)

    Mccroskey, W. J.; Baeder, J. D.

    1985-01-01

    The growing application of computational aerodynamics to nonlinear helicopter problems is outlined, with particular emphasis on several recent quasi-two-dimensional examples that used the thin-layer Navier-Stokes equations and an eddy-viscosity model to approximate turbulence. Rotor blade section characteristics can now be calculated accurately over a wide range of transonic flow conditions. However, a finite-difference simulation of the complete flow field about a helicopter in forward flight is not currently feasible, despite the impressive progress that is being made in both two and three dimensions. The principal limitations are today's computer speeds and memories, algorithm and solution methods, grid generation, vortex modeling, structural and aerodynamic coupling, and a shortage of engineers who are skilled in both computational fluid dynamics and helicopter aerodynamics and dynamics.

  12. ASTEC: Controls analysis for personal computers

    NASA Technical Reports Server (NTRS)

    Downing, John P.; Bauer, Frank H.; Thorpe, Christopher J.

    1989-01-01

    The ASTEC (Analysis and Simulation Tools for Engineering Controls) software is under development at Goddard Space Flight Center (GSFC). The design goal is to provide a wide selection of controls analysis tools at the personal computer level, as well as the capability to upload compute-intensive jobs to a mainframe or supercomputer. The project is a follow-on to the INCA (INteractive Controls Analysis) program that has been developed at GSFC over the past five years. While ASTEC makes use of the algorithms and expertise developed for the INCA program, the user interface was redesigned to take advantage of the capabilities of the personal computer. The design philosophy and the current capabilities of the ASTEC software are described.

  13. Development and Validation of a Fast, Accurate and Cost-Effective Aeroservoelastic Method on Advanced Parallel Computing Systems

    NASA Technical Reports Server (NTRS)

    Goodwin, Sabine A.; Raj, P.

    1999-01-01

    Progress to date towards the development and validation of a fast, accurate and cost-effective aeroelastic method for advanced parallel computing platforms such as the IBM SP2 and the SGI Origin 2000 is presented in this paper. The ENSAERO code, developed at the NASA-Ames Research Center has been selected for this effort. The code allows for the computation of aeroelastic responses by simultaneously integrating the Euler or Navier-Stokes equations and the modal structural equations of motion. To assess the computational performance and accuracy of the ENSAERO code, this paper reports the results of the Navier-Stokes simulations of the transonic flow over a flexible aeroelastic wing body configuration. In addition, a forced harmonic oscillation analysis in the frequency domain and an analysis in the time domain are done on a wing undergoing a rigid pitch and plunge motion. Finally, to demonstrate the ENSAERO flutter-analysis capability, aeroelastic Euler and Navier-Stokes computations on an L-1011 wind tunnel model including pylon, nacelle and empennage are underway. All computational solutions are compared with experimental data to assess the level of accuracy of ENSAERO. As the computations described above are performed, a meticulous log of computational performance in terms of wall clock time, execution speed, memory and disk storage is kept. Code scalability is also demonstrated by studying the impact of varying the number of processors on computational performance on the IBM SP2 and the Origin 2000 systems.

  14. Advanced Computational Aeroacoustics Methods for Fan Noise Prediction

    NASA Technical Reports Server (NTRS)

    Envia, Edmane (Technical Monitor); Tam, Christopher

    2003-01-01

    Direct computation of fan noise is presently not possible. One of the major difficulties is the geometrical complexity of the problem. In the case of fan noise, the blade geometry is critical to the loading on the blade and hence the intensity of the radiated noise. The precise geometry must be incorporated into the computation. In computational fluid dynamics (CFD), there are two general ways to handle problems with complex geometry. One way is to use unstructured grids. The other is to use body fitted overset grids. In the overset grid method, accurate data transfer is of utmost importance. For acoustic computation, it is not clear that the currently used data transfer methods are sufficiently accurate as not to contaminate the very small amplitude acoustic disturbances. In CFD, low order schemes are, invariably, used in conjunction with unstructured grids. However, low order schemes are known to be numerically dispersive and dissipative. dissipative errors are extremely undesirable for acoustic wave problems. The objective of this project is to develop a high order unstructured grid Dispersion-Relation-Preserving (DRP) scheme. would minimize numerical dispersion and dissipation errors. contains the results of the funded portion of the project. scheme on an unstructured grid has been developed. constructed in the wave number space. The characteristics of the scheme can be improved by the inclusion of additional constraints. Stability of the scheme has been investigated. Stability can be improved by adopting the upwinding strategy.

  15. Advanced Computer Image Generation Techniques Exploiting Perceptual Characteristics. Final Report.

    ERIC Educational Resources Information Center

    Stenger, Anthony J.; And Others

    This study suggests and identifies computer image generation (CIG) algorithms for visual simulation that improve the training effectiveness of CIG simulators and identifies areas of basic research in visual perception that are significant for improving CIG technology. The first phase of the project entailed observing three existing CIG simulators.…

  16. MAX - An advanced parallel computer for space applications

    NASA Technical Reports Server (NTRS)

    Lewis, Blair F.; Bunker, Robert L.

    1991-01-01

    MAX is a fault-tolerant multicomputer hardware and software architecture designed to meet the needs of NASA spacecraft systems. It consists of conventional computing modules (computers) connected via a dual network topology. One network is used to transfer data among the computers and between computers and I/O devices. This network's topology is arbitrary. The second network operates as a broadcast medium for operating system synchronization messages and supports the operating system's Byzantine resilience. A fully distributed operating system supports multitasking in an asynchronous event and data driven environment. A large grain dataflow paradigm is used to coordinate the multitasking and provide easy control of concurrency. It is the basis of the system's fault tolerance and allows both static and dynamical location of tasks. Redundant execution of tasks with software voting of results may be specified for critical tasks. The dataflow paradigm also supports simplified software design, test and maintenance. A unique feature is a method for reliably patching code in an executing dataflow application.

  17. 77 FR 62231 - DOE/Advanced Scientific Computing Advisory Committee

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-10-12

    ... From the Federal Register Online via the Government Publishing Office DEPARTMENT OF ENERGY DOE.... Computational Science Graduate Fellowship (CSGF) Longitudinal Study. Update on Exascale. Update from DOE data... contact Melea Baker, (301) 903-7486 or by email at: Melea.Baker@science.doe.gov . You must make...

  18. Advanced Simulation and Computing Co-Design Strategy

    SciTech Connect

    Ang, James A.; Hoang, Thuc T.; Kelly, Suzanne M.; McPherson, Allen; Neely, Rob

    2015-11-01

    This ASC Co-design Strategy lays out the full continuum and components of the co-design process, based on what we have experienced thus far and what we wish to do more in the future to meet the program’s mission of providing high performance computing (HPC) and simulation capabilities for NNSA to carry out its stockpile stewardship responsibility.

  19. Discrete computer analysis in petroleum geology

    SciTech Connect

    Zakharian, A.Z.

    1995-08-01

    Computer analysis must not be resembling on geologist`s work, having its own way because of uncertainty and shortness of geological information even on mature stage of exploration, when our original system of formal discrete computer analysis, realised on {open_quotes}FoxPro for Windows{close_quotes} with not substantial but probabilistic (without ever driving the usual maps) representation of geological situation was used for picking out the sets of best points for exploration drilling in south part of Dheprovsko-Donetzky oil-gas basin.

  20. Temporal fringe pattern analysis with parallel computing

    SciTech Connect

    Tuck Wah Ng; Kar Tien Ang; Argentini, Gianluca

    2005-11-20

    Temporal fringe pattern analysis is invaluable in transient phenomena studies but necessitates long processing times. Here we describe a parallel computing strategy based on the single-program multiple-data model and hyperthreading processor technology to reduce the execution time. In a two-node cluster workstation configuration we found that execution periods were reduced by 1.6 times when four virtual processors were used. To allow even lower execution times with an increasing number of processors, the time allocated for data transfer, data read, and waiting should be minimized. Parallel computing is found here to present a feasible approach to reduce execution times in temporal fringe pattern analysis.

  1. Interfacing Computer Aided Parallelization and Performance Analysis

    NASA Technical Reports Server (NTRS)

    Jost, Gabriele; Jin, Haoqiang; Labarta, Jesus; Gimenez, Judit; Biegel, Bryan A. (Technical Monitor)

    2003-01-01

    When porting sequential applications to parallel computer architectures, the program developer will typically go through several cycles of source code optimization and performance analysis. We have started a project to develop an environment where the user can jointly navigate through program structure and performance data information in order to make efficient optimization decisions. In a prototype implementation we have interfaced the CAPO computer aided parallelization tool with the Paraver performance analysis tool. We describe both tools and their interface and give an example for how the interface helps within the program development cycle of a benchmark code.

  2. Integrated computer aided planning and manufacture of advanced technology jet engines

    NASA Astrophysics Data System (ADS)

    Subhas, B. K.; George, Chacko; Arul Raj, A.

    1987-10-01

    This paper highlights an attempt at evolving a computer aided manufacturing system on a personal computer. A case study of an advanced technology jet engine component is included to illustrate various outputs from the system. The proposed system could be an alternate solution to sophisticated and expensive CAD/CAM workstations.

  3. Advanced Computational Thermal Studies and their Assessment for Supercritical-Pressure Reactors (SCRs)

    SciTech Connect

    D. M. McEligot; J. Y. Yoo; J. S. Lee; S. T. Ro; E. Lurien; S. O. Park; R. H. Pletcher; B. L. Smith; P. Vukoslavcevic; J. M. Wallace

    2009-04-01

    The goal of this laboratory / university collaboration of coupled computational and experimental studies is the improvement of predictive methods for supercritical-pressure reactors. The general objective is to develop supporting knowledge needed of advanced computational techniques for the technology development of the concepts and their safety systems.

  4. Project T.E.A.M. (Technical Education Advancement Modules). Introduction to Computers.

    ERIC Educational Resources Information Center

    Ellis, Brenda

    This instructional guide, one of a series developed by the Technical Education Advancement Modules (TEAM) project, is a 3-hour introduction to computers. The purpose is to develop the following competencies: (1) orientation to data processing; (2) use of data entry devices; (3) use of computer menus; and (4) entry of data with accuracy and…

  5. Teaching Advanced Concepts in Computer Networks: VNUML-UM Virtualization Tool

    ERIC Educational Resources Information Center

    Ruiz-Martinez, A.; Pereniguez-Garcia, F.; Marin-Lopez, R.; Ruiz-Martinez, P. M.; Skarmeta-Gomez, A. F.

    2013-01-01

    In the teaching of computer networks the main problem that arises is the high price and limited number of network devices the students can work with in the laboratories. Nowadays, with virtualization we can overcome this limitation. In this paper, we present a methodology that allows students to learn advanced computer network concepts through…

  6. UNEDF: Advanced Scientific Computing Transforms the Low-Energy Nuclear Many-Body Problem

    SciTech Connect

    Stoitsov, Mario; Nam, Hai Ah; Nazarewicz, Witold; Bulgac, Aurel; Hagen, Gaute; Kortelainen, E. M.; Pei, Junchen; Roche, K. J.; Schunck, N.; Thompson, I.; Vary, J. P.; Wild, S.

    2011-01-01

    The UNEDF SciDAC collaboration of nuclear theorists, applied mathematicians, and computer scientists is developing a comprehensive description of nuclei and their reactions that delivers maximum predictive power with quantified uncertainties. This paper illustrates significant milestones accomplished by UNEDF through integration of the theoretical approaches, advanced numerical algorithms, and leadership class computational resources.

  7. Advanced Coal Wind Hybrid: Economic Analysis

    SciTech Connect

    Phadke, Amol; Goldman, Charles; Larson, Doug; Carr, Tom; Rath, Larry; Balash, Peter; Yih-Huei, Wan

    2008-11-28

    Growing concern over climate change is prompting new thinking about the technologies used to generate electricity. In the future, it is possible that new government policies on greenhouse gas emissions may favor electric generation technology options that release zero or low levels of carbon emissions. The Western U.S. has abundant wind and coal resources. In a world with carbon constraints, the future of coal for new electrical generation is likely to depend on the development and successful application of new clean coal technologies with near zero carbon emissions. This scoping study explores the economic and technical feasibility of combining wind farms with advanced coal generation facilities and operating them as a single generation complex in the Western US. The key questions examined are whether an advanced coal-wind hybrid (ACWH) facility provides sufficient advantages through improvements to the utilization of transmission lines and the capability to firm up variable wind generation for delivery to load centers to compete effectively with other supply-side alternatives in terms of project economics and emissions footprint. The study was conducted by an Analysis Team that consists of staff from the Lawrence Berkeley National Laboratory (LBNL), National Energy Technology Laboratory (NETL), National Renewable Energy Laboratory (NREL), and Western Interstate Energy Board (WIEB). We conducted a screening level analysis of the economic competitiveness and technical feasibility of ACWH generation options located in Wyoming that would supply electricity to load centers in California, Arizona or Nevada. Figure ES-1 is a simple stylized representation of the configuration of the ACWH options. The ACWH consists of a 3,000 MW coal gasification combined cycle power plant equipped with carbon capture and sequestration (G+CC+CCS plant), a fuel production or syngas storage facility, and a 1,500 MW wind plant. The ACWH project is connected to load centers by a 3,000 MW

  8. The ergonomics of computer aided design within advanced manufacturing technology.

    PubMed

    John, P A

    1988-03-01

    Many manufacturing companies have now awakened to the significance of computer aided design (CAD), although the majority of them have only been able to purchase computerised draughting systems of which only a subset produce direct manufacturing data. Such companies are moving steadily towards the concept of computer integrated manufacture (CIM), and this demands CAD to address more than draughting. CAD architects are thus having to rethink the basic specification of such systems, although they typically suffer from an insufficient understanding of the design task and have consequently been working with inadequate specifications. It is at this fundamental level that ergonomics has much to offer, making its contribution by encouraging user-centred design. The discussion considers the relationships between CAD and: the design task; the organisation and people; creativity; and artificial intelligence. It finishes with a summary of the contribution of ergonomics. PMID:15676646

  9. Advanced Techniques for Root Cause Analysis

    2000-09-19

    Five items make up this package, or can be used individually. The Chronological Safety Management Template utilizes a linear adaptation of the Integrated Safety Management System laid out in the form of a template that greatly enhances the ability of the analyst to perform the first step of any investigation which is to gather all pertinent facts and identify causal factors. The Problem Analysis Tree is a simple three (3) level problem analysis tree whichmore » is easier for organizations outside of WSRC to use. Another part is the Systemic Root Cause Tree. One of the most basic and unique features of Expanded Root Cause Analysis is the Systemic Root Cause portion of the Expanded Root Cause Pyramid. The Systemic Root Causes are even more basic than the Programmatic Root Causes and represent Root Causes that cut across multiple (if not all) programs in an organization. the Systemic Root Cause portion contains 51 causes embedded at the bottom level of a three level Systemic Root Cause Tree that is divided into logical, organizationally based categorie to assist the analyst. The Computer Aided Root Cause Analysis that allows the analyst at each level of the Pyramid to a) obtain a brief description of the cause that is being considered, b) record a decision that the item is applicable, c) proceed to the next level of the Pyramid to see only those items at the next level of the tree that are relevant to the particular cause that has been chosen, and d) at the end of the process automatically print out a summary report of the incident, the causal factors as they relate to the safety management system, the probable causes, apparent causes, Programmatic Root Causes and Systemic Root Causes for each causal factor and the associated corrective action.« less

  10. Cogeneration computer model assessment: Advanced cogeneration research study

    NASA Technical Reports Server (NTRS)

    Rosenberg, L.

    1983-01-01

    Cogeneration computer simulation models to recommend the most desirable models or their components for use by the Southern California Edison Company (SCE) in evaluating potential cogeneration projects was assessed. Existing cogeneration modeling capabilities are described, preferred models are identified, and an approach to the development of a code which will best satisfy SCE requirements is recommended. Five models (CELCAP, COGEN 2, CPA, DEUS, and OASIS) are recommended for further consideration.

  11. UNEDF: Advanced Scientific Computing Collaboration Transforms the Low-Energy Nuclear Many-Body Problem

    SciTech Connect

    Nam, H.; Stoitsov, M.; Nazarewicz, W.; Bulgac, A.; Hagen, G.; Kortelainen, M.; Maris, P.; Pei, J. C.; Roche, K. J.; Schunck, N.; Thompson, I.; Vary, J. P.; Wild, S. M.

    2012-12-20

    The demands of cutting-edge science are driving the need for larger and faster computing resources. With the rapidly growing scale of computing systems and the prospect of technologically disruptive architectures to meet these needs, scientists face the challenge of effectively using complex computational resources to advance scientific discovery. Multi-disciplinary collaborating networks of researchers with diverse scientific backgrounds are needed to address these complex challenges. The UNEDF SciDAC collaboration of nuclear theorists, applied mathematicians, and computer scientists is developing a comprehensive description of nuclei and their reactions that delivers maximum predictive power with quantified uncertainties. This paper describes UNEDF and identifies attributes that classify it as a successful computational collaboration. Finally, we illustrate significant milestones accomplished by UNEDF through integrative solutions using the most reliable theoretical approaches, most advanced algorithms, and leadership-class computational resources.

  12. Parallel computing in genomic research: advances and applications

    PubMed Central

    Ocaña, Kary; de Oliveira, Daniel

    2015-01-01

    Today’s genomic experiments have to process the so-called “biological big data” that is now reaching the size of Terabytes and Petabytes. To process this huge amount of data, scientists may require weeks or months if they use their own workstations. Parallelism techniques and high-performance computing (HPC) environments can be applied for reducing the total processing time and to ease the management, treatment, and analyses of this data. However, running bioinformatics experiments in HPC environments such as clouds, grids, clusters, and graphics processing unit requires the expertise from scientists to integrate computational, biological, and mathematical techniques and technologies. Several solutions have already been proposed to allow scientists for processing their genomic experiments using HPC capabilities and parallelism techniques. This article brings a systematic review of literature that surveys the most recently published research involving genomics and parallel computing. Our objective is to gather the main characteristics, benefits, and challenges that can be considered by scientists when running their genomic experiments to benefit from parallelism techniques and HPC capabilities. PMID:26604801

  13. Advanced Computational Methods for Security Constrained Financial Transmission Rights

    SciTech Connect

    Kalsi, Karanjit; Elbert, Stephen T.; Vlachopoulou, Maria; Zhou, Ning; Huang, Zhenyu

    2012-07-26

    Financial Transmission Rights (FTRs) are financial insurance tools to help power market participants reduce price risks associated with transmission congestion. FTRs are issued based on a process of solving a constrained optimization problem with the objective to maximize the FTR social welfare under power flow security constraints. Security constraints for different FTR categories (monthly, seasonal or annual) are usually coupled and the number of constraints increases exponentially with the number of categories. Commercial software for FTR calculation can only provide limited categories of FTRs due to the inherent computational challenges mentioned above. In this paper, first an innovative mathematical reformulation of the FTR problem is presented which dramatically improves the computational efficiency of optimization problem. After having re-formulated the problem, a novel non-linear dynamic system (NDS) approach is proposed to solve the optimization problem. The new formulation and performance of the NDS solver is benchmarked against widely used linear programming (LP) solvers like CPLEX™ and tested on both standard IEEE test systems and large-scale systems using data from the Western Electricity Coordinating Council (WECC). The performance of the NDS is demonstrated to be comparable and in some cases is shown to outperform the widely used CPLEX algorithms. The proposed formulation and NDS based solver is also easily parallelizable enabling further computational improvement.

  14. Tool for Sizing Analysis of the Advanced Life Support System

    NASA Technical Reports Server (NTRS)

    Yeh, Hue-Hsie Jannivine; Brown, Cheryl B.; Jeng, Frank J.

    2005-01-01

    Advanced Life Support Sizing Analysis Tool (ALSSAT) is a computer model for sizing and analyzing designs of environmental-control and life support systems (ECLSS) for spacecraft and surface habitats involved in the exploration of Mars and Moon. It performs conceptual designs of advanced life support (ALS) subsystems that utilize physicochemical and biological processes to recycle air and water, and process wastes in order to reduce the need of resource resupply. By assuming steady-state operations, ALSSAT is a means of investigating combinations of such subsystems technologies and thereby assisting in determining the most cost-effective technology combination available. In fact, ALSSAT can perform sizing analysis of the ALS subsystems that are operated dynamically or steady in nature. Using the Microsoft Excel spreadsheet software with Visual Basic programming language, ALSSAT has been developed to perform multiple-case trade studies based on the calculated ECLSS mass, volume, power, and Equivalent System Mass, as well as parametric studies by varying the input parameters. ALSSAT s modular format is specifically designed for the ease of future maintenance and upgrades.

  15. National facility for advanced computational science: A sustainable path to scientific discovery

    SciTech Connect

    Simon, Horst; Kramer, William; Saphir, William; Shalf, John; Bailey, David; Oliker, Leonid; Banda, Michael; McCurdy, C. William; Hules, John; Canning, Andrew; Day, Marc; Colella, Philip; Serafini, David; Wehner, Michael; Nugent, Peter

    2004-04-02

    Lawrence Berkeley National Laboratory (Berkeley Lab) proposes to create a National Facility for Advanced Computational Science (NFACS) and to establish a new partnership between the American computer industry and a national consortium of laboratories, universities, and computing facilities. NFACS will provide leadership-class scientific computing capability to scientists and engineers nationwide, independent of their institutional affiliation or source of funding. This partnership will bring into existence a new class of computational capability in the United States that is optimal for science and will create a sustainable path towards petaflops performance.

  16. Computer based terrain analysis for operational planning

    SciTech Connect

    Powell, D.R.

    1987-01-01

    Analysis of operational capability is an ongoing task for military commanders. In peacetime, most analysis is conducted via computer based combat simulations, where selected force structures engage in simulated combat to gain insight into specific scenarios. The command and control (C/sup 2/) mechanisms that direct combat forces are often neglected relative to the fidelity of representation of mechanical and physical entities. C/sup 2/ capabilities should include the ability to plan a mission, monitor execution activities, and redirect combat power when appropriate. This paper discusses the development of a computer based approach to mission planning for land warfare. The aspect emphasized is the computation and representation of relevant terrain features in the context of operational planning.

  17. Parametric cost analysis for advanced energy concepts

    SciTech Connect

    Not Available

    1983-10-01

    This report presents results of an exploratory study to develop parametric cost estimating relationships for advanced fossil-fuel energy systems. The first of two tasks was to develop a standard Cost Chart of Accounts to serve as a basic organizing framework for energy systems cost analysis. The second task included development of selected parametric cost estimating relationships (CERs) for individual elements (or subsystems) of a fossil fuel plant, nominally for the Solvent-Refined Coal (SRC) process. Parametric CERs are presented for the following elements: coal preparation, coal slurry preparation, dissolver (reactor); gasification; oxygen production; acid gas/CO/sub 2/ removal; shift conversion; cryogenic hydrogen recovery; and sulfur removal. While the nominal focus of the study was on the SRC process, each of these elements is found in other fossil fuel processes. Thus, the results of this effort have broader potential application. However, it should also be noted that the CERs presented in this report are based upon a limited data base. Thus, they are applicable over a limited range of values (of the independent variables) and for a limited set of specific technologies (e.g., the gasifier CER is for the multi-train, Koppers-Totzek process). Additional work is required to extend the range of these CERs. 16 figures, 13 tables.

  18. Final Report Computational Analysis of Dynamical Systems

    SciTech Connect

    Guckenheimer, John

    2012-05-08

    This is the final report for DOE Grant DE-FG02-93ER25164, initiated in 1993. This grant supported research of John Guckenheimer on computational analysis of dynamical systems. During that period, seventeen individuals received PhD degrees under the supervision of Guckenheimer and over fifty publications related to the grant were produced. This document contains copies of these publications.

  19. COMPUTER ANALYSIS OF PLANAR GAMMA CAMERA IMAGES

    EPA Science Inventory



    COMPUTER ANALYSIS OF PLANAR GAMMA CAMERA IMAGES

    T Martonen1 and J Schroeter2

    1Experimental Toxicology Division, National Health and Environmental Effects Research Laboratory, U.S. EPA, Research Triangle Park, NC 27711 USA and 2Curriculum in Toxicology, Unive...

  20. Vision 20/20: Automation and advanced computing in clinical radiation oncology

    SciTech Connect

    Moore, Kevin L. Moiseenko, Vitali; Kagadis, George C.; McNutt, Todd R.; Mutic, Sasa

    2014-01-15

    This Vision 20/20 paper considers what computational advances are likely to be implemented in clinical radiation oncology in the coming years and how the adoption of these changes might alter the practice of radiotherapy. Four main areas of likely advancement are explored: cloud computing, aggregate data analyses, parallel computation, and automation. As these developments promise both new opportunities and new risks to clinicians and patients alike, the potential benefits are weighed against the hazards associated with each advance, with special considerations regarding patient safety under new computational platforms and methodologies. While the concerns of patient safety are legitimate, the authors contend that progress toward next-generation clinical informatics systems will bring about extremely valuable developments in quality improvement initiatives, clinical efficiency, outcomes analyses, data sharing, and adaptive radiotherapy.

  1. Computational analysis of a multistage axial compressor

    NASA Astrophysics Data System (ADS)

    Mamidoju, Chaithanya

    Turbomachines are used extensively in Aerospace, Power Generation, and Oil & Gas Industries. Efficiency of these machines is often an important factor and has led to the continuous effort to improve the design to achieve better efficiency. The axial flow compressor is a major component in a gas turbine with the turbine's overall performance depending strongly on compressor performance. Traditional analysis of axial compressors involves throughflow calculations, isolated blade passage analysis, Quasi-3D blade-to-blade analysis, single-stage (rotor-stator) analysis, and multi-stage analysis involving larger design cycles. In the current study, the detailed flow through a 15 stage axial compressor is analyzed using a 3-D Navier Stokes CFD solver in a parallel computing environment. Methodology is described for steady state (frozen rotor stator) analysis of one blade passage per component. Various effects such as mesh type and density, boundary conditions, tip clearance and numerical issues such as turbulence model choice, advection model choice, and parallel processing performance are analyzed. A high sensitivity of the predictions to the above was found. Physical explanation to the flow features observed in the computational study are given. The total pressure rise verses mass flow rate was computed.

  2. Beam Optics Analysis - An Advanced 3D Trajectory Code

    SciTech Connect

    Ives, R. Lawrence; Bui, Thuc; Vogler, William; Neilson, Jeff; Read, Mike; Shephard, Mark; Bauer, Andrew; Datta, Dibyendu; Beal, Mark

    2006-01-03

    Calabazas Creek Research, Inc. has completed initial development of an advanced, 3D program for modeling electron trajectories in electromagnetic fields. The code is being used to design complex guns and collectors. Beam Optics Analysis (BOA) is a fully relativistic, charged particle code using adaptive, finite element meshing. Geometrical input is imported from CAD programs generating ACIS-formatted files. Parametric data is inputted using an intuitive, graphical user interface (GUI), which also provides control of convergence, accuracy, and post processing. The program includes a magnetic field solver, and magnetic information can be imported from Maxwell 2D/3D and other programs. The program supports thermionic emission and injected beams. Secondary electron emission is also supported, including multiple generations. Work on field emission is in progress as well as implementation of computer optimization of both the geometry and operating parameters. The principle features of the program and its capabilities are presented.

  3. Beam Optics Analysis — An Advanced 3D Trajectory Code

    NASA Astrophysics Data System (ADS)

    Ives, R. Lawrence; Bui, Thuc; Vogler, William; Neilson, Jeff; Read, Mike; Shephard, Mark; Bauer, Andrew; Datta, Dibyendu; Beal, Mark

    2006-01-01

    Calabazas Creek Research, Inc. has completed initial development of an advanced, 3D program for modeling electron trajectories in electromagnetic fields. The code is being used to design complex guns and collectors. Beam Optics Analysis (BOA) is a fully relativistic, charged particle code using adaptive, finite element meshing. Geometrical input is imported from CAD programs generating ACIS-formatted files. Parametric data is inputted using an intuitive, graphical user interface (GUI), which also provides control of convergence, accuracy, and post processing. The program includes a magnetic field solver, and magnetic information can be imported from Maxwell 2D/3D and other programs. The program supports thermionic emission and injected beams. Secondary electron emission is also supported, including multiple generations. Work on field emission is in progress as well as implementation of computer optimization of both the geometry and operating parameters. The principle features of the program and its capabilities are presented.

  4. An integrated computer system for preliminary design of advanced aircraft.

    NASA Technical Reports Server (NTRS)

    Fulton, R. E.; Sobieszczanski, J.; Landrum, E. J.

    1972-01-01

    A progress report is given on the first phase of a research project to develop a system of Integrated Programs for Aerospace-Vehicle Design (IPAD) which is intended to automate to the largest extent possible the preliminary and detailed design of advanced aircraft. The approach used is to build a pilot system and simultaneously to carry out two major contractual studies to define a practical IPAD system preparatory to programing. The paper summarizes the specifications and goals of the IPAD system, the progress to date, and any conclusion reached regarding its feasibility and scope. Sample calculations obtained with the pilot system are given for aircraft preliminary designs optimized with respect to discipline parameters, such as weight or L/D, and these results are compared with designs optimized with respect to overall performance parameters, such as range or payload.

  5. Nonlinear displacement analysis of advanced propeller structures using NASTRAN

    NASA Technical Reports Server (NTRS)

    Lawrence, C.; Kielb, R. E.

    1984-01-01

    The steady state displacements of a rotating advanced turboprop are computed using the geometrically nonlinear capabilities of COSMIC NASTRAN Rigid Format 4 and MSC NASTRAN Solution 64. A description of the modified Newton-Raphson algorithm used by Solution 64 and the iterative scheme used by Rigid Format 4 is provided. A representative advanced turboprop, SR3, was used for the study. Displacements for SR3 are computed for rotational speeds up to 10,000 rpm. The results show Solution 64 to be superior for computating displacements of flexible rotating structures. This is attributed to its ability to update the displacement dependent centrifugal force during the solution process.

  6. RADTRAN 5: A computer code for transportation risk analysis

    SciTech Connect

    Neuhauser, K. S.; Kanipe, F. L.

    1991-01-01

    RADTRAN 5 is a computer code developed at Sandia National Laboratories (SNL) in Albuquerque, NM, to estimate radiological and nonradiological risks of radioactive materials transportation. RADTRAN 5 is written in ANSI Standard FORTRAN 77 and contains significant advances in the methodology for route-specific analysis first developed by SNL for RADTRAN 4 (Neuhauser and Kanipe, 1992). Like the previous RADTRAN codes, RADTRAN 5 contains two major modules for incident-free and accident risk amlysis, respectively. All commercially important transportation modes may be analyzed with RADTRAN 5: highway by combination truck; highway by light-duty vehicle; rail; barge; ocean-going ship; cargo air; and passenger air.

  7. Recent advances in computer camera methods for machine vision

    NASA Astrophysics Data System (ADS)

    Olson, Gaylord G.; Walker, Jo N.

    1998-10-01

    During the past year, several new computer camera methods (hardware and software) have been developed which have applications in machine vision. These are described below, along with some test results. The improvements are generally in the direction of higher speed and greater parallelism. A PCI interface card has been designed which is adaptable to multiple CCD types, both color and monochrome. A newly designed A/D converter allows for a choice of 8 or 10-bit conversion resolution and a choice of two different analog inputs. Thus, by using four of these converters feeding the 32-bit PCI data bus, up to 8 camera heads can be used with a single PCI card, and four camera heads can be operated in parallel. The card has been designed so that any of 8 different CCD types can be used with it (6 monochrome and 2 color CCDs) ranging in resolution from 192 by 165 pixels up to 1134 by 972 pixels. In the area of software, a method has been developed to better utilize the decision-making capability of the computer along with the sub-array scan capabilities of many CCDs. Specifically, it is shown below how to achieve a dual scan mode camera system wherein one scan mode is a low density, high speed scan of a complete image area, and a higher density sub-array scan is used in those areas where changes have been observed. The name given to this technique is adaptive sub-array scanning.

  8. Risk analysis of computer system designs

    NASA Technical Reports Server (NTRS)

    Vallone, A.

    1981-01-01

    Adverse events during implementation can affect final capabilities, schedule and cost of a computer system even though the system was accurately designed and evaluated. Risk analysis enables the manager to forecast the impact of those events and to timely ask for design revisions or contingency plans before making any decision. This paper presents a structured procedure for an effective risk analysis. The procedure identifies the required activities, separates subjective assessments from objective evaluations, and defines a risk measure to determine the analysis results. The procedure is consistent with the system design evaluation and enables a meaningful comparison among alternative designs.

  9. Advanced Materials and Solids Analysis Research Core (AMSARC)

    EPA Science Inventory

    The Advanced Materials and Solids Analysis Research Core (AMSARC), centered at the U.S. Environmental Protection Agency's (EPA) Andrew W. Breidenbach Environmental Research Center in Cincinnati, Ohio, is the foundation for the Agency's solids and surfaces analysis capabilities. ...

  10. ADVANCES IN X-RAY COMPUTED MICROTOMOGRAPHY AT THE NSLS.

    SciTech Connect

    DOWD,B.A.

    1998-08-07

    The X-Ray Computed Microtomography workstation at beamline X27A at the NSLS has been utilized by scientists from a broad range of disciplines from industrial materials processing to environmental science. The most recent applications are presented here as well as a description of the facility that has evolved to accommodate a wide variety of materials and sample sizes. One of the most exciting new developments reported here resulted from a pursuit of faster reconstruction techniques. A Fast Filtered Back Transform (FFBT) reconstruction program has been developed and implemented, that is based on a refinement of the ''gridding'' algorithm first developed for use with radio astronomical data. This program has reduced the reconstruction time to 8.5 sec for a 929 x 929 pixel{sup 2} slice on an R10,000 CPU, more than 8x reduction compared with the Filtered Back-Projection method.

  11. Advances in x-ray computed microtomography at the NSLS

    SciTech Connect

    Dowd, B.A.; Andrews, A.B.; Marr, R.B.; Siddons, D.P.; Jones, K.W.; Peskin, A.M.

    1998-08-01

    The X-Ray Computed Microtomography workstation at beamline X27A at the NSLS has been utilized by scientists from a broad range of disciplines from industrial materials processing to environmental science. The most recent applications are presented here as well as a description of the facility that has evolved to accommodate a wide variety of materials and sample sizes. One of the most exciting new developments reported here resulted from a pursuit of faster reconstruction techniques. A Fast Filtered Back Transform (FFBT) reconstruction program has been developed and implemented, that is based on a refinement of the gridding algorithm first developed for use with radio astronomical data. This program has reduced the reconstruction time to 8.5 sec for a 929 x 929 pixel{sup 2} slice on an R10,000 CPU, more than 8x reduction compared with the Filtered Back-Projection method.

  12. Experimental and computing strategies in advanced material characterization problems

    NASA Astrophysics Data System (ADS)

    Bolzon, G.

    2015-10-01

    The mechanical characterization of materials relies more and more often on sophisticated experimental methods that permit to acquire a large amount of data and, contemporarily, to reduce the invasiveness of the tests. This evolution accompanies the growing demand of non-destructive diagnostic tools that assess the safety level of components in use in structures and infrastructures, for instance in the strategic energy sector. Advanced material systems and properties that are not amenable to traditional techniques, for instance thin layered structures and their adhesion on the relevant substrates, can be also characterized by means of combined experimental-numerical tools elaborating data acquired by full-field measurement techniques. In this context, parameter identification procedures involve the repeated simulation of the laboratory or in situ tests by sophisticated and usually expensive non-linear analyses while, in some situation, reliable and accurate results would be required in real time. The effectiveness and the filtering capabilities of reduced models based on decomposition and interpolation techniques can be profitably used to meet these conflicting requirements. This communication intends to summarize some results recently achieved in this field by the author and her co-workers. The aim is to foster further interaction between engineering and mathematical communities.

  13. Experimental and computing strategies in advanced material characterization problems

    SciTech Connect

    Bolzon, G.

    2015-10-28

    The mechanical characterization of materials relies more and more often on sophisticated experimental methods that permit to acquire a large amount of data and, contemporarily, to reduce the invasiveness of the tests. This evolution accompanies the growing demand of non-destructive diagnostic tools that assess the safety level of components in use in structures and infrastructures, for instance in the strategic energy sector. Advanced material systems and properties that are not amenable to traditional techniques, for instance thin layered structures and their adhesion on the relevant substrates, can be also characterized by means of combined experimental-numerical tools elaborating data acquired by full-field measurement techniques. In this context, parameter identification procedures involve the repeated simulation of the laboratory or in situ tests by sophisticated and usually expensive non-linear analyses while, in some situation, reliable and accurate results would be required in real time. The effectiveness and the filtering capabilities of reduced models based on decomposition and interpolation techniques can be profitably used to meet these conflicting requirements. This communication intends to summarize some results recently achieved in this field by the author and her co-workers. The aim is to foster further interaction between engineering and mathematical communities.

  14. Identifying human disease genes: advances in molecular genetics and computational approaches.

    PubMed

    Bakhtiar, S M; Ali, A; Baig, S M; Barh, D; Miyoshi, A; Azevedo, V

    2014-01-01

    The human genome project is one of the significant achievements that have provided detailed insight into our genetic legacy. During the last two decades, biomedical investigations have gathered a considerable body of evidence by detecting more than 2000 disease genes. Despite the imperative advances in the genetic understanding of various diseases, the pathogenesis of many others remains obscure. With recent advances, the laborious methodologies used to identify DNA variations are replaced by direct sequencing of genomic DNA to detect genetic changes. The ability to perform such studies depends equally on the development of high-throughput and economical genotyping methods. Currently, basically for every disease whose origen is still unknown, genetic approaches are available which could be pedigree-dependent or -independent with the capacity to elucidate fundamental disease mechanisms. Computer algorithms and programs for linkage analysis have formed the foundation for many disease gene detection projects, similarly databases of clinical findings have been widely used to support diagnostic decisions in dysmorphology and general human disease. For every disease type, genome sequence variations, particularly single nucleotide polymorphisms are mapped by comparing the genetic makeup of case and control groups. Methods that predict the effects of polymorphisms on protein stability are useful for the identification of possible disease associations, whereas structural effects can be assessed using methods to predict stability changes in proteins using sequence and/or structural information. PMID:25061732

  15. Activities and operations of the Advanced Computing Research Facility, July-October 1986

    SciTech Connect

    Pieper, G.W.

    1986-01-01

    Research activities and operations of the Advanced Computing Research Facility (ACRF) at Argonne National Laboratory are discussed for the period from July 1986 through October 1986. The facility is currently supported by the Department of Energy, and is operated by the Mathematics and Computer Science Division at Argonne. Over the past four-month period, a new commercial multiprocessor, the Intel iPSC-VX/d4 hypercube was installed. In addition, four other commercial multiprocessors continue to be available for research - an Encore Multimax, a Sequent Balance 21000, an Alliant FX/8, and an Intel iPSC/d5 - as well as a locally designed multiprocessor, the Lemur. These machines are being actively used by scientists at Argonne and throughout the nation in a wide variety of projects concerning computer systems with parallel and vector architectures. A variety of classes, workshops, and seminars have been sponsored to train researchers on computing techniques for the advanced computer systems at the Advanced Computing Research Facility. For example, courses were offered on writing programs for parallel computer systems and hosted the first annual Alliant users group meeting. A Sequent users group meeting and a two-day workshop on performance evaluation of parallel computers and programs are being organized.

  16. National Energy Research Scientific Computing Center (NERSC): Advancing the frontiers of computational science and technology

    SciTech Connect

    Hules, J.

    1996-11-01

    National Energy Research Scientific Computing Center (NERSC) provides researchers with high-performance computing tools to tackle science`s biggest and most challenging problems. Founded in 1974 by DOE/ER, the Controlled Thermonuclear Research Computer Center was the first unclassified supercomputer center and was the model for those that followed. Over the years the center`s name was changed to the National Magnetic Fusion Energy Computer Center and then to NERSC; it was relocated to LBNL. NERSC, one of the largest unclassified scientific computing resources in the world, is the principal provider of general-purpose computing services to DOE/ER programs: Magnetic Fusion Energy, High Energy and Nuclear Physics, Basic Energy Sciences, Health and Environmental Research, and the Office of Computational and Technology Research. NERSC users are a diverse community located throughout US and in several foreign countries. This brochure describes: the NERSC advantage, its computational resources and services, future technologies, scientific resources, and computational science of scale (interdisciplinary research over a decade or longer; examples: combustion in engines, waste management chemistry, global climate change modeling).

  17. Advanced Computer Simulations Of Nanomaterials And Stochastic Biological Processes

    NASA Astrophysics Data System (ADS)

    Minakova, Maria S.

    This dissertation consists of several parts. The first two chapters are devoted to of study of dynamic processes in cellular organelles called filopodia. A stochastic kinetics approach is used to describe non-equilibrium evolution of the filopodial system from nano- to micro scales. Dynamic coupling between chemistry and mechanics is also taken into account in order to investigate the influence of focal adhesions on cell motility. The second chapter explores the possibilities and effects of motor enhanced delivery of actin monomers to the polymerizing tips of filopodia, and how the steady-state filopodial length can exceed the limit set by pure diffusion. Finally, we also challenge the currently existing view of active transport and propose a new theoretical model that accurately describes the motor dynamics and concentration profiles seen in experiments in a physically meaningful way. The third chapter is a result of collaboration between three laboratories, as a part of Energy Frontier Research Center at the University of North Carolina at Chapel Hill. The work presented here unified the fields of synthetic chemistry, photochemistry, and computational physical chemistry in order to investigate a novel bio-synthetic compound and its energy transfer capabilities. This particular peptide-based design has never been studied via Molecular Dynamics with high precision, and it is the first attempt known to us to simulate the whole chromophore-peptide complex in solution in order to gain detailed information about its structural and dynamic features. The fourth chapter deals with the non-equilibrium relaxation induced transport of water molecules in a microemulsion. This problem required a different set of methodologies and a more detailed, all-atomistic treatment of the system. We found interesting water clustering effects and elucidated the most probable mechanism of water transfer through oil under the condition of saturated Langmuir monolayers. Together these

  18. Computational strategies for tire monitoring and analysis

    NASA Technical Reports Server (NTRS)

    Danielson, Kent T.; Noor, Ahmed K.; Green, James S.

    1995-01-01

    Computational strategies are presented for the modeling and analysis of tires in contact with pavement. A procedure is introduced for simple and accurate determination of tire cross-sectional geometric characteristics from a digitally scanned image. Three new strategies for reducing the computational effort in the finite element solution of tire-pavement contact are also presented. These strategies take advantage of the observation that footprint loads do not usually stimulate a significant tire response away from the pavement contact region. The finite element strategies differ in their level of approximation and required amount of computer resources. The effectiveness of the strategies is demonstrated by numerical examples of frictionless and frictional contact of the space shuttle Orbiter nose-gear tire. Both an in-house research code and a commercial finite element code are used in the numerical studies.

  19. CART V: recent advancements in computer-aided camouflage assessment

    NASA Astrophysics Data System (ADS)

    Müller, Thomas; Müller, Markus

    2011-05-01

    In order to facilitate systematic, computer aided improvements of camouflage and concealment assessment methods, the software system CART (Camouflage Assessment in Real-Time) was built up for the camouflage assessment of objects in multispectral image sequences (see contributions to SPIE 2007-2010 [1], [2], [3], [4]). It comprises a semi-automatic marking of target objects (ground truth generation) including their propagation over the image sequence and the evaluation via user-defined feature extractors as well as methods to assess the object's movement conspicuity. In this fifth part in an annual series at the SPIE conference in Orlando, this paper presents the enhancements over the recent year and addresses the camouflage assessment of static and moving objects in multispectral image data that can show noise or image artefacts. The presented methods fathom the correlations between image processing and camouflage assessment. A novel algorithm is presented based on template matching to assess the structural inconspicuity of an object objectively and quantitatively. The results can easily be combined with an MTI (moving target indication) based movement conspicuity assessment function in order to explore the influence of object movement to a camouflage effect in different environments. As the results show, the presented methods contribute to a significant benefit in the field of camouflage assessment.

  20. Block sparse Cholesky algorithms on advanced uniprocessor computers

    SciTech Connect

    Ng, E.G.; Peyton, B.W.

    1991-12-01

    As with many other linear algebra algorithms, devising a portable implementation of sparse Cholesky factorization that performs well on the broad range of computer architectures currently available is a formidable challenge. Even after limiting our attention to machines with only one processor, as we have done in this report, there are still several interesting issues to consider. For dense matrices, it is well known that block factorization algorithms are the best means of achieving this goal. We take this approach for sparse factorization as well. This paper has two primary goals. First, we examine two sparse Cholesky factorization algorithms, the multifrontal method and a blocked left-looking sparse Cholesky method, in a systematic and consistent fashion, both to illustrate the strengths of the blocking techniques in general and to obtain a fair evaluation of the two approaches. Second, we assess the impact of various implementation techniques on time and storage efficiency, paying particularly close attention to the work-storage requirement of the two methods and their variants.

  1. Seeing Is Believing: Quantifying Is Convincing: Computational Image Analysis in Biology.

    PubMed

    Sbalzarini, Ivo F

    2016-01-01

    Imaging is center stage in biology. Advances in microscopy and labeling techniques have enabled unprecedented observations and continue to inspire new developments. Efficient and accurate quantification and computational analysis of the acquired images, however, are becoming the bottleneck. We review different paradigms of computational image analysis for intracellular, single-cell, and tissue-level imaging, providing pointers to the specialized literature and listing available software tools. We place particular emphasis on clear categorization of image-analysis frameworks and on identifying current trends and challenges in the field. We further outline some of the methodological advances that are required in order to use images as quantitative scientific measurements. PMID:27207361

  2. Development of Advanced Light-Duty Powertrain and Hybrid Analysis Tool (SAE 2013-01-0808)

    EPA Science Inventory

    The Advanced Light-Duty Powertrain and Hybrid Analysis tool was created by Environmental Protection Agency to evaluate the Greenhouse gas emissions and fuel efficiency from light-duty vehicles. It is a physics-based, forward-looking, full vehicle computer simulator, which is cap...

  3. Ubiquitous computing in sports: A review and analysis.

    PubMed

    Baca, Arnold; Dabnichki, Peter; Heller, Mario; Kornfeind, Philipp

    2009-10-01

    Ubiquitous (pervasive) computing is a term for a synergetic use of sensing, communication and computing. Pervasive use of computing has seen a rapid increase in the current decade. This development has propagated in applied sport science and everyday life. The work presents a survey of recent developments in sport and leisure with emphasis on technology and computational techniques. A detailed analysis on new technological developments is performed. Sensors for position and motion detection, and such for equipment and physiological monitoring are discussed. Aspects of novel trends in communication technologies and data processing are outlined. Computational advancements have started a new trend - development of smart and intelligent systems for a wide range of applications - from model-based posture recognition to context awareness algorithms for nutrition monitoring. Examples particular to coaching and training are discussed. Selected tools for monitoring rules' compliance and automatic decision-making are outlined. Finally, applications in leisure and entertainment are presented, from systems supporting physical activity to systems providing motivation. It is concluded that the emphasis in future will shift from technologies to intelligent systems that allow for enhanced social interaction as efforts need to be made to improve user-friendliness and standardisation of measurement and transmission protocols. PMID:19764000

  4. Computational analysis of forebody tangential slot blowing

    NASA Technical Reports Server (NTRS)

    Gee, Ken; Agosta-Greenman, Roxana M.; Rizk, Yehia M.; Schiff, Lewis B.; Cummings, Russell M.

    1994-01-01

    An overview of the computational effort to analyze forebody tangential slot blowing is presented. Tangential slot blowing generates side force and yawing moment which may be used to control an aircraft flying at high-angle-of-attack. Two different geometries are used in the analysis: (1) The High Alpha Research Vehicle; and (2) a generic chined forebody. Computations using the isolated F/A-18 forebody are obtained at full-scale wind tunnel test conditions for direct comparison with available experimental data. The effects of over- and under-blowing on force and moment production are analyzed. Time-accurate solutions using the isolated forebody are obtained to study the force onset timelag of tangential slot blowing. Computations using the generic chined forebody are obtained at experimental wind tunnel conditions, and the results compared with available experimental data. This computational analysis compliments the experimental results and provides a detailed understanding of the effects of tangential slot blowing on the flow field about simple and complex geometries.

  5. Probabilistic structural analysis computer code (NESSUS)

    NASA Technical Reports Server (NTRS)

    Shiao, Michael C.

    1988-01-01

    Probabilistic structural analysis has been developed to analyze the effects of fluctuating loads, variable material properties, and uncertain analytical models especially for high performance structures such as SSME turbopump blades. The computer code NESSUS (Numerical Evaluation of Stochastic Structure Under Stress) was developed to serve as a primary computation tool for the characterization of the probabilistic structural response due to the stochastic environments by statistical description. The code consists of three major modules NESSUS/PRE, NESSUS/FEM, and NESSUS/FPI. NESSUS/PRE is a preprocessor which decomposes the spatially correlated random variables into a set of uncorrelated random variables using a modal analysis method. NESSUS/FEM is a finite element module which provides structural sensitivities to all the random variables considered. NESSUS/FPI is Fast Probability Integration method by which a cumulative distribution function or a probability density function is calculated.

  6. Statistical Data Analysis in the Computer Age

    NASA Astrophysics Data System (ADS)

    Efron, Bradley; Tibshirani, Robert

    1991-07-01

    Most of our familiar statistical methods, such as hypothesis testing, linear regression, analysis of variance, and maximum likelihood estimation, were designed to be implemented on mechanical calculators. modern electronic computation has encouraged a host of new statistical methods that require fewer distributional assumptions than their predecessors and can be applied to more complicated statistical estimators. These methods allow the scientist to explore and describe data and draw valid statistical inferences without the usual concerns for mathematical tractability. This is possible because traditional methods of mathematical analysis are replaced by specially constructed computer algorithms. Mathematics has not disappeared from statistical theory. It is the main method for deciding which algorithms are correct and efficient tools for automating statistical inference.

  7. Advanced entry guidance algorithm with landing footprint computation

    NASA Astrophysics Data System (ADS)

    Leavitt, James Aaron

    -determined angle of attack profile. The method is also capable of producing orbital footprints using an automatically-generated set of angle of attack profiles of varying range, with the lowest profile designed for near-maximum range in the absence of an active heat load constraint. The accuracy of the footprint method is demonstrated by direct comparison with footprints computed independently by an optimization program.

  8. Teaching Advance Care Planning to Medical Students with a Computer-Based Decision Aid

    PubMed Central

    Levi, Benjamin H.

    2013-01-01

    Discussing end-of-life decisions with cancer patients is a crucial skill for physicians. This article reports findings from a pilot study evaluating the effectiveness of a computer-based decision aid for teaching medical students about advance care planning. Second-year medical students at a single medical school were randomized to use a standard advance directive or a computer-based decision aid to help patients with advance care planning. Students' knowledge, skills, and satisfaction were measured by self-report; their performance was rated by patients. 121/133 (91%) of students participated. The Decision-Aid Group (n=60) outperformed the Standard Group (n=61) in terms of students´ knowledge (p<0.01), confidence in helping patients with advance care planning (p<0.01), knowledge of what matters to patients (p=0.05), and satisfaction with their learning experience (p<0.01). Likewise, patients in the Decision Aid Group were more satisfied with the advance care planning method (p<0.01) and with several aspects of student performance. Use of a computer-based decision aid may be an effective way to teach medical students how to discuss advance care planning with cancer patients. PMID:20632222

  9. Advanced probabilistic risk analysis using RAVEN and RELAP-7

    SciTech Connect

    Rabiti, Cristian; Alfonsi, Andrea; Mandelli, Diego; Cogliati, Joshua; Kinoshita, Robert

    2014-06-01

    RAVEN, under the support of the Nuclear Energy Advanced Modeling and Simulation (NEAMS) program [1], is advancing its capability to perform statistical analyses of stochastic dynamic systems. This is aligned with its mission to provide the tools needed by the Risk Informed Safety Margin Characterization (RISMC) path-lead [2] under the Department Of Energy (DOE) Light Water Reactor Sustainability program [3]. In particular this task is focused on the synergetic development with the RELAP-7 [4] code to advance the state of the art on the safety analysis of nuclear power plants (NPP). The investigation of the probabilistic evolution of accident scenarios for a complex system such as a nuclear power plant is not a trivial challenge. The complexity of the system to be modeled leads to demanding computational requirements even to simulate one of the many possible evolutions of an accident scenario (tens of CPU/hour). At the same time, the probabilistic analysis requires thousands of runs to investigate outcomes characterized by low probability and severe consequence (tail problem). The milestone reported in June of 2013 [5] described the capability of RAVEN to implement complex control logic and provide an adequate support for the exploration of the probabilistic space using a Monte Carlo sampling strategy. Unfortunately the Monte Carlo approach is ineffective with a problem of this complexity. In the following year of development, the RAVEN code has been extended with more sophisticated sampling strategies (grids, Latin Hypercube, and adaptive sampling). This milestone report illustrates the effectiveness of those methodologies in performing the assessment of the probability of core damage following the onset of a Station Black Out (SBO) situation in a boiling water reactor (BWR). The first part of the report provides an overview of the available probabilistic analysis capabilities, ranging from the different types of distributions available, possible sampling

  10. The Nuclear Energy Advanced Modeling and Simulation Enabling Computational Technologies FY09 Report

    SciTech Connect

    Diachin, L F; Garaizar, F X; Henson, V E; Pope, G

    2009-10-12

    In this document we report on the status of the Nuclear Energy Advanced Modeling and Simulation (NEAMS) Enabling Computational Technologies (ECT) effort. In particular, we provide the context for ECT In the broader NEAMS program and describe the three pillars of the ECT effort, namely, (1) tools and libraries, (2) software quality assurance, and (3) computational facility (computers, storage, etc) needs. We report on our FY09 deliverables to determine the needs of the integrated performance and safety codes (IPSCs) in these three areas and lay out the general plan for software quality assurance to meet the requirements of DOE and the DOE Advanced Fuel Cycle Initiative (AFCI). We conclude with a brief description of our interactions with the Idaho National Laboratory computer center to determine what is needed to expand their role as a NEAMS user facility.

  11. Unique Systems Analysis Task 7, Advanced Subsonic Technologies Evaluation Analysis

    NASA Technical Reports Server (NTRS)

    Eisenberg, Joseph D. (Technical Monitor); Bettner, J. L.; Stratton, S.

    2004-01-01

    To retain a preeminent U.S. position in the aircraft industry, aircraft passenger mile costs must be reduced while at the same time, meeting anticipated more stringent environmental regulations. A significant portion of these improvements will come from the propulsion system. A technology evaluation and system analysis was accomplished under this task, including areas such as aerodynamics and materials and improved methods for obtaining low noise and emissions. Previous subsonic evaluation analyses have identified key technologies in selected components for propulsion systems for year 2015 and beyond. Based on the current economic and competitive environment, it is clear that studies with nearer turn focus that have a direct impact on the propulsion industry s next generation product are required. This study will emphasize the year 2005 entry into service time period. The objective of this study was to determine which technologies and materials offer the greatest opportunities for improving propulsion systems. The goals are twofold. The first goal is to determine an acceptable compromise between the thermodynamic operating conditions for A) best performance, and B) acceptable noise and chemical emissions. The second goal is the evaluation of performance, weight and cost of advanced materials and concepts on the direct operating cost of an advanced regional transport of comparable technology level.

  12. Observations Regarding Use of Advanced CFD Analysis, Sensitivity Analysis, and Design Codes in MDO

    NASA Technical Reports Server (NTRS)

    Newman, Perry A.; Hou, Gene J. W.; Taylor, Arthur C., III

    1996-01-01

    Observations regarding the use of advanced computational fluid dynamics (CFD) analysis, sensitivity analysis (SA), and design codes in gradient-based multidisciplinary design optimization (MDO) reflect our perception of the interactions required of CFD and our experience in recent aerodynamic design optimization studies using CFD. Sample results from these latter studies are summarized for conventional optimization (analysis - SA codes) and simultaneous analysis and design optimization (design code) using both Euler and Navier-Stokes flow approximations. The amount of computational resources required for aerodynamic design using CFD via analysis - SA codes is greater than that required for design codes. Thus, an MDO formulation that utilizes the more efficient design codes where possible is desired. However, in the aerovehicle MDO problem, the various disciplines that are involved have different design points in the flight envelope; therefore, CFD analysis - SA codes are required at the aerodynamic 'off design' points. The suggested MDO formulation is a hybrid multilevel optimization procedure that consists of both multipoint CFD analysis - SA codes and multipoint CFD design codes that perform suboptimizations.

  13. Projected role of advanced computational aerodynamic methods at the Lockheed-Georgia company

    NASA Technical Reports Server (NTRS)

    Lores, M. E.

    1978-01-01

    Experience with advanced computational methods being used at the Lockheed-Georgia Company to aid in the evaluation and design of new and modified aircraft indicates that large and specialized computers will be needed to make advanced three-dimensional viscous aerodynamic computations practical. The Numerical Aerodynamic Simulation Facility should be used to provide a tool for designing better aerospace vehicles while at the same time reducing development costs by performing computations using Navier-Stokes equations solution algorithms and permitting less sophisticated but nevertheless complex calculations to be made efficiently. Configuration definition procedures and data output formats can probably best be defined in cooperation with industry, therefore, the computer should handle many remote terminals efficiently. The capability of transferring data to and from other computers needs to be provided. Because of the significant amount of input and output associated with 3-D viscous flow calculations and because of the exceedingly fast computation speed envisioned for the computer, special attention should be paid to providing rapid, diversified, and efficient input and output.

  14. The Tuition Advance Fund: An Analysis Prepared for Boston University.

    ERIC Educational Resources Information Center

    Botsford, Keith

    Three models for anlayzing the Tuition Advance Fund (TAF) are examined. The three models are: projections by the Institute for Demographic and Economic Studies (IDES), projections by Data Resources, Inc. (DRI), and the Tuition Advance Fund Simulation (TAFSIM) models from Boston University. Analysis of the TAF is based on enrollment, price, and…

  15. A Meta-Analysis of Advance-Organizer Studies.

    ERIC Educational Resources Information Center

    Stone, Carol Leth

    Long term studies of advance organizers (AO) were analyzed with Glass's meta-analysis technique. AO's were defined as bridges from reader's previous knowledge to what is to be learned. The results were compared with predictions from Ausubel's model of assimilative learning. The results of the study indicated that advance organizers were associated…

  16. Surface Modeling, Solid Modeling and Finite Element Modeling. Analysis Capabilities of Computer-Assisted Design and Manufacturing Systems.

    ERIC Educational Resources Information Center

    Nee, John G.; Kare, Audhut P.

    1987-01-01

    Explores several concepts in computer assisted design/computer assisted manufacturing (CAD/CAM). Defines, evaluates, reviews and compares advanced computer-aided geometric modeling and analysis techniques. Presents the results of a survey to establish the capabilities of minicomputer based-systems with the CAD/CAM packages evaluated. (CW)

  17. Aerodynamic analysis of Pegasus - Computations vs reality

    NASA Technical Reports Server (NTRS)

    Mendenhall, Michael R.; Lesieutre, Daniel J.; Whittaker, C. H.; Curry, Robert E.; Moulton, Bryan

    1993-01-01

    Pegasus, a three-stage, air-launched, winged space booster was developed to provide fast and efficient commercial launch services for small satellites. The aerodynamic design and analysis of Pegasus was conducted without benefit of wind tunnel tests using only computational aerodynamic and fluid dynamic methods. Flight test data from the first two operational flights of Pegasus are now available, and they provide an opportunity to validate the accuracy of the predicted pre-flight aerodynamic characteristics. Comparisons of measured and predicted flight characteristics are presented and discussed. Results show that the computational methods provide reasonable aerodynamic design information with acceptable margins. Post-flight analyses illustrate certain areas in which improvements are desired.

  18. Semiconductor Device Analysis on Personal Computers

    1993-02-08

    PC-1D models the internal operation of bipolar semiconductor devices by solving for the concentrations and quasi-one-dimensional flow of electrons and holes resulting from either electrical or optical excitation. PC-1D uses the same detailed physical models incorporated in mainframe computer programs, yet runs efficiently on personal computers. PC-1D was originally developed with DOE funding to analyze solar cells. That continues to be its primary mode of usage, with registered copies in regular use at more thanmore » 100 locations worldwide. The program has been successfully applied to the analysis of silicon, gallium-arsenide, and indium-phosphide solar cells. The program is also suitable for modeling bipolar transistors and diodes, including heterojunction devices. Its easy-to-use graphical interface makes it useful as a teaching tool as well.« less

  19. Meaningful statistical analysis of large computational clusters.

    SciTech Connect

    Gentile, Ann C.; Marzouk, Youssef M.; Brandt, James M.; Pebay, Philippe Pierre

    2005-07-01

    Effective monitoring of large computational clusters demands the analysis of a vast amount of raw data from a large number of machines. The fundamental interactions of the system are not, however, well-defined, making it difficult to draw meaningful conclusions from this data, even if one were able to efficiently handle and process it. In this paper we show that computational clusters, because they are comprised of a large number of identical machines, behave in a statistically meaningful fashion. We therefore can employ normal statistical methods to derive information about individual systems and their environment and to detect problems sooner than with traditional mechanisms. We discuss design details necessary to use these methods on a large system in a timely and low-impact fashion.

  20. Computational frameworks for discrete Gabor analysis

    NASA Astrophysics Data System (ADS)

    Strohmer, Thomas

    1997-10-01

    The Gabor transform yields a discrete representation of a signal in the phase space. Since the Gabor transform is non-orthogonal, efficient reconstruction of a signal from its phase space samples is not straightforward and involves the computation of the so- called dual Gabor function. We present a unifying approach to the derivation of numerical algorithms for discrete Gabor analysis, based on unitary matrix factorization. The factorization point of view is notably useful for the design of efficient numerical algorithms. This presentation is the first systematic account of its kind. In particular, it is shown that different algorithms for the computation of the dual window correspond to different factorizations of the frame operator. Simple number theoretic conditions on the time-frequency lattice parameters imply additional structural properties of the frame operator.

  1. Research in Computational Aeroscience Applications Implemented on Advanced Parallel Computing Systems

    NASA Technical Reports Server (NTRS)

    Wigton, Larry

    1996-01-01

    Improving the numerical linear algebra routines for use in new Navier-Stokes codes, specifically Tim Barth's unstructured grid code, with spin-offs to TRANAIR is reported. A fast distance calculation routine for Navier-Stokes codes using the new one-equation turbulence models is written. The primary focus of this work was devoted to improving matrix-iterative methods. New algorithms have been developed which activate the full potential of classical Cray-class computers as well as distributed-memory parallel computers.

  2. PARTNERING WITH DOE TO APPLY ADVANCED BIOLOGICAL, ENVIRONMENTAL, AND COMPUTATIONAL SCIENCE TO ENVIRONMENTAL ISSUES

    EPA Science Inventory

    On February 18, 2004, the U.S. Environmental Protection Agency and Department of Energy signed a Memorandum of Understanding to expand the research collaboration of both agencies to advance biological, environmental, and computational sciences for protecting human health and the ...

  3. Advanced Telecommunications and Computer Technologies in Georgia Public Elementary School Library Media Centers.

    ERIC Educational Resources Information Center

    Rogers, Jackie L.

    The purpose of this study was to determine what recent progress had been made in Georgia public elementary school library media centers regarding access to advanced telecommunications and computer technologies as a result of special funding. A questionnaire addressed the following areas: automation and networking of the school library media center…

  4. COMPUTATIONAL TOXICOLOGY ADVANCES: EMERGING CAPABILITIES FOR DATA EXPLORATION AND SAR MODEL DEVELOPMENT

    EPA Science Inventory

    Computational Toxicology Advances: Emerging capabilities for data exploration and SAR model development
    Ann M. Richard and ClarLynda R. Williams, National Health & Environmental Effects Research Laboratory, US EPA, Research Triangle Park, NC, USA; email: richard.ann@epa.gov

  5. Computers-for-edu: An Advanced Business Application Programming (ABAP) Teaching Case

    ERIC Educational Resources Information Center

    Boyle, Todd A.

    2007-01-01

    The "Computers-for-edu" case is designed to provide students with hands-on exposure to creating Advanced Business Application Programming (ABAP) reports and dialogue programs, as well as navigating various mySAP Enterprise Resource Planning (ERP) transactions needed by ABAP developers. The case requires students to apply a wide variety of ABAP…

  6. Building an advanced climate model: Program plan for the CHAMMP (Computer Hardware, Advanced Mathematics, and Model Physics) Climate Modeling Program

    SciTech Connect

    Not Available

    1990-12-01

    The issue of global warming and related climatic changes from increasing concentrations of greenhouse gases in the atmosphere has received prominent attention during the past few years. The Computer Hardware, Advanced Mathematics, and Model Physics (CHAMMP) Climate Modeling Program is designed to contribute directly to this rapid improvement. The goal of the CHAMMP Climate Modeling Program is to develop, verify, and apply a new generation of climate models within a coordinated framework that incorporates the best available scientific and numerical approaches to represent physical, biogeochemical, and ecological processes, that fully utilizes the hardware and software capabilities of new computer architectures, that probes the limits of climate predictability, and finally that can be used to address the challenging problem of understanding the greenhouse climate issue through the ability of the models to simulate time-dependent climatic changes over extended times and with regional resolution.

  7. Computer-Assisted Handwriting Analysis: Interaction with Legal Issues in U.S. Courts

    NASA Astrophysics Data System (ADS)

    Manning, Kenneth A.; Srihari, Sargur N.

    Advances in the development of computer-assisted handwriting analysis have led to the consideration of a computational system by courts in the United States. Computer-assisted handwriting analysis has been introduced in the context of Frye or Daubert hearings conducted to determine the admissibility of handwriting testimony by questioned document examiners, as expert witnesses, in civil and criminal proceedings. This paper provides a comparison of scientific and judicial methods, and examines concerns over reliability of handwriting analysis expressed in judicial decisions. Recently, the National Research Council assessed that “the scientific basis for handwriting comparisons needs to be strengthened”. Recent studies involving computer-assisted handwriting analysis are reviewed in light of the concerns expressed by the judiciary and National Research Council. A future potential role for computer-assisted handwriting analysis in the courts is identified.

  8. Scientific Discovery through Advanced Computing (SciDAC-3) Partnership Project Annual Report

    SciTech Connect

    Hoffman, Forest M.; Bochev, Pavel B.; Cameron-Smith, Philip J..; Easter, Richard C; Elliott, Scott M.; Ghan, Steven J.; Liu, Xiaohong; Lowrie, Robert B.; Lucas, Donald D.; Ma, Po-lun; Sacks, William J.; Shrivastava, Manish; Singh, Balwinder; Tautges, Timothy J.; Taylor, Mark A.; Vertenstein, Mariana; Worley, Patrick H.

    2014-01-15

    The Applying Computationally Efficient Schemes for BioGeochemical Cycles ACES4BGC Project is advancing the predictive capabilities of Earth System Models (ESMs) by reducing two of the largest sources of uncertainty, aerosols and biospheric feedbacks, with a highly efficient computational approach. In particular, this project is implementing and optimizing new computationally efficient tracer advection algorithms for large numbers of tracer species; adding important biogeochemical interactions between the atmosphere, land, and ocean models; and applying uncertainty quanti cation (UQ) techniques to constrain process parameters and evaluate uncertainties in feedbacks between biogeochemical cycles and the climate system.

  9. Advanced Fingerprint Analysis Project Fingerprint Constituents

    SciTech Connect

    GM Mong; CE Petersen; TRW Clauss

    1999-10-29

    The work described in this report was focused on generating fundamental data on fingerprint components which will be used to develop advanced forensic techniques to enhance fluorescent detection, and visualization of latent fingerprints. Chemical components of sweat gland secretions are well documented in the medical literature and many chemical techniques are available to develop latent prints, but there have been no systematic forensic studies of fingerprint sweat components or of the chemical and physical changes these substances undergo over time.

  10. A First Attempt to Bring Computational Biology into Advanced High School Biology Classrooms

    PubMed Central

    Gallagher, Suzanne Renick; Coon, William; Donley, Kristin; Scott, Abby; Goldberg, Debra S.

    2011-01-01

    Computer science has become ubiquitous in many areas of biological research, yet most high school and even college students are unaware of this. As a result, many college biology majors graduate without adequate computational skills for contemporary fields of biology. The absence of a computational element in secondary school biology classrooms is of growing concern to the computational biology community and biology teachers who would like to acquaint their students with updated approaches in the discipline. We present a first attempt to correct this absence by introducing a computational biology element to teach genetic evolution into advanced biology classes in two local high schools. Our primary goal was to show students how computation is used in biology and why a basic understanding of computation is necessary for research in many fields of biology. This curriculum is intended to be taught by a computational biologist who has worked with a high school advanced biology teacher to adapt the unit for his/her classroom, but a motivated high school teacher comfortable with mathematics and computing may be able to teach this alone. In this paper, we present our curriculum, which takes into consideration the constraints of the required curriculum, and discuss our experiences teaching it. We describe the successes and challenges we encountered while bringing this unit to high school students, discuss how we addressed these challenges, and make suggestions for future versions of this curriculum.We believe that our curriculum can be a valuable seed for further development of computational activities aimed at high school biology students. Further, our experiences may be of value to others teaching computational biology at this level. Our curriculum can be obtained at http://ecsite.cs.colorado.edu/?page_id=149#biology or by contacting the authors. PMID:22046118

  11. Advanced nuclear rocket engine mission analysis

    SciTech Connect

    Ramsthaler, J.; Farbman, G.; Sulmeisters, T.; Buden, D.; Harris, P.

    1987-12-01

    The use of a derivative of the NERVA engine developed from 1955 to 1973 was evluated for potential application to Air Force orbital transfer and maneuvering missions in the time period 1995 to 2020. The NERVA stge was found to have lower life cycle costs (LCC) than an advanced chemical stage for performing low earth orbit (LEO) to geosynchronous orbit (GEO0 missions at any level of activity greater than three missions per year. It had lower life cycle costs than a high performance nuclear electric engine at any level of LEO to GEO mission activity. An examination of all unmanned orbital transfer and maneuvering missions from the Space Transportation Architecture study (STAS 111-3) indicated a LCC advantage for the NERVA stage over the advanced chemical stage of fifteen million dollars. The cost advanced accured from both the orbital transfer and maneuvering missions. Parametric analyses showed that the specific impulse of the NERVA stage and the cost of delivering material to low earth orbit were the most significant factors in the LCC advantage over the chemical stage. Lower development costs and a higher thrust gave the NERVA engine an LCC advantage over the nuclear electric stage. An examination of technical data from the Rover/NERVA program indicated that development of the NERVA stage has a low technical risk, and the potential for high reliability and safe operation. The data indicated the NERVA engine had a great flexibility which would permit a single stage to perform all Air Force missions.

  12. Analysis of advanced programs (Study 2.3). Volume 1: Executive summary

    NASA Technical Reports Server (NTRS)

    1972-01-01

    An analysis of advanced programs for space missions is presented. The subjects discussed are (1) alternate space shuttle configurations, (2) space transportation systems, (3) computer programs and methodology to estimate program cost implications of space vehicle program uncertainties, (4) vehicle synthesis program, (5) relative effectiveness of various proposals for space vehicles, and (6) cost analysis of solid propellant rocket engine program to support space shuttle vehicle.

  13. Advanced Modeling, Simulation and Analysis (AMSA) Capability Roadmap Progress Review

    NASA Technical Reports Server (NTRS)

    Antonsson, Erik; Gombosi, Tamas

    2005-01-01

    Contents include the following: NASA capability roadmap activity. Advanced modeling, simulation, and analysis overview. Scientific modeling and simulation. Operations modeling. Multi-special sensing (UV-gamma). System integration. M and S Environments and Infrastructure.

  14. FORTRAN computer program for seismic risk analysis

    USGS Publications Warehouse

    McGuire, Robin K.

    1976-01-01

    A program for seismic risk analysis is described which combines generality of application, efficiency and accuracy of operation, and the advantage of small storage requirements. The theoretical basis for the program is first reviewed, and the computational algorithms used to apply this theory are described. The information required for running the program is listed. Published attenuation functions describing the variation with earthquake magnitude and distance of expected values for various ground motion parameters are summarized for reference by the program user. Finally, suggestions for use of the program are made, an example problem is described (along with example problem input and output) and the program is listed.

  15. FY05-FY06 Advanced Simulation and Computing Implementation Plan, Volume 2

    SciTech Connect

    Baron, A L

    2004-07-19

    The Stockpile Stewardship Program (SSP) is a single, highly integrated technical program for maintaining the safety and reliability of the U.S. nuclear stockpile. The SSP uses past nuclear test data along with future non-nuclear test data, computational modeling and simulation, and experimental facilities to advance understanding of nuclear weapons. It includes stockpile surveillance, experimental research, development and engineering programs, and an appropriately scaled production capability to support stockpile requirements. This integrated national program will require the continued use of current facilities and programs along with new experimental facilities and computational enhancements to support these programs. The Advanced Simulation and Computing program (ASC) is a cornerstone of the SSP, providing simulation capabilities and computational resources to support the annual stockpile assessment and certification, to study advanced nuclear weapon design and manufacturing processes, to analyze accident scenarios and weapons aging, and to provide the tools to enable stockpile life extension programs and the resolution of significant finding investigations (SFIs). This requires a balanced system of technical staff, hardware, simulation software, and computer science solutions.

  16. NALDA (Naval Aviation Logistics Data Analysis) CAI (computer aided instruction)

    SciTech Connect

    Handler, B.H. ); France, P.A.; Frey, S.C.; Gaubas, N.F.; Hyland, K.J.; Lindsey, A.M.; Manley, D.O. ); Hunnum, W.H. ); Smith, D.L. )

    1990-07-01

    Data Systems Engineering Organization (DSEO) personnel developed a prototype computer aided instruction CAI system for the Naval Aviation Logistics Data Analysis (NALDA) system. The objective of this project was to provide a CAI prototype that could be used as an enhancement to existing NALDA training. The CAI prototype project was performed in phases. The task undertaken in Phase I was to analyze the problem and the alternative solutions and to develop a set of recommendations on how best to proceed. The findings from Phase I are documented in Recommended CAI Approach for the NALDA System (Duncan et al., 1987). In Phase II, a structured design and specifications were developed, and a prototype CAI system was created. A report, NALDA CAI Prototype: Phase II Final Report, was written to record the findings and results of Phase II. NALDA CAI: Recommendations for an Advanced Instructional Model, is comprised of related papers encompassing research on computer aided instruction CAI, newly developing training technologies, instructional systems development, and an Advanced Instructional Model. These topics were selected because of their relevancy to the CAI needs of NALDA. These papers provide general background information on various aspects of CAI and give a broad overview of new technologies and their impact on the future design and development of training programs. The paper within have been index separately elsewhere.

  17. Computer network environment planning and analysis

    NASA Technical Reports Server (NTRS)

    Dalphin, John F.

    1989-01-01

    The GSFC Computer Network Environment provides a broadband RF cable between campus buildings and ethernet spines in buildings for the interlinking of Local Area Networks (LANs). This system provides terminal and computer linkage among host and user systems thereby providing E-mail services, file exchange capability, and certain distributed computing opportunities. The Environment is designed to be transparent and supports multiple protocols. Networking at Goddard has a short history and has been under coordinated control of a Network Steering Committee for slightly more than two years; network growth has been rapid with more than 1500 nodes currently addressed and greater expansion expected. A new RF cable system with a different topology is being installed during summer 1989; consideration of a fiber optics system for the future will begin soon. Summmer study was directed toward Network Steering Committee operation and planning plus consideration of Center Network Environment analysis and modeling. Biweekly Steering Committee meetings were attended to learn the background of the network and the concerns of those managing it. Suggestions for historical data gathering have been made to support future planning and modeling. Data Systems Dynamic Simulator, a simulation package developed at NASA and maintained at GSFC was studied as a possible modeling tool for the network environment. A modeling concept based on a hierarchical model was hypothesized for further development. Such a model would allow input of newly updated parameters and would provide an estimation of the behavior of the network.

  18. Advanced surface design for logistics analysis

    NASA Astrophysics Data System (ADS)

    Brown, Tim R.; Hansen, Scott D.

    The development of anthropometric arm/hand and tool models and their manipulation in a large system model for maintenance simulation are discussed. The use of Advanced Surface Design and s-fig technology in anthropometrics, and three-dimensional graphics simulation tools, are found to achieve a good balance between model manipulation speed and model accuracy. The present second generation models are shown to be twice as fast to manipulate as the first generation b-surf models, to be easier to manipulate into various configurations, and to more closely approximate human contours.

  19. Advanced tracking systems design and analysis

    NASA Technical Reports Server (NTRS)

    Potash, R.; Floyd, L.; Jacobsen, A.; Cunningham, K.; Kapoor, A.; Kwadrat, C.; Radel, J.; Mccarthy, J.

    1989-01-01

    The results of an assessment of several types of high-accuracy tracking systems proposed to track the spacecraft in the National Aeronautics and Space Administration (NASA) Advanced Tracking and Data Relay Satellite System (ATDRSS) are summarized. Tracking systems based on the use of interferometry and ranging are investigated. For each system, the top-level system design and operations concept are provided. A comparative system assessment is presented in terms of orbit determination performance, ATDRSS impacts, life-cycle cost, and technological risk.

  20. A Computational Future for Preventing HIV in Minority Communities: How Advanced Technology Can Improve Implementation of Effective Programs

    PubMed Central

    Brown, C Hendricks; Mohr, David C.; Gallo, Carlos G.; Mader, Christopher; Palinkas, Lawrence; Wingood, Gina; Prado, Guillermo; Kellam, Sheppard G.; Pantin, Hilda; Poduska, Jeanne; Gibbons, Robert; McManus, John; Ogihara, Mitsunori; Valente, Thomas; Wulczyn, Fred; Czaja, Sara; Sutcliffe, Geoff; Villamar, Juan; Jacobs, Christopher

    2013-01-01

    African Americans and Hispanics in the U.S. have much higher rates of HIV than non-minorities. There is now strong evidence that a range of behavioral interventions are efficacious in reducing sexual risk behavior in these populations. While a handful of these programs are just beginning to be disseminated widely, we still have not implemented effective programs to a level that would reduce the population incidence of HIV for minorities. We propose that innovative approaches involving computational technologies be explored for their use in both developing new interventions as well as in supporting wide-scale implementation of effective behavioral interventions. Mobile technologies have a place in both of these activities. First, mobile technologies can be used in sensing contexts and interacting to the unique preferences and needs of individuals at times where intervention to reduce risk would be most impactful. Secondly, mobile technologies can be used to improve the delivery of interventions by facilitators and their agencies. Systems science methods, including social network analysis, agent based models, computational linguistics, intelligent data analysis, and systems and software engineering all have strategic roles that can bring about advances in HIV prevention in minority communities. Using an existing mobile technology for depression and three effective HIV prevention programs, we illustrate how eight areas in the intervention/implementation process can use innovative computational approaches to advance intervention adoption, fidelity, and sustainability. PMID:23673892

  1. Good relationships between computational image analysis and radiological physics

    SciTech Connect

    Arimura, Hidetaka; Kamezawa, Hidemi; Jin, Ze; Nakamoto, Takahiro; Soufi, Mazen

    2015-09-30

    Good relationships between computational image analysis and radiological physics have been constructed for increasing the accuracy of medical diagnostic imaging and radiation therapy in radiological physics. Computational image analysis has been established based on applied mathematics, physics, and engineering. This review paper will introduce how computational image analysis is useful in radiation therapy with respect to radiological physics.

  2. Good relationships between computational image analysis and radiological physics

    NASA Astrophysics Data System (ADS)

    Arimura, Hidetaka; Kamezawa, Hidemi; Jin, Ze; Nakamoto, Takahiro; Soufi, Mazen

    2015-09-01

    Good relationships between computational image analysis and radiological physics have been constructed for increasing the accuracy of medical diagnostic imaging and radiation therapy in radiological physics. Computational image analysis has been established based on applied mathematics, physics, and engineering. This review paper will introduce how computational image analysis is useful in radiation therapy with respect to radiological physics.

  3. Study of flutter related computational procedures for minimum weight structural sizing of advanced aircraft

    NASA Technical Reports Server (NTRS)

    Oconnell, R. F.; Hassig, H. J.; Radovcich, N. A.

    1976-01-01

    Results of a study of the development of flutter modules applicable to automated structural design of advanced aircraft configurations, such as a supersonic transport, are presented. Automated structural design is restricted to automated sizing of the elements of a given structural model. It includes a flutter optimization procedure; i.e., a procedure for arriving at a structure with minimum mass for satisfying flutter constraints. Methods of solving the flutter equation and computing the generalized aerodynamic force coefficients in the repetitive analysis environment of a flutter optimization procedure are studied, and recommended approaches are presented. Five approaches to flutter optimization are explained in detail and compared. An approach to flutter optimization incorporating some of the methods discussed is presented. Problems related to flutter optimization in a realistic design environment are discussed and an integrated approach to the entire flutter task is presented. Recommendations for further investigations are made. Results of numerical evaluations, applying the five methods of flutter optimization to the same design task, are presented.

  4. IMPROVED COMPUTATIONAL NEUTRONICS METHODS AND VALIDATION PROTOCOLS FOR THE ADVANCED TEST REACTOR

    SciTech Connect

    David W. Nigg; Joseph W. Nielsen; Benjamin M. Chase; Ronnie K. Murray; Kevin A. Steuhm

    2012-04-01

    The Idaho National Laboratory (INL) is in the process of modernizing the various reactor physics modeling and simulation tools used to support operation and safety assurance of the Advanced Test Reactor (ATR). Key accomplishments so far have encompassed both computational as well as experimental work. A new suite of stochastic and deterministic transport theory based reactor physics codes and their supporting nuclear data libraries (HELIOS, KENO6/SCALE, NEWT/SCALE, ATTILA, and an extended implementation of MCNP5) has been installed at the INL. Corresponding models of the ATR and ATRC are now operational with all five codes, demonstrating the basic feasibility of the new code packages for their intended purpose. Of particular importance, a set of as-run core depletion HELIOS calculations for all ATR cycles since August 2009 was successfully completed during 2011. This demonstration supported a decision late in the year to proceed with the phased incorporation of the HELIOS methodology into the ATR fuel cycle management process beginning in 2012. On the experimental side of the project, new hardware was fabricated, measurement protocols were finalized, and the first four of six planned physics code validation experiments based on neutron activation spectrometry were conducted at the ATRC facility. Data analysis for the first three experiments, focused on characterization of the neutron spectrum in one of the ATR flux traps, has been completed. The six experiments will ultimately form the basis for a flexible, easily-repeatable ATR physics code validation protocol that is consistent with applicable ASTM standards.

  5. Improved computational neutronics methods and validation protocols for the advanced test reactor

    SciTech Connect

    Nigg, D. W.; Nielsen, J. W.; Chase, B. M.; Murray, R. K.; Steuhm, K. A.; Unruh, T.

    2012-07-01

    The Idaho National Laboratory (INL) is in the process of updating the various reactor physics modeling and simulation tools used to support operation and safety assurance of the Advanced Test Reactor (ATR). Key accomplishments so far have encompassed both computational as well as experimental work. A new suite of stochastic and deterministic transport theory based reactor physics codes and their supporting nuclear data libraries (HELIOS, KENO6/SCALE, NEWT/SCALE, ATTILA, and an extended implementation of MCNP5) has been installed at the INL. Corresponding models of the ATR and ATRC are now operational with all five codes, demonstrating the basic feasibility of the new code packages for their intended purposes. On the experimental side of the project, new hardware was fabricated, measurement protocols were finalized, and the first four of six planned physics code validation experiments based on neutron activation spectrometry have been conducted at the ATRC facility. Data analysis for the first three experiments, focused on characterization of the neutron spectrum in one of the ATR flux traps, has been completed. The six experiments will ultimately form the basis for flexible and repeatable ATR physics code validation protocols that are consistent with applicable national standards. (authors)

  6. Advances in computer-aided design and computer-aided manufacture technology.

    PubMed

    Calamia, J R

    1996-01-01

    Although the development of computer-aided design (CAD) and computer-aided manufacture (CAM) technology and the benefits of increased productivity became obvious in the automobile and aerospace industries in the 1970s, investigations of this technology's application in the field of dentistry did not begin until the 1980s. Only now are we beginning to see the fruits of this work with the commercial availability of some systems; the potential for this technology seems boundless. This article reviews the recent literature with emphasis on the period from June 1992 to May 1993. This review should familiarize the reader with some of the latest developments in this technology, including a brief description of some systems currently available and the clinical and economical rationale for their acceptance into the dental mainstream. This article concentrates on a particular system, the Cerec (Siemens/Pelton and Crane, Charlotte, NC) system, for three reasons: First, this system has been available since 1985 and, as a result, has a track record of almost 7 years of data. Most of the data have just recently been released and consequently, much of this year's literature on CAD-CAM is monopolized by studies using this system. Second, this system was developed as a mobile, affordable, direct chairside CAD-CAM restorative method. As such, it is of special interest to the patient, providing a one-visit restoration. Third, the author is currently engaged in research using this particular system and has a working knowledge of this system's capabilities. PMID:9462062

  7. Recent Advances in Anthocyanin Analysis and Characterization

    PubMed Central

    Welch, Cara R.; Wu, Qingli; Simon, James E.

    2009-01-01

    Anthocyanins are a class of polyphenols responsible for the orange, red, purple and blue colors of many fruits, vegetables, grains, flowers and other plants. Consumption of anthocyanins has been linked as protective agents against many chronic diseases and possesses strong antioxidant properties leading to a variety of health benefits. In this review, we examine the advances in the chemical profiling of natural anthocyanins in plant and biological matrices using various chromatographic separations (HPLC and CE) coupled with different detection systems (UV, MS and NMR). An overview of anthocyanin chemistry, prevalence in plants, biosynthesis and metabolism, bioactivities and health properties, sample preparation and phytochemical investigations are discussed while the major focus examines the comparative advantages and disadvantages of each analytical technique. PMID:19946465

  8. Crashworthiness analysis using advanced material models in DYNA3D

    SciTech Connect

    Logan, R.W.; Burger, M.J.; McMichael, L.D.; Parkinson, R.D.

    1993-10-22

    As part of an electric vehicle consortium, LLNL and Kaiser Aluminum are conducting experimental and numerical studies on crashworthy aluminum spaceframe designs. They have jointly explored the effect of heat treat on crush behavior and duplicated the experimental behavior with finite-element simulations. The major technical contributions to the state of the art in numerical simulation arise from the development and use of advanced material model descriptions for LLNL`s DYNA3D code. Constitutive model enhancements in both flow and failure have been employed for conventional materials such as low-carbon steels, and also for lighter weight materials such as aluminum and fiber composites being considered for future vehicles. The constitutive model enhancements are developed as extensions from LLNL`s work in anisotropic flow and multiaxial failure modeling. Analysis quality as a function of level of simplification of material behavior and mesh is explored, as well as the penalty in computation cost that must be paid for using more complex models and meshes. The lightweight material modeling technology is being used at the vehicle component level to explore the safety implications of small neighborhood electric vehicles manufactured almost exclusively from these materials.

  9. G-computation demonstration in causal mediation analysis.

    PubMed

    Wang, Aolin; Arah, Onyebuchi A

    2015-10-01

    Recent work has considerably advanced the definition, identification and estimation of controlled direct, and natural direct and indirect effects in causal mediation analysis. Despite the various estimation methods and statistical routines being developed, a unified approach for effect estimation under different effect decomposition scenarios is still needed for epidemiologic research. G-computation offers such unification and has been used for total effect and joint controlled direct effect estimation settings, involving different types of exposure and outcome variables. In this study, we demonstrate the utility of parametric g-computation in estimating various components of the total effect, including (1) natural direct and indirect effects, (2) standard and stochastic controlled direct effects, and (3) reference and mediated interaction effects, using Monte Carlo simulations in standard statistical software. For each study subject, we estimated their nested potential outcomes corresponding to the (mediated) effects of an intervention on the exposure wherein the mediator was allowed to attain the value it would have under a possible counterfactual exposure intervention, under a pre-specified distribution of the mediator independent of any causes, or under a fixed controlled value. A final regression of the potential outcome on the exposure intervention variable was used to compute point estimates and bootstrap was used to obtain confidence intervals. Through contrasting different potential outcomes, this analytical framework provides an intuitive way of estimating effects under the recently introduced 3- and 4-way effect decomposition. This framework can be extended to complex multivariable and longitudinal mediation settings. PMID:26537707

  10. Distributed Design and Analysis of Computer Experiments

    SciTech Connect

    Doak, Justin

    2002-11-11

    DDACE is a C++ object-oriented software library for the design and analysis of computer experiments. DDACE can be used to generate samples from a variety of sampling techniques. These samples may be used as input to a application code. DDACE also contains statistical tools such as response surface models and correlation coefficients to analyze input/output relationships between variables in an application code. DDACE can generate input values for uncertain variables within a user's application. For example, a user might like to vary a temperature variable as well as some material variables in a series of simulations. Through the series of simulations the user might be looking for optimal settings of parameters based on some user criteria. Or the user may be interested in the sensitivity to input variability shown by an output variable. In either case, the user may provide information about the suspected ranges and distributions of a set of input variables, along with a sampling scheme, and DDACE will generate input points based on these specifications. The input values generated by DDACE and the one or more outputs computed through the user's application code can be analyzed with a variety of statistical methods. This can lead to a wealth of information about the relationships between the variables in the problem. While statistical and mathematical packages may be employeed to carry out the analysis on the input/output relationships, DDACE also contains some tools for analyzing the simulation data. DDACE incorporates a software package called MARS (Multivariate Adaptive Regression Splines), developed by Jerome Friedman. MARS is used for generating a spline surface fit of the data. With MARS, a model simplification may be calculated using the input and corresponding output, values for the user's application problem. The MARS grid data may be used for generating 3-dimensional response surface plots of the simulation data. DDACE also contains an implementation of an

  11. Distributed Design and Analysis of Computer Experiments

    2002-11-11

    DDACE is a C++ object-oriented software library for the design and analysis of computer experiments. DDACE can be used to generate samples from a variety of sampling techniques. These samples may be used as input to a application code. DDACE also contains statistical tools such as response surface models and correlation coefficients to analyze input/output relationships between variables in an application code. DDACE can generate input values for uncertain variables within a user's application. Formore » example, a user might like to vary a temperature variable as well as some material variables in a series of simulations. Through the series of simulations the user might be looking for optimal settings of parameters based on some user criteria. Or the user may be interested in the sensitivity to input variability shown by an output variable. In either case, the user may provide information about the suspected ranges and distributions of a set of input variables, along with a sampling scheme, and DDACE will generate input points based on these specifications. The input values generated by DDACE and the one or more outputs computed through the user's application code can be analyzed with a variety of statistical methods. This can lead to a wealth of information about the relationships between the variables in the problem. While statistical and mathematical packages may be employeed to carry out the analysis on the input/output relationships, DDACE also contains some tools for analyzing the simulation data. DDACE incorporates a software package called MARS (Multivariate Adaptive Regression Splines), developed by Jerome Friedman. MARS is used for generating a spline surface fit of the data. With MARS, a model simplification may be calculated using the input and corresponding output, values for the user's application problem. The MARS grid data may be used for generating 3-dimensional response surface plots of the simulation data. DDACE also contains an implementation

  12. Advances in computer-aided design and computer-aided manufacture technology.

    PubMed

    Calamia, J R

    1994-01-01

    Although the development of computer-aided design (CAD) and computer-aided manufacture (CAM) technology and the benefits of increased productivity became obvious in the automobile and aerospace industries in the 1970s, investigations of this technology's application in the field of dentistry did not begin until the 1980s. Only now are we beginning to see the fruits of this work with the commercial availability of some systems; the potential for this technology seems boundless. This article reviews the recent literature with emphasis on the period from June 1992 to May 1993. This review should familiarize the reader with some of the latest developments in this technology, including a brief description of some systems currently available and the clinical and economical rationale for their acceptance into the dental mainstream. This article concentrates on a particular system, the Cerec (Siemens/Pelton and Crane, Charlotte, NC) system, for three reasons: first, this system has been available since 1985 and, as a result, has a track record of almost 7 years of data. Most of the data have just recently been released and consequently, much of this year's literature on CAD-CAM is monopolized by studies using this system. Second, this system was developed as a mobile, affordable, direct chairside CAD-CAM restorative method. As such, it is of special interest to the dentist who will offer this new technology directly to the patient, providing a one-visit restoration. Third, the author is currently engaged in research using this particular system and has a working knowledge of this system's capabilities. PMID:8032444

  13. Analysis of an advanced technology subsonic turbofan incorporating revolutionary materials

    NASA Technical Reports Server (NTRS)

    Knip, Gerald, Jr.

    1987-01-01

    Successful implementation of revolutionary composite materials in an advanced turbofan offers the possibility of further improvements in engine performance and thrust-to-weight ratio relative to current metallic materials. The present analysis determines the approximate engine cycle and configuration for an early 21st century subsonic turbofan incorporating all composite materials. The advanced engine is evaluated relative to a current technology baseline engine in terms of its potential fuel savings for an intercontinental quadjet having a design range of 5500 nmi and a payload of 500 passengers. The resultant near optimum, uncooled, two-spool, advanced engine has an overall pressure ratio of 87, a bypass ratio of 18, a geared fan, and a turbine rotor inlet temperature of 3085 R. Improvements result in a 33-percent fuel saving for the specified misssion. Various advanced composite materials are used throughout the engine. For example, advanced polymer composite materials are used for the fan and the low pressure compressor (LPC).

  14. An advanced approach for computer modeling and prototyping of the human tooth.

    PubMed

    Chang, Kuang-Hua; Magdum, Sheetalkumar; Khera, Satish C; Goel, Vijay K

    2003-05-01

    This paper presents a systematic and practical method for constructing accurate computer and physical models that can be employed for the study of human tooth mechanics. The proposed method starts with a histological section preparation of a human tooth. Through tracing outlines of the tooth on the sections, discrete points are obtained and are employed to construct B-spline curves that represent the exterior contours and dentino-enamel junction (DEJ) of the tooth using a least square curve fitting technique. The surface skinning technique is then employed to quilt the B-spline curves to create a smooth boundary and DEJ of the tooth using B-spline surfaces. These surfaces are respectively imported into SolidWorks via its application protocol interface to create solid models. The solid models are then imported into Pro/MECHANICA Structure for finite element analysis (FEA). The major advantage of the proposed method is that it first generates smooth solid models, instead of finite element models in discretized form. As a result, a more advanced p-FEA can be employed for structural analysis, which usually provides superior results to traditional h-FEA. In addition, the solid model constructed is smooth and can be fabricated with various scales using the solid freeform fabrication technology. This method is especially useful in supporting bioengineering applications, where the shape of the object is usually complicated. A human maxillary second molar is presented to illustrate and demonstrate the proposed method. Note that both the solid and p-FEA models of the molar are presented. However, comparison between p- and h-FEA models is out of the scope of the paper. PMID:12757205

  15. Advanced Satellite Research Project: SCAR Research Database. Bibliographic analysis

    NASA Technical Reports Server (NTRS)

    Pelton, Joseph N.

    1991-01-01

    The literature search was provided to locate and analyze the most recent literature that was relevant to the research. This was done by cross-relating books, articles, monographs, and journals that relate to the following topics: (1) Experimental Systems - Advanced Communications Technology Satellite (ACTS), and (2) Integrated System Digital Network (ISDN) and Advance Communication Techniques (ISDN and satellites, ISDN standards, broadband ISDN, flame relay and switching, computer networks and satellites, satellite orbits and technology, satellite transmission quality, and network configuration). Bibliographic essay on literature citations and articles reviewed during the literature search task is provided.

  16. Probabilistic Computational Methods in Structural Failure Analysis

    NASA Astrophysics Data System (ADS)

    Krejsa, Martin; Kralik, Juraj

    2015-12-01

    Probabilistic methods are used in engineering where a computational model contains random variables. Each random variable in the probabilistic calculations contains uncertainties. Typical sources of uncertainties are properties of the material and production and/or assembly inaccuracies in the geometry or the environment where the structure should be located. The paper is focused on methods for the calculations of failure probabilities in structural failure and reliability analysis with special attention on newly developed probabilistic method: Direct Optimized Probabilistic Calculation (DOProC), which is highly efficient in terms of calculation time and the accuracy of the solution. The novelty of the proposed method lies in an optimized numerical integration that does not require any simulation technique. The algorithm has been implemented in mentioned software applications, and has been used several times in probabilistic tasks and probabilistic reliability assessments.

  17. Computed tomographic analysis of meteorite inclusions

    NASA Technical Reports Server (NTRS)

    Arnold, J. R.; Testa, J. P., Jr.; Friedman, P. J.; Kambic, G. X.

    1983-01-01

    The feasibility of obtaining nondestructively a cross-sectional display of very dense heterogeneous rocky specimens, whether lunar, terrestrial or meteoritic, by using a fourth generation computed tomographic (CT) scanner, with modifications to the software only, is discussed. A description of the scanner, and of the experimental and analytical procedures is given. Using this technique, the interior of heterogeneous materials such as Allende can be probed nondestructively. The regions of material with high and low atomic numbers are displayed quickly; the object can then be cut to obtain for analysis just the areas of interest. A comparison of this technique with conventional industrial and medical techniques is made in terms of image resolution and density distribution display precision.

  18. Computational based functional analysis of Bacillus phytases.

    PubMed

    Verma, Anukriti; Singh, Vinay Kumar; Gaur, Smriti

    2016-02-01

    Phytase is an enzyme which catalyzes the total hydrolysis of phytate to less phosphorylated myo-inositol derivatives and inorganic phosphate and digests the undigestable phytate part present in seeds and grains and therefore provides digestible phosphorus, calcium and other mineral nutrients. Phytases are frequently added to the feed of monogastric animals so that bioavailability of phytic acid-bound phosphate increases, ultimately enhancing the nutritional value of diets. The Bacillus phytase is very suitable to be used in animal feed because of its optimum pH with excellent thermal stability. Present study is aimed to perform an in silico comparative characterization and functional analysis of phytases from Bacillus amyloliquefaciens to explore physico-chemical properties using various bio-computational tools. All proteins are acidic and thermostable and can be used as suitable candidates in the feed industry. PMID:26672917

  19. Review of Computational Stirling Analysis Methods

    NASA Technical Reports Server (NTRS)

    Dyson, Rodger W.; Wilson, Scott D.; Tew, Roy C.

    2004-01-01

    Nuclear thermal to electric power conversion carries the promise of longer duration missions and higher scientific data transmission rates back to Earth for both Mars rovers and deep space missions. A free-piston Stirling convertor is a candidate technology that is considered an efficient and reliable power conversion device for such purposes. While already very efficient, it is believed that better Stirling engines can be developed if the losses inherent its current designs could be better understood. However, they are difficult to instrument and so efforts are underway to simulate a complete Stirling engine numerically. This has only recently been attempted and a review of the methods leading up to and including such computational analysis is presented. And finally it is proposed that the quality and depth of Stirling loss understanding may be improved by utilizing the higher fidelity and efficiency of recently developed numerical methods. One such method, the Ultra HI-Fl technique is presented in detail.

  20. Development of advanced structural analysis methodologies for predicting widespread fatigue damage in aircraft structures

    NASA Technical Reports Server (NTRS)

    Harris, Charles E.; Starnes, James H., Jr.; Newman, James C., Jr.

    1995-01-01

    NASA is developing a 'tool box' that includes a number of advanced structural analysis computer codes which, taken together, represent the comprehensive fracture mechanics capability required to predict the onset of widespread fatigue damage. These structural analysis tools have complementary and specialized capabilities ranging from a finite-element-based stress-analysis code for two- and three-dimensional built-up structures with cracks to a fatigue and fracture analysis code that uses stress-intensity factors and material-property data found in 'look-up' tables or from equations. NASA is conducting critical experiments necessary to verify the predictive capabilities of the codes, and these tests represent a first step in the technology-validation and industry-acceptance processes. NASA has established cooperative programs with aircraft manufacturers to facilitate the comprehensive transfer of this technology by making these advanced structural analysis codes available to industry.

  1. DOE Advanced Scientific Computing Advisory Subcommittee (ASCAC) Report: Top Ten Exascale Research Challenges

    SciTech Connect

    Lucas, Robert; Ang, James; Bergman, Keren; Borkar, Shekhar; Carlson, William; Carrington, Laura; Chiu, George; Colwell, Robert; Dally, William; Dongarra, Jack; Geist, Al; Haring, Rud; Hittinger, Jeffrey; Hoisie, Adolfy; Klein, Dean Micron; Kogge, Peter; Lethin, Richard; Sarkar, Vivek; Schreiber, Robert; Shalf, John; Sterling, Thomas; Stevens, Rick; Bashor, Jon; Brightwell, Ron; Coteus, Paul; Debenedictus, Erik; Hiller, Jon; Kim, K. H.; Langston, Harper; Murphy, Richard Micron; Webster, Clayton; Wild, Stefan; Grider, Gary; Ross, Rob; Leyffer, Sven; Laros III, James

    2014-02-10

    Exascale computing systems are essential for the scientific fields that will transform the 21st century global economy, including energy, biotechnology, nanotechnology, and materials science. Progress in these fields is predicated on the ability to perform advanced scientific and engineering simulations, and analyze the deluge of data. On July 29, 2013, ASCAC was charged by Patricia Dehmer, the Acting Director of the Office of Science, to assemble a subcommittee to provide advice on exascale computing. This subcommittee was directed to return a list of no more than ten technical approaches (hardware and software) that will enable the development of a system that achieves the Department's goals for exascale computing. Numerous reports over the past few years have documented the technical challenges and the non¬-viability of simply scaling existing computer designs to reach exascale. The technical challenges revolve around energy consumption, memory performance, resilience, extreme concurrency, and big data. Drawing from these reports and more recent experience, this ASCAC subcommittee has identified the top ten computing technology advancements that are critical to making a capable, economically viable, exascale system.

  2. A comparison of computer architectures for the NASA demonstration advanced avionics system

    NASA Technical Reports Server (NTRS)

    Seacord, C. L.; Bailey, D. G.; Larson, J. C.

    1979-01-01

    The paper compares computer architectures for the NASA demonstration advanced avionics system. Two computer architectures are described with an unusual approach to fault tolerance: a single spare processor can correct for faults in any of the distributed processors by taking on the role of a failed module. It was shown the system must be used from a functional point of view to properly apply redundancy and achieve fault tolerance and ultra reliability. Data are presented on complexity and mission failure probability which show that the revised version offers equivalent mission reliability at lower cost as measured by hardware and software complexity.

  3. AIR INGRESS ANALYSIS: COMPUTATIONAL FLUID DYNAMIC MODELS

    SciTech Connect

    Chang H. Oh; Eung S. Kim; Richard Schultz; Hans Gougar; David Petti; Hyung S. Kang

    2010-08-01

    The Idaho National Laboratory (INL), under the auspices of the U.S. Department of Energy, is performing research and development that focuses on key phenomena important during potential scenarios that may occur in very high temperature reactors (VHTRs). Phenomena Identification and Ranking Studies to date have ranked an air ingress event, following on the heels of a VHTR depressurization, as important with regard to core safety. Consequently, the development of advanced air ingress-related models and verification and validation data are a very high priority. Following a loss of coolant and system depressurization incident, air will enter the core of the High Temperature Gas Cooled Reactor through the break, possibly causing oxidation of the in-the core and reflector graphite structure. Simple core and plant models indicate that, under certain circumstances, the oxidation may proceed at an elevated rate with additional heat generated from the oxidation reaction itself. Under postulated conditions of fluid flow and temperature, excessive degradation of the lower plenum graphite can lead to a loss of structural support. Excessive oxidation of core graphite can also lead to the release of fission products into the confinement, which could be detrimental to a reactor safety. Computational fluid dynamic model developed in this study will improve our understanding of this phenomenon. This paper presents two-dimensional and three-dimensional CFD results for the quantitative assessment of the air ingress phenomena. A portion of results of the density-driven stratified flow in the inlet pipe will be compared with results of the experimental results.

  4. Computational analysis of irradiation facilities at the JSI TRIGA reactor.

    PubMed

    Snoj, Luka; Zerovnik, Gašper; Trkov, Andrej

    2012-03-01

    Characterization and optimization of irradiation facilities in a research reactor is important for optimal performance. Nowadays this is commonly done with advanced Monte Carlo neutron transport computer codes such as MCNP. However, the computational model in such calculations should be verified and validated with experiments. In the paper we describe the irradiation facilities at the JSI TRIGA reactor and demonstrate their computational characterization to support experimental campaigns by providing information on the characteristics of the irradiation facilities. PMID:22154389

  5. High Performance Computing Modeling Advances Accelerator Science for High-Energy Physics

    SciTech Connect

    Amundson, James; Macridin, Alexandru; Spentzouris, Panagiotis

    2014-07-28

    The development and optimization of particle accelerators are essential for advancing our understanding of the properties of matter, energy, space, and time. Particle accelerators are complex devices whose behavior involves many physical effects on multiple scales. Therefore, advanced computational tools utilizing high-performance computing are essential for accurately modeling them. In the past decade, the US Department of Energy's SciDAC program has produced accelerator-modeling tools that have been employed to tackle some of the most difficult accelerator science problems. The authors discuss the Synergia framework and its applications to high-intensity particle accelerator physics. Synergia is an accelerator simulation package capable of handling the entire spectrum of beam dynamics simulations. Our authors present Synergia's design principles and its performance on HPC platforms.

  6. High Performance Computing Modeling Advances Accelerator Science for High-Energy Physics

    DOE PAGESBeta

    Amundson, James; Macridin, Alexandru; Spentzouris, Panagiotis

    2014-07-28

    The development and optimization of particle accelerators are essential for advancing our understanding of the properties of matter, energy, space, and time. Particle accelerators are complex devices whose behavior involves many physical effects on multiple scales. Therefore, advanced computational tools utilizing high-performance computing are essential for accurately modeling them. In the past decade, the US Department of Energy's SciDAC program has produced accelerator-modeling tools that have been employed to tackle some of the most difficult accelerator science problems. The authors discuss the Synergia framework and its applications to high-intensity particle accelerator physics. Synergia is an accelerator simulation package capable ofmore » handling the entire spectrum of beam dynamics simulations. Our authors present Synergia's design principles and its performance on HPC platforms.« less

  7. Robotics, Stem Cells and Brain Computer Interfaces in Rehabilitation and Recovery from Stroke; Updates and Advances

    PubMed Central

    Boninger, Michael L; Wechsler, Lawrence R.; Stein, Joel

    2014-01-01

    Objective To describe the current state and latest advances in robotics, stem cells, and brain computer interfaces in rehabilitation and recovery for stroke. Design The authors of this summary recently reviewed this work as part of a national presentation. The paper represents the information included in each area. Results Each area has seen great advances and challenges as products move to market and experiments are ongoing. Conclusion Robotics, stem cells, and brain computer interfaces all have tremendous potential to reduce disability and lead to better outcomes for patients with stroke. Continued research and investment will be needed as the field moves forward. With this investment, the potential for recovery of function is likely substantial PMID:25313662

  8. Computational methods in the prediction of advanced subsonic and supersonic propeller induced noise: ASSPIN users' manual

    NASA Technical Reports Server (NTRS)

    Dunn, M. H.; Tarkenton, G. M.

    1992-01-01

    This document describes the computational aspects of propeller noise prediction in the time domain and the use of high speed propeller noise prediction program ASSPIN (Advanced Subsonic and Supersonic Propeller Induced Noise). These formulations are valid in both the near and far fields. Two formulations are utilized by ASSPIN: (1) one is used for subsonic portions of the propeller blade; and (2) the second is used for transonic and supersonic regions on the blade. Switching between the two formulations is done automatically. ASSPIN incorporates advanced blade geometry and surface pressure modelling, adaptive observer time grid strategies, and contains enhanced numerical algorithms that result in reduced computational time. In addition, the ability to treat the nonaxial inflow case has been included.

  9. Graphical and Normative Analysis of Binocular Vision by Mini Computer: A Teaching Aid and Clinical Tool.

    ERIC Educational Resources Information Center

    Kees, Martin; Schor, Clifton

    1981-01-01

    An inexpensive computer graphics systems (Commodore PET), used as a video aid for teaching students advanced case analysis, is described. The course provides students with the analytical tools for evaluating with graphical and statistical techniques and treating with lenses, prisms, and orthoptics various anomalies of binocular vision. (MLW)

  10. Recent Advances in Cardiac Computed Tomography: Dual Energy, Spectral and Molecular CT Imaging

    PubMed Central

    Danad, Ibrahim; Fayad, Zahi A.; Willemink, Martin J.; Min, James K.

    2015-01-01

    Computed tomography (CT) evolved into a powerful diagnostic tool and it is impossible to imagine current clinical practice without CT imaging. Due to its widespread availability, ease of clinical application, superb sensitivity for detection of CAD, and non-invasive nature, CT has become a valuable tool within the armamentarium of the cardiologist. In the last few years, numerous technological advances in CT have occurred—including dual energy CT (DECT), spectral CT and CT-based molecular imaging. By harnessing the advances in technology, cardiac CT has advanced beyond the mere evaluation of coronary stenosis to an imaging modality tool that permits accurate plaque characterization, assessment of myocardial perfusion and even probing of molecular processes that are involved in coronary atherosclerosis. Novel innovations in CT contrast agents and pre-clinical spectral CT devices have paved the way for CT-based molecular imaging. PMID:26068288

  11. First 3 years of operation of RIACS (Research Institute for Advanced Computer Science) (1983-1985). Final Report

    SciTech Connect

    Denning, P.J.

    1986-04-01

    The focus of the Research Institute for Advanced Computer Science (RIACS) is to explore matches between advanced computing architectures and the processes of scientific research. An architecture evaluation of the MIT static dataflow machine, specification of a graphical language for expressing distributed computations, and specification of an expert system for aiding in grid generation for two-dimensional flow problems was initiated. Research projects for 1984 and 1985 are summarized.

  12. Computer architecture for efficient algorithmic executions in real-time systems: New technology for avionics systems and advanced space vehicles

    NASA Technical Reports Server (NTRS)

    Carroll, Chester C.; Youngblood, John N.; Saha, Aindam

    1987-01-01

    Improvements and advances in the development of computer architecture now provide innovative technology for the recasting of traditional sequential solutions into high-performance, low-cost, parallel system to increase system performance. Research conducted in development of specialized computer architecture for the algorithmic execution of an avionics system, guidance and control problem in real time is described. A comprehensive treatment of both the hardware and software structures of a customized computer which performs real-time computation of guidance commands with updated estimates of target motion and time-to-go is presented. An optimal, real-time allocation algorithm was developed which maps the algorithmic tasks onto the processing elements. This allocation is based on the critical path analysis. The final stage is the design and development of the hardware structures suitable for the efficient execution of the allocated task graph. The processing element is designed for rapid execution of the allocated tasks. Fault tolerance is a key feature of the overall architecture. Parallel numerical integration techniques, tasks definitions, and allocation algorithms are discussed. The parallel implementation is analytically verified and the experimental results are presented. The design of the data-driven computer architecture, customized for the execution of the particular algorithm, is discussed.

  13. Computer architecture for efficient algorithmic executions in real-time systems: new technology for avionics systems and advanced space vehicles

    SciTech Connect

    Carroll, C.C.; Youngblood, J.N.; Saha, A.

    1987-12-01

    Improvements and advances in the development of computer architecture now provide innovative technology for the recasting of traditional sequential solutions into high-performance, low-cost, parallel system to increase system performance. Research conducted in development of specialized computer architecture for the algorithmic execution of an avionics system, guidance and control problem in real time is described. A comprehensive treatment of both the hardware and software structures of a customized computer which performs real-time computation of guidance commands with updated estimates of target motion and time-to-go is presented. An optimal, real-time allocation algorithm was developed which maps the algorithmic tasks onto the processing elements. This allocation is based on the critical path analysis. The final stage is the design and development of the hardware structures suitable for the efficient execution of the allocated task graph. The processing element is designed for rapid execution of the allocated tasks. Fault tolerance is a key feature of the overall architecture. Parallel numerical integration techniques, tasks definitions, and allocation algorithms are discussed. The parallel implementation is analytically verified and the experimental results are presented. The design of the data-driven computer architecture, customized for the execution of the particular algorithm, is discussed.

  14. Advanced methods for the computation of particle beam transport and the computation of electromagnetic fields and beam-cavity interactions. Final report

    SciTech Connect

    Dragt, A.J.; Gluckstern, R.L.

    1992-11-01

    The University of Maryland Dynamical Systems and Accelerator Theory Group carries out research in two broad areas: the computation of charged particle beam transport using Lie algebraic methods and advanced methods for the computation of electromagnetic fields and beam-cavity interactions. Important improvements in the state of the art are believed to be possible in both of these areas. In addition, applications of these methods are made to problems of current interest in accelerator physics including the theoretical performance of present and proposed high energy machines. The Lie algebraic method of computing and analyzing beam transport handles both linear and nonlinear beam elements. Tests show this method to be superior to the earlier matrix or numerical integration methods. It has wide application to many areas including accelerator physics, intense particle beams, ion microprobes, high resolution electron microscopy, and light optics. With regard to the area of electromagnetic fields and beam cavity interactions, work is carried out on the theory of beam breakup in single pulses. Work is also done on the analysis of the high frequency behavior of longitudinal and transverse coupling impedances, including the examination of methods which may be used to measure these impedances. Finally, work is performed on the electromagnetic analysis of coupled cavities and on the coupling of cavities to waveguides.

  15. Progress in Advanced Spectral Analysis of Radioxenon

    SciTech Connect

    Haas, Derek A.; Schrom, Brian T.; Cooper, Matthew W.; Ely, James H.; Flory, Adam E.; Hayes, James C.; Heimbigner, Tom R.; McIntyre, Justin I.; Saunders, Danielle L.; Suckow, Thomas J.

    2010-09-21

    Improvements to a Java based software package developed at Pacific Northwest National Laboratory (PNNL) for display and analysis of radioxenon spectra acquired by the International Monitoring System (IMS) are described here. The current version of the Radioxenon JavaViewer implements the region of interest (ROI) method for analysis of beta-gamma coincidence data. Upgrades to the Radioxenon JavaViewer will include routines to analyze high-purity germanium detector (HPGe) data, Standard Spectrum Method to analyze beta-gamma coincidence data and calibration routines to characterize beta-gamma coincidence detectors. These upgrades are currently under development; the status and initial results will be presented. Implementation of these routines into the JavaViewer and subsequent release is planned for FY 2011-2012.

  16. Recent advances in statistical energy analysis

    NASA Technical Reports Server (NTRS)

    Heron, K. H.

    1992-01-01

    Statistical Energy Analysis (SEA) has traditionally been developed using modal summation and averaging approach, and has led to the need for many restrictive SEA assumptions. The assumption of 'weak coupling' is particularly unacceptable when attempts are made to apply SEA to structural coupling. It is now believed that this assumption is more a function of the modal formulation rather than a necessary formulation of SEA. The present analysis ignores this restriction and describes a wave approach to the calculation of plate-plate coupling loss factors. Predictions based on this method are compared with results obtained from experiments using point excitation on one side of an irregular six-sided box structure. Conclusions show that the use and calculation of infinite transmission coefficients is the way forward for the development of a purely predictive SEA code.

  17. Advanced CMOS Radiation Effects Testing Analysis

    NASA Technical Reports Server (NTRS)

    Pellish, Jonathan Allen; Marshall, Paul W.; Rodbell, Kenneth P.; Gordon, Michael S.; LaBel, Kenneth A.; Schwank, James R.; Dodds, Nathaniel A.; Castaneda, Carlos M.; Berg, Melanie D.; Kim, Hak S.; Phan, Anthony M.; Seidleck, Christina M.

    2014-01-01

    Presentation at the annual NASA Electronic Parts and Packaging (NEPP) Program Electronic Technology Workshop (ETW). The material includes an update of progress in this NEPP task area over the past year, which includes testing, evaluation, and analysis of radiation effects data on the IBM 32 nm silicon-on-insulator (SOI) complementary metal oxide semiconductor (CMOS) process. The testing was conducted using test vehicles supplied by directly by IBM.

  18. Advanced CMOS Radiation Effects Testing and Analysis

    NASA Technical Reports Server (NTRS)

    Pellish, J. A.; Marshall, P. W.; Rodbell, K. P.; Gordon, M. S.; LaBel, K. A.; Schwank, J. R.; Dodds, N. A.; Castaneda, C. M.; Berg, M. D.; Kim, H. S.; Phan, A. M.; Seidleck, C. M.

    2014-01-01

    Presentation at the annual NASA Electronic Parts and Packaging (NEPP) Program Electronic Technology Workshop (ETW). The material includes an update of progress in this NEPP task area over the past year, which includes testing, evaluation, and analysis of radiation effects data on the IBM 32 nm silicon-on-insulator (SOI) complementary metal oxide semiconductor (CMOS) process. The testing was conducted using test vehicles supplied by directly by IBM.

  19. Low-latency analysis pipeline for compact binary coalescences in the advanced gravitational wave detector era

    NASA Astrophysics Data System (ADS)

    Adams, T.; Buskulic, D.; Germain, V.; Guidi, G. M.; Marion, F.; Montani, M.; Mours, B.; Piergiovanni, F.; Wang, G.

    2016-09-01

    The multi-band template analysis (MBTA) pipeline is a low-latency coincident analysis pipeline for the detection of gravitational waves (GWs) from compact binary coalescences. MBTA runs with a low computational cost, and can identify candidate GW events online with a sub-minute latency. The low computational running cost of MBTA also makes it useful for data quality studies. Events detected by MBTA online can be used to alert astronomical partners for electromagnetic follow-up. We outline the current status of MBTA and give details of recent pipeline upgrades and validation tests that were performed in preparation for the first advanced detector observing period. The MBTA pipeline is ready for the outset of the advanced detector era and the exciting prospects it will bring.

  20. Recent advances in 3D computed tomography techniques for simulation and navigation in hepatobiliary pancreatic surgery.

    PubMed

    Uchida, Masafumi

    2014-04-01

    A few years ago it could take several hours to complete a 3D image using a 3D workstation. Thanks to advances in computer science, obtaining results of interest now requires only a few minutes. Many recent 3D workstations or multimedia computers are equipped with onboard 3D virtual patient modeling software, which enables patient-specific preoperative assessment and virtual planning, navigation, and tool positioning. Although medical 3D imaging can now be conducted using various modalities, including computed tomography (CT), magnetic resonance imaging (MRI), positron emission tomography (PET), and ultrasonography (US) among others, the highest quality images are obtained using CT data, and CT images are now the most commonly used source of data for 3D simulation and navigation image. If the 2D source image is bad, no amount of 3D image manipulation in software will provide a quality 3D image. In this exhibition, the recent advances in CT imaging technique and 3D visualization of the hepatobiliary and pancreatic abnormalities are featured, including scan and image reconstruction technique, contrast-enhanced techniques, new application of advanced CT scan techniques, and new virtual reality simulation and navigation imaging. PMID:24464989

  1. Computational analysis of bacterial RNA-Seq data

    PubMed Central

    McClure, Ryan; Balasubramanian, Divya; Sun, Yan; Bobrovskyy, Maksym; Sumby, Paul; Genco, Caroline A.; Vanderpool, Carin K.; Tjaden, Brian

    2013-01-01

    Recent advances in high-throughput RNA sequencing (RNA-seq) have enabled tremendous leaps forward in our understanding of bacterial transcriptomes. However, computational methods for analysis of bacterial transcriptome data have not kept pace with the large and growing data sets generated by RNA-seq technology. Here, we present new algorithms, specific to bacterial gene structures and transcriptomes, for analysis of RNA-seq data. The algorithms are implemented in an open source software system called Rockhopper that supports various stages of bacterial RNA-seq data analysis, including aligning sequencing reads to a genome, constructing transcriptome maps, quantifying transcript abundance, testing for differential gene expression, determining operon structures and visualizing results. We demonstrate the performance of Rockhopper using 2.1 billion sequenced reads from 75 RNA-seq experiments conducted with Escherichia coli, Neisseria gonorrhoeae, Salmonella enterica, Streptococcus pyogenes and Xenorhabdus nematophila. We find that the transcriptome maps generated by our algorithms are highly accurate when compared with focused experimental data from E. coli and N. gonorrhoeae, and we validate our system’s ability to identify novel small RNAs, operons and transcription start sites. Our results suggest that Rockhopper can be used for efficient and accurate analysis of bacterial RNA-seq data, and that it can aid with elucidation of bacterial transcriptomes. PMID:23716638

  2. Advances in Analysis of Longitudinal Data

    PubMed Central

    Gibbons, Robert D.; Hedeker, Donald; DuToit, Stephen

    2010-01-01

    In this review, we explore recent developments in the area of linear and nonlinear generalized mixed-effects regression models and various alternatives, including generalized estimating equations for analysis of longitudinal data. Methods are described for continuous and normally distributed as well as categorical (binary, ordinal, nominal) and count (Poisson) variables. Extensions of the model to three and four levels of clustering, multivariate outcomes, and incorporation of design weights are also described. Linear and nonlinear models are illustrated using an example involving a study of the relationship between mood and smoking. PMID:20192796

  3. Analysis on the security of cloud computing

    NASA Astrophysics Data System (ADS)

    He, Zhonglin; He, Yuhua

    2011-02-01

    Cloud computing is a new technology, which is the fusion of computer technology and Internet development. It will lead the revolution of IT and information field. However, in cloud computing data and application software is stored at large data centers, and the management of data and service is not completely trustable, resulting in safety problems, which is the difficult point to improve the quality of cloud service. This paper briefly introduces the concept of cloud computing. Considering the characteristics of cloud computing, it constructs the security architecture of cloud computing. At the same time, with an eye toward the security threats cloud computing faces, several corresponding strategies are provided from the aspect of cloud computing users and service providers.

  4. Computer Analysis of a Physical Pendulum.

    ERIC Educational Resources Information Center

    Priest, Joseph; Potts, Larry

    1990-01-01

    The interfacing of a physical pendulum to an Apple IIe computer and the physic instruction associated with it are discussed. Laboratory procedures, software commands, and computations used in this lesson are described. (CW)

  5. Advanced Orion Optimized Laser System Analysis

    NASA Technical Reports Server (NTRS)

    1996-01-01

    Contractor shall perform a complete analysis of the potential of the solid state laser in the very long pulse mode (100 ns pulse width, 10-30 hz rep-rate) and in the very short pulse mode (100 ps pulse width 10-30 hz rep rate) concentrating on the operation of the device in the 'hot-rod' mode, where no active cooling the laser operation is attempted. Contractor's calculations shall be made of the phase aberrations which develop during the repped-pulse train, and the results shall feed into the adaptive optics analyses. The contractor shall devise solutions to work around ORION track issues. A final report shall be furnished to the MSFC COTR including all calculations and analysis of estimates of bulk phase and intensity aberration distribution in the laser output beam as a function of time during the repped-pulse train for both wave forms (high-energy/long-pulse, as well as low-energy/short-pulse). Recommendations shall be made for mitigating the aberrations by laser re-design and/or changes in operating parameters of optical pump sources and/or designs.

  6. Performance Analysis of Cloud Computing Architectures Using Discrete Event Simulation

    NASA Technical Reports Server (NTRS)

    Stocker, John C.; Golomb, Andrew M.

    2011-01-01

    Cloud computing offers the economic benefit of on-demand resource allocation to meet changing enterprise computing needs. However, the flexibility of cloud computing is disadvantaged when compared to traditional hosting in providing predictable application and service performance. Cloud computing relies on resource scheduling in a virtualized network-centric server environment, which makes static performance analysis infeasible. We developed a discrete event simulation model to evaluate the overall effectiveness of organizations in executing their workflow in traditional and cloud computing architectures. The two part model framework characterizes both the demand using a probability distribution for each type of service request as well as enterprise computing resource constraints. Our simulations provide quantitative analysis to design and provision computing architectures that maximize overall mission effectiveness. We share our analysis of key resource constraints in cloud computing architectures and findings on the appropriateness of cloud computing in various applications.

  7. ADVANCING THE FUNDAMENTAL UNDERSTANDING AND SCALE-UP OF TRISO FUEL COATERS VIA ADVANCED MEASUREMENT AND COMPUTATIONAL TECHNIQUES

    SciTech Connect

    Biswas, Pratim; Al-Dahhan, Muthanna

    2012-11-01

    to advance the fundamental understanding of the hydrodynamics by systematically investigating the effect of design and operating variables, to evaluate the reported dimensionless groups as scaling factors, and to establish a reliable scale-up methodology for the TRISO fuel particle spouted bed coaters based on hydrodynamic similarity via advanced measurement and computational techniques. An additional objective is to develop an on-line non-invasive measurement technique based on gamma ray densitometry (i.e. Nuclear Gauge Densitometry) that can be installed and used for coater process monitoring to ensure proper performance and operation and to facilitate the developed scale-up methodology. To achieve the objectives set for the project, the work will use optical probes and gamma ray computed tomography (CT) (for the measurements of solids/voidage holdup cross-sectional distribution and radial profiles along the bed height, spouted diameter, and fountain height) and radioactive particle tracking (RPT) (for the measurements of the 3D solids flow field, velocity, turbulent parameters, circulation time, solids lagrangian trajectories, and many other of spouted bed related hydrodynamic parameters). In addition, gas dynamic measurement techniques and pressure transducers will be utilized to complement the obtained information. The measurements obtained by these techniques will be used as benchmark data to evaluate and validate the computational fluid dynamic (CFD) models (two fluid model or discrete particle model) and their closures. The validated CFD models and closures will be used to facilitate the developed methodology for scale-up, design and hydrodynamic similarity. Successful execution of this work and the proposed tasks will advance the fundamental understanding of the coater flow field and quantify it for proper and safe design, scale-up, and performance. Such achievements will overcome the barriers to AGR applications and will help assure that the US maintains

  8. Computing in Qualitative Analysis: A Healthy Development?

    ERIC Educational Resources Information Center

    Richards, Lyn; Richards, Tom

    1991-01-01

    Discusses the potential impact of computers in qualitative health research. Describes the original goals, design, and implementation of NUDIST, a qualitative computing software. Argues for evaluation of the impact of computer techniques and for an opening of debate among program developers and users to address the purposes and power of computing…

  9. Value analysis for advanced technology products

    NASA Astrophysics Data System (ADS)

    Soulliere, Mark

    2011-03-01

    Technology by itself can be wondrous, but buyers of technology factor in the price they have to pay along with performance in their decisions. As a result, the ``best'' technology may not always win in the marketplace when ``good enough'' can be had at a lower price. Technology vendors often set pricing by ``cost plus margin,'' or by competitors' offerings. What if the product is new (or has yet to be invented)? Value pricing is a methodology to price products based on the value generated (e.g. money saved) by using one product vs. the next best technical alternative. Value analysis can often clarify what product attributes generate the most value. It can also assist in identifying market forces outside of the control of the technology vendor that also influence pricing. These principles are illustrated with examples.

  10. Advanced stability analysis for laminar flow control

    NASA Technical Reports Server (NTRS)

    Orszag, S. A.

    1981-01-01

    Five classes of problems are addressed: (1) the extension of the SALLY stability analysis code to the full eighth order compressible stability equations for three dimensional boundary layer; (2) a comparison of methods for prediction of transition using SALLY for incompressible flows; (3) a study of instability and transition in rotating disk flows in which the effects of Coriolis forces and streamline curvature are included; (4) a new linear three dimensional instability mechanism that predicts Reynolds numbers for transition to turbulence in planar shear flows in good agreement with experiment; and (5) a study of the stability of finite amplitude disturbances in axisymmetric pipe flow showing the stability of this flow to all nonlinear axisymmetric disturbances.

  11. Performance analysis of advanced spacecraft TPS

    NASA Technical Reports Server (NTRS)

    Pitts, William C.

    1987-01-01

    The analysis on the feasibility for using metal hydrides in the thermal protection system of cryogenic tanks in space was based on the heat capacity of ice as the phase change material (PCM). It was found that with ice the thermal protection system weight could be reduced by, at most, about 20 percent over an all LI-900 insulation. For this concept to be viable, a metal hydride with considerably more capacity than water would be required. None were found. Special metal hydrides were developed for hydrogen fuel storage applications and it may be possible to do so for the current application. Until this appears promising further effort on this feasibility study does not seem warranted.

  12. User's manual for a computer program to calculate discrete frequency noise of conventional and advanced propellers

    NASA Technical Reports Server (NTRS)

    Martin, R. M.; Farassat, F.

    1981-01-01

    A user's manual is presented for a computer program for the calculation of discrete frequency noise of conventional and advanced propellers. The structure of the program and the subroutines describing the input functions are discussed. Input variables and their default values and the variables in the output data sheet are defined. Two versions of the program are available. These differ only in the graphic output capability. One version has only printed output capability. A second version with extensive graphic output capability is available for the computer system at Langley. This Manual includes four detailed examples of both the printed and graphic outputs. These examples may be reproduced by users to check their code on their computer system.

  13. CRADA ORNL 91-0046B final report: Assessment of IBM advanced computing architectures

    SciTech Connect

    Geist, G.A.

    1996-02-01

    This was a Cooperative Research and Development Agreement (CRADA) with IBM to assess their advanced computer architectures. Over the course of this project three different architectures were evaluated. The POWER/4 RIOS1 based shared memory multiprocessor, the POWER/2 RIOS2 based high performance workstation, and the J30 PowerPC based shared memory multiprocessor. In addition to this hardware several software packages where beta tested for IBM including: ESSO scientific computing library, nv video-conferencing package, Ultimedia multimedia display environment, FORTRAN 90 and C++ compilers, and the AIX 4.1 operating system. Both IBM and ORNL benefited from the research performed in this project and even though access to the POWER/4 computer was delayed several months, all milestones were met.

  14. In-Service Design & Performance Prediction of Advanced Fusion Material Systems by Computational Modeling and Simulation

    SciTech Connect

    G. R. Odette; G. E. Lucas

    2005-11-15

    This final report on "In-Service Design & Performance Prediction of Advanced Fusion Material Systems by Computational Modeling and Simulation" (DE-FG03-01ER54632) consists of a series of summaries of work that has been published, or presented at meetings, or both. It briefly describes results on the following topics: 1) A Transport and Fate Model for Helium and Helium Management; 2) Atomistic Studies of Point Defect Energetics, Dynamics and Interactions; 3) Multiscale Modeling of Fracture consisting of: 3a) A Micromechanical Model of the Master Curve (MC) Universal Fracture Toughness-Temperature Curve Relation, KJc(T - To), 3b) An Embrittlement DTo Prediction Model for the Irradiation Hardening Dominated Regime, 3c) Non-hardening Irradiation Assisted Thermal and Helium Embrittlement of 8Cr Tempered Martensitic Steels: Compilation and Analysis of Existing Data, 3d) A Model for the KJc(T) of a High Strength NFA MA957, 3e) Cracked Body Size and Geometry Effects of Measured and Effective Fracture Toughness-Model Based MC and To Evaluations of F82H and Eurofer 97, 3-f) Size and Geometry Effects on the Effective Toughness of Cracked Fusion Structures; 4) Modeling the Multiscale Mechanics of Flow Localization-Ductility Loss in Irradiation Damaged BCC Alloys; and 5) A Universal Relation Between Indentation Hardness and True Stress-Strain Constitutive Behavior. Further details can be found in the cited references or presentations that generally can be accessed on the internet, or provided upon request to the authors. Finally, it is noted that this effort was integrated with our base program in fusion materials, also funded by the DOE OFES.

  15. Ferrofluids: Modeling, numerical analysis, and scientific computation

    NASA Astrophysics Data System (ADS)

    Tomas, Ignacio

    This dissertation presents some developments in the Numerical Analysis of Partial Differential Equations (PDEs) describing the behavior of ferrofluids. The most widely accepted PDE model for ferrofluids is the Micropolar model proposed by R.E. Rosensweig. The Micropolar Navier-Stokes Equations (MNSE) is a subsystem of PDEs within the Rosensweig model. Being a simplified version of the much bigger system of PDEs proposed by Rosensweig, the MNSE are a natural starting point of this thesis. The MNSE couple linear velocity u, angular velocity w, and pressure p. We propose and analyze a first-order semi-implicit fully-discrete scheme for the MNSE, which decouples the computation of the linear and angular velocities, is unconditionally stable and delivers optimal convergence rates under assumptions analogous to those used for the Navier-Stokes equations. Moving onto the much more complex Rosensweig's model, we provide a definition (approximation) for the effective magnetizing field h, and explain the assumptions behind this definition. Unlike previous definitions available in the literature, this new definition is able to accommodate the effect of external magnetic fields. Using this definition we setup the system of PDEs coupling linear velocity u, pressure p, angular velocity w, magnetization m, and magnetic potential ϕ We show that this system is energy-stable and devise a numerical scheme that mimics the same stability property. We prove that solutions of the numerical scheme always exist and, under certain simplifying assumptions, that the discrete solutions converge. A notable outcome of the analysis of the numerical scheme for the Rosensweig's model is the choice of finite element spaces that allow the construction of an energy-stable scheme. Finally, with the lessons learned from Rosensweig's model, we develop a diffuse-interface model describing the behavior of two-phase ferrofluid flows and present an energy-stable numerical scheme for this model. For a

  16. TAIR- TRANSONIC AIRFOIL ANALYSIS COMPUTER CODE

    NASA Technical Reports Server (NTRS)

    Dougherty, F. C.

    1994-01-01

    The Transonic Airfoil analysis computer code, TAIR, was developed to employ a fast, fully implicit algorithm to solve the conservative full-potential equation for the steady transonic flow field about an arbitrary airfoil immersed in a subsonic free stream. The full-potential formulation is considered exact under the assumptions of irrotational, isentropic, and inviscid flow. These assumptions are valid for a wide range of practical transonic flows typical of modern aircraft cruise conditions. The primary features of TAIR include: a new fully implicit iteration scheme which is typically many times faster than classical successive line overrelaxation algorithms; a new, reliable artifical density spatial differencing scheme treating the conservative form of the full-potential equation; and a numerical mapping procedure capable of generating curvilinear, body-fitted finite-difference grids about arbitrary airfoil geometries. Three aspects emphasized during the development of the TAIR code were reliability, simplicity, and speed. The reliability of TAIR comes from two sources: the new algorithm employed and the implementation of effective convergence monitoring logic. TAIR achieves ease of use by employing a "default mode" that greatly simplifies code operation, especially by inexperienced users, and many useful options including: several airfoil-geometry input options, flexible user controls over program output, and a multiple solution capability. The speed of the TAIR code is attributed to the new algorithm and the manner in which it has been implemented. Input to the TAIR program consists of airfoil coordinates, aerodynamic and flow-field convergence parameters, and geometric and grid convergence parameters. The airfoil coordinates for many airfoil shapes can be generated in TAIR from just a few input parameters. Most of the other input parameters have default values which allow the user to run an analysis in the default mode by specifing only a few input parameters

  17. Advanced space system analysis software. Technical, user, and programmer guide

    NASA Technical Reports Server (NTRS)

    Farrell, C. E.; Zimbelman, H. F.

    1981-01-01

    The LASS computer program provides a tool for interactive preliminary and conceptual design of LSS. Eight program modules were developed, including four automated model geometry generators, an associated mass properties module, an appendage synthesizer module, an rf analysis module, and an orbital transfer analysis module. The existing rigid body controls analysis module was modified to permit analysis of effects of solar pressure on orbital performance. A description of each module, user instructions, and programmer information are included.

  18. Advanced analysis techniques for uranium assay

    SciTech Connect

    Geist, W. H.; Ensslin, Norbert; Carrillo, L. A.; Beard, C. A.

    2001-01-01

    Uranium has a negligible passive neutron emission rate making its assay practicable only with an active interrogation method. The active interrogation uses external neutron sources to induce fission events in the uranium in order to determine the mass. This technique requires careful calibration with standards that are representative of the items to be assayed. The samples to be measured are not always well represented by the available standards which often leads to large biases. A technique of active multiplicity counting is being developed to reduce some of these assay difficulties. Active multiplicity counting uses the measured doubles and triples count rates to determine the neutron multiplication (f4) and the product of the source-sample coupling ( C ) and the 235U mass (m). Since the 35U mass always appears in the multiplicity equations as the product of Cm, the coupling needs to be determined before the mass can be known. A relationship has been developed that relates the coupling to the neutron multiplication. The relationship is based on both an analytical derivation and also on empirical observations. To determine a scaling constant present in this relationship, known standards must be used. Evaluation of experimental data revealed an improvement over the traditional calibration curve analysis method of fitting the doubles count rate to the 235Um ass. Active multiplicity assay appears to relax the requirement that the calibration standards and unknown items have the same chemical form and geometry.

  19. Multispectral laser imaging for advanced food analysis

    NASA Astrophysics Data System (ADS)

    Senni, L.; Burrascano, P.; Ricci, M.

    2016-07-01

    A hardware-software apparatus for food inspection capable of realizing multispectral NIR laser imaging at four different wavelengths is herein discussed. The system was designed to operate in a through-transmission configuration to detect the presence of unwanted foreign bodies inside samples, whether packed or unpacked. A modified Lock-In technique was employed to counterbalance the significant signal intensity attenuation due to transmission across the sample and to extract the multispectral information more efficiently. The NIR laser wavelengths used to acquire the multispectral images can be varied to deal with different materials and to focus on specific aspects. In the present work the wavelengths were selected after a preliminary analysis to enhance the image contrast between foreign bodies and food in the sample, thus identifying the location and nature of the defects. Experimental results obtained from several specimens, with and without packaging, are presented and the multispectral image processing as well as the achievable spatial resolution of the system are discussed.

  20. Advances in carbonate exploration and reservoir analysis

    USGS Publications Warehouse

    Garland, J.; Neilson, J.; Laubach, S.E.; Whidden, Katherine J.

    2012-01-01

    The development of innovative techniques and concepts, and the emergence of new plays in carbonate rocks are creating a resurgence of oil and gas discoveries worldwide. The maturity of a basin and the application of exploration concepts have a fundamental influence on exploration strategies. Exploration success often occurs in underexplored basins by applying existing established geological concepts. This approach is commonly undertaken when new basins ‘open up’ owing to previous political upheavals. The strategy of using new techniques in a proven mature area is particularly appropriate when dealing with unconventional resources (heavy oil, bitumen, stranded gas), while the application of new play concepts (such as lacustrine carbonates) to new areas (i.e. ultra-deep South Atlantic basins) epitomizes frontier exploration. Many low-matrix-porosity hydrocarbon reservoirs are productive because permeability is controlled by fractures and faults. Understanding basic fracture properties is critical in reducing geological risk and therefore reducing well costs and increasing well recovery. The advent of resource plays in carbonate rocks, and the long-standing recognition of naturally fractured carbonate reservoirs means that new fracture and fault analysis and prediction techniques and concepts are essential.

  1. Research in applied mathematics, numerical analysis, and computer science

    NASA Technical Reports Server (NTRS)

    1984-01-01

    Research conducted at the Institute for Computer Applications in Science and Engineering (ICASE) in applied mathematics, numerical analysis, and computer science is summarized and abstracts of published reports are presented. The major categories of the ICASE research program are: (1) numerical methods, with particular emphasis on the development and analysis of basic numerical algorithms; (2) control and parameter identification; (3) computational problems in engineering and the physical sciences, particularly fluid dynamics, acoustics, and structural analysis; and (4) computer systems and software, especially vector and parallel computers.

  2. New computing systems and their impact on structural analysis and design

    NASA Technical Reports Server (NTRS)

    Noor, Ahmed K.

    1989-01-01

    A review is given of the recent advances in computer technology that are likely to impact structural analysis and design. The computational needs for future structures technology are described. The characteristics of new and projected computing systems are summarized. Advances in programming environments, numerical algorithms, and computational strategies for new computing systems are reviewed, and a novel partitioning strategy is outlined for maximizing the degree of parallelism. The strategy is designed for computers with a shared memory and a small number of powerful processors (or a small number of clusters of medium-range processors). It is based on approximating the response of the structure by a combination of symmetric and antisymmetric response vectors, each obtained using a fraction of the degrees of freedom of the original finite element model. The strategy was implemented on the CRAY X-MP/4 and the Alliant FX/8 computers. For nonlinear dynamic problems on the CRAY X-MP with four CPUs, it resulted in an order of magnitude reduction in total analysis time, compared with the direct analysis on a single-CPU CRAY X-MP machine.

  3. Shell stability analysis in a computer aided engineering (CAE) environment

    NASA Technical Reports Server (NTRS)

    Arbocz, J.; Hol, J. M. A. M.

    1993-01-01

    The development of 'DISDECO', the Delft Interactive Shell DEsign COde is described. The purpose of this project is to make the accumulated theoretical, numerical and practical knowledge of the last 25 years or so readily accessible to users interested in the analysis of buckling sensitive structures. With this open ended, hierarchical, interactive computer code the user can access from his workstation successively programs of increasing complexity. The computational modules currently operational in DISDECO provide the prospective user with facilities to calculate the critical buckling loads of stiffened anisotropic shells under combined loading, to investigate the effects the various types of boundary conditions will have on the critical load, and to get a complete picture of the degrading effects the different shapes of possible initial imperfections might cause, all in one interactive session. Once a design is finalized, its collapse load can be verified by running a large refined model remotely from behind the workstation with one of the current generation 2-dimensional codes, with advanced capabilities to handle both geometric and material nonlinearities.

  4. Computational method for analysis of polyethylene biodegradation

    NASA Astrophysics Data System (ADS)

    Watanabe, Masaji; Kawai, Fusako; Shibata, Masaru; Yokoyama, Shigeo; Sudate, Yasuhiro

    2003-12-01

    In a previous study concerning the biodegradation of polyethylene, we proposed a mathematical model based on two primary factors: the direct consumption or absorption of small molecules and the successive weight loss of large molecules due to β-oxidation. Our model is an initial value problem consisting of a differential equation whose independent variable is time. Its unknown variable represents the total weight of all the polyethylene molecules that belong to a molecular-weight class specified by a parameter. In this paper, we describe a numerical technique to introduce experimental results into analysis of our model. We first establish its mathematical foundation in order to guarantee its validity, by showing that the initial value problem associated with the differential equation has a unique solution. Our computational technique is based on a linear system of differential equations derived from the original problem. We introduce some numerical results to illustrate our technique as a practical application of the linear approximation. In particular, we show how to solve the inverse problem to determine the consumption rate and the β-oxidation rate numerically, and illustrate our numerical technique by analyzing the GPC patterns of polyethylene wax obtained before and after 5 weeks cultivation of a fungus, Aspergillus sp. AK-3. A numerical simulation based on these degradation rates confirms that the primary factors of the polyethylene biodegradation posed in modeling are indeed appropriate.

  5. [Computational genome analysis of three marine algoviruses].

    PubMed

    Stepanova, O A; Boĭko, A L; Shcherbatenko, I S

    2013-01-01

    Computational analysis of genomic sequences of three new marine algoviruses: Tetraselmis viridis virus (TvV-S20 and TvV-SI1 strains) and Dunaliella viridis virus (DvV-SI2 strain) was conducted. Both considerable similarity and essential distinctions between studied strains and the most studied marine algoviruses of Phycodnaviridae family were revealed. Our data show that the tested strains are new viruses with the following features: only they were isolated from marine eukaryotic microalgae T. viridis and D. viridis, coding sequences (CDSs) of their genomes are localized mainly on one of the DNA strands and form several clusters with short intergenic spaces; there are considerable variations in genome structure within viruses and their strains; viral genomic DNA has a high GC-content (55.5 - 67.4%); their genes contain no well-known optimal contexts of translation start codones, and the contexts of terminal codons read-through; the vast majority of viral genes and proteins do not have any matches in gene banks. PMID:24479317

  6. Computer-aided petrographic analysis of sandstones

    SciTech Connect

    Thayer, P.A.; Helmold, K.P.

    1987-05-01

    Thin-section point counting, mathematical and statistical analysis of petrographic-petrophysical data, report generation, and graphical presentation of results can be done efficiently by computer. Compositional and textural data are collected with a modified Schares point-counting system. The system uses an MS-DOS microcomputer programmed in BASIC to drive a motorized stage attached to a polarizing microscope. Numeric codes for up to 500 different categories of minerals, cements, pores, etc, are input using a separate keypad. Calculation and printing of constituent percentages, QFR, Folk name, and grain-size distribution are completed in seconds after data entry. Raw data files, compatible with software such as Lotus 1-2-3, SPSS, and SAS, are stored on floppy disk. Petrographic data files are transferred directly to a mainframe, merged with log and petrophysical data, analyzed statistically with SAS, and reports generated. SAS/GRAPH and TELL-A-GRAF routines linked with SAS generate a variety of cross plots, histograms, pie and bar charts, ternary diagrams, and vertical variation diagrams (e.g., depth vs. porosity, permeability, mean size, sorting, and percent grains-matrix-cement).

  7. Advanced Simulation and Computing FY10-11 Implementation Plan Volume 2, Rev. 0

    SciTech Connect

    Carnes, B

    2009-06-08

    The Stockpile Stewardship Program (SSP) is a single, highly integrated technical program for maintaining the surety and reliability of the U.S. nuclear stockpile. The SSP uses past nuclear test data along with current and future non-nuclear test data, computational modeling and simulation, and experimental facilities to advance understanding of nuclear weapons. It includes stockpile surveillance, experimental research, development and engineering programs, and an appropriately scaled production capability to support stockpile requirements. This integrated national program requires the continued use of current facilities and programs along with new experimental facilities and computational enhancements to support these programs. The Advanced Simulation and Computing Program (ASC) is a cornerstone of the SSP, providing simulation capabilities and computational resources to support the annual stockpile assessment and certification, to study advanced nuclear weapons design and manufacturing processes, to analyze accident scenarios and weapons aging, and to provide the tools to enable stockpile Life Extension Programs (LEPs) and the resolution of Significant Finding Investigations (SFIs). This requires a balanced resource, including technical staff, hardware, simulation software, and computer science solutions. In its first decade, the ASC strategy focused on demonstrating simulation capabilities of unprecedented scale in three spatial dimensions. In its second decade, ASC is focused on increasing its predictive capabilities in a three-dimensional simulation environment while maintaining support to the SSP. The program continues to improve its unique tools for solving progressively more difficult stockpile problems (focused on sufficient resolution, dimensionality and scientific details); to quantify critical margins and uncertainties (QMU); and to resolve increasingly difficult analyses needed for the SSP. Moreover, ASC has restructured its business model from one that

  8. Advanced Simulation & Computing FY09-FY10 Implementation Plan Volume 2, Rev. 0

    SciTech Connect

    Meisner, R; Perry, J; McCoy, M; Hopson, J

    2008-04-30

    The Stockpile Stewardship Program (SSP) is a single, highly integrated technical program for maintaining the safety and reliability of the U.S. nuclear stockpile. The SSP uses past nuclear test data along with current and future nonnuclear test data, computational modeling and simulation, and experimental facilities to advance understanding of nuclear weapons. It includes stockpile surveillance, experimental research, development and engineering programs, and an appropriately scaled production capability to support stockpile requirements. This integrated national program requires the continued use of current facilities and programs along with new experimental facilities and computational enhancements to support these programs. The Advanced Simulation and Computing Program (ASC)1 is a cornerstone of the SSP, providing simulation capabilities and computational resources to support the annual stockpile assessment and certification, to study advanced nuclear-weapons design and manufacturing processes, to analyze accident scenarios and weapons aging, and to provide the tools to enable Stockpile Life Extension Programs (SLEPs) and the resolution of Significant Finding Investigations (SFIs). This requires a balanced resource, including technical staff, hardware, simulation software, and computer science solutions. In its first decade, the ASC strategy focused on demonstrating simulation capabilities of unprecedented scale in three spatial dimensions. In its second decade, ASC is focused on increasing its predictive capabilities in a three-dimensional simulation environment while maintaining the support to the SSP. The program continues to improve its unique tools for solving progressively more difficult stockpile problems (focused on sufficient resolution, dimensionality and scientific details); to quantify critical margins and uncertainties (QMU); and to resolve increasingly difficult analyses needed for the SSP. Moreover, ASC has restructured its business model from one

  9. Advanced Simulation and Computing Fiscal Year 2011-2012 Implementation Plan, Revision 0

    SciTech Connect

    McCoy, M; Phillips, J; Hpson, J; Meisner, R

    2010-04-22

    The Stockpile Stewardship Program (SSP) is a single, highly integrated technical program for maintaining the surety and reliability of the U.S. nuclear stockpile. The SSP uses past nuclear test data along with current and future non-nuclear test data, computational modeling and simulation, and experimental facilities to advance understanding of nuclear weapons. It includes stockpile surveillance, experimental research, development and engineering (D&E) programs, and an appropriately scaled production capability to support stockpile requirements. This integrated national program requires the continued use of current facilities and programs along with new experimental facilities and computational enhancements to support these programs. The Advanced Simulation and Computing Program (ASC) is a cornerstone of the SSP, providing simulation capabilities and computational resources to support the annual stockpile assessment and certification, to study advanced nuclear weapons design and manufacturing processes, to analyze accident scenarios and weapons aging, and to provide the tools to enable stockpile Life Extension Programs (LEPs) and the resolution of Significant Finding Investigations (SFIs). This requires a balanced resource, including technical staff, hardware, simulation software, and computer science solutions. In its first decade, the ASC strategy focused on demonstrating simulation capabilities of unprecedented scale in three spatial dimensions. In its second decade, ASC is focused on increasing its predictive capabilities in a three-dimensional (3D) simulation environment while maintaining support to the SSP. The program continues to improve its unique tools for solving progressively more difficult stockpile problems (focused on sufficient resolution, dimensionality and scientific details); to quantify critical margins and uncertainties (QMU); and to resolve increasingly difficult analyses needed for the SSP. Moreover, ASC has restructured its business model

  10. Advanced Simulation and Computing FY08-09 Implementation Plan Volume 2 Revision 0

    SciTech Connect

    McCoy, M; Kusnezov, D; Bikkel, T; Hopson, J

    2007-04-25

    The Stockpile Stewardship Program (SSP) is a single, highly integrated technical program for maintaining the safety and reliability of the U.S. nuclear stockpile. The SSP uses past nuclear test data along with current and future nonnuclear test data, computational modeling and simulation, and experimental facilities to advance understanding of nuclear weapons. It includes stockpile surveillance, experimental research, development and engineering programs, and an appropriately scaled production capability to support stockpile requirements. This integrated national program requires the continued use of current facilities and programs along with new experimental facilities and computational enhancements to support these programs. The Advanced Simulation and Computing Program (ASC) is a cornerstone of the SSP, providing simulation capabilities and computational resources to support the annual stockpile assessment and certification, to study advanced nuclear-weapons design and manufacturing processes, to analyze accident scenarios and weapons aging, and to provide the tools to enable Stockpile Life Extension Programs (SLEPs) and the resolution of Significant Finding Investigations (SFIs). This requires a balanced resource, including technical staff, hardware, simulation software, and computer science solutions. In its first decade, the ASC strategy focused on demonstrating simulation capabilities of unprecedented scale in three spatial dimensions. In its second decade, ASC is focused on increasing its predictive capabilities in a three-dimensional simulation environment while maintaining the support to the SSP. The program continues to improve its unique tools for solving progressively more difficult stockpile problems (focused on sufficient resolution, dimensionality and scientific details); to quantify critical margins and uncertainties (QMU); and to resolve increasingly difficult analyses needed for the SSP. Moreover, ASC has restructured its business model from one

  11. Advanced Simulation and Computing FY09-FY10 Implementation Plan Volume 2, Rev. 1

    SciTech Connect

    Kissel, L

    2009-04-01

    The Stockpile Stewardship Program (SSP) is a single, highly integrated technical program for maintaining the surety and reliability of the U.S. nuclear stockpile. The SSP uses past nuclear test data along with current and future non-nuclear test data, computational modeling and simulation, and experimental facilities to advance understanding of nuclear weapons. It includes stockpile surveillance, experimental research, development and engineering programs, and an appropriately scaled production capability to support stockpile requirements. This integrated national program requires the continued use of current facilities and programs along with new experimental facilities and computational enhancements to support these programs. The Advanced Simulation and Computing Program (ASC) is a cornerstone of the SSP, providing simulation capabilities and computational resources to support the annual stockpile assessment and certification, to study advanced nuclear weapons design and manufacturing processes, to analyze accident scenarios and weapons aging, and to provide the tools to enable stockpile Life Extension Programs (LEPs) and the resolution of Significant Finding Investigations (SFIs). This requires a balanced resource, including technical staff, hardware, simulation software, and computer science solutions. In its first decade, the ASC strategy focused on demonstrating simulation capabilities of unprecedented scale in three spatial dimensions. In its second decade, ASC is focused on increasing its predictive capabilities in a three-dimensional simulation environment while maintaining support to the SSP. The program continues to improve its unique tools for solving progressively more difficult stockpile problems (focused on sufficient resolution, dimensionality and scientific details); to quantify critical margins and uncertainties (QMU); and to resolve increasingly difficult analyses needed for the SSP. Moreover, ASC has restructured its business model from one that

  12. Advanced Simulation and Computing FY09-FY10 Implementation Plan, Volume 2, Revision 0.5

    SciTech Connect

    Meisner, R; Hopson, J; Peery, J; McCoy, M

    2008-10-07

    The Stockpile Stewardship Program (SSP) is a single, highly integrated technical program for maintaining the surety and reliability of the U.S. nuclear stockpile. The SSP uses past nuclear test data along with current and future non-nuclear test data, computational modeling and simulation, and experimental facilities to advance understanding of nuclear weapons. It includes stockpile surveillance, experimental research, development and engineering programs, and an appropriately scaled production capability to support stockpile requirements. This integrated national program requires the continued use of current facilities and programs along with new experimental facilities and computational enhancements to support these programs. The Advanced Simulation and Computing Program (ASC)1 is a cornerstone of the SSP, providing simulation capabilities and computational resources to support the annual stockpile assessment and certification, to study advanced nuclear weapons design and manufacturing processes, to analyze accident scenarios and weapons aging, and to provide the tools to enable stockpile Life Extension Programs (LEPs) and the resolution of Significant Finding Investigations (SFIs). This requires a balanced resource, including technical staff, hardware, simulation software, and computer science solutions. In its first decade, the ASC strategy focused on demonstrating simulation capabilities of unprecedented scale in three spatial dimensions. In its second decade, ASC is focused on increasing its predictive capabilities in a three-dimensional simulation environment while maintaining support to the SSP. The program continues to improve its unique tools for solving progressively more difficult stockpile problems (focused on sufficient resolution, dimensionality and scientific details); to quantify critical margins and uncertainties (QMU); and to resolve increasingly difficult analyses needed for the SSP. Moreover, ASC has restructured its business model from one

  13. Advanced Simulation and Computing FY08-09 Implementation Plan, Volume 2, Revision 0.5

    SciTech Connect

    Kusnezov, D; Bickel, T; McCoy, M; Hopson, J

    2007-09-13

    The Stockpile Stewardship Program (SSP) is a single, highly integrated technical program for maintaining the surety and reliability of the U.S. nuclear stockpile. The SSP uses past nuclear test data along with current and future non-nuclear test data, computational modeling and simulation, and experimental facilities to advance understanding of nuclear weapons. It includes stockpile surveillance, experimental research, development and engineering programs, and an appropriately scaled production capability to support stockpile requirements. This integrated national program requires the continued use of current facilities and programs along with new experimental facilities and computational enhancements to support these programs. The Advanced Simulation and Computing Program (ASC)1 is a cornerstone of the SSP, providing simulation capabilities and computational resources to support the annual stockpile assessment and certification, to study advanced nuclear-weapons design and manufacturing processes, to analyze accident scenarios and weapons aging, and to provide the tools to enable Stockpile Life Extension Programs (SLEPs) and the resolution of Significant Finding Investigations (SFIs). This requires a balanced resource, including technical staff, hardware, simulation software, and computer science solutions. In its first decade, the ASC strategy focused on demonstrating simulation capabilities of unprecedented scale in three spatial dimensions. In its second decade, ASC is focused on increasing its predictive capabilities in a three-dimensional simulation environment while maintaining the support to the SSP. The program continues to improve its unique tools for solving progressively more difficult stockpile problems (focused on sufficient resolution, dimensionality and scientific details); to quantify critical margins and uncertainties (QMU); and to resolve increasingly difficult analyses needed for the SSP. Moreover, ASC has restructured its business model from

  14. Advanced Simulation and Computing FY10-FY11 Implementation Plan Volume 2, Rev. 0.5

    SciTech Connect

    Meisner, R; Peery, J; McCoy, M; Hopson, J

    2009-09-08

    The Stockpile Stewardship Program (SSP) is a single, highly integrated technical program for maintaining the surety and reliability of the U.S. nuclear stockpile. The SSP uses past nuclear test data along with current and future non-nuclear test data, computational modeling and simulation, and experimental facilities to advance understanding of nuclear weapons. It includes stockpile surveillance, experimental research, development and engineering (D&E) programs, and an appropriately scaled production capability to support stockpile requirements. This integrated national program requires the continued use of current facilities and programs along with new experimental facilities and computational enhancements to support these programs. The Advanced Simulation and Computing Program (ASC) is a cornerstone of the SSP, providing simulation capabilities and computational resources to support the annual stockpile assessment and certification, to study advanced nuclear weapons design and manufacturing processes, to analyze accident scenarios and weapons aging, and to provide the tools to enable stockpile Life Extension Programs (LEPs) and the resolution of Significant Finding Investigations (SFIs). This requires a balanced resource, including technical staff, hardware, simulation software, and computer science solutions. In its first decade, the ASC strategy focused on demonstrating simulation capabilities of unprecedented scale in three spatial dimensions. In its second decade, ASC is focused on increasing its predictive capabilities in a three-dimensional (3D) simulation environment while maintaining support to the SSP. The program continues to improve its unique tools for solving progressively more difficult stockpile problems (focused on sufficient resolution, dimensionality and scientific details); to quantify critical margins and uncertainties (QMU); and to resolve increasingly difficult analyses needed for the SSP. Moreover, ASC has restructured its business model

  15. Advanced Simulation and Computing FY07-08 Implementation Plan Volume 2

    SciTech Connect

    Kusnezov, D; Hale, A; McCoy, M; Hopson, J

    2006-06-22

    The Stockpile Stewardship Program (SSP) is a single, highly integrated technical program for maintaining the safety and reliability of the U.S. nuclear stockpile. The SSP uses past nuclear test data along with current and future nonnuclear test data, computational modeling and simulation, and experimental facilities to advance understanding of nuclear weapons. It includes stockpile surveillance, experimental research, development and engineering programs, and an appropriately scaled production capability to support stockpile requirements. This integrated national program will require the continued use of current facilities and programs along with new experimental facilities and computational enhancements to support these programs. The Advanced Simulation and Computing Program (ASC) is a cornerstone of the SSP, providing simulation capabilities and computational resources to support the annual stockpile assessment and certification, to study advanced nuclear-weapons design and manufacturing processes, to analyze accident scenarios and weapons aging, and to provide the tools to enable Stockpile Life Extension Programs (SLEPs) and the resolution of Significant Finding Investigations (SFIs). This requires a balanced resource, including technical staff, hardware, simulation software, and computer science solutions. In its first decade, the ASC strategy focused on demonstrating simulation capabilities of unprecedented scale in three spatial dimensions. In its second decade, ASC is focused on increasing its predictive capabilities in a three-dimensional simulation environment while maintaining the support to the SSP. The program continues to improve its unique tools for solving progressively more difficult stockpile problems (focused on sufficient resolution, dimensionality and scientific details); to quantify critical margins and uncertainties (QMU); and to resolve increasingly difficult analyses needed for the SSP. Moreover, ASC has restructured its business model from

  16. Study of flutter related computational procedures for minimum weight structural sizing of advanced aircraft, supplemental data

    NASA Technical Reports Server (NTRS)

    Oconnell, R. F.; Hassig, H. J.; Radovcich, N. A.

    1975-01-01

    Computational aspects of (1) flutter optimization (minimization of structural mass subject to specified flutter requirements), (2) methods for solving the flutter equation, and (3) efficient methods for computing generalized aerodynamic force coefficients in the repetitive analysis environment of computer-aided structural design are discussed. Specific areas included: a two-dimensional Regula Falsi approach to solving the generalized flutter equation; method of incremented flutter analysis and its applications; the use of velocity potential influence coefficients in a five-matrix product formulation of the generalized aerodynamic force coefficients; options for computational operations required to generate generalized aerodynamic force coefficients; theoretical considerations related to optimization with one or more flutter constraints; and expressions for derivatives of flutter-related quantities with respect to design variables.

  17. Parallel Analysis and Visualization on Cray Compute Node Linux

    SciTech Connect

    Pugmire, Dave; Ahern, Sean

    2008-01-01

    Capability computer systems are deployed to give researchers the computational power required to investigate and solve key challenges facing the scientific community. As the power of these computer systems increases, the computational problem domain typically increases in size, complexity and scope. These increases strain the ability of commodity analysis and visualization clusters to effectively perform post-processing tasks and provide critical insight and understanding to the computed results. An alternative to purchasing increasingly larger, separate analysis and visualization commodity clusters is to use the computational system itself to perform post-processing tasks. In this paper, the recent successful port of VisIt, a parallel, open source analysis and visualization tool, to compute node linux running on the Cray is detailed. Additionally, the unprecedented ability of this resource for analysis and visualization is discussed and a report on obtained results is presented.

  18. Advanced earth observation spacecraft computer-aided design software: Technical, user and programmer guide

    NASA Technical Reports Server (NTRS)

    Farrell, C. E.; Krauze, L. D.

    1983-01-01

    The IDEAS computer of NASA is a tool for interactive preliminary design and analysis of LSS (Large Space System). Nine analysis modules were either modified or created. These modules include the capabilities of automatic model generation, model mass properties calculation, model area calculation, nonkinematic deployment modeling, rigid-body controls analysis, RF performance prediction, subsystem properties definition, and EOS science sensor selection. For each module, a section is provided that contains technical information, user instructions, and programmer documentation.

  19. Volume accumulator design analysis computer codes

    NASA Technical Reports Server (NTRS)

    Whitaker, W. D.; Shimazaki, T. T.

    1973-01-01

    The computer codes, VANEP and VANES, were written and used to aid in the design and performance calculation of the volume accumulator units (VAU) for the 5-kwe reactor thermoelectric system. VANEP computes the VAU design which meets the primary coolant loop VAU volume and pressure performance requirements. VANES computes the performance of the VAU design, determined from the VANEP code, at the conditions of the secondary coolant loop. The codes can also compute the performance characteristics of the VAU's under conditions of possible modes of failure which still permit continued system operation.

  20. METHODS ADVANCEMENT FOR MILK ANALYSIS: THE MAMA STUDY

    EPA Science Inventory

    The Methods Advancement for Milk Analysis (MAMA) study was designed by US EPA and CDC investigators to provide data to support the technological and study design needs of the proposed National Children=s Study (NCS). The NCS is a multi-Agency-sponsored study, authorized under the...

  1. Polybrominated Diphenyl Ethers in Dryer Lint: An Advanced Analysis Laboratory

    ERIC Educational Resources Information Center

    Thompson, Robert Q.

    2008-01-01

    An advanced analytical chemistry laboratory experiment is described that involves environmental analysis and gas chromatography-mass spectrometry. Students analyze lint from clothes dryers for traces of flame retardant chemicals, polybrominated diphenylethers (PBDEs), compounds receiving much attention recently. In a typical experiment, ng/g…

  2. A Meta-Analysis of Advanced Organizer Studies.

    ERIC Educational Resources Information Center

    Stone, Carol Leth

    1983-01-01

    Twenty-nine reports yielding 112 studies were analyzed with Glass's meta-analysis technique, and results were compared with predictions from Ausubel's model of assimilative learning. Overall, advance organizers were shown to be associated with increased learning and retention of material to be learned. (Author)

  3. Advanced GIS Exercise: Predicting Rainfall Erosivity Index Using Regression Analysis

    ERIC Educational Resources Information Center

    Post, Christopher J.; Goddard, Megan A.; Mikhailova, Elena A.; Hall, Steven T.

    2006-01-01

    Graduate students from a variety of agricultural and natural resource fields are incorporating geographic information systems (GIS) analysis into their graduate research, creating a need for teaching methodologies that help students understand advanced GIS topics for use in their own research. Graduate-level GIS exercises help students understand…

  4. NASTRAN documentation for flutter analysis of advanced turbopropellers

    NASA Technical Reports Server (NTRS)

    Elchuri, V.; Gallo, A. M.; Skalski, S. C.

    1982-01-01

    An existing capability developed to conduct modal flutter analysis of tuned bladed-shrouded discs was modified to facilitate investigation of the subsonic unstalled flutter characteristics of advanced turbopropellers. The modifications pertain to the inclusion of oscillatory modal aerodynamic loads of blades with large (backward and forward) varying sweep.

  5. Development of Advanced Computational Aeroelasticity Tools at NASA Langley Research Center

    NASA Technical Reports Server (NTRS)

    Bartels, R. E.

    2008-01-01

    NASA Langley Research Center has continued to develop its long standing computational tools to address new challenges in aircraft and launch vehicle design. This paper discusses the application and development of those computational aeroelastic tools. Four topic areas will be discussed: 1) Modeling structural and flow field nonlinearities; 2) Integrated and modular approaches to nonlinear multidisciplinary analysis; 3) Simulating flight dynamics of flexible vehicles; and 4) Applications that support both aeronautics and space exploration.

  6. Implementing computer-based testing in distance education for advanced practice nurses: lessons learned.

    PubMed

    Caudle, Patricia; Bigness, Joanne; Daniels, Judi; Gillmor-Kahn, Mickey; Knestrick, Joyce

    2011-01-01

    A distance education program utilized by graduate nursing students worldwide faces unique problems with testing. This article presents the results of a pilot study on the implementation of computer-based testing at the Frontier Nursing University. A detailed analysis of the evaluative survey completed by students in the pilot study revealed issues of hi-directional respect and trust between faculty and students and technological anxiety among students using computer-based testing. PMID:22029246

  7. Advanced stress analysis methods applicable to turbine engine structures

    NASA Technical Reports Server (NTRS)

    Pian, T. H. H.

    1985-01-01

    Advanced stress analysis methods applicable to turbine engine structures are investigated. Constructions of special elements which containing traction-free circular boundaries are investigated. New versions of mixed variational principle and version of hybrid stress elements are formulated. A method is established for suppression of kinematic deformation modes. semiLoof plate and shell elements are constructed by assumed stress hybrid method. An elastic-plastic analysis is conducted by viscoplasticity theory using the mechanical subelement model.

  8. Recent Advances in Multidisciplinary Analysis and Optimization, part 3

    NASA Technical Reports Server (NTRS)

    Barthelemy, Jean-Francois M. (Editor)

    1989-01-01

    This three-part document contains a collection of technical papers presented at the Second NASA/Air Force Symposium on Recent Advances in Multidisciplinary Analysis and Optimization, held September 28-30, 1988 in Hampton, Virginia. The topics covered include: aircraft design, aeroelastic tailoring, control of aeroelastic structures, dynamics and control of flexible structures, structural design, design of large engineering systems, application of artificial intelligence, shape optimization, software development and implementation, and sensitivity analysis.

  9. Recent Advances in Multidisciplinary Analysis and Optimization, part 1

    NASA Technical Reports Server (NTRS)

    Barthelemy, Jean-Francois M. (Editor)

    1989-01-01

    This three-part document contains a collection of technical papers presented at the Second NASA/Air Force Symposium on Recent Advances in Multidisciplinary Analysis and Optimization, held September 28-30, 1988 in Hampton, Virginia. The topics covered include: helicopter design, aeroelastic tailoring, control of aeroelastic structures, dynamics and control of flexible structures, structural design, design of large engineering systems, application of artificial intelligence, shape optimization, software development and implementation, and sensitivity analysis.

  10. Recent Advances in Multidisciplinary Analysis and Optimization, part 2

    NASA Technical Reports Server (NTRS)

    Barthelemy, Jean-Francois M. (Editor)

    1989-01-01

    This three-part document contains a collection of technical papers presented at the Second NASA/Air Force Symposium on Recent Advances in Multidisciplinary Analysis and Optimization, held September 28-30, 1988 in Hampton, Virginia. The topics covered include: helicopter design, aeroelastic tailoring, control of aeroelastic structures, dynamics and control of flexible structures, structural design, design of large engineering systems, application of artificial intelligence, shape optimization, software development and implementation, and sensitivity analysis.

  11. High Resolution Traction Force Microscopy Based on Experimental and Computational Advances

    PubMed Central

    Sabass, Benedikt; Gardel, Margaret L.; Waterman, Clare M.; Schwarz, Ulrich S.

    2008-01-01

    Cell adhesion and migration crucially depend on the transmission of actomyosin-generated forces through sites of focal adhesion to the extracellular matrix. Here we report experimental and computational advances in improving the resolution and reliability of traction force microscopy. First, we introduce the use of two differently colored nanobeads as fiducial markers in polyacrylamide gels and explain how the displacement field can be computationally extracted from the fluorescence data. Second, we present different improvements regarding standard methods for force reconstruction from the displacement field, which are the boundary element method, Fourier-transform traction cytometry, and traction reconstruction with point forces. Using extensive data simulation, we show that the spatial resolution of the boundary element method can be improved considerably by splitting the elastic field into near, intermediate, and far field. Fourier-transform traction cytometry requires considerably less computer time, but can achieve a comparable resolution only when combined with Wiener filtering or appropriate regularization schemes. Both methods tend to underestimate forces, especially at small adhesion sites. Traction reconstruction with point forces does not suffer from this limitation, but is only applicable with stationary and well-developed adhesion sites. Third, we combine these advances and for the first time reconstruct fibroblast traction with a spatial resolution of ∼1 μm. PMID:17827246

  12. An expanded framework for the advanced computational testing and simulation toolkit

    SciTech Connect

    Marques, Osni A.; Drummond, Leroy A.

    2003-11-09

    The Advanced Computational Testing and Simulation (ACTS) Toolkit is a set of computational tools developed primarily at DOE laboratories and is aimed at simplifying the solution of common and important computational problems. The use of the tools reduces the development time for new codes and the tools provide functionality that might not otherwise be available. This document outlines an agenda for expanding the scope of the ACTS Project based on lessons learned from current activities. Highlights of this agenda include peer-reviewed certification of new tools; finding tools to solve problems that are not currently addressed by the Toolkit; working in collaboration with other software initiatives and DOE computer facilities; expanding outreach efforts; promoting interoperability, further development of the tools; and improving functionality of the ACTS Information Center, among other tasks. The ultimate goal is to make the ACTS tools more widely used and more effective in solving DOE's and the nation's scientific problems through the creation of a reliable software infrastructure for scientific computing.

  13. Developing a Comprehensive Software Suite for Advanced Reactor Performance and Safety Analysis

    SciTech Connect

    Pointer, William David; Bradley, Keith S; Fischer, Paul F; Smith, Micheal A; Tautges, Timothy J; Ferencz, Robert M; Martineau, Richard C; Jain, Rajeev; Obabko, Aleksandr; Billings, Jay Jay

    2013-01-01

    This paper provides an introduction to the reactor analysis capabilities of the nuclear power reactor simulation tools that are being developed as part of the U.S. Department of Energy s Nuclear Energy Advanced Modeling and Simulation (NEAMS) Toolkit. The NEAMS Toolkit is an integrated suite of multi-physics simulation tools that leverage high-performance computing to reduce uncertainty in the prediction of performance and safety of advanced reactor and fuel designs. The Toolkit effort is comprised of two major components, the Fuels Product Line (FPL), which provides tools for fuel performance analysis, and the Reactor Product Line (RPL), which provides tools for reactor performance and safety analysis. This paper provides an overview of the NEAMS RPL development effort.

  14. An Analysis of 27 Years of Research into Computer Education Published in Australian Educational Computing

    ERIC Educational Resources Information Center

    Zagami, Jason

    2015-01-01

    Analysis of three decades of publications in Australian Educational Computing (AEC) provides insight into the historical trends in Australian educational computing, highlighting an emphasis on pedagogy, comparatively few articles on educational technologies, and strong research topic alignment with similar international journals. Analysis confirms…

  15. Frequency Analysis Program for a Computer Assisted Laboratory.

    ERIC Educational Resources Information Center

    Aburdene, Maurice F.

    1983-01-01

    Describes a Fortran program used in a computer-assisted-laboratory course. Program utilizes computer-controlled frequency sweeping to measure response (amplitude/phase) of a series RLC circuit, modeling the circuit and comparing experimental/theoretical results for system gain with computer gain using least squares analysis. Plots of both gain…

  16. Numerical Package in Computer Supported Numeric Analysis Teaching

    ERIC Educational Resources Information Center

    Tezer, Murat

    2007-01-01

    At universities in the faculties of Engineering, Sciences, Business and Economics together with higher education in Computing, it is stated that because of the difficulty, calculators and computers can be used in Numerical Analysis (NA). In this study, the learning computer supported NA will be discussed together with important usage of the…

  17. Isolation and analysis of ginseng: advances and challenges

    PubMed Central

    Wang, Chong-Zhi

    2011-01-01

    Ginseng occupies a prominent position in the list of best-selling natural products in the world. Because of its complex constituents, multidisciplinary techniques are needed to validate the analytical methods that support ginseng’s use worldwide. In the past decade, rapid development of technology has advanced many aspects of ginseng research. The aim of this review is to illustrate the recent advances in the isolation and analysis of ginseng, and to highlight their new applications and challenges. Emphasis is placed on recent trends and emerging techniques. The current article reviews the literature between January 2000 and September 2010. PMID:21258738

  18. Design and analysis of advanced flight planning concepts

    NASA Technical Reports Server (NTRS)

    Sorensen, John A.

    1987-01-01

    The objectives of this continuing effort are to develop and evaluate new algorithms and advanced concepts for flight management and flight planning. This includes the minimization of fuel or direct operating costs, the integration of the airborne flight management and ground-based flight planning processes, and the enhancement of future traffic management systems design. Flight management (FMS) concepts are for on-board profile computation and steering of transport aircraft in the vertical plane between a city pair and along a given horizontal path. Flight planning (FPS) concepts are for the pre-flight ground based computation of the three-dimensional reference trajectory that connects the city pair and specifies the horizontal path, fuel load, and weather profiles for initializing the FMS. As part of these objectives, a new computer program called EFPLAN has been developed and utilized to study advanced flight planning concepts. EFPLAN represents an experimental version of an FPS. It has been developed to generate reference flight plans compatible as input to an FMS and to provide various options for flight planning research. This report describes EFPLAN and the associated research conducted in its development.

  19. Computational Models of Exercise on the Advanced Resistance Exercise Device (ARED)

    NASA Technical Reports Server (NTRS)

    Newby, Nate; Caldwell, Erin; Scott-Pandorf, Melissa; Peters,Brian; Fincke, Renita; DeWitt, John; Poutz-Snyder, Lori

    2011-01-01

    Muscle and bone loss remain a concern for crew returning from space flight. The advanced resistance exercise device (ARED) is used for on-orbit resistance exercise to help mitigate these losses. However, characterization of how the ARED loads the body in microgravity has yet to be determined. Computational models allow us to analyze ARED exercise in both 1G and 0G environments. To this end, biomechanical models of the squat, single-leg squat, and deadlift exercise on the ARED have been developed to further investigate bone and muscle forces resulting from the exercises.

  20. Advanced Computing Technologies for Rocket Engine Propulsion Systems: Object-Oriented Design with C++

    NASA Technical Reports Server (NTRS)

    Bekele, Gete

    2002-01-01

    This document explores the use of advanced computer technologies with an emphasis on object-oriented design to be applied in the development of software for a rocket engine to improve vehicle safety and reliability. The primary focus is on phase one of this project, the smart start sequence module. The objectives are: 1) To use current sound software engineering practices, object-orientation; 2) To improve on software development time, maintenance, execution and management; 3) To provide an alternate design choice for control, implementation, and performance.

  1. NDE of advanced turbine engine components and materials by computed tomography

    NASA Technical Reports Server (NTRS)

    Yancey, R. N.; Baaklini, George Y.; Klima, Stanley J.

    1991-01-01

    Computed tomography (CT) is an X-ray technique that provides quantitative 3D density information of materials and components and can accurately detail spatial distributions of cracks, voids, and density variations. CT scans of ceramic materials, composites, and engine components were taken and the resulting images will be discussed. Scans were taken with two CT systems with different spatial resolution capabilities. The scans showed internal damage, density variations, and geometrical arrangement of various features in the materials and components. It was concluded that CT can play an important role in the characterization of advanced turbine engine materials and components. Future applications of this technology will be outlined.

  2. DOE Advanced Scientific Computing Advisory Committee (ASCAC) Subcommittee Report on Scientific and Technical Information

    SciTech Connect

    Hey, Tony; Agarwal, Deborah; Borgman, Christine; Cartaro, Concetta; Crivelli, Silvia; Van Dam, Kerstin Kleese; Luce, Richard; Arjun, Shankar; Trefethen, Anne; Wade, Alex; Williams, Dean

    2015-09-04

    The Advanced Scientific Computing Advisory Committee (ASCAC) was charged to form a standing subcommittee to review the Department of Energy’s Office of Scientific and Technical Information (OSTI) and to begin by assessing the quality and effectiveness of OSTI’s recent and current products and services and to comment on its mission and future directions in the rapidly changing environment for scientific publication and data. The Committee met with OSTI staff and reviewed available products, services and other materials. This report summaries their initial findings and recommendations.

  3. CAVASS: a computer-assisted visualization and analysis software system.

    PubMed

    Grevera, George; Udupa, Jayaram; Odhner, Dewey; Zhuge, Ying; Souza, Andre; Iwanaga, Tad; Mishra, Shipra

    2007-11-01

    The Medical Image Processing Group at the University of Pennsylvania has been developing (and distributing with source code) medical image analysis and visualization software systems for a long period of time. Our most recent system, 3DVIEWNIX, was first released in 1993. Since that time, a number of significant advancements have taken place with regard to computer platforms and operating systems, networking capability, the rise of parallel processing standards, and the development of open-source toolkits. The development of CAVASS by our group is the next generation of 3DVIEWNIX. CAVASS will be freely available and open source, and it is integrated with toolkits such as Insight Toolkit and Visualization Toolkit. CAVASS runs on Windows, Unix, Linux, and Mac but shares a single code base. Rather than requiring expensive multiprocessor systems, it seamlessly provides for parallel processing via inexpensive clusters of work stations for more time-consuming algorithms. Most importantly, CAVASS is directed at the visualization, processing, and analysis of 3-dimensional and higher-dimensional medical imagery, so support for digital imaging and communication in medicine data and the efficient implementation of algorithms is given paramount importance. PMID:17786517

  4. Sodium fast reactor gaps analysis of computer codes and models for accident analysis and reactor safety.

    SciTech Connect

    Carbajo, Juan; Jeong, Hae-Yong; Wigeland, Roald; Corradini, Michael; Schmidt, Rodney Cannon; Thomas, Justin; Wei, Tom; Sofu, Tanju; Ludewig, Hans; Tobita, Yoshiharu; Ohshima, Hiroyuki; Serre, Frederic

    2011-06-01

    This report summarizes the results of an expert-opinion elicitation activity designed to qualitatively assess the status and capabilities of currently available computer codes and models for accident analysis and reactor safety calculations of advanced sodium fast reactors, and identify important gaps. The twelve-member panel consisted of representatives from five U.S. National Laboratories (SNL, ANL, INL, ORNL, and BNL), the University of Wisconsin, the KAERI, the JAEA, and the CEA. The major portion of this elicitation activity occurred during a two-day meeting held on Aug. 10-11, 2010 at Argonne National Laboratory. There were two primary objectives of this work: (1) Identify computer codes currently available for SFR accident analysis and reactor safety calculations; and (2) Assess the status and capability of current US computer codes to adequately model the required accident scenarios and associated phenomena, and identify important gaps. During the review, panel members identified over 60 computer codes that are currently available in the international community to perform different aspects of SFR safety analysis for various event scenarios and accident categories. A brief description of each of these codes together with references (when available) is provided. An adaptation of the Predictive Capability Maturity Model (PCMM) for computational modeling and simulation is described for use in this work. The panel's assessment of the available US codes is presented in the form of nine tables, organized into groups of three for each of three risk categories considered: anticipated operational occurrences (AOOs), design basis accidents (DBA), and beyond design basis accidents (BDBA). A set of summary conclusions are drawn from the results obtained. At the highest level, the panel judged that current US code capabilities are adequate for licensing given reasonable margins, but expressed concern that US code development activities had stagnated and that the

  5. Application of advanced reliability methods to local strain fatigue analysis

    NASA Technical Reports Server (NTRS)

    Wu, T. T.; Wirsching, P. H.

    1983-01-01

    When design factors are considered as random variables and the failure condition cannot be expressed by a closed form algebraic inequality, computations of risk (or probability of failure) might become extremely difficult or very inefficient. This study suggests using a simple, and easily constructed, second degree polynomial to approximate the complicated limit state in the neighborhood of the design point; a computer analysis relates the design variables at selected points. Then a fast probability integration technique (i.e., the Rackwitz-Fiessler algorithm) can be used to estimate risk. The capability of the proposed method is demonstrated in an example of a low cycle fatigue problem for which a computer analysis is required to perform local strain analysis to relate the design variables. A comparison of the performance of this method is made with a far more costly Monte Carlo solution. Agreement of the proposed method with Monte Carlo is considered to be good.

  6. Advanced Post-Irradiation Examination Capabilities Alternatives Analysis Report

    SciTech Connect

    Jeff Bryan; Bill Landman; Porter Hill

    2012-12-01

    An alternatives analysis was performed for the Advanced Post-Irradiation Capabilities (APIEC) project in accordance with the U.S. Department of Energy (DOE) Order DOE O 413.3B, “Program and Project Management for the Acquisition of Capital Assets”. The Alternatives Analysis considered six major alternatives: ? No Action ? Modify Existing DOE Facilities – capabilities distributed among multiple locations ? Modify Existing DOE Facilities – capabilities consolidated at a few locations ? Construct New Facility ? Commercial Partnership ? International Partnerships Based on the alternatives analysis documented herein, it is recommended to DOE that the advanced post-irradiation examination capabilities be provided by a new facility constructed at the Materials and Fuels Complex at the Idaho National Laboratory.

  7. Advanced Simulation and Computing Fiscal Year 2016 Implementation Plan, Version 0

    SciTech Connect

    McCoy, M.; Archer, B.; Hendrickson, B.

    2015-08-27

    The Stockpile Stewardship Program (SSP) is an integrated technical program for maintaining the safety, surety, and reliability of the U.S. nuclear stockpile. The SSP uses nuclear test data, computational modeling and simulation, and experimental facilities to advance understanding of nuclear weapons. It includes stockpile surveillance, experimental research, development and engineering programs, and an appropriately scaled production capability to support stockpile requirements. This integrated national program requires the continued use of experimental facilities and programs, and the computational capabilities to support these programs. The purpose of this IP is to outline key work requirements to be performed and to control individual work activities within the scope of work. Contractors may not deviate from this plan without a revised WA or subsequent IP.

  8. "ATLAS" Advanced Technology Life-cycle Analysis System

    NASA Technical Reports Server (NTRS)

    Lollar, Louis F.; Mankins, John C.; ONeil, Daniel A.

    2004-01-01

    Making good decisions concerning research and development portfolios-and concerning the best systems concepts to pursue - as early as possible in the life cycle of advanced technologies is a key goal of R&D management This goal depends upon the effective integration of information from a wide variety of sources as well as focused, high-level analyses intended to inform such decisions Life-cycle Analysis System (ATLAS) methodology and tool kit. ATLAS encompasses a wide range of methods and tools. A key foundation for ATLAS is the NASA-created Technology Readiness. The toolkit is largely spreadsheet based (as of August 2003). This product is being funded by the Human and Robotics The presentation provides a summary of the Advanced Technology Level (TRL) systems Technology Program Office, Office of Exploration Systems, NASA Headquarters, Washington D.C. and is being integrated by Dan O Neil of the Advanced Projects Office, NASA/MSFC, Huntsville, AL

  9. Alternative Computational Approaches for Probalistic Fatigue Analysis

    NASA Technical Reports Server (NTRS)

    Ebbeler, D. H.; Newlin, L. E.; Sutharshana, S.; Moore, N. R.; Grigoriu, M.

    1995-01-01

    The feasibility is discussed for alternative methods of direct Monte Carlo simulation for failure probability computations. First and second order reliability methods are used for fatigue crack growth and low cycle fatigue structural failure modes to illustrate typical problems.

  10. System balance analysis for vector computers

    NASA Technical Reports Server (NTRS)

    Knight, J. C.; Poole, W. G., Jr.; Voight, R. G.

    1975-01-01

    The availability of vector processors capable of sustaining computing rates of 10 to the 8th power arithmetic results pers second raised the question of whether peripheral storage devices representing current technology can keep such processors supplied with data. By examining the solution of a large banded linear system on these computers, it was found that even under ideal conditions, the processors will frequently be waiting for problem data.

  11. Analysis of an advanced ducted propeller subsonic inlet

    NASA Technical Reports Server (NTRS)

    Iek, Chanthy; Boldman, Donald R.; Ibrahim, Mounir

    1992-01-01

    It is shown that a time marching Navier-Stokes code called PARC can be utilized to provide a reasonable prediction of the flow field within an inlet for an advanced ducted propeller. The code validation was implemented for a nonseparated flow condition associated with the inlet functioning at angles-of-attack of zero and 25 deg. Comparison of the computational results with the test data shows that the PARC code with the propeller face fixed flow properties boundary conditions (BC) provided a better prediction of the inlet surface static pressures than the prediction when the mass flow BC was employed.

  12. Three parallel computation methods for structural vibration analysis

    NASA Technical Reports Server (NTRS)

    Storaasli, Olaf; Bostic, Susan; Patrick, Merrell; Mahajan, Umesh; Ma, Shing

    1988-01-01

    The Lanczos (1950), multisectioning, and subspace iteration sequential methods for vibration analysis presently used as bases for three parallel algorithms are noted, in the aftermath of three example problems, to maintain reasonable accuracy in the computation of vibration frequencies. Significant computation time reductions are obtained as the number of processors increases. An analysis is made of the performance of each method, in order to characterize relative strengths and weaknesses as well as to identify those parameters that most strongly affect computation efficiency.

  13. Recent advances in the analysis of nanotube-reinforced polymeric biomaterials

    NASA Astrophysics Data System (ADS)

    Reddy, J. N.; Unnikrishnan, Vinu U.; Unnikrishnan, Ginu U.

    2013-12-01

    Conventional experimental or computational techniques are often inadequate for the analysis and development of nanocomposite-based materials as they are tedious (e.g., experimental methods) or are unsuitable to capture the properties of these novel materials (e.g., conventional computational techniques), thereby requiring multiscale computational strategies. During the last 5 years, major developments were made by the authors on the formulation and implementation of multiscale computational models, using atomistic simulation and micro-mechanics-based techniques, to study the mechanical and thermal behavior of nanocomposite-based materials. In this article, the advances made in the computational analysis of nanocomposites for tissue engineering applications (e.g., scaffolds and bioreactors) would be discussed. The material properties of the nanocomposites in the lower scales were determined using molecular dynamics, and were then transferred to the macroscale using various homogenization techniques. Also in this article, the authors discuss the development of a theory of mixture-based finite element model for nutrient flow in a hollow fiber membrane bioreactor and the use of computational tools to improve the efficiency of the bioreactor.

  14. Advanced computational sensors technology: testing and evaluation in visible, SWIR, and LWIR imaging

    NASA Astrophysics Data System (ADS)

    Rizk, Charbel G.; Wilson, John P.; Pouliquen, Philippe

    2015-05-01

    The Advanced Computational Sensors Team at the Johns Hopkins University Applied Physics Laboratory and the Johns Hopkins University Department of Electrical and Computer Engineering has been developing advanced readout integrated circuit (ROIC) technology for more than 10 years with a particular focus on the key challenges of dynamic range, sampling rate, system interface and bandwidth, and detector materials or band dependencies. Because the pixel array offers parallel sampling by default, the team successfully demonstrated that adding smarts in the pixel and the chip can increase performance significantly. Each pixel becomes a smart sensor and can operate independently in collecting, processing, and sharing data. In addition, building on the digital circuit revolution, the effective well size can be increased by orders of magnitude within the same pixel pitch over analog designs. This research has yielded an innovative class of a system-on-chip concept: the Flexible Readout and Integration Sensor (FRIS) architecture. All key parameters are programmable and/or can be adjusted dynamically, and this architecture can potentially be sensor and application agnostic. This paper reports on the testing and evaluation of one prototype that can support either detector polarity and includes sample results with visible, short-wavelength infrared (SWIR), and long-wavelength infrared (LWIR) imaging.

  15. Computational analysis of LDDMM for brain mapping

    PubMed Central

    Ceritoglu, Can; Tang, Xiaoying; Chow, Margaret; Hadjiabadi, Darian; Shah, Damish; Brown, Timothy; Burhanullah, Muhammad H.; Trinh, Huong; Hsu, John T.; Ament, Katarina A.; Crocetti, Deana; Mori, Susumu; Mostofsky, Stewart H.; Yantis, Steven; Miller, Michael I.; Ratnanather, J. Tilak

    2013-01-01

    One goal of computational anatomy (CA) is to develop tools to accurately segment brain structures in healthy and diseased subjects. In this paper, we examine the performance and complexity of such segmentation in the framework of the large deformation diffeomorphic metric mapping (LDDMM) registration method with reference to atlases and parameters. First we report the application of a multi-atlas segmentation approach to define basal ganglia structures in healthy and diseased kids' brains. The segmentation accuracy of the multi-atlas approach is compared with the single atlas LDDMM implementation and two state-of-the-art segmentation algorithms—Freesurfer and FSL—by computing the overlap errors between automatic and manual segmentations of the six basal ganglia nuclei in healthy subjects as well as subjects with diseases including ADHD and Autism. The high accuracy of multi-atlas segmentation is obtained at the cost of increasing the computational complexity because of the calculations necessary between the atlases and a subject. Second, we examine the effect of parameters on total LDDMM computation time and segmentation accuracy for basal ganglia structures. Single atlas LDDMM method is used to automatically segment the structures in a population of 16 subjects using different sets of parameters. The results show that a cascade approach and using fewer time steps can reduce computational complexity as much as five times while maintaining reliable segmentations. PMID:23986653

  16. High Performance Computing: Advanced Research Projects Agency Should Do More To Foster Program Goals. Report to the Chairman, Committee on Armed Services, House of Representatives.

    ERIC Educational Resources Information Center

    General Accounting Office, Washington, DC. Information Management and Technology Div.

    High-performance computing refers to the use of advanced computing technologies to solve highly complex problems in the shortest possible time. The federal High Performance Computing and Communications Initiative of the Advanced Research Project Agency (ARPA) attempts to accelerate availability and use of high performance computers and networks.…

  17. Modern Computational Techniques for the HMMER Sequence Analysis

    PubMed Central

    2013-01-01

    This paper focuses on the latest research and critical reviews on modern computing architectures, software and hardware accelerated algorithms for bioinformatics data analysis with an emphasis on one of the most important sequence analysis applications—hidden Markov models (HMM). We show the detailed performance comparison of sequence analysis tools on various computing platforms recently developed in the bioinformatics society. The characteristics of the sequence analysis, such as data and compute-intensive natures, make it very attractive to optimize and parallelize by using both traditional software approach and innovated hardware acceleration technologies. PMID:25937944

  18. Global detailed geoid computation and model analysis

    NASA Technical Reports Server (NTRS)

    Marsh, J. G.; Vincent, S.

    1974-01-01

    Comparisons and analyses were carried out through the use of detailed gravimetric geoids which we have computed by combining models with a set of 26,000 1 deg x 1 deg mean free air gravity anomalies. The accuracy of the detailed gravimetric geoid computed using the most recent Goddard earth model (GEM-6) in conjunction with the set of 1 deg x 1 deg mean free air gravity anomalies is assessed at + or - 2 meters on the continents of North America, Europe, and Australia, 2 to 5 meters in the Northeast Pacific and North Atlantic areas, and 5 to 10 meters in other areas where surface gravity data are sparse. The R.M.S. differences between this detailed geoid and the detailed geoids computed using the other satellite gravity fields in conjuction with same set of surface data range from 3 to 7 meters.

  19. Hypercube-Computer Analysis Of Electromagnetic Scattering

    NASA Technical Reports Server (NTRS)

    Patterson, J. E.; Liewer, P. C.; Calalo, R. H.; Manshadi, F.

    1990-01-01

    Capabilities of hypercube and parallel processing demonstrated. Report describes use of Mark III Hypercube computer to analyze scattering of electromagnetic waves. Purpose of study to assess utility of parallel computing in such computation-intensive problems as large-scale electromagnetic scattering. Two electromagnetic codes based on different algorithms converted to run on Mark III Hypercube. First code implements finite-difference, time-domain solution of Maxwell's curl equations. Second code is Numerical Electromagnetics Code (NEC-2) which embodies frequency-domain method and developed to analyze electromagnetic responses of antennas and other metallic structures. On Mark III Hypercube with 32 active nodes, largest lattice contains about 2,048,000 unit cells.

  20. Computational Analysis of SAXS Data Acquisition.

    PubMed

    Dong, Hui; Kim, Jin Seob; Chirikjian, Gregory S

    2015-09-01

    Small-angle x-ray scattering (SAXS) is an experimental biophysical method used for gaining insight into the structure of large biomolecular complexes. Under appropriate chemical conditions, the information obtained from a SAXS experiment can be equated to the pair distribution function, which is the distribution of distances between every pair of points in the complex. Here we develop a mathematical model to calculate the pair distribution function for a structure of known density, and analyze the computational complexity of these calculations. Efficient recursive computation of this forward model is an important step in solving the inverse problem of recovering the three-dimensional density of biomolecular structures from their pair distribution functions. In particular, we show that integrals of products of three spherical-Bessel functions arise naturally in this context. We then develop an algorithm for the efficient recursive computation of these integrals. PMID:26244255

  1. Analysis of life cycle costs for electric vans with advanced battery systems

    SciTech Connect

    Marr, W.W.; Walsh, W.J.; Miller, J.F.

    1988-11-01

    The performance of advanced Zn/Br/sub 2/, LiAl/FeS, Na/S, Ni/Fe, and Fe/Air batteries in electric vans was compared to that of tubular lead-acid technology. The MARVEL computer analysis system evaluated these batteries for the G-Van and IDSEP vehicles over two driving schedules. Each of the advanced batteries exhibited the potential for major improvements in both range and life cycle cost compared with tubular lead-acid. A sensitivity analysis revealed specific energy, battery initial cost, and cycle life to be the dominant factors in reducing life cycle cost for the case of vans powered by tubular lead-acid batteries. 5 refs., 8 figs., 2 tabs.

  2. Analysis of life cycle costs for electric vans with advanced battery systems

    SciTech Connect

    Marr, W.W.; Walsh, W.J.; Miller, J.F.

    1989-01-01

    The performance of advanced Zn/Br/sub 2/, LiAl/FeS, Na/S, Ni/Fe, and Fe/Air batteries in electric vans was compared to that of tubular lead-acid technology. The MARVEL computer analysis system evaluated these batteries for the G-Van and IDSEP vehicles over two driving schedules. Each of the advanced batteries exhibited the potential for major improvements in both range and life cycle cost compared with tubular lead-acid. A sensitivity analysis reveals specific energy, battery initial cost, and cycle life to be the dominant factors in reducing life cycle cost for the case of vans powered by tubular lead-acid batteries.

  3. Advances in Mid-Infrared Spectroscopy for Chemical Analysis.

    PubMed

    Haas, Julian; Mizaikoff, Boris

    2016-06-12

    Infrared spectroscopy in the 3-20 μm spectral window has evolved from a routine laboratory technique into a state-of-the-art spectroscopy and sensing tool by benefitting from recent progress in increasingly sophisticated spectra acquisition techniques and advanced materials for generating, guiding, and detecting mid-infrared (MIR) radiation. Today, MIR spectroscopy provides molecular information with trace to ultratrace sensitivity, fast data acquisition rates, and high spectral resolution catering to demanding applications in bioanalytics, for example, and to improved routine analysis. In addition to advances in miniaturized device technology without sacrificing analytical performance, selected innovative applications for MIR spectroscopy ranging from process analysis to biotechnology and medical diagnostics are highlighted in this review. PMID:27070183

  4. Advances in Mid-Infrared Spectroscopy for Chemical Analysis

    NASA Astrophysics Data System (ADS)

    Haas, Julian; Mizaikoff, Boris

    2016-06-01

    Infrared spectroscopy in the 3–20 μm spectral window has evolved from a routine laboratory technique into a state-of-the-art spectroscopy and sensing tool by benefitting from recent progress in increasingly sophisticated spectra acquisition techniques and advanced materials for generating, guiding, and detecting mid-infrared (MIR) radiation. Today, MIR spectroscopy provides molecular information with trace to ultratrace sensitivity, fast data acquisition rates, and high spectral resolution catering to demanding applications in bioanalytics, for example, and to improved routine analysis. In addition to advances in miniaturized device technology without sacrificing analytical performance, selected innovative applications for MIR spectroscopy ranging from process analysis to biotechnology and medical diagnostics are highlighted in this review.

  5. Advanced gamma ray balloon experiment ground checkout and data analysis

    NASA Technical Reports Server (NTRS)

    Blackstone, M.

    1976-01-01

    A software programming package to be used in the ground checkout and handling of data from the advanced gamma ray balloon experiment is described. The Operator's Manual permits someone unfamiliar with the inner workings of the software system (called LEO) to operate on the experimental data as it comes from the Pulse Code Modulation interface, converting it to a form for later analysis, and monitoring the program of an experiment. A Programmer's Manual is included.

  6. [Advances in independent component analysis and its application].

    PubMed

    Chen, Huafu; Yao, Dezhong

    2003-06-01

    The independent component analysis (ICA) is a new technique in statistical signal processing, which decomposes mixed signals into statistical independent components. The reported applications in biomedical and radar signal have demonstrated its good prospect in various blind signal separation. In this paper, the progress of ICA in such as its principle, algorithm and application and advance direction of ICA in future is reviewed. The aim is to promote the research in theory and application in the future. PMID:12856621

  7. Computer-assisted sperm analysis (CASA): capabilities and potential developments.

    PubMed

    Amann, Rupert P; Waberski, Dagmar

    2014-01-01

    Computer-assisted sperm analysis (CASA) systems have evolved over approximately 40 years, through advances in devices to capture the image from a microscope, huge increases in computational power concurrent with amazing reduction in size of computers, new computer languages, and updated/expanded software algorithms. Remarkably, basic concepts for identifying sperm and their motion patterns are little changed. Older and slower systems remain in use. Most major spermatology laboratories and semen processing facilities have a CASA system, but the extent of reliance thereon ranges widely. This review describes capabilities and limitations of present CASA technology used with boar, bull, and stallion sperm, followed by possible future developments. Each marketed system is different. Modern CASA systems can automatically view multiple fields in a shallow specimen chamber to capture strobe-like images of 500 to >2000 sperm, at 50 or 60 frames per second, in clear or complex extenders, and in <2 minutes, store information for ≥ 30 frames and provide summary data for each spermatozoon and the population. A few systems evaluate sperm morphology concurrent with motion. CASA cannot accurately predict 'fertility' that will be obtained with a semen sample or subject. However, when carefully validated, current CASA systems provide information important for quality assurance of semen planned for marketing, and for the understanding of the diversity of sperm responses to changes in the microenvironment in research. The four take-home messages from this review are: (1) animal species, extender or medium, specimen chamber, intensity of illumination, imaging hardware and software, instrument settings, technician, etc., all affect accuracy and precision of output values; (2) semen production facilities probably do not need a substantially different CASA system whereas biology laboratories would benefit from systems capable of imaging and tracking sperm in deep chambers for a flexible

  8. Advanced superposition methods for high speed turbopump vibration analysis

    NASA Technical Reports Server (NTRS)

    Nielson, C. E.; Campany, A. D.

    1981-01-01

    The small, high pressure Mark 48 liquid hydrogen turbopump was analyzed and dynamically tested to determine the cause of high speed vibration at an operating speed of 92,400 rpm. This approaches the design point operating speed of 95,000 rpm. The initial dynamic analysis in the design stage and subsequent further analysis of the rotor only dynamics failed to predict the vibration characteristics found during testing. An advanced procedure for dynamics analysis was used in this investigation. The procedure involves developing accurate dynamic models of the rotor assembly and casing assembly by finite element analysis. The dynamically instrumented assemblies are independently rap tested to verify the analytical models. The verified models are then combined by modal superposition techniques to develop a completed turbopump model where dynamic characteristics are determined. The results of the dynamic testing and analysis obtained are presented and methods of moving the high speed vibration characteristics to speeds above the operating range are recommended. Recommendations for use of these advanced dynamic analysis procedures during initial design phases are given.

  9. Advanced image analysis for the preservation of cultural heritage

    NASA Astrophysics Data System (ADS)

    France, Fenella G.; Christens-Barry, William; Toth, Michael B.; Boydston, Kenneth

    2010-02-01

    The Library of Congress' Preservation Research and Testing Division has established an advanced preservation studies scientific program for research and analysis of the diverse range of cultural heritage objects in its collection. Using this system, the Library is currently developing specialized integrated research methodologies for extending preservation analytical capacities through non-destructive hyperspectral imaging of cultural objects. The research program has revealed key information to support preservation specialists, scholars and other institutions. The approach requires close and ongoing collaboration between a range of scientific and cultural heritage personnel - imaging and preservation scientists, art historians, curators, conservators and technology analysts. A research project of the Pierre L'Enfant Plan of Washington DC, 1791 had been undertaken to implement and advance the image analysis capabilities of the imaging system. Innovative imaging options and analysis techniques allow greater processing and analysis capacities to establish the imaging technique as the first initial non-invasive analysis and documentation step in all cultural heritage analyses. Mapping spectral responses, organic and inorganic data, topography semi-microscopic imaging, and creating full spectrum images have greatly extended this capacity from a simple image capture technique. Linking hyperspectral data with other non-destructive analyses has further enhanced the research potential of this image analysis technique.

  10. Advanced Simulation and Computing: A Summary Report to the Director's Review

    SciTech Connect

    McCoy, M G; Peck, T

    2003-06-01

    It has now been three years since the Advanced Simulation and Computing Program (ASCI), as managed by Defense and Nuclear Technologies (DNT) Directorate, has been reviewed by this Director's Review Committee (DRC). Since that time, there has been considerable progress for all components of the ASCI Program, and these developments will be highlighted in this document and in the presentations planned for June 9 and 10, 2003. There have also been some name changes. Today, the Program is called ''Advanced Simulation and Computing,'' Although it retains the familiar acronym ASCI, the initiative nature of the effort has given way to sustained services as an integral part of the Stockpile Stewardship Program (SSP). All computing efforts at LLNL and the other two Defense Program (DP) laboratories are funded and managed under ASCI. This includes the so-called legacy codes, which remain essential tools in stockpile stewardship. The contract between the Department of Energy (DOE) and the University of California (UC) specifies an independent appraisal of Directorate technical work and programmatic management. Such represents the work of this DNT Review Committee. Beginning this year, the Laboratory is implementing a new review system. This process was negotiated between UC, the National Nuclear Security Administration (NNSA), and the Laboratory Directors. Central to this approach are eight performance objectives that focus on key programmatic and administrative goals. Associated with each of these objectives are a number of performance measures to more clearly characterize the attainment of the objectives. Each performance measure has a lead directorate and one or more contributing directorates. Each measure has an evaluation plan and has identified expected documentation to be included in the ''Assessment File''.

  11. World Wide Web interface for advanced SPECT reconstruction algorithms implemented on a remote massively parallel computer.

    PubMed

    Formiconi, A R; Passeri, A; Guelfi, M R; Masoni, M; Pupi, A; Meldolesi, U; Malfetti, P; Calori, L; Guidazzoli, A

    1997-11-01

    Data from Single Photon Emission Computed Tomography (SPECT) studies are blurred by inevitable physical phenomena occurring during data acquisition. These errors may be compensated by means of reconstruction algorithms which take into account accurate physical models of the data acquisition procedure. Unfortunately, this approach involves high memory requirements as well as a high computational burden which cannot be afforded by the computer systems of SPECT acquisition devices. In this work the possibility of accessing High Performance Computing and Networking (HPCN) resources through a World Wide Web interface for the advanced reconstruction of SPECT data in a clinical environment was investigated. An iterative algorithm with an accurate model of the variable system response was ported on the Multiple Instruction Multiple Data (MIMD) parallel architecture of a Cray T3D massively parallel computer. The system was accessible even from low cost PC-based workstations through standard TCP/IP networking. A speedup factor of 148 was predicted by the benchmarks run on the Cray T3D. A complete brain study of 30 (64 x 64) slices was reconstructed from a set of 90 (64 x 64) projections with ten iterations of the conjugate gradients algorithm in 9 s which corresponds to an actual speed-up factor of 135. The technique was extended to a more accurate 3D modeling of the system response for a true 3D reconstruction of SPECT data; the reconstruction time of the same data set with this more accurate model was 5 min. This work demonstrates the possibility of exploiting remote HPCN resources from hospital sites by means of low cost workstations using standard communication protocols and an user-friendly WWW interface without particular problems for routine use. PMID:9506406

  12. An emulator for minimizing computer resources for finite element analysis

    NASA Technical Reports Server (NTRS)

    Melosh, R.; Utku, S.; Islam, M.; Salama, M.

    1984-01-01

    A computer code, SCOPE, has been developed for predicting the computer resources required for a given analysis code, computer hardware, and structural problem. The cost of running the code is a small fraction (about 3 percent) of the cost of performing the actual analysis. However, its accuracy in predicting the CPU and I/O resources depends intrinsically on the accuracy of calibration data that must be developed once for the computer hardware and the finite element analysis code of interest. Testing of the SCOPE code on the AMDAHL 470 V/8 computer and the ELAS finite element analysis program indicated small I/O errors (3.2 percent), larger CPU errors (17.8 percent), and negligible total errors (1.5 percent).

  13. Thermoelectric pump performance analysis computer code

    NASA Technical Reports Server (NTRS)

    Johnson, J. L.

    1973-01-01

    A computer program is presented that was used to analyze and design dual-throat electromagnetic dc conduction pumps for the 5-kwe ZrH reactor thermoelectric system. In addition to a listing of the code and corresponding identification of symbols, the bases for this analytical model are provided.

  14. Computer program performs stiffness matrix structural analysis

    NASA Technical Reports Server (NTRS)

    Bamford, R.; Batchelder, R.; Schmele, L.; Wada, B. K.

    1968-01-01

    Computer program generates the stiffness matrix for a particular type of structure from geometrical data, and performs static and normal mode analyses. It requires the structure to be modeled as a stable framework of uniform, weightless members, and joints at which loads are applied and weights are lumped.

  15. Computational and Physical Analysis of Catalytic Compounds

    NASA Astrophysics Data System (ADS)

    Wu, Richard; Sohn, Jung Jae; Kyung, Richard

    2015-03-01

    Nanoparticles exhibit unique physical and chemical properties depending on their geometrical properties. For this reason, synthesis of nanoparticles with controlled shape and size is important to use their unique properties. Catalyst supports are usually made of high-surface-area porous oxides or carbon nanomaterials. These support materials stabilize metal catalysts against sintering at high reaction temperatures. Many studies have demonstrated large enhancements of catalytic behavior due to the role of the oxide-metal interface. In this paper, the catalyzing ability of supported nano metal oxides, such as silicon oxide and titanium oxide compounds as catalysts have been analyzed using computational chemistry method. Computational programs such as Gamess and Chemcraft has been used in an effort to compute the efficiencies of catalytic compounds, and bonding energy changes during the optimization convergence. The result illustrates how the metal oxides stabilize and the steps that it takes. The graph of the energy computation step(N) versus energy(kcal/mol) curve shows that the energy of the titania converges faster at the 7th iteration calculation, whereas the silica converges at the 9th iteration calculation.

  16. Computed Tomography Analysis of NASA BSTRA Balls

    SciTech Connect

    Perry, R L; Schneberk, D J; Thompson, R R

    2004-10-12

    Fifteen 1.25 inch BSTRA balls were scanned with the high energy computed tomography system at LLNL. This system has a resolution limit of approximately 210 microns. A threshold of 238 microns (two voxels) was used, and no anomalies at or greater than this were observed.

  17. Reliability computation using fault tree analysis

    NASA Technical Reports Server (NTRS)

    Chelson, P. O.

    1971-01-01

    A method is presented for calculating event probabilities from an arbitrary fault tree. The method includes an analytical derivation of the system equation and is not a simulation program. The method can handle systems that incorporate standby redundancy and it uses conditional probabilities for computing fault trees where the same basic failure appears in more than one fault path.

  18. Detecting Nano-Scale Vibrations in Rotating Devices by Using Advanced Computational Methods

    PubMed Central

    del Toro, Raúl M.; Haber, Rodolfo E.; Schmittdiel, Michael C.

    2010-01-01

    This paper presents a computational method for detecting vibrations related to eccentricity in ultra precision rotation devices used for nano-scale manufacturing. The vibration is indirectly measured via a frequency domain analysis of the signal from a piezoelectric sensor attached to the stationary component of the rotating device. The algorithm searches for particular harmonic sequences associated with the eccentricity of the device rotation axis. The detected sequence is quantified and serves as input to a regression model that estimates the eccentricity. A case study presents the application of the computational algorithm during precision manufacturing processes. PMID:22399918

  19. Detecting nano-scale vibrations in rotating devices by using advanced computational methods.

    PubMed

    del Toro, Raúl M; Haber, Rodolfo E; Schmittdiel, Michael C

    2010-01-01

    This paper presents a computational method for detecting vibrations related to eccentricity in ultra precision rotation devices used for nano-scale manufacturing. The vibration is indirectly measured via a frequency domain analysis of the signal from a piezoelectric sensor attached to the stationary component of the rotating device. The algorithm searches for particular harmonic sequences associated with the eccentricity of the device rotation axis. The detected sequence is quantified and serves as input to a regression model that estimates the eccentricity. A case study presents the application of the computational algorithm during precision manufacturing processes. PMID:22399918

  20. Wing analysis using a transonic potential flow computational method

    NASA Technical Reports Server (NTRS)

    Henne, P. A.; Hicks, R. M.

    1978-01-01

    The ability of the method to compute wing transonic performance was determined by comparing computed results with both experimental data and results computed by other theoretical procedures. Both pressure distributions and aerodynamic forces were evaluated. Comparisons indicated that the method is a significant improvement in transonic wing analysis capability. In particular, the computational method generally calculated the correct development of three-dimensional pressure distributions from subcritical to transonic conditions. Complicated, multiple shocked flows observed experimentally were reproduced computationally. The ability to identify the effects of design modifications was demonstrated both in terms of pressure distributions and shock drag characteristics.

  1. Efficient curve-skeleton computation for the analysis of biomedical 3d images - biomed 2010.

    PubMed

    Brun, Francesco; Dreossi, Diego

    2010-01-01

    Advances in three dimensional (3D) biomedical imaging techniques, such as magnetic resonance (MR) and computed tomography (CT), make it easy to reconstruct high quality 3D models of portions of human body and other biological specimens. A major challenge lies in the quantitative analysis of the resulting models thus allowing a more comprehensive characterization of the object under investigation. An interesting approach is based on curve-skeleton (or medial axis) extraction, which gives basic information concerning the topology and the geometry. Curve-skeletons have been applied in the analysis of vascular networks and the diagnosis of tracheal stenoses as well as a 3D flight path in virtual endoscopy. However curve-skeleton computation is a crucial task. An effective skeletonization algorithm was introduced by N. Cornea in [1] but it lacks in computational performances. Thanks to the advances in imaging techniques the resolution of 3D images is increasing more and more, therefore there is the need for efficient algorithms in order to analyze significant Volumes of Interest (VOIs). In the present paper an improved skeletonization algorithm based on the idea proposed in [1] is presented. A computational comparison between the original and the proposed method is also reported. The obtained results show that the proposed method allows a significant computational improvement making more appealing the adoption of the skeleton representation in biomedical image analysis applications. PMID:20467122

  2. Recent advances in computational methodology for simulation of mechanical circulatory assist devices

    PubMed Central

    Marsden, Alison L.; Bazilevs, Yuri; Long, Christopher C.; Behr, Marek

    2014-01-01

    Ventricular assist devices (VADs) provide mechanical circulatory support to offload the work of one or both ventricles during heart failure. They are used in the clinical setting as destination therapy, as bridge to transplant, or more recently as bridge to recovery to allow for myocardial remodeling. Recent developments in computational simulation allow for detailed assessment of VAD hemodynamics for device design and optimization for both children and adults. Here, we provide a focused review of the recent literature on finite element methods and optimization for VAD simulations. As VAD designs typically fall into two categories, pulsatile and continuous flow devices, we separately address computational challenges of both types of designs, and the interaction with the circulatory system with three representative case studies. In particular, we focus on recent advancements in finite element methodology that has increased the fidelity of VAD simulations. We outline key challenges, which extend to the incorporation of biological response such as thrombosis and hemolysis, as well as shape optimization methods and challenges in computational methodology. PMID:24449607

  3. Advances in the operation of the DIII-D neutral beam computer systems

    SciTech Connect

    Phillips, J.C.; Busath, J.L.; Penaflor, B.G.; Piglowski, D.; Kellman, D.H.; Chiu, H.K.; Hong, R.M.

    1998-02-01

    The DIII-D neutral beam system routinely provides up to 20 MW of deuterium neutral beam heating in support of experiments on the DIII-D tokamak, and is a critical part of the DIII-D physics experimental program. The four computer systems previously used to control neutral beam operation and data acquisition were designed and implemented in the late 1970`s and used on DIII and DIII-D from 1981--1996. By comparison to modern standards, they had become expensive to maintain, slow and cumbersome, making it difficult to implement improvements. Most critical of all, they were not networked computers. During the 1997 experimental campaign, these systems were replaced with new Unix compliant hardware and, for the most part, commercially available software. This paper describes operational experience with the new neutral beam computer systems, and new advances made possible by using features not previously available. These include retention and access to historical data, an asynchronously fired ``rules`` base, and a relatively straightforward programming interface. Methods and principles for extending the availability of data beyond the scope of the operator consoles will be discussed.

  4. Advanced stress analysis methods applicable to turbine engine structures

    NASA Technical Reports Server (NTRS)

    Pian, Theodore H. H.

    1991-01-01

    The following tasks on the study of advanced stress analysis methods applicable to turbine engine structures are described: (1) constructions of special elements which contain traction-free circular boundaries; (2) formulation of new version of mixed variational principles and new version of hybrid stress elements; (3) establishment of methods for suppression of kinematic deformation modes; (4) construction of semiLoof plate and shell elements by assumed stress hybrid method; and (5) elastic-plastic analysis by viscoplasticity theory using the mechanical subelement model.

  5. Advanced Signal Analysis for Forensic Applications of Ground Penetrating Radar

    SciTech Connect

    Steven Koppenjan; Matthew Streeton; Hua Lee; Michael Lee; Sashi Ono

    2004-06-01

    Ground penetrating radar (GPR) systems have traditionally been used to image subsurface objects. The main focus of this paper is to evaluate an advanced signal analysis technique. Instead of compiling spatial data for the analysis, this technique conducts object recognition procedures based on spectral statistics. The identification feature of an object type is formed from the training vectors by a singular-value decomposition procedure. To illustrate its capability, this procedure is applied to experimental data and compared to the performance of the neural-network approach.

  6. Advanced Models for Aeroelastic Analysis of Propulsion Systems

    NASA Technical Reports Server (NTRS)

    Keith, Theo G., Jr.; Mahajan, Aparajit

    1996-01-01

    This report describes an integrated, multidisciplinary simulation capability for aeroelastic analysis and optimization of advanced propulsion systems. This research is intended to improve engine development, acquisition, and maintenance costs. One of the proposed simulations is aeroelasticity of blades, cowls, and struts in an ultra-high bypass fan. These ducted fans are expected to have significant performance, fuel, and noise improvements over existing engines. An interface program was written to use modal information from COBSTAN and NASTRAN blade models in aeroelastic analysis with a single rotation ducted fan aerodynamic code.

  7. Computational thermo-fluid analysis of a disk brake

    NASA Astrophysics Data System (ADS)

    Takizawa, Kenji; Tezduyar, Tayfun E.; Kuraishi, Takashi; Tabata, Shinichiro; Takagi, Hirokazu

    2016-06-01

    We present computational thermo-fluid analysis of a disk brake, including thermo-fluid analysis of the flow around the brake and heat conduction analysis of the disk. The computational challenges include proper representation of the small-scale thermo-fluid behavior, high-resolution representation of the thermo-fluid boundary layers near the spinning solid surfaces, and bringing the heat transfer coefficient (HTC) calculated in the thermo-fluid analysis of the flow to the heat conduction analysis of the spinning disk. The disk brake model used in the analysis closely represents the actual configuration, and this adds to the computational challenges. The components of the method we have developed for computational analysis of the class of problems with these types of challenges include the Space-Time Variational Multiscale method for coupled incompressible flow and thermal transport, ST Slip Interface method for high-resolution representation of the thermo-fluid boundary layers near spinning solid surfaces, and a set of projection methods for different parts of the disk to bring the HTC calculated in the thermo-fluid analysis. With the HTC coming from the thermo-fluid analysis of the flow around the brake, we do the heat conduction analysis of the disk, from the start of the breaking until the disk spinning stops, demonstrating how the method developed works in computational analysis of this complex and challenging problem.

  8. Structural Configuration Systems Analysis for Advanced Aircraft Fuselage Concepts

    NASA Technical Reports Server (NTRS)

    Mukhopadhyay, Vivek; Welstead, Jason R.; Quinlan, Jesse R.; Guynn, Mark D.

    2016-01-01

    Structural configuration analysis of an advanced aircraft fuselage concept is investigated. This concept is characterized by a double-bubble section fuselage with rear mounted engines. Based on lessons learned from structural systems analysis of unconventional aircraft, high-fidelity finite-element models (FEM) are developed for evaluating structural performance of three double-bubble section configurations. Structural sizing and stress analysis are applied for design improvement and weight reduction. Among the three double-bubble configurations, the double-D cross-section fuselage design was found to have a relatively lower structural weight. The structural FEM weights of these three double-bubble fuselage section concepts are also compared with several cylindrical fuselage models. Since these fuselage concepts are different in size, shape and material, the fuselage structural FEM weights are normalized by the corresponding passenger floor area for a relative comparison. This structural systems analysis indicates that an advanced composite double-D section fuselage may have a relative structural weight ratio advantage over a conventional aluminum fuselage. Ten commercial and conceptual aircraft fuselage structural weight estimates, which are empirically derived from the corresponding maximum takeoff gross weight, are also presented and compared with the FEM- based estimates for possible correlation. A conceptual full vehicle FEM model with a double-D fuselage is also developed for preliminary structural analysis and weight estimation.

  9. Computer aided radiation analysis for manned spacecraft

    NASA Technical Reports Server (NTRS)

    Appleby, Matthew H.; Griffin, Brand N.; Tanner, Ernest R., II; Pogue, William R.; Golightly, Michael J.

    1991-01-01

    In order to assist in the design of radiation shielding an analytical tool is presented that can be employed in combination with CAD facilities and NASA transport codes. The nature of radiation in space is described, and the operational requirements for protection are listed as background information for the use of the technique. The method is based on the Boeing radiation exposure model (BREM) for combining NASA radiation transport codes and CAD facilities, and the output is given as contour maps of the radiation-shield distribution so that dangerous areas can be identified. Computational models are used to solve the 1D Boltzmann transport equation and determine the shielding needs for the worst-case scenario. BREM can be employed directly with the radiation computations to assess radiation protection during all phases of design which saves time and ultimately spacecraft weight.

  10. Computer applications for engineering/structural analysis. Revision 1

    SciTech Connect

    Zaslawsky, M.; Samaddar, S.K.

    1991-12-31

    Analysts and organizations have a tendency to lock themselves into specific codes with the obvious consequences of not addressing the real problem and thus reaching the wrong conclusion. This paper discusses the role of the analyst in selecting computer codes. The participation and support of a computation division in modifying the source program, configuration management, and pre- and post-processing of codes are among the subjects discussed. Specific examples illustrating the computer code selection process are described in the following problem areas: soil structure interaction, structural analysis of nuclear reactors, analysis of waste tanks where fluid structure interaction is important, analysis of equipment, structure-structure interaction, analysis of the operation of the superconductor supercollider which includes friction and transient temperature, and 3D analysis of the 10-meter telescope being built in Hawaii. Validation and verification of computer codes and their impact on the selection process are also discussed.

  11. TAIR: A transonic airfoil analysis computer code

    NASA Technical Reports Server (NTRS)

    Dougherty, F. C.; Holst, T. L.; Grundy, K. L.; Thomas, S. D.

    1981-01-01

    The operation of the TAIR (Transonic AIRfoil) computer code, which uses a fast, fully implicit algorithm to solve the conservative full-potential equation for transonic flow fields about arbitrary airfoils, is described on two levels of sophistication: simplified operation and detailed operation. The program organization and theory are elaborated to simplify modification of TAIR for new applications. Examples with input and output are given for a wide range of cases, including incompressible, subcritical compressible, and transonic calculations.

  12. Scalable Computer Performance and Analysis (Hierarchical INTegration)

    1999-09-02

    HINT is a program to measure a wide variety of scalable computer systems. It is capable of demonstrating the benefits of using more memory or processing power, and of improving communications within the system. HINT can be used for measurement of an existing system, while the associated program ANALYTIC HINT can be used to explain the measurements or as a design tool for proposed systems.

  13. Advanced Mesh-Enabled Monte carlo capability for Multi-Physics Reactor Analysis

    SciTech Connect

    Wilson, Paul; Evans, Thomas; Tautges, Tim

    2012-12-24

    This project will accumulate high-precision fluxes throughout reactor geometry on a non- orthogonal grid of cells to support multi-physics coupling, in order to more accurately calculate parameters such as reactivity coefficients and to generate multi-group cross sections. This work will be based upon recent developments to incorporate advanced geometry and mesh capability in a modular Monte Carlo toolkit with computational science technology that is in use in related reactor simulation software development. Coupling this capability with production-scale Monte Carlo radiation transport codes can provide advanced and extensible test-beds for these developments. Continuous energy Monte Carlo methods are generally considered to be the most accurate computational tool for simulating radiation transport in complex geometries, particularly neutron transport in reactors. Nevertheless, there are several limitations for their use in reactor analysis. Most significantly, there is a trade-off between the fidelity of results in phase space, statistical accuracy, and the amount of computer time required for simulation. Consequently, to achieve an acceptable level of statistical convergence in high-fidelity results required for modern coupled multi-physics analysis, the required computer time makes Monte Carlo methods prohibitive for design iterations and detailed whole-core analysis. More subtly, the statistical uncertainty is typically not uniform throughout the domain, and the simulation quality is limited by the regions with the largest statistical uncertainty. In addition, the formulation of neutron scattering laws in continuous energy Monte Carlo methods makes it difficult to calculate adjoint neutron fluxes required to properly determine important reactivity parameters. Finally, most Monte Carlo codes available for reactor analysis have relied on orthogonal hexahedral grids for tallies that do not conform to the geometric boundaries and are thus generally not well

  14. Advanced computational simulation for design and manufacturing of lightweight material components for automotive applications

    SciTech Connect

    Simunovic, S.; Aramayo, G.A.; Zacharia, T.; Toridis, T.G.; Bandak, F.; Ragland, C.L.

    1997-04-01

    Computational vehicle models for the analysis of lightweight material performance in automobiles have been developed through collaboration between Oak Ridge National Laboratory, the National Highway Transportation Safety Administration, and George Washington University. The vehicle models have been verified against experimental data obtained from vehicle collisions. The crashed vehicles were analyzed, and the main impact energy dissipation mechanisms were identified and characterized. Important structural parts were extracted and digitized and directly compared with simulation results. High-performance computing played a key role in the model development because it allowed for rapid computational simulations and model modifications. The deformation of the computational model shows a very good agreement with the experiments. This report documents the modifications made to the computational model and relates them to the observations and findings on the test vehicle. Procedural guidelines are also provided that the authors believe need to be followed to create realistic models of passenger vehicles that could be used to evaluate the performance of lightweight materials in automotive structural components.

  15. Multivariate analysis: A statistical approach for computations

    NASA Astrophysics Data System (ADS)

    Michu, Sachin; Kaushik, Vandana

    2014-10-01

    Multivariate analysis is a type of multivariate statistical approach commonly used in, automotive diagnosis, education evaluating clusters in finance etc and more recently in the health-related professions. The objective of the paper is to provide a detailed exploratory discussion about factor analysis (FA) in image retrieval method and correlation analysis (CA) of network traffic. Image retrieval methods aim to retrieve relevant images from a collected database, based on their content. The problem is made more difficult due to the high dimension of the variable space in which the images are represented. Multivariate correlation analysis proposes an anomaly detection and analysis method based on the correlation coefficient matrix. Anomaly behaviors in the network include the various attacks on the network like DDOs attacks and network scanning.

  16. Analysis of a Multiprocessor Guidance Computer. Ph.D. Thesis

    NASA Technical Reports Server (NTRS)

    Maltach, E. G.

    1969-01-01

    The design of the next generation of spaceborne digital computers is described. It analyzes a possible multiprocessor computer configuration. For the analysis, a set of representative space computing tasks was abstracted from the Lunar Module Guidance Computer programs as executed during the lunar landing, from the Apollo program. This computer performs at this time about 24 concurrent functions, with iteration rates from 10 times per second to once every two seconds. These jobs were tabulated in a machine-independent form, and statistics of the overall job set were obtained. It was concluded, based on a comparison of simulation and Markov results, that the Markov process analysis is accurate in predicting overall trends and in configuration comparisons, but does not provide useful detailed information in specific situations. Using both types of analysis, it was determined that the job scheduling function is a critical one for efficiency of the multiprocessor. It is recommended that research into the area of automatic job scheduling be performed.

  17. Application of advanced computational codes in the design of an experiment for a supersonic throughflow fan rotor

    NASA Technical Reports Server (NTRS)

    Wood, Jerry R.; Schmidt, James F.; Steinke, Ronald J.; Chima, Rodrick V.; Kunik, William G.

    1987-01-01

    Increased emphasis on sustained supersonic or hypersonic cruise has revived interest in the supersonic throughflow fan as a possible component in advanced propulsion systems. Use of a fan that can operate with a supersonic inlet axial Mach number is attractive from the standpoint of reducing the inlet losses incurred in diffusing the flow from a supersonic flight Mach number to a subsonic one at the fan face. The design of the experiment using advanced computational codes to calculate the components required is described. The rotor was designed using existing turbomachinery design and analysis codes modified to handle fully supersonic axial flow through the rotor. A two-dimensional axisymmetric throughflow design code plus a blade element code were used to generate fan rotor velocity diagrams and blade shapes. A quasi-three-dimensional, thin shear layer Navier-Stokes code was used to assess the performance of the fan rotor blade shapes. The final design was stacked and checked for three-dimensional effects using a three-dimensional Euler code interactively coupled with a two-dimensional boundary layer code. The nozzle design in the expansion region was analyzed with a three-dimensional parabolized viscous code which corroborated the results from the Euler code. A translating supersonic diffuser was designed using these same codes.

  18. Advanced Computational Modeling of Vapor Deposition in a High-pressure Reactor

    NASA Technical Reports Server (NTRS)

    Cardelino, Beatriz H.; Moore, Craig E.; McCall, Sonya D.; Cardelino, Carlos A.; Dietz, Nikolaus; Bachmann, Klaus

    2004-01-01

    In search of novel approaches to produce new materials for electro-optic technologies, advances have been achieved in the development of computer models for vapor deposition reactors in space. Numerical simulations are invaluable tools for costly and difficult processes, such as those experiments designed for high pressures and microgravity conditions. Indium nitride is a candidate compound for high-speed laser and photo diodes for optical communication system, as well as for semiconductor lasers operating into the blue and ultraviolet regions. But InN and other nitride compounds exhibit large thermal decomposition at its optimum growth temperature. In addition, epitaxy at lower temperatures and subatmospheric pressures incorporates indium droplets into the InN films. However, surface stabilization data indicate that InN could be grown at 900 K in high nitrogen pressures, and microgravity could provide laminar flow conditions. Numerical models for chemical vapor deposition have been developed, coupling complex chemical kinetics with fluid dynamic properties.

  19. Advanced Computational Modeling of Vapor Deposition in a High-Pressure Reactor

    NASA Technical Reports Server (NTRS)

    Cardelino, Beatriz H.; Moore, Craig E.; McCall, Sonya D.; Cardelino, Carlos A.; Dietz, Nikolaus; Bachmann, Klaus

    2004-01-01

    In search of novel approaches to produce new materials for electro-optic technologies, advances have been achieved in the development of computer models for vapor deposition reactors in space. Numerical simulations are invaluable tools for costly and difficult processes, such as those experiments designed for high pressures and microgravity conditions. Indium nitride is a candidate compound for high-speed laser and photo diodes for optical communication system, as well as for semiconductor lasers operating into the blue and ultraviolet regions. But InN and other nitride compounds exhibit large thermal decomposition at its optimum growth temperature. In addition, epitaxy at lower temperatures and subatmospheric pressures incorporates indium droplets into the InN films. However, surface stabilization data indicate that InN could be grown at 900 K in high nitrogen pressures, and microgravity could provide laminar flow conditions. Numerical models for chemical vapor deposition have been developed, coupling complex chemical kinetics with fluid dynamic properties.

  20. Development of an Advanced Computational Model for OMCVD of Indium Nitride

    NASA Technical Reports Server (NTRS)

    Cardelino, Carlos A.; Moore, Craig E.; Cardelino, Beatriz H.; Zhou, Ning; Lowry, Sam; Krishnan, Anantha; Frazier, Donald O.; Bachmann, Klaus J.

    1999-01-01

    An advanced computational model is being developed to predict the formation of indium nitride (InN) film from the reaction of trimethylindium (In(CH3)3) with ammonia (NH3). The components are introduced into the reactor in the gas phase within a background of molecular nitrogen (N2). Organometallic chemical vapor deposition occurs on a heated sapphire surface. The model simulates heat and mass transport with gas and surface chemistry under steady state and pulsed conditions. The development and validation of an accurate model for the interactions between the diffusion of gas phase species and surface kinetics is essential to enable the regulation of the process in order to produce a low defect material. The validation of the model will be performed in concert with a NASA-North Carolina State University project.