Nuclear reactor descriptions for space power systems analysis
NASA Technical Reports Server (NTRS)
Mccauley, E. W.; Brown, N. J.
1972-01-01
For the small, high performance reactors required for space electric applications, adequate neutronic analysis is of crucial importance, but in terms of computational time consumed, nuclear calculations probably yield the least amount of detail for mission analysis study. It has been found possible, after generation of only a few designs of a reactor family in elaborate thermomechanical and nuclear detail to use simple curve fitting techniques to assure desired neutronic performance while still performing the thermomechanical analysis in explicit detail. The resulting speed-up in computation time permits a broad detailed examination of constraints by the mission analyst.
2010-02-27
investigated in more detail. The intermediate level of fidelity, though more expensive, is then used to refine the analysis , add geometric detail, and...design stage is used to further refine the analysis , narrowing the design to a handful of options. Figure 1. Integrated Hierarchical Framework. In...computational structural and computational fluid modeling. For the structural analysis tool we used McIntosh Structural Dynamics’ finite element code CNEVAL
Operation of the Institute for Computer Applications in Science and Engineering
NASA Technical Reports Server (NTRS)
1975-01-01
The ICASE research program is described in detail; it consists of four major categories: (1) efficient use of vector and parallel computers, with particular emphasis on the CDC STAR-100; (2) numerical analysis, with particular emphasis on the development and analysis of basic numerical algorithms; (3) analysis and planning of large-scale software systems; and (4) computational research in engineering and the natural sciences, with particular emphasis on fluid dynamics. The work in each of these areas is described in detail; other activities are discussed, a prognosis of future activities are included.
Global detailed geoid computation and model analysis
NASA Technical Reports Server (NTRS)
Marsh, J. G.; Vincent, S.
1974-01-01
Comparisons and analyses were carried out through the use of detailed gravimetric geoids which we have computed by combining models with a set of 26,000 1 deg x 1 deg mean free air gravity anomalies. The accuracy of the detailed gravimetric geoid computed using the most recent Goddard earth model (GEM-6) in conjunction with the set of 1 deg x 1 deg mean free air gravity anomalies is assessed at + or - 2 meters on the continents of North America, Europe, and Australia, 2 to 5 meters in the Northeast Pacific and North Atlantic areas, and 5 to 10 meters in other areas where surface gravity data are sparse. The R.M.S. differences between this detailed geoid and the detailed geoids computed using the other satellite gravity fields in conjuction with same set of surface data range from 3 to 7 meters.
NASA Technical Reports Server (NTRS)
Oman, B. H.
1977-01-01
The NASA Langley Research Center vehicle design evaluation program (VDEP-2) was expanded by (1) incorporating into the program a capability to conduct preliminary design studies on subsonic commercial transport type aircraft using both JP and such alternate fuels as hydrogen and methane;(2) incorporating an aircraft detailed mission and performance analysis capability; and (3) developing and incorporating an external loads analysis capability. The resulting computer program (VDEP-3) provides a preliminary design tool that enables the user to perform integrated sizing, structural analysis, and cost studies on subsonic commercial transport aircraft. Both versions of the VDEP-3 Program which are designated preliminary Analysis VDEP-3 and detailed Analysis VDEP utilize the same vehicle sizing subprogram which includes a detailed mission analysis capability, as well as a geometry and weight analysis for multibodied configurations.
NASA Technical Reports Server (NTRS)
Walker, Carrie K.
1991-01-01
A technique has been developed for combining features of a systems architecture design and assessment tool and a software development tool. This technique reduces simulation development time and expands simulation detail. The Architecture Design and Assessment System (ADAS), developed at the Research Triangle Institute, is a set of computer-assisted engineering tools for the design and analysis of computer systems. The ADAS system is based on directed graph concepts and supports the synthesis and analysis of software algorithms mapped to candidate hardware implementations. Greater simulation detail is provided by the ADAS functional simulator. With the functional simulator, programs written in either Ada or C can be used to provide a detailed description of graph nodes. A Computer-Aided Software Engineering tool developed at the Charles Stark Draper Laboratory (CSDL CASE) automatically generates Ada or C code from engineering block diagram specifications designed with an interactive graphical interface. A technique to use the tools together has been developed, which further automates the design process.
Computer Analysis of Air Pollution from Highways, Streets, and Complex Interchanges
DOT National Transportation Integrated Search
1974-03-01
A detailed computer analysis of air quality for a complex highway interchange was prepared, using an in-house version of the Environmental Protection Agency's Gaussian Highway Line Source Model. This analysis showed that the levels of air pollution n...
Computer-Generated, Three-Dimensional Character Animation: A Report and Analysis.
ERIC Educational Resources Information Center
Kingsbury, Douglas Lee
This master's thesis details the experience gathered in the production "Snoot and Muttly," a short character animation with 3-D computer generated images, and provides an analysis of the computer-generated 3-D character animation system capabilities. Descriptions are provided of the animation environment at the Ohio State University…
A computer program for the design and analysis of low-speed airfoils, supplement
NASA Technical Reports Server (NTRS)
Eppler, R.; Somers, D. M.
1980-01-01
Three new options were incorporated into an existing computer program for the design and analysis of low speed airfoils. These options permit the analysis of airfoils having variable chord (variable geometry), a boundary layer displacement iteration, and the analysis of the effect of single roughness elements. All three options are described in detail and are included in the FORTRAN IV computer program.
High Speed Cylindrical Roller Bearing Analysis, SKF Computer Program CYBEAN. Volume 1: Analysis
NASA Technical Reports Server (NTRS)
Kleckner, R. J.; Pirvics, J.
1978-01-01
The CYBEAN (CYlindrical BEaring ANalysis) program was created to detail radially loaded, aligned and misaligned Cylindrical roller bearing performance under a variety of operating conditions. The models and associated mathematics used within CYBEAN are described. The user is referred to the material for formulation assumptions and algorithm detail.
Modern Computational Techniques for the HMMER Sequence Analysis
2013-01-01
This paper focuses on the latest research and critical reviews on modern computing architectures, software and hardware accelerated algorithms for bioinformatics data analysis with an emphasis on one of the most important sequence analysis applications—hidden Markov models (HMM). We show the detailed performance comparison of sequence analysis tools on various computing platforms recently developed in the bioinformatics society. The characteristics of the sequence analysis, such as data and compute-intensive natures, make it very attractive to optimize and parallelize by using both traditional software approach and innovated hardware acceleration technologies. PMID:25937944
Supplement to the ICRPG turbulent boundary layer nozzle analysis computer program
NASA Technical Reports Server (NTRS)
Omori, S.; Gross, K. W.
1972-01-01
A supplement is presented for a turbulent boundary layer nozzle analysis computer program. It describes the program calculation sequence and presents a detailed documentation of each subroutine. Important equations are derived explicitly, and improvements to the program are discussed.
Use of Computer Simulation for the Analysis of Railroad Operations in the St. Louis Terminal Area
DOT National Transportation Integrated Search
1977-11-01
This report discusses the computer simulation methodology, its uses and limitations, and its applicability to the analysis of alternative railroad terminal restructuring plans. Included is a detailed discussion of the AAR Simulation System, an overvi...
ERIC Educational Resources Information Center
Hussmann, Katja; Grande, Marion; Meffert, Elisabeth; Christoph, Swetlana; Piefke, Martina; Willmes, Klaus; Huber, Walter
2012-01-01
Although generally accepted as an important part of aphasia assessment, detailed analysis of spontaneous speech is rarely carried out in clinical practice mostly due to time limitations. The Aachener Sprachanalyse (ASPA; Aachen Speech Analysis) is a computer-assisted method for the quantitative analysis of German spontaneous speech that allows for…
High speed cylindrical roller bearing analysis, SKF computer program CYBEAN. Volume 2: User's manual
NASA Technical Reports Server (NTRS)
Kleckner, R. J.; Pirvics, J.
1978-01-01
The CYBEAN (Cylindrical Bearing Analysis) was created to detail radially loaded, aligned and misaligned cylindrical roller bearing performance under a variety of operating conditions. Emphasis was placed on detailing the effects of high speed, preload and system thermal coupling. Roller tilt, skew, radial, circumferential and axial displacement as well as flange contact were considered. Variable housing and flexible out-of-round outer ring geometries, and both steady state and time transient temperature calculations were enabled. The complete range of elastohydrodynamic contact considerations, employing full and partial film conditions were treated in the computation of raceway and flange contacts. Input and output architectures containing guidelines for use and a sample execution are detailed.
Idea Notebook: Wilderness Food Planning in the Computer Age.
ERIC Educational Resources Information Center
Drury, Jack K.
1986-01-01
Explains the use of a computer as a planning and teaching tool in wilderness trip food planning. Details use of master food list and spreadsheet software such as VisiCalc to provide shopping lists for food purchasing, cost analysis, and diet analysis. (NEC)
A users manual for a revised version of the Langley charring ablator program
NASA Technical Reports Server (NTRS)
Stroud, C. W.; Brinkley, K. L.
1975-01-01
A computer program is described that will compute the transient response of a thermal protection material to a prescribed heat input at the surface. The program has the capability of analyzing pyrolysis gas chemical kinetics in detail and treating pyrolysis reactions-in-depth. Deposition of solid products produced by chemical reactions in the gas phase is included in the analysis. An outline is given for the theory. detailed operating instructions for the computer program are included.
Development of a Aerothermoelastic-Acoustics Simulation Capability of Flight Vehicles
NASA Technical Reports Server (NTRS)
Gupta, K. K.; Choi, S. B.; Ibrahim, A.
2010-01-01
A novel numerical, finite element based analysis methodology is presented in this paper suitable for accurate and efficient simulation of practical, complex flight vehicles. An associated computer code, developed in this connection, is also described in some detail. Thermal effects of high speed flow obtained from a heat conduction analysis are incorporated in the modal analysis which in turn affects the unsteady flow arising out of interaction of elastic structures with the air. Numerical examples pertaining to representative problems are given in much detail testifying to the efficacy of the advocated techniques. This is a unique implementation of temperature effects in a finite element CFD based multidisciplinary simulation analysis capability involving large scale computations.
NASA Technical Reports Server (NTRS)
Heinmiller, J. P.
1971-01-01
This document is the programmer's guide for the GNAT computer program developed under MSC/TRW Task 705-2, Apollo cryogenic storage system analysis, subtask 2, is reported. Detailed logic flow charts and compiled program listings are provided for all program elements.
DORCA 2 computer program. Volume 3: Program listing
NASA Technical Reports Server (NTRS)
Carey, J. B.
1972-01-01
A program listing for the Dynamic Operational Requirements and Cost Analysis Program is presented. Detailed instructions for the computer programming involved in space mission planning and project requirements are developed.
Procedures for numerical analysis of circadian rhythms
REFINETTI, ROBERTO; LISSEN, GERMAINE CORNÉ; HALBERG, FRANZ
2010-01-01
This article reviews various procedures used in the analysis of circadian rhythms at the populational, organismal, cellular and molecular levels. The procedures range from visual inspection of time plots and actograms to several mathematical methods of time series analysis. Computational steps are described in some detail, and additional bibliographic resources and computer programs are listed. PMID:23710111
Abstractions for DNA circuit design.
Lakin, Matthew R; Youssef, Simon; Cardelli, Luca; Phillips, Andrew
2012-03-07
DNA strand displacement techniques have been used to implement a broad range of information processing devices, from logic gates, to chemical reaction networks, to architectures for universal computation. Strand displacement techniques enable computational devices to be implemented in DNA without the need for additional components, allowing computation to be programmed solely in terms of nucleotide sequences. A major challenge in the design of strand displacement devices has been to enable rapid analysis of high-level designs while also supporting detailed simulations that include known forms of interference. Another challenge has been to design devices capable of sustaining precise reaction kinetics over long periods, without relying on complex experimental equipment to continually replenish depleted species over time. In this paper, we present a programming language for designing DNA strand displacement devices, which supports progressively increasing levels of molecular detail. The language allows device designs to be programmed using a common syntax and then analysed at varying levels of detail, with or without interference, without needing to modify the program. This allows a trade-off to be made between the level of molecular detail and the computational cost of analysis. We use the language to design a buffered architecture for DNA devices, capable of maintaining precise reaction kinetics for a potentially unbounded period. We test the effectiveness of buffered gates to support long-running computation by designing a DNA strand displacement system capable of sustained oscillations.
Prediction of ball and roller bearing thermal and kinematic performance by computer analysis
NASA Technical Reports Server (NTRS)
Pirvics, J.; Kleckner, R. J.
1983-01-01
Characteristics of good computerized analysis software are suggested. These general remarks and an overview of representative software precede a more detailed discussion of load support system analysis program structure. Particular attention is directed at a recent cylindrical roller bearing analysis as an example of the available design tools. Selected software modules are then examined to reveal the detail inherent in contemporary analysis. This leads to a brief section on current design computation which seeks to suggest when and why computerized analysis is warranted. An example concludes the argument offered for such design methodology. Finally, remarks are made concerning needs for model development to address effects which are now considered to be secondary but are anticipated to emerge to primary status in the near future.
Computer Series, 98. Electronics for Scientists: A Computer-Intensive Approach.
ERIC Educational Resources Information Center
Scheeline, Alexander; Mork, Brian J.
1988-01-01
Reports the design for a principles-before-details presentation of electronics for an instrumental analysis class. Uses computers for data collection and simulations. Requires one semester with two 2.5-hour periods and two lectures per week. Includes lab and lecture syllabi. (MVL)
Systematic comparison of the behaviors produced by computational models of epileptic neocortex.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Warlaumont, A. S.; Lee, H. C.; Benayoun, M.
2010-12-01
Two existing models of brain dynamics in epilepsy, one detailed (i.e., realistic) and one abstract (i.e., simplified) are compared in terms of behavioral range and match to in vitro mouse recordings. A new method is introduced for comparing across computational models that may have very different forms. First, high-level metrics were extracted from model and in vitro output time series. A principal components analysis was then performed over these metrics to obtain a reduced set of derived features. These features define a low-dimensional behavior space in which quantitative measures of behavioral range and degree of match to real data canmore » be obtained. The detailed and abstract models and the mouse recordings overlapped considerably in behavior space. Both the range of behaviors and similarity to mouse data were similar between the detailed and abstract models. When no high-level metrics were used and principal components analysis was computed over raw time series, the models overlapped minimally with the mouse recordings. The method introduced here is suitable for comparing across different kinds of model data and across real brain recordings. It appears that, despite differences in form and computational expense, detailed and abstract models do not necessarily differ in their behaviors.« less
A Novel Shape Parameterization Approach
NASA Technical Reports Server (NTRS)
Samareh, Jamshid A.
1999-01-01
This paper presents a novel parameterization approach for complex shapes suitable for a multidisciplinary design optimization application. The approach consists of two basic concepts: (1) parameterizing the shape perturbations rather than the geometry itself and (2) performing the shape deformation by means of the soft objects animation algorithms used in computer graphics. Because the formulation presented in this paper is independent of grid topology, we can treat computational fluid dynamics and finite element grids in a similar manner. The proposed approach is simple, compact, and efficient. Also, the analytical sensitivity derivatives are easily computed for use in a gradient-based optimization. This algorithm is suitable for low-fidelity (e.g., linear aerodynamics and equivalent laminated plate structures) and high-fidelity analysis tools (e.g., nonlinear computational fluid dynamics and detailed finite element modeling). This paper contains the implementation details of parameterizing for planform, twist, dihedral, thickness, and camber. The results are presented for a multidisciplinary design optimization application consisting of nonlinear computational fluid dynamics, detailed computational structural mechanics, performance, and a simple propulsion module.
Multidisciplinary Aerodynamic-Structural Shape Optimization Using Deformation (MASSOUD)
NASA Technical Reports Server (NTRS)
Samareh, Jamshid A.
2000-01-01
This paper presents a multidisciplinary shape parameterization approach. The approach consists of two basic concepts: (1) parameterizing the shape perturbations rather than the geometry itself and (2) performing the shape deformation by means of the soft object animation algorithms used in computer graphics. Because the formulation presented in this paper is independent of grid topology, we can treat computational fluid dynamics and finite element grids in the same manner. The proposed approach is simple, compact, and efficient. Also, the analytical sensitivity derivatives are easily computed for use in a gradient-based optimization. This algorithm is suitable for low-fidelity (e.g., linear aerodynamics and equivalent laminate plate structures) and high-fidelity (e.g., nonlinear computational fluid dynamics and detailed finite element modeling) analysis tools. This paper contains the implementation details of parameterizing for planform, twist, dihedral, thickness, camber, and free-form surface. Results are presented for a multidisciplinary application consisting of nonlinear computational fluid dynamics, detailed computational structural mechanics, and a simple performance module.
Multidisciplinary Aerodynamic-Structural Shape Optimization Using Deformation (MASSOUD)
NASA Technical Reports Server (NTRS)
Samareh, Jamshid A.
2000-01-01
This paper presents a multidisciplinary shape parameterization approach. The approach consists of two basic concepts: (1) parameterizing the shape perturbations rather than the geometry itself and (2) performing the shape deformation by means of the soft object animation algorithms used in computer graphics. Because the formulation presented in this paper is independent of grid topology, we can treat computational fluid dynamics and finite element grids in a similar manner. The proposed approach is simple, compact, and efficient. Also, the analytical sensitivity derivatives are easily computed for use in a gradient-based optimization. This algorithm is suitable for low-fidelity (e.g., linear aerodynamics and equivalent laminated plate structures) and high-fidelity (e.g., nonlinear computational fluid dynamics and detailed finite element modeling analysis tools. This paper contains the implementation details of parameterizing for planform, twist, dihedral, thickness, camber, and free-form surface. Results are presented for a multidisciplinary design optimization application consisting of nonlinear computational fluid dynamics, detailed computational structural mechanics, and a simple performance module.
Microeconomic Analysis with BASIC.
ERIC Educational Resources Information Center
Tom, C. F. Joseph
Computer programs written in BASIC for the study of microeconomic analysis with special emphasis in economic decisions on price, output, and profit of a business firm are described. A very brief overview of the content of each of the 28 computer programs comprising the course is provided; four of the programs are then discussed in greater detail.…
The 30/20 GHz fixed communications systems service demand assessment. Volume 3: Appendices
NASA Technical Reports Server (NTRS)
Gabriszeski, T.; Reiner, P.; Rogers, J.; Terbo, W.
1979-01-01
The market analysis of voice, video, and data 18/30 GHz communications systems services and satellite transmission services is discussed. Detail calculations, computer displays of traffic, survey questionnaires, and detailed service forecasts are presented.
Computational Analysis of a Prototype Martian Rotorcraft Experiment
NASA Technical Reports Server (NTRS)
Corfeld, Kelly J.; Strawn, Roger C.; Long, Lyle N.
2002-01-01
This paper presents Reynolds-averaged Navier-Stokes calculations for a prototype Martian rotorcraft. The computations are intended for comparison with an ongoing Mars rotor hover test at NASA Ames Research Center. These computational simulations present a new and challenging problem, since rotors that operate on Mars will experience a unique low Reynolds number and high Mach number environment. Computed results for the 3-D rotor differ substantially from 2-D sectional computations in that the 3-D results exhibit a stall delay phenomenon caused by rotational forces along the blade span. Computational results have yet to be compared to experimental data, but computed performance predictions match the experimental design goals fairly well. In addition, the computed results provide a high level of detail in the rotor wake and blade surface aerodynamics. These details provide an important supplement to the expected experimental performance data.
Shuttle Electrical Power Analysis Program (SEPAP); single string circuit analysis report
NASA Technical Reports Server (NTRS)
Murdock, C. R.
1974-01-01
An evaluation is reported of the data obtained from an analysis of the distribution network characteristics of the shuttle during a spacelab mission. A description of the approach utilized in the development of the computer program and data base is provided and conclusions are drawn from the analysis of the data. Data sheets are provided for information to support the detailed discussion on each computer run.
Visual Environments for CFD Research
NASA Technical Reports Server (NTRS)
Watson, Val; George, Michael W. (Technical Monitor)
1994-01-01
This viewgraph presentation gives an overview of the visual environments for computational fluid dynamics (CFD) research. It includes details on critical needs from the future computer environment, features needed to attain this environment, prospects for changes in and the impact of the visualization revolution on the human-computer interface, human processing capabilities, limits of personal environment and the extension of that environment with computers. Information is given on the need for more 'visual' thinking (including instances of visual thinking), an evaluation of the alternate approaches for and levels of interactive computer graphics, a visual analysis of computational fluid dynamics, and an analysis of visualization software.
ERIC Educational Resources Information Center
Turcotte, Sandrine
2012-01-01
This article describes in detail a conversation analysis of conceptual change in a computer-supported collaborative learning environment. Conceptual change is an essential learning process in science education that has yet to be fully understood. While many models and theories have been developed over the last three decades, empirical data to…
NASA Technical Reports Server (NTRS)
Marvin, J. G.; Horstman, C. C.; Rubesin, M. W.; Coakley, T. J.; Kussoy, M. I.
1975-01-01
An experiment designed to test and guide computations of the interaction of an impinging shock wave with a turbulent boundary layer is described. Detailed mean flow-field and surface data are presented for two shock strengths which resulted in attached and separated flows, respectively. Numerical computations, employing the complete time-averaged Navier-Stokes equations along with algebraic eddy-viscosity and turbulent Prandtl number models to describe shear stress and heat flux, are used to illustrate the dependence of the computations on the particulars of the turbulence models. Models appropriate for zero-pressure-gradient flows predicted the overall features of the flow fields, but were deficient in predicting many of the details of the interaction regions. Improvements to the turbulence model parameters were sought through a combination of detailed data analysis and computer simulations which tested the sensitivity of the solutions to model parameter changes. Computer simulations using these improvements are presented and discussed.
ERIC Educational Resources Information Center
Gilakjani, Abbas Pourhosein
2014-01-01
Computer technology has changed the ways we work, learn, interact and spend our leisure time. Computer technology has changed every aspect of our daily life--how and where we get our news, how we order goods and services, and how we communicate. This study investigates some of the significant issues concerning the use of computer technology…
Multidisciplinary optimization of an HSCT wing using a response surface methodology
DOE Office of Scientific and Technical Information (OSTI.GOV)
Giunta, A.A.; Grossman, B.; Mason, W.H.
1994-12-31
Aerospace vehicle design is traditionally divided into three phases: conceptual, preliminary, and detailed. Each of these design phases entails a particular level of accuracy and computational expense. While there are several computer programs which perform inexpensive conceptual-level aircraft multidisciplinary design optimization (MDO), aircraft MDO remains prohibitively expensive using preliminary- and detailed-level analysis tools. This occurs due to the expense of computational analyses and because gradient-based optimization requires the analysis of hundreds or thousands of aircraft configurations to estimate design sensitivity information. A further hindrance to aircraft MDO is the problem of numerical noise which occurs frequently in engineering computations. Computermore » models produce numerical noise as a result of the incomplete convergence of iterative processes, round-off errors, and modeling errors. Such numerical noise is typically manifested as a high frequency, low amplitude variation in the results obtained from the computer models. Optimization attempted using noisy computer models may result in the erroneous calculation of design sensitivities and may slow or prevent convergence to an optimal design.« less
MSFC crack growth analysis computer program, version 2 (users manual)
NASA Technical Reports Server (NTRS)
Creager, M.
1976-01-01
An updated version of the George C. Marshall Space Flight Center Crack Growth Analysis Program is described. The updated computer program has significantly expanded capabilities over the original one. This increased capability includes an extensive expansion of the library of stress intensity factors, plotting capability, increased design iteration capability, and the capability of performing proof test logic analysis. The technical approaches used within the computer program are presented, and the input and output formats and options are described. Details of the stress intensity equations, example data, and example problems are presented.
Decoupled 1D/3D analysis of a hydraulic valve
NASA Astrophysics Data System (ADS)
Mehring, Carsten; Zopeya, Ashok; Latham, Matt; Ihde, Thomas; Massie, Dan
2014-10-01
Analysis approaches during product development of fluid valves and other aircraft fluid delivery components vary greatly depending on the development stage. Traditionally, empirical or simplistic one-dimensional tools are being deployed during preliminary design, whereas detailed analysis such as CFD (Computational Fluid Dynamics) tools are used to refine a selected design during the detailed design stage. In recent years, combined 1D/3D co-simulation has been deployed specifically for system level simulations requiring an increased level of analysis detail for one or more components. The present paper presents a decoupled 1D/3D analysis approach where 3D CFD analysis results are utilized to enhance the fidelity of a dynamic 1D modelin context of an aircraft fuel valve.
The Social Organisation of Help during Young Children's Use of the Computer
ERIC Educational Resources Information Center
Davidson, Christina
2012-01-01
This article examines some of the ways that young children seek and provide help through social interaction during use of the computer in the home. Although social interaction is considered an important aspect of young children's use of computers, there are still few studies that provide detailed analysis of how young children accomplish that…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kahn, R.E.
1983-11-01
Fifth generation of computers is described. The three disciplines involved in bringing such a new generation to reality are: microelectronics; artificial intelligence and, computer systems and architecture. Applications in industry, offices, aerospace, education, health care and retailing are outlined. An analysis is given of research efforts in the US, Japan, U.K., and Europe. Fifth generation programming languages are detailed.
Bimolecular dynamics by computer analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Eilbeck, J.C.; Lomdahl, P.S.; Scott, A.C.
1984-01-01
As numerical tools (computers and display equipment) become more powerful and the atomic structures of important biological molecules become known, the importance of detailed computation of nonequilibrium biomolecular dynamics increases. In this manuscript we report results from a well developed study of the hydrogen bonded polypeptide crystal acetanilide, a model protein. Directions for future research are suggested. 9 references, 6 figures.
Bifilar analysis users manual, volume 2
NASA Technical Reports Server (NTRS)
Cassarino, S. J.
1980-01-01
The digital computer program developed to study the vibration response of a coupled rotor/bifilar/airframe coupled system is described. The theoretical development of the rotor/airframe system equations of motion is provided. The fuselage and bifilar absorber equations of motion are discussed. The modular block approach used in the make-up of this computer program is described. The input data needed to run the rotor and bifilar absorber analyses is described. Sample output formats are presented and discussed. The results for four test cases, which use the major logic paths of the computer program, are presented. The overall program structure is discussed in detail. The FORTRAN subroutines are described in detail.
ERIC Educational Resources Information Center
Gray, John S.
1994-01-01
A detailed analysis and computer-based solution to a puzzle addressing the arrangement of dominoes on a grid is presented. The problem is one used in a college-level data structures or algorithms course. The solution uses backtracking to generate all possible answers. Details of the use of backtracking and techniques for mapping abstract problems…
Methods of space radiation dose analysis with applications to manned space systems
NASA Technical Reports Server (NTRS)
Langley, R. W.; Billings, M. P.
1972-01-01
The full potential of state-of-the-art space radiation dose analysis for manned missions has not been exploited. Point doses have been overemphasized, and the critical dose to the bone marrow has been only crudely approximated, despite the existence of detailed man models and computer codes for dose integration in complex geometries. The method presented makes it practical to account for the geometrical detail of the astronaut as well as the vehicle. Discussed are the major assumptions involved and the concept of applying the results of detailed proton dose analysis to the real-time interpretation of on-board dosimetric measurements.
Sacco, Federica; Paun, Bruno; Lehmkuhl, Oriol; Iles, Tinen L; Iaizzo, Paul A; Houzeaux, Guillaume; Vázquez, Mariano; Butakoff, Constantine; Aguado-Sierra, Jazmin
2018-06-11
Computational modelling plays an important role in right ventricular (RV) haemodynamic analysis. However, current approaches employ smoothed ventricular anatomies. The aim of this study is to characterise RV haemodynamics including detailed endocardial structures like trabeculae, moderator band and papillary muscles (PMs). Four paired detailed and smoothed RV endocardium models (two male and two female) were reconstructed from ex-vivo human hearts high-resolution magnetic resonance images (MRI). Detailed models include structures with ≥1 mm 2 cross-sectional area. Haemodynamic characterisation was done by computational fluid dynamics (CFD) simulations with steady and transient inflows, using high performance computing (HPC). The differences between the flows in smoothed and detailed models were assessed using Q-criterion for vorticity quantification, the pressure drop between inlet and outlet, and the wall shear stress (WSS). Results demonstrated that detailed endocardial structures increase the degree of intra-ventricular pressure drop, decrease the WSS and disrupt the dominant vortex creating secondary small vortices. Increasingly turbulent blood flow was observed in the detailed RVs. Female RVs were less trabeculated and presented lower pressure drops than the males. In conclusion, neglecting endocardial structures in RV haemodynamic models may lead to inaccurate conclusions about the pressures, stresses, and blood flow behaviour in the cavity. This article is protected by copyright. All rights reserved.
Evaluating a Computerized Aid for Conducting a Cognitive Task Analysis
2000-01-01
in conducting a cognitive task analysis . The conduct of a cognitive task analysis is costly and labor intensive. As a result, a few computerized aids...evaluation of a computerized aid, specifically CAT-HCI (Cognitive Analysis Tool - Human Computer Interface), for the conduct of a detailed cognitive task analysis . A
NGScloud: RNA-seq analysis of non-model species using cloud computing.
Mora-Márquez, Fernando; Vázquez-Poletti, José Luis; López de Heredia, Unai
2018-05-03
RNA-seq analysis usually requires large computing infrastructures. NGScloud is a bioinformatic system developed to analyze RNA-seq data using the cloud computing services of Amazon that permit the access to ad hoc computing infrastructure scaled according to the complexity of the experiment, so its costs and times can be optimized. The application provides a user-friendly front-end to operate Amazon's hardware resources, and to control a workflow of RNA-seq analysis oriented to non-model species, incorporating the cluster concept, which allows parallel runs of common RNA-seq analysis programs in several virtual machines for faster analysis. NGScloud is freely available at https://github.com/GGFHF/NGScloud/. A manual detailing installation and how-to-use instructions is available with the distribution. unai.lopezdeheredia@upm.es.
Efficient Computation Of Behavior Of Aircraft Tires
NASA Technical Reports Server (NTRS)
Tanner, John A.; Noor, Ahmed K.; Andersen, Carl M.
1989-01-01
NASA technical paper discusses challenging application of computational structural mechanics to numerical simulation of responses of aircraft tires during taxing, takeoff, and landing. Presents details of three main elements of computational strategy: use of special three-field, mixed-finite-element models; use of operator splitting; and application of technique reducing substantially number of degrees of freedom. Proposed computational strategy applied to two quasi-symmetric problems: linear analysis of anisotropic tires through use of two-dimensional-shell finite elements and nonlinear analysis of orthotropic tires subjected to unsymmetric loading. Three basic types of symmetry and combinations exhibited by response of tire identified.
Convergence acceleration of the Proteus computer code with multigrid methods
NASA Technical Reports Server (NTRS)
Demuren, A. O.; Ibraheem, S. O.
1992-01-01
Presented here is the first part of a study to implement convergence acceleration techniques based on the multigrid concept in the Proteus computer code. A review is given of previous studies on the implementation of multigrid methods in computer codes for compressible flow analysis. Also presented is a detailed stability analysis of upwind and central-difference based numerical schemes for solving the Euler and Navier-Stokes equations. Results are given of a convergence study of the Proteus code on computational grids of different sizes. The results presented here form the foundation for the implementation of multigrid methods in the Proteus code.
Geospatial Representation, Analysis and Computing Using Bandlimited Functions
2010-02-19
navigation of aircraft and missiles require detailed representations of gravity and efficient methods for determining orbits and trajectories. However, many...efficient on today’s computers. Under this grant new, computationally efficient, localized representations of gravity have been developed and tested. As a...step in developing a new approach to estimating gravitational potentials, a multiresolution representation for gravity estimation has been proposed
ERIC Educational Resources Information Center
Judd, Terry; Kennedy, Gregor
2011-01-01
Logs of on-campus computer and Internet usage were used to conduct a study of computer-based task switching and multitasking by undergraduate medical students. A detailed analysis of over 6000 individual sessions revealed that while a majority of students engaged in both task switching and multitasking behaviours, they did so less frequently than…
ERIC Educational Resources Information Center
Rogowski, Steve
1982-01-01
A problem is detailed which has a solution that embodies geometry, trigonometry, ballistics, projectile mechanics, vector analysis, and elementary computer graphics. It is felt that the information and sample computer programs can be a useful starting point for a user written code that involves missiles and other projectiles. (MP)
Convergence acceleration of the Proteus computer code with multigrid methods
NASA Technical Reports Server (NTRS)
Demuren, A. O.; Ibraheem, S. O.
1995-01-01
This report presents the results of a study to implement convergence acceleration techniques based on the multigrid concept in the two-dimensional and three-dimensional versions of the Proteus computer code. The first section presents a review of the relevant literature on the implementation of the multigrid methods in computer codes for compressible flow analysis. The next two sections present detailed stability analysis of numerical schemes for solving the Euler and Navier-Stokes equations, based on conventional von Neumann analysis and the bi-grid analysis, respectively. The next section presents details of the computational method used in the Proteus computer code. Finally, the multigrid implementation and applications to several two-dimensional and three-dimensional test problems are presented. The results of the present study show that the multigrid method always leads to a reduction in the number of iterations (or time steps) required for convergence. However, there is an overhead associated with the use of multigrid acceleration. The overhead is higher in 2-D problems than in 3-D problems, thus overall multigrid savings in CPU time are in general better in the latter. Savings of about 40-50 percent are typical in 3-D problems, but they are about 20-30 percent in large 2-D problems. The present multigrid method is applicable to steady-state problems and is therefore ineffective in problems with inherently unstable solutions.
Automatic network coupling analysis for dynamical systems based on detailed kinetic models.
Lebiedz, Dirk; Kammerer, Julia; Brandt-Pollmann, Ulrich
2005-10-01
We introduce a numerical complexity reduction method for the automatic identification and analysis of dynamic network decompositions in (bio)chemical kinetics based on error-controlled computation of a minimal model dimension represented by the number of (locally) active dynamical modes. Our algorithm exploits a generalized sensitivity analysis along state trajectories and subsequent singular value decomposition of sensitivity matrices for the identification of these dominant dynamical modes. It allows for a dynamic coupling analysis of (bio)chemical species in kinetic models that can be exploited for the piecewise computation of a minimal model on small time intervals and offers valuable functional insight into highly nonlinear reaction mechanisms and network dynamics. We present results for the identification of network decompositions in a simple oscillatory chemical reaction, time scale separation based model reduction in a Michaelis-Menten enzyme system and network decomposition of a detailed model for the oscillatory peroxidase-oxidase enzyme system.
Operating Dedicated Data Centers - Is It Cost-Effective?
NASA Astrophysics Data System (ADS)
Ernst, M.; Hogue, R.; Hollowell, C.; Strecker-Kellog, W.; Wong, A.; Zaytsev, A.
2014-06-01
The advent of cloud computing centres such as Amazon's EC2 and Google's Computing Engine has elicited comparisons with dedicated computing clusters. Discussions on appropriate usage of cloud resources (both academic and commercial) and costs have ensued. This presentation discusses a detailed analysis of the costs of operating and maintaining the RACF (RHIC and ATLAS Computing Facility) compute cluster at Brookhaven National Lab and compares them with the cost of cloud computing resources under various usage scenarios. An extrapolation of likely future cost effectiveness of dedicated computing resources is also presented.
Performance Analysis of Multilevel Parallel Applications on Shared Memory Architectures
NASA Technical Reports Server (NTRS)
Biegel, Bryan A. (Technical Monitor); Jost, G.; Jin, H.; Labarta J.; Gimenez, J.; Caubet, J.
2003-01-01
Parallel programming paradigms include process level parallelism, thread level parallelization, and multilevel parallelism. This viewgraph presentation describes a detailed performance analysis of these paradigms for Shared Memory Architecture (SMA). This analysis uses the Paraver Performance Analysis System. The presentation includes diagrams of a flow of useful computations.
NASA Technical Reports Server (NTRS)
Bailey, David H.; Borwein, Jonathan M.; Borwein, Peter B.; Plouffe, Simon
1996-01-01
This article gives a brief history of the analysis and computation of the mathematical constant Pi=3.14159 ..., including a number of the formulas that have been used to compute Pi through the ages. Recent developments in this area are then discussed in some detail, including the recent computation of Pi to over six billion decimal digits using high-order convergent algorithms, and a newly discovered scheme that permits arbitrary individual hexadecimal digits of Pi to be computed.
Automation of the aircraft design process
NASA Technical Reports Server (NTRS)
Heldenfels, R. R.
1974-01-01
The increasing use of the computer to automate the aerospace product development and engineering process is examined with emphasis on structural analysis and design. Examples of systems of computer programs in aerospace and other industries are reviewed and related to the characteristics of aircraft design in its conceptual, preliminary, and detailed phases. Problems with current procedures are identified, and potential improvements from optimum utilization of integrated disciplinary computer programs by a man/computer team are indicated.
An Improved Version of the NASA-Lockheed Multielement Airfoil Analysis Computer Program
NASA Technical Reports Server (NTRS)
Brune, G. W.; Manke, J. W.
1978-01-01
An improved version of the NASA-Lockheed computer program for the analysis of multielement airfoils is described. The predictions of the program are evaluated by comparison with recent experimental high lift data including lift, pitching moment, profile drag, and detailed distributions of surface pressures and boundary layer parameters. The results of the evaluation show that the contract objectives of improving program reliability and accuracy have been met.
Analysis of Compton continuum measurements
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gold, R.; Olson, I. K.
1970-01-01
Five computer programs: COMPSCAT, FEND, GABCO, DOSE, and COMPLOT, have been developed and used for the analysis and subsequent reduction of measured energy distributions of Compton recoil electrons to continuous gamma spectra. In addition to detailed descriptions of these computer programs, the relationship amongst these codes is stressed. The manner in which these programs function is illustrated by tracing a sample measurement through a complete cycle of the data-reduction process.
Probabilistic Structural Analysis Methods (PSAM) for select space propulsion systems components
NASA Technical Reports Server (NTRS)
1991-01-01
Summarized here is the technical effort and computer code developed during the five year duration of the program for probabilistic structural analysis methods. The summary includes a brief description of the computer code manuals and a detailed description of code validation demonstration cases for random vibrations of a discharge duct, probabilistic material nonlinearities of a liquid oxygen post, and probabilistic buckling of a transfer tube liner.
Calculation of three-dimensional, inviscid, supersonic, steady flows
NASA Technical Reports Server (NTRS)
Moretti, G.
1981-01-01
A detailed description of a computational program for the evaluation of three dimensional supersonic, inviscid, steady flow past airplanes is presented. Emphasis was put on how a powerful, automatic mapping technique is coupled to the fluid mechanical analysis. Each of the three constituents of the analysis (body geometry, mapping technique, and gas dynamical effects) was carefully coded and described. Results of computations based on sample geometrics and discussions are also presented.
A Computer String-Grammar of English.
ERIC Educational Resources Information Center
Sager, Naomi
This volume is the fourth in a series of detailed reports on a working computer program for the syntactic analysis of English sentences into their component strings. The report (1) records the considerations involved in various decisions among alternative grammatical formulations and presents the word-subclasses, the linguistic strings, etc., for…
NASA Technical Reports Server (NTRS)
Smith, Terence R.; Menon, Sudhakar; Star, Jeffrey L.; Estes, John E.
1987-01-01
This paper provides a brief survey of the history, structure and functions of 'traditional' geographic information systems (GIS), and then suggests a set of requirements that large-scale GIS should satisfy, together with a set of principles for their satisfaction. These principles, which include the systematic application of techniques from several subfields of computer science to the design and implementation of GIS and the integration of techniques from computer vision and image processing into standard GIS technology, are discussed in some detail. In particular, the paper provides a detailed discussion of questions relating to appropriate data models, data structures and computational procedures for the efficient storage, retrieval and analysis of spatially-indexed data.
INFORMATION STORAGE AND RETRIEVAL, REPORTS ON EVALUATION PROCEDURES AND RESULTS 1965-1967.
ERIC Educational Resources Information Center
SALTON, GERALD
A DETAILED ANALYSIS OF THE RETRIEVAL EVALUATION RESULTS OBTAINED WITH THE AUTOMATIC SMART DOCUMENT RETRIEVAL SYSTEM FOR DOCUMENT COLLECTIONS IN THE FIELDS OF AERODYNAMICS, COMPUTER SCIENCE, AND DOCUMENTATION IS GIVEN IN THIS REPORT. THE VARIOUS COMPONENTS OF FULLY AUTOMATIC DOCUMENT RETRIEVAL SYSTEMS ARE DISCUSSED IN DETAIL, INCLUDING THE FORMS OF…
TopoMS: Comprehensive topological exploration for molecular and condensed-matter systems.
Bhatia, Harsh; Gyulassy, Attila G; Lordi, Vincenzo; Pask, John E; Pascucci, Valerio; Bremer, Peer-Timo
2018-06-15
We introduce TopoMS, a computational tool enabling detailed topological analysis of molecular and condensed-matter systems, including the computation of atomic volumes and charges through the quantum theory of atoms in molecules, as well as the complete molecular graph. With roots in techniques from computational topology, and using a shared-memory parallel approach, TopoMS provides scalable, numerically robust, and topologically consistent analysis. TopoMS can be used as a command-line tool or with a GUI (graphical user interface), where the latter also enables an interactive exploration of the molecular graph. This paper presents algorithmic details of TopoMS and compares it with state-of-the-art tools: Bader charge analysis v1.0 (Arnaldsson et al., 01/11/17) and molecular graph extraction using Critic2 (Otero-de-la-Roza et al., Comput. Phys. Commun. 2014, 185, 1007). TopoMS not only combines the functionality of these individual codes but also demonstrates up to 4× performance gain on a standard laptop, faster convergence to fine-grid solution, robustness against lattice bias, and topological consistency. TopoMS is released publicly under BSD License. © 2018 Wiley Periodicals, Inc. © 2018 Wiley Periodicals, Inc.
Transportation Analysis and Simulation System Requirements
DOT National Transportation Integrated Search
1973-04-01
This document provides: : a. A brief summary of overall project (PPA OS223) accomplishments during FY 72. : b. A detailed summary of the following two major FY 72 activities: : 1. Analysis of TSC's computation resources and their utilization; : 2. Pr...
Computer code for off-design performance analysis of radial-inflow turbines with rotor blade sweep
NASA Technical Reports Server (NTRS)
Meitner, P. L.; Glassman, A. J.
1983-01-01
The analysis procedure of an existing computer program was extended to include rotor blade sweep, to model the flow more accurately at the rotor exit, and to provide more detail to the loss model. The modeling changes are described and all analysis equations and procedures are presented. Program input and output are described and are illustrated by an example problem. Results obtained from this program and from a previous program are compared with experimental data.
Ayres-de-Campos, Diogo; Rei, Mariana; Nunes, Inês; Sousa, Paulo; Bernardes, João
2017-01-01
SisPorto 4.0 is the most recent version of a program for the computer analysis of cardiotocographic (CTG) signals and ST events, which has been adapted to the 2015 International Federation of Gynaecology and Obstetrics (FIGO) guidelines for intrapartum foetal monitoring. This paper provides a detailed description of the analysis performed by the system, including the signal-processing algorithms involved in identification of basic CTG features and the resulting real-time alerts.
A structural analysis of an ocean going patrol boat subjected to planning loads
NASA Technical Reports Server (NTRS)
Clark, James H.; Lafreniere, Robert; Stoodt, Robert; Wiedenheft, John
1987-01-01
A static structural analysis of an ocean going patrol vessel subjected to hydrodynamic planning loads is discussed. The analysis required the development of a detailed model that included hull plating, five structural bulkheads, longitudinal and transverse stiffners, and a coarse representation of the superstructure. The finite element model was developed from fabrication drawings using the Navy computer aided design system. Various stress and displacement contours are shown for the entire hull. Because several critical areas appeared to be overstressed, these areas were remeshed for detail and are presented for completeness.
Signal design study for shuttle/TDRSS Ku-band uplink
NASA Technical Reports Server (NTRS)
1976-01-01
The adequacy of the signal design approach chosen for the TDRSS/orbiter uplink was evaluated. Critical functions and/or components associated with the baseline design were identified, and design alternatives were developed for those areas considered high risk. A detailed set of RF and signal processing performance specifications for the orbiter hardware associated with the TDRSS/orbiter Ku band uplink was analyzed. Performances of a detailed design of the PN despreader, the PSK carrier synchronization loop, and the symbol synchronizer are identified. The performance of the downlink signal by means of computer simulation to obtain a realistic determination of bit error rate degradations was studied. The three channel PM downlink signal was detailed by means of analysis and computer simulation.
Adaptation of a Control Center Development Environment for Industrial Process Control
NASA Technical Reports Server (NTRS)
Killough, Ronnie L.; Malik, James M.
1994-01-01
In the control center, raw telemetry data is received for storage, display, and analysis. This raw data must be combined and manipulated in various ways by mathematical computations to facilitate analysis, provide diversified fault detection mechanisms, and enhance display readability. A development tool called the Graphical Computation Builder (GCB) has been implemented which provides flight controllers with the capability to implement computations for use in the control center. The GCB provides a language that contains both general programming constructs and language elements specifically tailored for the control center environment. The GCB concept allows staff who are not skilled in computer programming to author and maintain computer programs. The GCB user is isolated from the details of external subsystem interfaces and has access to high-level functions such as matrix operators, trigonometric functions, and unit conversion macros. The GCB provides a high level of feedback during computation development that improves upon the often cryptic errors produced by computer language compilers. An equivalent need can be identified in the industrial data acquisition and process control domain: that of an integrated graphical development tool tailored to the application to hide the operating system, computer language, and data acquisition interface details. The GCB features a modular design which makes it suitable for technology transfer without significant rework. Control center-specific language elements can be replaced by elements specific to industrial process control.
POLO: a user's guide to Probit Or LOgit analysis.
Jacqueline L. Robertson; Robert M. Russell; N.E. Savin
1980-01-01
This user's guide provides detailed instructions for the use of POLO (Probit Or LOgit), a computer program for the analysis of quantal response data such as that obtained from insecticide bioassays by the techniques of probit or logit analysis. Dosage-response lines may be compared for parallelism or...
The contractor will conduct an independent peer review of FEV’s light-duty truck (LDT) mass safety study, “Light-Duty Vehicle Weight Reduction Study with Crash Model, Feasibility and Detailed Cost Analysis – Silverado 1500”, and its corresponding computer-aided engineering (CAE) ...
Finite element analysis of helicopter structures
NASA Technical Reports Server (NTRS)
Rich, M. J.
1978-01-01
Application of the finite element analysis is now being expanded to three dimensional analysis of mechanical components. Examples are presented for airframe, mechanical components, and composite structure calculations. Data are detailed on the increase of model size, computer usage, and the effect on reducing stress analysis costs. Future applications for use of finite element analysis for helicopter structures are projected.
Computations of Axisymmetric Flows in Hypersonic Shock Tubes
NASA Technical Reports Server (NTRS)
Sharma, Surendra P.; Wilson, Gregory J.
1995-01-01
A time-accurate two-dimensional fluid code is used to compute test times in shock tubes operated at supersonic speeds. Unlike previous studies, this investigation resolves the finer temporal details of the shock-tube flow by making use of modern supercomputers and state-of-the-art computational fluid dynamic solution techniques. The code, besides solving the time-dependent fluid equations, also accounts for the finite rate chemistry in the hypersonic environment. The flowfield solutions are used to estimate relevant shock-tube parameters for laminar flow, such as test times, and to predict density and velocity profiles. Boundary-layer parameters such as bar-delta(sub u), bar-delta(sup *), and bar-tau(sub w), and test time parameters such as bar-tau and particle time of flight t(sub f), are computed and compared with those evaluated by using Mirels' correlations. This article then discusses in detail the effects of flow nonuniformities on particle time-of-flight behind the normal shock and, consequently, on the interpretation of shock-tube data. This article concludes that for accurate interpretation of shock-tube data, a detailed analysis of flowfield parameters, using a computer code such as used in this study, must be performed.
High speed cylindrical roller bearing analysis. SKF computer program CYBEAN. Volume 2: User's manual
NASA Technical Reports Server (NTRS)
Dyba, G. J.; Kleckner, R. J.
1981-01-01
CYBEAN (CYlindrical BEaring ANalysis) was created to detail radially loaded, aligned and misaligned cylindrical roller bearing performance under a variety of operating conditions. Emphasis was placed on detailing the effects of high speed, preload and system thermal coupling. Roller tilt, skew, radial, circumferential and axial displacement as well as flange contact were considered. Variable housing and flexible out-of-round outer ring geometries, and both steady state and time transient temperature calculations were enabled. The complete range of elastohydrodynamic contact considerations, employing full and partial film conditions were treated in the computation of raceway and flange contacts. The practical and correct implementation of CYBEAN is discussed. The capability to execute the program at four different levels of complexity was included. In addition, the program was updated to properly direct roller-to-raceway contact load vectors automatically in those cases where roller or ring profiles have small radii of curvature. Input and output architectures containing guidelines for use and two sample executions are detailed.
A Computational Observer For Performing Contrast-Detail Analysis Of Ultrasound Images
NASA Astrophysics Data System (ADS)
Lopez, H.; Loew, M. H.
1988-06-01
Contrast-Detail (C/D) analysis allows the quantitative determination of an imaging system's ability to display a range of varying-size targets as a function of contrast. Using this technique, a contrast-detail plot is obtained which can, in theory, be used to compare image quality from one imaging system to another. The C/D plot, however, is usually obtained by using data from human observer readings. We have shown earlier(7) that the performance of human observers in the task of threshold detection of simulated lesions embedded in random ultrasound noise is highly inaccurate and non-reproducible for untrained observers. We present an objective, computational method for the determination of the C/D curve for ultrasound images. This method utilizes digital images of the C/D phantom developed at CDRH, and lesion-detection algorithms that simulate the Bayesian approach using the likelihood function for an ideal observer. We present the results of this method, and discuss the relationship to the human observer and to the comparability of image quality between systems.
A dc model for power switching transistors suitable for computer-aided design and analysis
NASA Technical Reports Server (NTRS)
Wilson, P. M.; George, R. T., Jr.; Owen, H. A., Jr.; Wilson, T. G.
1979-01-01
The proposed dc model for bipolar junction power switching transistors is based on measurements which may be made with standard laboratory equipment. Those nonlinearities which are of importance to power electronics design are emphasized. Measurements procedures are discussed in detail. A model formulation adapted for use with a computer program is presented, and a comparison between actual and computer-generated results is made.
Mirror neurons and imitation: a computationally guided review.
Oztop, Erhan; Kawato, Mitsuo; Arbib, Michael
2006-04-01
Neurophysiology reveals the properties of individual mirror neurons in the macaque while brain imaging reveals the presence of 'mirror systems' (not individual neurons) in the human. Current conceptual models attribute high level functions such as action understanding, imitation, and language to mirror neurons. However, only the first of these three functions is well-developed in monkeys. We thus distinguish current opinions (conceptual models) on mirror neuron function from more detailed computational models. We assess the strengths and weaknesses of current computational models in addressing the data and speculations on mirror neurons (macaque) and mirror systems (human). In particular, our mirror neuron system (MNS), mental state inference (MSI) and modular selection and identification for control (MOSAIC) models are analyzed in more detail. Conceptual models often overlook the computational requirements for posited functions, while too many computational models adopt the erroneous hypothesis that mirror neurons are interchangeable with imitation ability. Our meta-analysis underlines the gap between conceptual and computational models and points out the research effort required from both sides to reduce this gap.
A computational workflow for designing silicon donor qubits
Humble, Travis S.; Ericson, M. Nance; Jakowski, Jacek; ...
2016-09-19
Developing devices that can reliably and accurately demonstrate the principles of superposition and entanglement is an on-going challenge for the quantum computing community. Modeling and simulation offer attractive means of testing early device designs and establishing expectations for operational performance. However, the complex integrated material systems required by quantum device designs are not captured by any single existing computational modeling method. We examine the development and analysis of a multi-staged computational workflow that can be used to design and characterize silicon donor qubit systems with modeling and simulation. Our approach integrates quantum chemistry calculations with electrostatic field solvers to performmore » detailed simulations of a phosphorus dopant in silicon. We show how atomistic details can be synthesized into an operational model for the logical gates that define quantum computation in this particular technology. In conclusion, the resulting computational workflow realizes a design tool for silicon donor qubits that can help verify and validate current and near-term experimental devices.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Huff, Kathryn D.
Component level and system level abstraction of detailed computational geologic repository models have resulted in four rapid computational models of hydrologic radionuclide transport at varying levels of detail. Those models are described, as is their implementation in Cyder, a software library of interchangeable radionuclide transport models appropriate for representing natural and engineered barrier components of generic geology repository concepts. A proof of principle demonstration was also conducted in which these models were used to represent the natural and engineered barrier components of a repository concept in a reducing, homogenous, generic geology. This base case demonstrates integration of the Cyder openmore » source library with the Cyclus computational fuel cycle systems analysis platform to facilitate calculation of repository performance metrics with respect to fuel cycle choices. (authors)« less
New Mexico district work-effort analysis computer program
Hiss, W.L.; Trantolo, A.P.; Sparks, J.L.
1972-01-01
The computer program (CAN 2) described in this report is one of several related programs used in the New Mexico District cost-analysis system. The work-effort information used in these programs is accumulated and entered to the nearest hour on forms completed by each employee. Tabulating cards are punched directly from these forms after visual examinations for errors are made. Reports containing detailed work-effort data itemized by employee within each project and account and by account and project for each employee are prepared for both current-month and year-to-date periods by the CAN 2 computer program. An option allowing preparation of reports for a specified 3-month period is provided. The total number of hours worked on each account and project and a grand total of hours worked in the New Mexico District is computed and presented in a summary report for each period. Work effort not chargeable directly to individual projects or accounts is considered as overhead and can be apportioned to the individual accounts and projects on the basis of the ratio of the total hours of work effort for the individual accounts or projects to the total New Mexico District work effort at the option of the user. The hours of work performed by a particular section, such as General Investigations or Surface Water, are prorated and charged to the projects or accounts within the particular section. A number of surveillance or buffer accounts are employed to account for the hours worked on special events or on those parts of large projects or accounts that require a more detailed analysis. Any part of the New Mexico District operation can be separated and analyzed in detail by establishing an appropriate buffer account. With the exception of statements associated with word size, the computer program is written in FORTRAN IV in a relatively low and standard language level to facilitate its use on different digital computers. The program has been run only on a Control Data Corporation 6600 computer system. Central processing computer time has seldom exceeded 5 minutes on the longest year-to-date runs.
NASA Technical Reports Server (NTRS)
Heidergott, K. W.
1979-01-01
The computer program known as QR is described. Classical control systems analysis and synthesis (root locus, time response, and frequency response) can be performed using this program. Programming details of the QR program are presented.
Halligan, Brian D.; Geiger, Joey F.; Vallejos, Andrew K.; Greene, Andrew S.; Twigger, Simon N.
2009-01-01
One of the major difficulties for many laboratories setting up proteomics programs has been obtaining and maintaining the computational infrastructure required for the analysis of the large flow of proteomics data. We describe a system that combines distributed cloud computing and open source software to allow laboratories to set up scalable virtual proteomics analysis clusters without the investment in computational hardware or software licensing fees. Additionally, the pricing structure of distributed computing providers, such as Amazon Web Services, allows laboratories or even individuals to have large-scale computational resources at their disposal at a very low cost per run. We provide detailed step by step instructions on how to implement the virtual proteomics analysis clusters as well as a list of current available preconfigured Amazon machine images containing the OMSSA and X!Tandem search algorithms and sequence databases on the Medical College of Wisconsin Proteomics Center website (http://proteomics.mcw.edu/vipdac). PMID:19358578
Analysis of a Multiprocessor Guidance Computer. Ph.D. Thesis
NASA Technical Reports Server (NTRS)
Maltach, E. G.
1969-01-01
The design of the next generation of spaceborne digital computers is described. It analyzes a possible multiprocessor computer configuration. For the analysis, a set of representative space computing tasks was abstracted from the Lunar Module Guidance Computer programs as executed during the lunar landing, from the Apollo program. This computer performs at this time about 24 concurrent functions, with iteration rates from 10 times per second to once every two seconds. These jobs were tabulated in a machine-independent form, and statistics of the overall job set were obtained. It was concluded, based on a comparison of simulation and Markov results, that the Markov process analysis is accurate in predicting overall trends and in configuration comparisons, but does not provide useful detailed information in specific situations. Using both types of analysis, it was determined that the job scheduling function is a critical one for efficiency of the multiprocessor. It is recommended that research into the area of automatic job scheduling be performed.
Halligan, Brian D; Geiger, Joey F; Vallejos, Andrew K; Greene, Andrew S; Twigger, Simon N
2009-06-01
One of the major difficulties for many laboratories setting up proteomics programs has been obtaining and maintaining the computational infrastructure required for the analysis of the large flow of proteomics data. We describe a system that combines distributed cloud computing and open source software to allow laboratories to set up scalable virtual proteomics analysis clusters without the investment in computational hardware or software licensing fees. Additionally, the pricing structure of distributed computing providers, such as Amazon Web Services, allows laboratories or even individuals to have large-scale computational resources at their disposal at a very low cost per run. We provide detailed step-by-step instructions on how to implement the virtual proteomics analysis clusters as well as a list of current available preconfigured Amazon machine images containing the OMSSA and X!Tandem search algorithms and sequence databases on the Medical College of Wisconsin Proteomics Center Web site ( http://proteomics.mcw.edu/vipdac ).
2008-03-01
Appendix 82 MatLab© Cd Calculator Routine FORTRAN© Subroutine of the Variable Cd Model ii ABBREVIATIONS & ACRONYMS Cd...Figure 29. Overview Flowchart of Benét Labs Recoil Analysis Code Figure 30. Overview Flowchart of Recoil Brake Subroutine Figure 31...Detail Flowchart of Recoil Pressure/Force Calculations Figure 32. Detail Flowchart of Variable Cd Subroutine Figure 33. Simulated Brake
NASA Technical Reports Server (NTRS)
Gupta, Kajal K.
1991-01-01
The details of an integrated general-purpose finite element structural analysis computer program which is also capable of solving complex multidisciplinary problems is presented. Thus, the SOLIDS module of the program possesses an extensive finite element library suitable for modeling most practical problems and is capable of solving statics, vibration, buckling, and dynamic response problems of complex structures, including spinning ones. The aerodynamic module, AERO, enables computation of unsteady aerodynamic forces for both subsonic and supersonic flow for subsequent flutter and divergence analysis of the structure. The associated aeroservoelastic analysis module, ASE, effects aero-structural-control stability analysis yielding frequency responses as well as damping characteristics of the structure. The program is written in standard FORTRAN to run on a wide variety of computers. Extensive graphics, preprocessing, and postprocessing routines are also available pertaining to a number of terminals.
Utility of fluorescence microscopy in embryonic/fetal topographical analysis.
Zucker, R M; Elstein, K H; Shuey, D L; Ebron-McCoy, M; Rogers, J M
1995-06-01
For topographical analysis of developing embryos, investigators typically rely on scanning electron microscopy (SEM) to provide the surface detail not attainable with light microscopy. SEM is an expensive and time-consuming technique, however, and the preparation procedure may alter morphology and leave the specimen friable. We report that by using a high-resolution compound epifluorescence microscope with inexpensive low-power objectives and the fluorochrome acridine orange, we were able to obtain surface images of fixed or fresh whole rat embryos and fetal palates of considerably greater topographical detail than those obtained using routine light microscopy. Indeed the resulting high-resolution images afford not only superior qualitative documentation of morphological observations, but the capability for detailed morphometry via digitization and computer-assisted image analysis.
High Temperature Composite Analyzer (HITCAN) demonstration manual, version 1.0
NASA Technical Reports Server (NTRS)
Singhal, S. N; Lackney, J. J.; Murthy, P. L. N.
1993-01-01
This manual comprises a variety of demonstration cases for the HITCAN (HIgh Temperature Composite ANalyzer) code. HITCAN is a general purpose computer program for predicting nonlinear global structural and local stress-strain response of arbitrarily oriented, multilayered high temperature metal matrix composite structures. HITCAN is written in FORTRAN 77 computer language and has been configured and executed on the NASA Lewis Research Center CRAY XMP and YMP computers. Detailed description of all program variables and terms used in this manual may be found in the User's Manual. The demonstration includes various cases to illustrate the features and analysis capabilities of the HITCAN computer code. These cases include: (1) static analysis, (2) nonlinear quasi-static (incremental) analysis, (3) modal analysis, (4) buckling analysis, (5) fiber degradation effects, (6) fabrication-induced stresses for a variety of structures; namely, beam, plate, ring, shell, and built-up structures. A brief discussion of each demonstration case with the associated input data file is provided. Sample results taken from the actual computer output are also included.
Payload Operations Control Center (POCC). [spacelab flight operations
NASA Technical Reports Server (NTRS)
Shipman, D. L.; Noneman, S. R.; Terry, E. S.
1981-01-01
The Spacelab payload operations control center (POCC) timeline analysis program which is used to provide POCC activity and resource information as a function of mission time is described. This program is fully automated and interactive, and is equipped with tutorial displays. The tutorial displays are sufficiently detailed for use by a program analyst having no computer experience. The POCC timeline analysis program is designed to operate on the VAX/VMS version V2.1 computer system.
Computer Analysis Of High-Speed Roller Bearings
NASA Technical Reports Server (NTRS)
Coe, H.
1988-01-01
High-speed cylindrical roller-bearing analysis program (CYBEAN) developed to compute behavior of cylindrical rolling-element bearings at high speeds and with misaligned shafts. With program, accurate assessment of geometry-induced roller preload possible for variety of out-ring and housing configurations and loading conditions. Enables detailed examination of bearing performance and permits exploration of causes and consequences of bearing skew. Provides general capability for assessment of designs of bearings supporting main shafts of engines. Written in FORTRAN IV.
A computer program for modeling non-spherical eclipsing binary star systems
NASA Technical Reports Server (NTRS)
Wood, D. B.
1972-01-01
The accurate analysis of eclipsing binary light curves is fundamental to obtaining information on the physical properties of stars. The model described accounts for the important geometric and photometric distortions such as rotational and tidal distortion, gravity brightening, and reflection effect. This permits a more accurate analysis of interacting eclipsing star systems. The model is designed to be useful to anyone with moderate computing resources. The programs, written in FORTRAN 4 for the IBM 360, consume about 80k bytes of core. The FORTRAN program listings are provided, and the computational aspects are described in some detail.
BOLD VENTURE COMPUTATION SYSTEM for nuclear reactor core analysis, Version III
DOE Office of Scientific and Technical Information (OSTI.GOV)
Vondy, D.R.; Fowler, T.B.; Cunningham, G.W. III.
1981-06-01
This report is a condensed documentation for VERSION III of the BOLD VENTURE COMPUTATION SYSTEM for nuclear reactor core analysis. An experienced analyst should be able to use this system routinely for solving problems by referring to this document. Individual reports must be referenced for details. This report covers basic input instructions and describes recent extensions to the modules as well as to the interface data file specifications. Some application considerations are discussed and an elaborate sample problem is used as an instruction aid. Instructions for creating the system on IBM computers are also given.
The Numerical Propulsion System Simulation: An Overview
NASA Technical Reports Server (NTRS)
Lytle, John K.
2000-01-01
Advances in computational technology and in physics-based modeling are making large-scale, detailed simulations of complex systems possible within the design environment. For example, the integration of computing, communications, and aerodynamics has reduced the time required to analyze major propulsion system components from days and weeks to minutes and hours. This breakthrough has enabled the detailed simulation of major propulsion system components to become a routine part of designing systems, providing the designer with critical information about the components early in the design process. This paper describes the development of the numerical propulsion system simulation (NPSS), a modular and extensible framework for the integration of multicomponent and multidisciplinary analysis tools using geographically distributed resources such as computing platforms, data bases, and people. The analysis is currently focused on large-scale modeling of complete aircraft engines. This will provide the product developer with a "virtual wind tunnel" that will reduce the number of hardware builds and tests required during the development of advanced aerospace propulsion systems.
NASA Technical Reports Server (NTRS)
Bains, R. W.; Herwig, H. A.; Luedeman, J. K.; Torina, E. M.
1974-01-01
The Shuttle Electric Power System Analysis SEPS computer program which performs detailed load analysis including predicting energy demands and consumables requirements of the shuttle electric power system along with parameteric and special case studies on the shuttle electric power system is described. The functional flow diagram of the SEPS program is presented along with data base requirements and formats, procedure and activity definitions, and mission timeline input formats. Distribution circuit input and fixed data requirements are included. Run procedures and deck setups are described.
NASA Technical Reports Server (NTRS)
DeBonis, J. R.; Trefny, C. J.; Steffen, C. J., Jr.
1999-01-01
Design and analysis of the inlet for a rocket based combined cycle engine is discussed. Computational fluid dynamics was used in both the design and subsequent analysis. Reynolds averaged Navier-Stokes simulations were performed using both perfect gas and real gas assumptions. An inlet design that operates over the required Mach number range from 0 to 12 was produced. Performance data for cycle analysis was post processed using a stream thrust averaging technique. A detailed performance database for cycle analysis is presented. The effect ot vehicle forebody compression on air capture is also examined.
NASA Technical Reports Server (NTRS)
1990-01-01
Structural Reliability Consultants' computer program creates graphic plots showing the statistical parameters of glue laminated timbers, or 'glulam.' The company president, Dr. Joseph Murphy, read in NASA Tech Briefs about work related to analysis of Space Shuttle surface tile strength performed for Johnson Space Center by Rockwell International Corporation. Analysis led to a theory of 'consistent tolerance bounds' for statistical distributions, applicable in industrial testing where statistical analysis can influence product development and use. Dr. Murphy then obtained the Tech Support Package that covers the subject in greater detail. The TSP became the basis for Dr. Murphy's computer program PC-DATA, which he is marketing commercially.
NASA Technical Reports Server (NTRS)
Nagy, S.
1988-01-01
Due to extraordinary distances scanned by modern telescopes, optical surfaces in such telescopes must be manufactured to unimaginable standards of perfection of a few thousandths of a centimeter. The detection of imperfections of less than 1/20 of a wavelength of light, for application in the building of the mirror for the Space Infrared Telescope Facility, was undertaken. Because the mirror must be kept very cold while in space, another factor comes into effect: cryogenics. The process to test a specific morror under cryogenic conditions is described; including the follow-up analysis accomplished through computer work. To better illustrate the process and analysis, a Pyrex Hex-Core mirror is followed through the process from the laser interferometry in the lab, to computer analysis via a computer program called FRINGE. This analysis via FRINGE is detailed.
NASA Technical Reports Server (NTRS)
Mullins, N. E.
1972-01-01
The GEODYN Orbit Determination and Geodetic Parameter Estimation System consists of a set of computer programs designed to determine and analyze definitive satellite orbits and their associated geodetic and measurement parameters. This manual describes the Support Programs used by the GEODYN System. The mathematics and programming descriptions are detailed. The operational procedures of each program are presented. GEODYN ancillary analysis programs may be grouped into three different categories: (1) orbit comparison - DELTA (2) data analysis using reference orbits - GEORGE, and (3) pass geometry computations - GROUNDTRACK. All of the above three programs use one or more tapes written by the GEODYN program in either a data reduction or orbit generator run.
Flow induction by pressure forces
NASA Technical Reports Server (NTRS)
Garris, C. A.; Toh, K. H.; Amin, S.
1992-01-01
A dual experimental/computational approach to the fluid mechanics of complex interactions that take place in a rotary-jet ejector is presented. The long-range goal is to perform both detailed flow mapping and finite element computational analysis. The described work represents an initial finding on the experimental mapping program. Test results on the hubless rotary-jet are discussed.
Spherical roller bearing analysis. SKF computer program SPHERBEAN. Volume 2: User's manual
NASA Technical Reports Server (NTRS)
Kleckner, R. J.; Dyba, G. J.
1980-01-01
The user's guide for the SPHERBEAN computer program for prediction of the thermomechanical performance characteristics of high speed lubricated double row spherical roller bearings is presented. The material presented is structured to guide the user in the practical and correct implementation of SPHERBEAN. Input and output, guidelines for program use, and sample executions are detailed.
NASA Technical Reports Server (NTRS)
Orzechowski, J. A.
1982-01-01
The CMC fluid mechanics program system was developed to transmit the theoretical evolution of finite element numerical solution methodology, applied to nonlinear field problems into a versatile computer code for comprehensive flow field analysis. A detailed view of the code from the standpoint of a computer programmer's use is presented. A system macroflow chart and detailed flow charts of several routines necessary to interact with a theoretican/user to modify the operation of this program are presented. All subroutines and details of usage, primarily for input and output routines are described. Integer and real scalars and a cross reference list denoting subroutine usage for these scalars are outlined. Entry points in dynamic storage vector IZ; the lengths of each vector accompanying the scalar definitions are described. A listing of the routines peculiar to the standard test case and a listing of the input deck and printout for this case are included.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Clark, Haley; BC Cancer Agency, Surrey, B.C.; BC Cancer Agency, Vancouver, B.C.
2014-08-15
Many have speculated about the future of computational technology in clinical radiation oncology. It has been advocated that the next generation of computational infrastructure will improve on the current generation by incorporating richer aspects of automation, more heavily and seamlessly featuring distributed and parallel computation, and providing more flexibility toward aggregate data analysis. In this report we describe how a recently created — but currently existing — analysis framework (DICOMautomaton) incorporates these aspects. DICOMautomaton supports a variety of use cases but is especially suited for dosimetric outcomes correlation analysis, investigation and comparison of radiotherapy treatment efficacy, and dose-volume computation. Wemore » describe: how it overcomes computational bottlenecks by distributing workload across a network of machines; how modern, asynchronous computational techniques are used to reduce blocking and avoid unnecessary computation; and how issues of out-of-date data are addressed using reactive programming techniques and data dependency chains. We describe internal architecture of the software and give a detailed demonstration of how DICOMautomaton could be used to search for correlations between dosimetric and outcomes data.« less
Development of surrogate models for the prediction of the flow around an aircraft propeller
NASA Astrophysics Data System (ADS)
Salpigidou, Christina; Misirlis, Dimitris; Vlahostergios, Zinon; Yakinthos, Kyros
2018-05-01
In the present work, the derivation of two surrogate models (SMs) for modelling the flow around a propeller for small aircrafts is presented. Both methodologies use derived functions based on computations with the detailed propeller geometry. The computations were performed using k-ω shear stress transport for modelling turbulence. In the SMs, the modelling of the propeller was performed in a computational domain of disk-like geometry, where source terms were introduced in the momentum equations. In the first SM, the source terms were polynomial functions of swirl and thrust, mainly related to the propeller radius. In the second SM, regression analysis was used to correlate the source terms with the velocity distribution through the propeller. The proposed SMs achieved faster convergence, in relation to the detail model, by providing also results closer to the available operational data. The regression-based model was the most accurate and required less computational time for convergence.
Mori, Yoshikazu; Ogawa, Kazuo; Warabi, Eiji; Yamamoto, Masahiro; Hirokawa, Takatsugu
2016-01-01
Transient receptor potential vanilloid type 1 (TRPV1) is a non-selective cation channel and a multimodal sensor protein. Since the precise structure of TRPV1 was obtained by electron cryo-microscopy, the binding mode of representative agonists such as capsaicin and resiniferatoxin (RTX) has been extensively characterized; however, detailed information on the binding mode of other vanilloids remains lacking. In this study, mutational analysis of human TRPV1 was performed, and four agonists (capsaicin, RTX, [6]-shogaol and [6]-gingerol) were used to identify amino acid residues involved in ligand binding and/or modulation of proton sensitivity. The detailed binding mode of each ligand was then simulated by computational analysis. As a result, three amino acids (L518, F591 and L670) were newly identified as being involved in ligand binding and/or modulation of proton sensitivity. In addition, in silico docking simulation and a subsequent mutational study suggested that [6]-gingerol might bind to and activate TRPV1 in a unique manner. These results provide novel insights into the binding mode of various vanilloids to the channel and will be helpful in developing a TRPV1 modulator. PMID:27606946
DOE Office of Scientific and Technical Information (OSTI.GOV)
Strout, Michelle
Programming parallel machines is fraught with difficulties: the obfuscation of algorithms due to implementation details such as communication and synchronization, the need for transparency between language constructs and performance, the difficulty of performing program analysis to enable automatic parallelization techniques, and the existence of important "dusty deck" codes. The SAIMI project developed abstractions that enable the orthogonal specification of algorithms and implementation details within the context of existing DOE applications. The main idea is to enable the injection of small programming models such as expressions involving transcendental functions, polyhedral iteration spaces with sparse constraints, and task graphs into full programsmore » through the use of pragmas. These smaller, more restricted programming models enable orthogonal specification of many implementation details such as how to map the computation on to parallel processors, how to schedule the computation, and how to allocation storage for the computation. At the same time, these small programming models enable the expression of the most computationally intense and communication heavy portions in many scientific simulations. The ability to orthogonally manipulate the implementation for such computations will significantly ease performance programming efforts and expose transformation possibilities and parameter to automated approaches such as autotuning. At Colorado State University, the SAIMI project was supported through DOE grant DE-SC3956 from April 2010 through August 2015. The SAIMI project has contributed a number of important results to programming abstractions that enable the orthogonal specification of implementation details in scientific codes. This final report summarizes the research that was funded by the SAIMI project.« less
Safety Guided Design Based on Stamp/STPA for Manned Vehicle in Concept Design Phase
NASA Astrophysics Data System (ADS)
Ujiie, Ryo; Katahira, Masafumi; Miyamoto, Yuko; Umeda, Hiroki; Leveson, Nancy; Hoshino, Nobuyuki
2013-09-01
In manned vehicles, such as the Soyuz and the Space Shuttle, the crew and computer system cooperate to succeed in returning to the earth. While computers increase the functionality of system, they also increase the complexity of the interaction between the controllers (human and computer) and the target dynamics. In some cases, the complexity can produce a serious accident. To prevent such losses, traditional hazard analysis such as FTA has been applied to system development, however it can be used after creating a detailed system because it focuses on detailed component failures. As a result, it's more difficult to eliminate hazard cause early in the process when it is most feasible.STAMP/STPA is a new hazard analysis that can be applied from the early development phase, with the analysis being refined as more detailed decisions are made. In essence, the analysis and design decisions are intertwined and go hand-in-hand. We have applied STAMP/STPA to a concept design of a new JAXA manned vehicle and tried safety guided design of the vehicle. As a result of this trial, it has been shown that STAMP/STPA can be accepted easily by system engineers and the design has been made more sophisticated from a safety viewpoint. The result also shows that the consequences of human errors on system safety can be analysed in the early development phase and the system designed to prevent them. Finally, the paper will discuss an effective way to harmonize this safety guided design approach with system engineering process based on the result of this experience in this project.
Acoustic radiation from lined, unflanged ducts: Acoustic source distribution program
NASA Technical Reports Server (NTRS)
Beckemeyer, R. J.; Sawdy, D. T.
1971-01-01
An acoustic radiation analysis was developed to predict the far-field characteristics of fan noise radiated from an acoustically lined unflanged duct. This analysis is comprised of three modular digital computer programs which together provide a capability of accounting for the impedance mismatch at the duct exit plane. Admissible duct configurations include circular or annular, with or without an extended centerbody. This variation in duct configurations provides a capability of modeling inlet and fan duct noise radiation. The computer programs are described in detail.
Applications of the pipeline environment for visual informatics and genomics computations
2011-01-01
Background Contemporary informatics and genomics research require efficient, flexible and robust management of large heterogeneous data, advanced computational tools, powerful visualization, reliable hardware infrastructure, interoperability of computational resources, and detailed data and analysis-protocol provenance. The Pipeline is a client-server distributed computational environment that facilitates the visual graphical construction, execution, monitoring, validation and dissemination of advanced data analysis protocols. Results This paper reports on the applications of the LONI Pipeline environment to address two informatics challenges - graphical management of diverse genomics tools, and the interoperability of informatics software. Specifically, this manuscript presents the concrete details of deploying general informatics suites and individual software tools to new hardware infrastructures, the design, validation and execution of new visual analysis protocols via the Pipeline graphical interface, and integration of diverse informatics tools via the Pipeline eXtensible Markup Language syntax. We demonstrate each of these processes using several established informatics packages (e.g., miBLAST, EMBOSS, mrFAST, GWASS, MAQ, SAMtools, Bowtie) for basic local sequence alignment and search, molecular biology data analysis, and genome-wide association studies. These examples demonstrate the power of the Pipeline graphical workflow environment to enable integration of bioinformatics resources which provide a well-defined syntax for dynamic specification of the input/output parameters and the run-time execution controls. Conclusions The LONI Pipeline environment http://pipeline.loni.ucla.edu provides a flexible graphical infrastructure for efficient biomedical computing and distributed informatics research. The interactive Pipeline resource manager enables the utilization and interoperability of diverse types of informatics resources. The Pipeline client-server model provides computational power to a broad spectrum of informatics investigators - experienced developers and novice users, user with or without access to advanced computational-resources (e.g., Grid, data), as well as basic and translational scientists. The open development, validation and dissemination of computational networks (pipeline workflows) facilitates the sharing of knowledge, tools, protocols and best practices, and enables the unbiased validation and replication of scientific findings by the entire community. PMID:21791102
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hunn, B. D.; Diamond, S. C.; Bennett, G. A.
1977-10-01
A set of computer programs, called Cal-ERDA, is described that is capable of rapid and detailed analysis of energy consumption in buildings. A new user-oriented input language, named the Building Design Language (BDL), has been written to allow simplified manipulation of the many variables used to describe a building and its operation. This manual provides the user with information necessary to understand in detail the Cal-ERDA set of computer programs. The new computer programs described include: an EXECUTIVE Processor to create computer system control commands; a BDL Processor to analyze input instructions, execute computer system control commands, perform assignments andmore » data retrieval, and control the operation of the LOADS, SYSTEMS, PLANT, ECONOMICS, and REPORT programs; a LOADS analysis program that calculates peak (design) zone and hourly loads and the effect of the ambient weather conditions, the internal occupancy, lighting, and equipment within the building, as well as variations in the size, location, orientation, construction, walls, roofs, floors, fenestrations, attachments (awnings, balconies), and shape of a building; a Heating, Ventilating, and Air-Conditioning (HVAC) SYSTEMS analysis program capable of modeling the operation of HVAC components including fans, coils, economizers, humidifiers, etc.; 16 standard configurations and operated according to various temperature and humidity control schedules. A plant equipment program models the operation of boilers, chillers, electrical generation equipment (diesel or turbines), heat storage apparatus (chilled or heated water), and solar heating and/or cooling systems. An ECONOMIC analysis program calculates life-cycle costs. A REPORT program produces tables of user-selected variables and arranges them according to user-specified formats. A set of WEATHER ANALYSIS programs manipulates, summarizes and plots weather data. Libraries of weather data, schedule data, and building data were prepared.« less
Analysis and assessment of STES technologies
NASA Astrophysics Data System (ADS)
Brown, D. R.; Blahnik, D. E.; Huber, H. D.
1982-12-01
Technical and economic assessments completed in FY 1982 in support of the Seasonal Thermal Energy Storage (STES) segment of the Underground Energy Storage Program included: (1) a detailed economic investigation of the cost of heat storage in aquifers, (2) documentation for AQUASTOR, a computer model for analyzing aquifer thermal energy storage (ATES) coupled with district heating or cooling, and (3) a technical and economic evaluation of several ice storage concepts. This paper summarizes the research efforts and main results of each of these three activities. In addition, a detailed economic investigation of the cost of chill storage in aquifers is currently in progress. The work parallels that done for ATES heat storage with technical and economic assumptions being varied in a parametric analysis of the cost of ATES delivered chill. The computer model AQUASTOR is the principal analytical tool being employed.
A Primer on Architectural Level Fault Tolerance
NASA Technical Reports Server (NTRS)
Butler, Ricky W.
2008-01-01
This paper introduces the fundamental concepts of fault tolerant computing. Key topics covered are voting, fault detection, clock synchronization, Byzantine Agreement, diagnosis, and reliability analysis. Low level mechanisms such as Hamming codes or low level communications protocols are not covered. The paper is tutorial in nature and does not cover any topic in detail. The focus is on rationale and approach rather than detailed exposition.
NASA Astrophysics Data System (ADS)
Borah, Mukunda Madhab; Devi, Th. Gomti
2018-06-01
The vibrational spectral analysis of Serotonin and its dimer were carried out using the Fourier Transform Infrared (FTIR) and Raman techniques. The equilibrium geometrical parameters, harmonic vibrational wavenumbers, Frontier orbitals, Mulliken atomic charges, Natural Bond orbitals, first order hyperpolarizability and some optimized energy parameters were computed by density functional theory with 6-31G(d,p) basis set. The detailed analysis of the vibrational spectra have been carried out by computing Potential Energy Distribution (PED, %) with the help of Vibrational Energy Distribution Analysis (VEDA) program. The second order delocalization energies E(2) confirms the occurrence of intramolecular Charge Transfer (ICT) within the molecule. The computed wavenumbers of Serotonin monomer and dimer were found in good agreement with the experimental Raman and IR values.
NASA Technical Reports Server (NTRS)
Cothran, E. K.
1982-01-01
The computer program written in support of one dimensional analytical approach to thermal modeling of Bridgman type crystal growth is presented. The program listing and flow charts are included, along with the complete thermal model. Sample problems include detailed comments on input and output to aid the first time user.
NASA Technical Reports Server (NTRS)
John, Bonnie E.; Remington, Roger W.; Steier, David M.
1991-01-01
Before all systems are go just prior to the launch of a space shuttle, thousands of operations and tests have been performed to ensure that all shuttle and support subsystems are operational and ready for launch. These steps, which range from activating the orbiter's flight computers to removing the launch pad from the itinerary of the NASA tour buses, are carried out by launch team members at various locations and with highly specialized fields of expertise. The liability for coordinating these diverse activities rests with the NASA Test Director (NTD) at NASA-Kennedy. The behavior is being studied of the NTD with the goal of building a detailed computational model of that behavior; the results of that analysis to date are given. The NTD's performance is described in detail, as a team member who must coordinate a complex task through efficient audio communication, as well as an individual taking notes and consulting manuals. A model of the routine cognitive skill used by the NTD to follow the launch countdown procedure manual was implemented using the Soar cognitive architecture. Several examples are given of how such a model could aid in evaluating proposed computer support systems.
Integrated approach for stress analysis of high performance diesel engine cylinder head
NASA Astrophysics Data System (ADS)
Chainov, N. D.; Myagkov, L. L.; Malastowski, N. S.; Blinov, A. S.
2018-03-01
Growing thermal and mechanical loads due to development of engines with high level of a mean effective pressure determine requirements to cylinder head durability. In this paper, computational schemes for thermal and mechanical stress analysis of a high performance diesel engine cylinder head were described. The most important aspects in this approach are the account of temperature fields of conjugated details (valves and saddles), heat transfer modeling in a cooling jacket of a cylinder head and topology optimization of the detail force scheme. Simulation results are shown and analyzed.
Materials requirements for optical processing and computing devices
NASA Technical Reports Server (NTRS)
Tanguay, A. R., Jr.
1985-01-01
Devices for optical processing and computing systems are discussed, with emphasis on the materials requirements imposed by functional constraints. Generalized optical processing and computing systems are described in order to identify principal categories of requisite components for complete system implementation. Three principal device categories are selected for analysis in some detail: spatial light modulators, volume holographic optical elements, and bistable optical devices. The implications for optical processing and computing systems of the materials requirements identified for these device categories are described, and directions for future research are proposed.
A new paradigm for atomically detailed simulations of kinetics in biophysical systems.
Elber, Ron
2017-01-01
The kinetics of biochemical and biophysical events determined the course of life processes and attracted considerable interest and research. For example, modeling of biological networks and cellular responses relies on the availability of information on rate coefficients. Atomically detailed simulations hold the promise of supplementing experimental data to obtain a more complete kinetic picture. However, simulations at biological time scales are challenging. Typical computer resources are insufficient to provide the ensemble of trajectories at the correct length that is required for straightforward calculations of time scales. In the last years, new technologies emerged that make atomically detailed simulations of rate coefficients possible. Instead of computing complete trajectories from reactants to products, these approaches launch a large number of short trajectories at different positions. Since the trajectories are short, they are computed trivially in parallel on modern computer architecture. The starting and termination positions of the short trajectories are chosen, following statistical mechanics theory, to enhance efficiency. These trajectories are analyzed. The analysis produces accurate estimates of time scales as long as hours. The theory of Milestoning that exploits the use of short trajectories is discussed, and several applications are described.
GPU-accelerated FDTD modeling of radio-frequency field-tissue interactions in high-field MRI.
Chi, Jieru; Liu, Feng; Weber, Ewald; Li, Yu; Crozier, Stuart
2011-06-01
The analysis of high-field RF field-tissue interactions requires high-performance finite-difference time-domain (FDTD) computing. Conventional CPU-based FDTD calculations offer limited computing performance in a PC environment. This study presents a graphics processing unit (GPU)-based parallel-computing framework, producing substantially boosted computing efficiency (with a two-order speedup factor) at a PC-level cost. Specific details of implementing the FDTD method on a GPU architecture have been presented and the new computational strategy has been successfully applied to the design of a novel 8-element transceive RF coil system at 9.4 T. Facilitated by the powerful GPU-FDTD computing, the new RF coil array offers optimized fields (averaging 25% improvement in sensitivity, and 20% reduction in loop coupling compared with conventional array structures of the same size) for small animal imaging with a robust RF configuration. The GPU-enabled acceleration paves the way for FDTD to be applied for both detailed forward modeling and inverse design of MRI coils, which were previously impractical.
Dashboard Task Monitor for Managing ATLAS User Analysis on the Grid
NASA Astrophysics Data System (ADS)
Sargsyan, L.; Andreeva, J.; Jha, M.; Karavakis, E.; Kokoszkiewicz, L.; Saiz, P.; Schovancova, J.; Tuckett, D.; Atlas Collaboration
2014-06-01
The organization of the distributed user analysis on the Worldwide LHC Computing Grid (WLCG) infrastructure is one of the most challenging tasks among the computing activities at the Large Hadron Collider. The Experiment Dashboard offers a solution that not only monitors but also manages (kill, resubmit) user tasks and jobs via a web interface. The ATLAS Dashboard Task Monitor provides analysis users with a tool that is independent of the operating system and Grid environment. This contribution describes the functionality of the application and its implementation details, in particular authentication, authorization and audit of the management operations.
Pusic, Martin V.; LeBlanc, Vicki; Patel, Vimla L.
2001-01-01
Traditional task analysis for instructional design has emphasized the importance of precisely defining behavioral educational objectives and working back to select objective-appropriate instructional strategies. However, this approach may miss effective strategies. Cognitive task analysis, on the other hand, breaks a process down into its component knowledge representations. Selection of instructional strategies based on all such representations in a domain is likely to lead to optimal instructional design. In this demonstration, using the interpretation of cervical spine x-rays as an educational example, we show how a detailed cognitive task analysis can guide the development of computer-aided instruction.
NASA Astrophysics Data System (ADS)
Sureshkumar, B.; Mary, Y. Sheena; Resmi, K. S.; Panicker, C. Yohannan; Armaković, Stevan; Armaković, Sanja J.; Van Alsenoy, C.; Narayana, B.; Suma, S.
2018-03-01
Two 8-hydroxyquinoline derivatives, 5,7-dichloro-8-hydroxyquinoline (57DC8HQ) and 5-chloro-7-iodo-8-hydroxy quinoline (5CL7I8HQ) have been investigated in details by means of spectroscopic characterization and computational molecular modelling techniques. FT-IR and FT-Raman experimental spectroscopic approaches have been utilized in order to obtain detailed spectroscopic signatures of title compounds, while DFT calculations have been used in order to visualize and assign vibrations. The computed values of dipole moment, polarizability and hyperpolarizability indicate that the title molecules exhibit NLO properties. The evaluated HOMO and LUMO energies demonstrate the chemical stability of the molecules. NBO analysis is made to study the stability of the molecules arising from hyperconjugative interactions and charge delocalization. DFT calculations have been also used jointly with MD simulations in order to investigate in details global and local reactivity properties of title compounds. Also, molecular docking has been also used in order to investigate affinity of title compounds against decarboxylase inhibitor and quinoline derivatives can be a lead compounds for developing new antiparkinsonian drug.
Glenn-HT: The NASA Glenn Research Center General Multi-Block Navier-Stokes Heat Transfer Code
NASA Technical Reports Server (NTRS)
Gaugler, Raymond E.; Lee, Chi-Miag (Technical Monitor)
2001-01-01
For the last several years, Glenn-HT, a three-dimensional (3D) Computational Fluid Dynamics (CFD) computer code for the analysis of gas turbine flow and convective heat transfer has been evolving at the NASA Glenn Research Center. The code is unique in the ability to give a highly detailed representation of the flow field very close to solid surfaces in order to get accurate representation of fluid heat transfer and viscous shear stresses. The code has been validated and used extensively for both internal cooling passage flow and for hot gas path flows, including detailed film cooling calculations and complex tip clearance gap flow and heat transfer. In its current form, this code has a multiblock grid capability and has been validated for a number of turbine configurations. The code has been developed and used primarily as a research tool, but it can be useful for detailed design analysis. In this paper, the code is described and examples of its validation and use for complex flow calculations are presented, emphasizing the applicability to turbomachinery for space launch vehicle propulsion systems.
Glenn-HT: The NASA Glenn Research Center General Multi-Block Navier-Stokes Heat Transfer Code
NASA Technical Reports Server (NTRS)
Gaugfer, Raymond E.
2002-01-01
For the last several years, Glenn-HT, a three-dimensional (3D) Computational Fluid Dynamics (CFD) computer code for the analysis of gas turbine flow and convective heat transfer has been evolving at the NASA Glenn Research Center. The code is unique in the ability to give a highly detailed representation of the flow field very close to solid surfaces in order to get accurate representation of fluid heat transfer and viscous shear stresses. The code has been validated and used extensively for both internal cooling passage flow and for hot gas path flows, including detailed film cooling calculations and complex tip clearance gap flow and heat transfer. In its current form, this code has a multiblock grid capability and has been validated for a number of turbine configurations. The code has been developed and used primarily as a research tool, but it can be useful for detailed design analysis. In this presentation, the code is described and examples of its validation and use for complex flow calculations are presented, emphasizing the applicability to turbomachinery.
Glenn-HT: The NASA Glenn Research Center General Multi-Block Navier Stokes Heat Transfer Code
NASA Technical Reports Server (NTRS)
Gaugler, Raymond E.
2002-01-01
For the last several years, Glenn-HT, a three-dimensional (3D) Computational Fluid Dynamics (CFD) computer code for the analysis of gas turbine flow and convective heat transfer has been evolving at the NASA Glenn Research Center. The code is unique in the ability to give a highly detailed representation of the flow field very close to solid surfaces in order to get accurate representation of fluid beat transfer and viscous shear stresses. The code has been validated and used extensively for both internal cooling passage flow and for hot gas path flows, including detailed film cooling calculations and complex tip clearance gap flow and heat transfer. In its current form, this code has a multiblock grid capability and has been validated for a number of turbine configurations. The code has been developed and used primarily as a research tool, but it can be useful for detailed design analysis. In this presentation, the code is described and examples of its validation and use for complex flow calculations are presented, emphasizing the applicability to turbomachinery.
Computer and photogrammetric general land use study of central north Alabama
NASA Technical Reports Server (NTRS)
Jayroe, R. R.; Larsen, P. A.; Campbell, C. W.
1974-01-01
The object of this report is to acquaint potential users with two computer programs, developed at NASA, Marshall Space Flight Center. They were used in producing a land use survey and maps of central north Alabama from Earth Resources Technology Satellite (ERTS) digital data. The report describes in detail the thought processes and analysis procedures used from the initiation of the land use study to its completion, as well as a photogrammetric study that was used in conjunction with the computer analysis to produce similar land use maps. The results of the land use demonstration indicate that, with respect to computer time and cost, such a study may be economically and realistically feasible on a statewide basis.
NASA Technical Reports Server (NTRS)
Nguyen, H. L.; Ying, S.-J.
1990-01-01
Jet-A spray combustion has been evaluated in gas turbine combustion with the use of propane chemical kinetics as the first approximation for the chemical reactions. Here, the numerical solutions are obtained by using the KIVA-2 computer code. The KIVA-2 code is the most developed of the available multidimensional combustion computer programs for application of the in-cylinder combustion dynamics of internal combustion engines. The released version of KIVA-2 assumes that 12 chemical species are present; the code uses an Arrhenius kinetic-controlled combustion model governed by a four-step global chemical reaction and six equilibrium reactions. Researchers efforts involve the addition of Jet-A thermophysical properties and the implementation of detailed reaction mechanisms for propane oxidation. Three different detailed reaction mechanism models are considered. The first model consists of 131 reactions and 45 species. This is considered as the full mechanism which is developed through the study of chemical kinetics of propane combustion in an enclosed chamber. The full mechanism is evaluated by comparing calculated ignition delay times with available shock tube data. However, these detailed reactions occupy too much computer memory and CPU time for the computation. Therefore, it only serves as a benchmark case by which to evaluate other simplified models. Two possible simplified models were tested in the existing computer code KIVA-2 for the same conditions as used with the full mechanism. One model is obtained through a sensitivity analysis using LSENS, the general kinetics and sensitivity analysis program code of D. A. Bittker and K. Radhakrishnan. This model consists of 45 chemical reactions and 27 species. The other model is based on the work published by C. K. Westbrook and F. L. Dryer.
Hasson, Uri; Skipper, Jeremy I; Wilde, Michael J; Nusbaum, Howard C; Small, Steven L
2008-01-15
The increasingly complex research questions addressed by neuroimaging research impose substantial demands on computational infrastructures. These infrastructures need to support management of massive amounts of data in a way that affords rapid and precise data analysis, to allow collaborative research, and to achieve these aims securely and with minimum management overhead. Here we present an approach that overcomes many current limitations in data analysis and data sharing. This approach is based on open source database management systems that support complex data queries as an integral part of data analysis, flexible data sharing, and parallel and distributed data processing using cluster computing and Grid computing resources. We assess the strengths of these approaches as compared to current frameworks based on storage of binary or text files. We then describe in detail the implementation of such a system and provide a concrete description of how it was used to enable a complex analysis of fMRI time series data.
Hasson, Uri; Skipper, Jeremy I.; Wilde, Michael J.; Nusbaum, Howard C.; Small, Steven L.
2007-01-01
The increasingly complex research questions addressed by neuroimaging research impose substantial demands on computational infrastructures. These infrastructures need to support management of massive amounts of data in a way that affords rapid and precise data analysis, to allow collaborative research, and to achieve these aims securely and with minimum management overhead. Here we present an approach that overcomes many current limitations in data analysis and data sharing. This approach is based on open source database management systems that support complex data queries as an integral part of data analysis, flexible data sharing, and parallel and distributed data processing using cluster computing and Grid computing resources. We assess the strengths of these approaches as compared to current frameworks based on storage of binary or text files. We then describe in detail the implementation of such a system and provide a concrete description of how it was used to enable a complex analysis of fMRI time series data. PMID:17964812
Control of Technology Transfer at JPL
NASA Technical Reports Server (NTRS)
Oliver, Ronald
2006-01-01
Controlled Technology: 1) Design: preliminary or critical design data, schematics, technical flow charts, SNV code/diagnostics, logic flow diagrams, wirelist, ICDs, detailed specifications or requirements. 2) Development: constraints, computations, configurations, technical analyses, acceptance criteria, anomaly resolution, detailed test plans, detailed technical proposals. 3) Production: process or how-to: assemble, operated, repair, maintain, modify. 4) Manufacturing: technical instructions, specific parts, specific materials, specific qualities, specific processes, specific flow. 5) Operations: how-to operate, contingency or standard operating plans, Ops handbooks. 6) Repair: repair instructions, troubleshooting schemes, detailed schematics. 7) Test: specific procedures, data, analysis, detailed test plan and retest plans, detailed anomaly resolutions, detailed failure causes and corrective actions, troubleshooting, trended test data, flight readiness data. 8) Maintenance: maintenance schedules and plans, methods for regular upkeep, overhaul instructions. 9) Modification: modification instructions, upgrades kit parts, including software
1989-10-01
REVIEW MENU PROGRAM (S) CHAPS PURPOSE AND OVERVIEV The Do Review menu allows the user to select which missions to perform detailed analysis on and...input files must be resident on the computer you are running SUPR on. Any interface or file transfer programs must be successfully executed prior to... COMPUTER PROGRAM WAS DEVELOPED BY SYSTEMS CONTROL TECHNOLOGY FOR THE DEPUTY CHIEF OF STAFF/OPERATIONS,HQ USAFE. THE USE OF THE COMPUTER PROGRAM IS
[Computer simulation by passenger wound analysis of vehicle collision].
Zou, Dong-Hua; Liu, Nning-Guo; Shen, Jie; Zhang, Xiao-Yun; Jin, Xian-Long; Chen, Yi-Jiu
2006-08-15
To reconstruct the course of vehicle collision, so that to provide the reference for forensic identification and disposal of traffic accidents. Through analyzing evidences left both on passengers and vehicles, technique of momentum impulse combined with multi-dynamics was applied to simulate the motion and injury of passengers as well as the track of vehicles. Model of computer stimulation perfectly reconstructed phases of the traffic collision, which coincide with details found by forensic investigation. Computer stimulation is helpful and feasible for forensic identification in traffic accidents.
Tomographic assessment of the spine in children with spondylocostal dysotosis syndrome.
Kaissi, Ali Al; Klaushofer, Klaus; Grill, Franz
2010-01-01
The aim of this study was to perform a detailed tomographic analysis of the skull base, craniocervical junction, and the entire spine in seven patients with spondylocostal dysostosis syndrome. Detailed scanning images have been organized in accordance with the most prominent clinical pathology. The reasons behind plagiocephaly, torticollis, short immobile neck, scoliosis and rigid back have been detected. Radiographic documentation was insufficient modality. Detailed computed tomography scans provided excellent delineation of the osseous abnormality pattern in our patients. This article throws light on the most serious osseous manifestations of spondylocostal dysostosissyndrome.
On the Use of Statistics in Design and the Implications for Deterministic Computer Experiments
NASA Technical Reports Server (NTRS)
Simpson, Timothy W.; Peplinski, Jesse; Koch, Patrick N.; Allen, Janet K.
1997-01-01
Perhaps the most prevalent use of statistics in engineering design is through Taguchi's parameter and robust design -- using orthogonal arrays to compute signal-to-noise ratios in a process of design improvement. In our view, however, there is an equally exciting use of statistics in design that could become just as prevalent: it is the concept of metamodeling whereby statistical models are built to approximate detailed computer analysis codes. Although computers continue to get faster, analysis codes always seem to keep pace so that their computational time remains non-trivial. Through metamodeling, approximations of these codes are built that are orders of magnitude cheaper to run. These metamodels can then be linked to optimization routines for fast analysis, or they can serve as a bridge for integrating analysis codes across different domains. In this paper we first review metamodeling techniques that encompass design of experiments, response surface methodology, Taguchi methods, neural networks, inductive learning, and kriging. We discuss their existing applications in engineering design and then address the dangers of applying traditional statistical techniques to approximate deterministic computer analysis codes. We conclude with recommendations for the appropriate use of metamodeling techniques in given situations and how common pitfalls can be avoided.
NASA Technical Reports Server (NTRS)
STACK S. H.
1981-01-01
A computer-aided design system has recently been developed specifically for the small research group environment. The system is implemented on a Prime 400 minicomputer linked with a CDC 6600 computer. The goal was to assign the minicomputer specific tasks, such as data input and graphics, thereby reserving the large mainframe computer for time-consuming analysis codes. The basic structure of the design system consists of GEMPAK, a computer code that generates detailed configuration geometry from a minimum of input; interface programs that reformat GEMPAK geometry for input to the analysis codes; and utility programs that simplify computer access and data interpretation. The working system has had a large positive impact on the quantity and quality of research performed by the originating group. This paper describes the system, the major factors that contributed to its particular form, and presents examples of its application.
Host computer software specifications for a zero-g payload manhandling simulator
NASA Technical Reports Server (NTRS)
Wilson, S. W.
1986-01-01
The HP PASCAL source code was developed for the Mission Planning and Analysis Division (MPAD) of NASA/JSC, and takes the place of detailed flow charts defining the host computer software specifications for MANHANDLE, a digital/graphical simulator that can be used to analyze the dynamics of onorbit (zero-g) payload manhandling operations. Input and output data for representative test cases are contained.
Alignment-free genetic sequence comparisons: a review of recent approaches by word analysis
Steele, Joe; Bastola, Dhundy
2014-01-01
Modern sequencing and genome assembly technologies have provided a wealth of data, which will soon require an analysis by comparison for discovery. Sequence alignment, a fundamental task in bioinformatics research, may be used but with some caveats. Seminal techniques and methods from dynamic programming are proving ineffective for this work owing to their inherent computational expense when processing large amounts of sequence data. These methods are prone to giving misleading information because of genetic recombination, genetic shuffling and other inherent biological events. New approaches from information theory, frequency analysis and data compression are available and provide powerful alternatives to dynamic programming. These new methods are often preferred, as their algorithms are simpler and are not affected by synteny-related problems. In this review, we provide a detailed discussion of computational tools, which stem from alignment-free methods based on statistical analysis from word frequencies. We provide several clear examples to demonstrate applications and the interpretations over several different areas of alignment-free analysis such as base–base correlations, feature frequency profiles, compositional vectors, an improved string composition and the D2 statistic metric. Additionally, we provide detailed discussion and an example of analysis by Lempel–Ziv techniques from data compression. PMID:23904502
On Target Localization Using Combined RSS and AoA Measurements
Beko, Marko; Dinis, Rui
2018-01-01
This work revises existing solutions for a problem of target localization in wireless sensor networks (WSNs), utilizing integrated measurements, namely received signal strength (RSS) and angle of arrival (AoA). The problem of RSS/AoA-based target localization became very popular in the research community recently, owing to its great applicability potential and relatively low implementation cost. Therefore, here, a comprehensive study of the state-of-the-art (SoA) solutions and their detailed analysis is presented. The beginning of this work starts by considering the SoA approaches based on convex relaxation techniques (more computationally complex in general), and it goes through other (less computationally complex) approaches, as well, such as the ones based on the generalized trust region sub-problems framework and linear least squares. Furthermore, a detailed analysis of the computational complexity of each solution is reviewed. Furthermore, an extensive set of simulation results is presented. Finally, the main conclusions are summarized, and a set of future aspects and trends that might be interesting for future research in this area is identified. PMID:29671832
High-Dimensional Exploratory Item Factor Analysis by a Metropolis-Hastings Robbins-Monro Algorithm
ERIC Educational Resources Information Center
Cai, Li
2010-01-01
A Metropolis-Hastings Robbins-Monro (MH-RM) algorithm for high-dimensional maximum marginal likelihood exploratory item factor analysis is proposed. The sequence of estimates from the MH-RM algorithm converges with probability one to the maximum likelihood solution. Details on the computer implementation of this algorithm are provided. The…
Evaluating Internal Communication: The ICA Communication Audit.
ERIC Educational Resources Information Center
Goldhaber, Gerald M.
1978-01-01
The ICA Communication Audit is described in detail as an effective measurement procedure that can help an academic institution to evaluate its internal communication system. Tools, computer programs, analysis, and feedback procedures are described and illustrated. (JMF)
Hydrodynamic design of generic pump components
NASA Technical Reports Server (NTRS)
Eastland, A. H. J.; Dodson, H. C.
1991-01-01
Inducer and impellar base geometries were defined for a fuel pump for a generic generator cycle. Blade surface data and inlet flowfield definition are available in sufficient detail to allow computational fluid dynamic analysis of the two components.
Computational analysis of Variable Thrust Engine (VTE) performance
NASA Technical Reports Server (NTRS)
Giridharan, M. G.; Krishnan, A.; Przekwas, A. J.
1993-01-01
The Variable Thrust Engine (VTE) of the Orbital Maneuvering Vehicle (OMV) uses a hypergolic propellant combination of Monomethyl Hydrazine (MMH) and Nitrogen Tetroxide (NTO) as fuel and oxidizer, respectively. The performance of the VTE depends on a number of complex interacting phenomena such as atomization, spray dynamics, vaporization, turbulent mixing, convective/radiative heat transfer, and hypergolic combustion. This study involved the development of a comprehensive numerical methodology to facilitate detailed analysis of the VTE. An existing Computational Fluid Dynamics (CFD) code was extensively modified to include the following models: a two-liquid, two-phase Eulerian-Lagrangian spray model; a chemical equilibrium model; and a discrete ordinate radiation heat transfer model. The modified code was used to conduct a series of simulations to assess the effects of various physical phenomena and boundary conditions on the VTE performance. The details of the models and the results of the simulations are presented.
Fulop, Sean A; Fitz, Kelly
2006-01-01
A modification of the spectrogram (log magnitude of the short-time Fourier transform) to more accurately show the instantaneous frequencies of signal components was first proposed in 1976 [Kodera et al., Phys. Earth Planet. Inter. 12, 142-150 (1976)], and has been considered or reinvented a few times since but never widely adopted. This paper presents a unified theoretical picture of this time-frequency analysis method, the time-corrected instantaneous frequency spectrogram, together with detailed implementable algorithms comparing three published techniques for its computation. The new representation is evaluated against the conventional spectrogram for its superior ability to track signal components. The lack of a uniform framework for either mathematics or implementation details which has characterized the disparate literature on the schemes has been remedied here. Fruitful application of the method is shown in the realms of speech phonation analysis, whale song pitch tracking, and additive sound modeling.
Algorithms in nature: the convergence of systems biology and computational thinking
Navlakha, Saket; Bar-Joseph, Ziv
2011-01-01
Computer science and biology have enjoyed a long and fruitful relationship for decades. Biologists rely on computational methods to analyze and integrate large data sets, while several computational methods were inspired by the high-level design principles of biological systems. Recently, these two directions have been converging. In this review, we argue that thinking computationally about biological processes may lead to more accurate models, which in turn can be used to improve the design of algorithms. We discuss the similar mechanisms and requirements shared by computational and biological processes and then present several recent studies that apply this joint analysis strategy to problems related to coordination, network analysis, and tracking and vision. We also discuss additional biological processes that can be studied in a similar manner and link them to potential computational problems. With the rapid accumulation of data detailing the inner workings of biological systems, we expect this direction of coupling biological and computational studies to greatly expand in the future. PMID:22068329
Solar heating and cooling technical data and systems analysis
NASA Technical Reports Server (NTRS)
Christensen, D. L.
1976-01-01
The accomplishments of a project to study solar heating and air conditioning are outlined. Presentation materials (data packages, slides, charts, and visual aids) were developed. Bibliographies and source materials on materials and coatings, solar water heaters, systems analysis computer models, solar collectors and solar projects were developed. Detailed MIRADS computer formats for primary data parameters were developed and updated. The following data were included: climatic, architectural, topography, heating and cooling equipment, thermal loads, and economics. Data sources in each of these areas were identified as well as solar radiation data stations and instruments.
Automated Parameter Studies Using a Cartesian Method
NASA Technical Reports Server (NTRS)
Murman, Scott M.; Aftosimis, Michael J.; Nemec, Marian
2004-01-01
Computational Fluid Dynamics (CFD) is now routinely used to analyze isolated points in a design space by performing steady-state computations at fixed flight conditions (Mach number, angle of attack, sideslip), for a fixed geometric configuration of interest. This "point analysis" provides detailed information about the flowfield, which aides an engineer in understanding, or correcting, a design. A point analysis is typically performed using high fidelity methods at a handful of critical design points, e.g. a cruise or landing configuration, or a sample of points along a flight trajectory.
Aeroservoelastic and Flight Dynamics Analysis Using Computational Fluid Dynamics
NASA Technical Reports Server (NTRS)
Arena, Andrew S., Jr.
1999-01-01
This document in large part is based on the Masters Thesis of Cole Stephens. The document encompasses a variety of technical and practical issues involved when using the STARS codes for Aeroservoelastic analysis of vehicles. The document covers in great detail a number of technical issues and step-by-step details involved in the simulation of a system where aerodynamics, structures and controls are tightly coupled. Comparisons are made to a benchmark experimental program conducted at NASA Langley. One of the significant advantages of the methodology detailed is that as a result of the technique used to accelerate the CFD-based simulation, a systems model is produced which is very useful for developing the control law strategy, and subsequent high-speed simulations.
NASA Technical Reports Server (NTRS)
Hoffer, R. M.
1974-01-01
Forestry, geology, and water resource applications were the focus of this study, which involved the use of computer-implemented pattern-recognition techniques to analyze ERTS-1 data. The results have proven the value of computer-aided analysis techniques, even in areas of mountainous terrain. Several analysis capabilities have been developed during these ERTS-1 investigations. A procedure to rotate, deskew, and geometrically scale the MSS data results in 1:24,000 scale printouts that can be directly overlayed on 7 1/2 minutes U.S.G.S. topographic maps. Several scales of computer-enhanced "false color-infrared" composites of MSS data can be obtained from a digital display unit, and emphasize the tremendous detail present in the ERTS-1 data. A grid can also be superimposed on the displayed data to aid in specifying areas of interest.
Garsson, B
1988-01-01
Remember that computer software is designed for accrual accounting, whereas your business operates and reports income on a cash basis. The rules of tax law stipulate that professional practices may use the cash method of accounting, but if accrual accounting is ever used to report taxable income the government may not permit a switch back to cash accounting. Therefore, always consider the computer as a bookkeeper, not a substitute for a qualified accountant. (Your accountant will have readily accessible payroll and general ledger data available for analysis and tax reports, thanks to the magic of computer processing.) Accounts Payable reports are interfaced with the general ledger and are of interest for transaction detail, open invoice and cash flow analysis, and for a record of payments by vendor. Payroll reports, including check register and withholding detail are provided and interfaced with the general ledger. The use of accounting software expands the use of in-office computers to areas beyond professional billing and insurance form generation. It simplifies payroll recordkeeping; maintains payables details; integrates payables, receivables, and payroll with general ledger files; provides instantaneous information on all aspects of the business office; and creates a continuous "audit-trail" following the entering of data. The availability of packaged accounting software allows the professional business office an array of choices. The person(s) responsible for bookkeeping and accounting should choose carefully, ensuring that any system is easy to use, has been thoroughly tested, and provides at least as much control over office records as has been outlined in this article.
NASA Astrophysics Data System (ADS)
Okladnikov, Igor; Gordov, Evgeny; Titov, Alexander; Fazliev, Alexander
2017-04-01
Description and the first results of the Russian Science Foundation project "Virtual computational information environment for analysis, evaluation and prediction of the impacts of global climate change on the environment and climate of a selected region" is presented. The project is aimed at development of an Internet-accessible computation and information environment providing unskilled in numerical modelling and software design specialists, decision-makers and stakeholders with reliable and easy-used tools for in-depth statistical analysis of climatic characteristics, and instruments for detailed analysis, assessment and prediction of impacts of global climate change on the environment and climate of the targeted region. In the framework of the project, approaches of "cloud" processing and analysis of large geospatial datasets will be developed on the technical platform of the Russian leading institution involved in research of climate change and its consequences. Anticipated results will create a pathway for development and deployment of thematic international virtual research laboratory focused on interdisciplinary environmental studies. VRE under development will comprise best features and functionality of earlier developed information and computing system CLIMATE (http://climate.scert.ru/), which is widely used in Northern Eurasia environment studies. The Project includes several major directions of research listed below. 1. Preparation of geo-referenced data sets, describing the dynamics of the current and possible future climate and environmental changes in detail. 2. Improvement of methods of analysis of climate change. 3. Enhancing the functionality of the VRE prototype in order to create a convenient and reliable tool for the study of regional social, economic and political consequences of climate change. 4. Using the output of the first three tasks, compilation of the VRE prototype, its validation, preparation of applicable detailed description of climate change in Western Siberia, and dissemination of the Project results. Results of the first stage of the Project implementation are presented. This work is supported by the Russian Science Foundation grant No16-19-10257.
The multi-disciplinary design study: A life cycle cost algorithm
NASA Technical Reports Server (NTRS)
Harding, R. R.; Pichi, F. J.
1988-01-01
The approach and results of a Life Cycle Cost (LCC) analysis of the Space Station Solar Dynamic Power Subsystem (SDPS) including gimbal pointing and power output performance are documented. The Multi-Discipline Design Tool (MDDT) computer program developed during the 1986 study has been modified to include the design, performance, and cost algorithms for the SDPS as described. As with the Space Station structural and control subsystems, the LCC of the SDPS can be computed within the MDDT program as a function of the engineering design variables. Two simple examples of MDDT's capability to evaluate cost sensitivity and design based on LCC are included. MDDT was designed to accept NASA's IMAT computer program data as input so that IMAT's detailed structural and controls design capability can be assessed with expected system LCC as computed by MDDT. No changes to IMAT were required. Detailed knowledge of IMAT is not required to perform the LCC analyses as the interface with IMAT is noninteractive.
Adjoint-Based Sensitivity and Uncertainty Analysis for Density and Composition: A User’s Guide
Favorite, Jeffrey A.; Perko, Zoltan; Kiedrowski, Brian C.; ...
2017-03-01
The ability to perform sensitivity analyses using adjoint-based first-order sensitivity theory has existed for decades. This paper provides guidance on how adjoint sensitivity methods can be used to predict the effect of material density and composition uncertainties in critical experiments, including when these uncertain parameters are correlated or constrained. Two widely used Monte Carlo codes, MCNP6 (Ref. 2) and SCALE 6.2 (Ref. 3), are both capable of computing isotopic density sensitivities in continuous energy and angle. Additionally, Perkó et al. have shown how individual isotope density sensitivities, easily computed using adjoint methods, can be combined to compute constrained first-order sensitivitiesmore » that may be used in the uncertainty analysis. This paper provides details on how the codes are used to compute first-order sensitivities and how the sensitivities are used in an uncertainty analysis. Constrained first-order sensitivities are computed in a simple example problem.« less
Computational analysis of forebody tangential slot blowing
NASA Technical Reports Server (NTRS)
Gee, Ken; Agosta-Greenman, Roxana M.; Rizk, Yehia M.; Schiff, Lewis B.; Cummings, Russell M.
1994-01-01
An overview of the computational effort to analyze forebody tangential slot blowing is presented. Tangential slot blowing generates side force and yawing moment which may be used to control an aircraft flying at high-angle-of-attack. Two different geometries are used in the analysis: (1) The High Alpha Research Vehicle; and (2) a generic chined forebody. Computations using the isolated F/A-18 forebody are obtained at full-scale wind tunnel test conditions for direct comparison with available experimental data. The effects of over- and under-blowing on force and moment production are analyzed. Time-accurate solutions using the isolated forebody are obtained to study the force onset timelag of tangential slot blowing. Computations using the generic chined forebody are obtained at experimental wind tunnel conditions, and the results compared with available experimental data. This computational analysis compliments the experimental results and provides a detailed understanding of the effects of tangential slot blowing on the flow field about simple and complex geometries.
Prototype for Meta-Algorithmic, Content-Aware Image Analysis
2015-03-01
PROTOTYPE FOR META-ALGORITHMIC, CONTENT-AWARE IMAGE ANALYSIS UNIVERSITY OF VIRGINIA MARCH 2015 FINAL TECHNICAL REPORT...ALGORITHMIC, CONTENT-AWARE IMAGE ANALYSIS 5a. CONTRACT NUMBER FA8750-12-C-0181 5b. GRANT NUMBER N/A 5c. PROGRAM ELEMENT NUMBER 62305E 6. AUTHOR(S) S...approaches were studied in detail and their results on a sample dataset are presented. 15. SUBJECT TERMS Image Analysis , Computer Vision, Content
Gottschalk, Louis A; DeFrancisco, Don; Bechtel, Robert J
2002-08-01
The aim of this study was to test the validity of a computer software program previously demonstrated to be capable of making DSM-IV neuropsychiatric diagnoses from the content analysis of speech or verbal texts. In this report, the computer program was applied to three personal writings of Napoleon Bonaparte when he was 12 to 16 years of age. The accuracy of the neuropsychiatric evaluations derived from the computerized content analysis of these writings of Napoleon was independently corroborated by two biographers who have described pertinent details concerning his life situations, moods, and other emotional reactions during this adolescent period of his life. The relevance of this type of computer technology to psychohistorical research and clinical psychiatry is suggested.
Multiphysics Analysis of a Solid-Core Nuclear Thermal Engine Thrust Chamber
NASA Technical Reports Server (NTRS)
Wang, Ten-See; Canabal, Francisco; Cheng, Gary; Chen, Yen-Sen
2006-01-01
The objective of this effort is to develop an efficient and accurate thermo-fluid computational methodology to predict environments for a hypothetical solid-core, nuclear thermal engine thrust chamber. The computational methodology is based on an unstructured-grid, pressure-based computational fluid dynamics methodology. Formulations for heat transfer in solids and porous media were implemented and anchored. A two-pronged approach was employed in this effort: A detailed thermo-fluid analysis on a multi-channel flow element for mid-section corrosion investigation; and a global modeling of the thrust chamber to understand the effect of hydrogen dissociation and recombination on heat transfer and thrust performance. The formulations and preliminary results on both aspects are presented.
NEMAR plotting computer program
NASA Technical Reports Server (NTRS)
Myler, T. R.
1981-01-01
A FORTRAN coded computer program which generates CalComp plots of trajectory parameters is examined. The trajectory parameters are calculated and placed on a data file by the Near Earth Mission Analysis Routine computer program. The plot program accesses the data file and generates the plots as defined by inputs to the plot program. Program theory, user instructions, output definitions, subroutine descriptions and detailed FORTRAN coding information are included. Although this plot program utilizes a random access data file, a data file of the same type and formatted in 102 numbers per record could be generated by any computer program and used by this plot program.
Design and Analysis of a Subcritical Airfoil for High Altitude, Long Endurance Missions.
1982-12-01
Airfoil Design and Analysis Method ......... .... 61 Appendix D: Boundary Layer Analysis Method ............. ... 81 Appendix E: Detailed Results ofr...attack. Computer codes designed by Richard Eppler were used for this study. The airfoil was anlayzed by using a viscous effects analysis program...inverse program designed by Eppler (Ref 5) was used in this study to accomplish this part. The second step involved the analysis of the airfoil under
CFD Analysis and Design Optimization Using Parallel Computers
NASA Technical Reports Server (NTRS)
Martinelli, Luigi; Alonso, Juan Jose; Jameson, Antony; Reuther, James
1997-01-01
A versatile and efficient multi-block method is presented for the simulation of both steady and unsteady flow, as well as aerodynamic design optimization of complete aircraft configurations. The compressible Euler and Reynolds Averaged Navier-Stokes (RANS) equations are discretized using a high resolution scheme on body-fitted structured meshes. An efficient multigrid implicit scheme is implemented for time-accurate flow calculations. Optimum aerodynamic shape design is achieved at very low cost using an adjoint formulation. The method is implemented on parallel computing systems using the MPI message passing interface standard to ensure portability. The results demonstrate that, by combining highly efficient algorithms with parallel computing, it is possible to perform detailed steady and unsteady analysis as well as automatic design for complex configurations using the present generation of parallel computers.
Clinical applications of biomechanics cinematography.
Woodle, A S
1986-10-01
Biomechanics cinematography is the analysis of movement of living organisms through the use of cameras, image projection systems, electronic digitizers, and computers. This article is a comparison of cinematographic systems and details practical uses of the modality in research and education.
Cortijo, Sandra; Charoensawan, Varodom; Roudier, François; Wigge, Philip A
2018-01-01
Chromatin immunoprecipitation combined with next-generation sequencing (ChIP-seq) is a powerful technique to investigate in vivo transcription factor (TF) binding to DNA, as well as chromatin marks. Here we provide a detailed protocol for all the key steps to perform ChIP-seq in Arabidopsis thaliana roots, also working on other A. thaliana tissues and in most non-ligneous plants. We detail all steps from material collection, fixation, chromatin preparation, immunoprecipitation, library preparation, and finally computational analysis based on a combination of publicly available tools.
Guide to the economic analysis of community energy systems
NASA Astrophysics Data System (ADS)
Pferdehirt, W. P.; Croke, K. G.; Hurter, A. P.; Kennedy, A. S.; Lee, C.
1981-08-01
This guidebook provides a framework for the economic analysis of community energy systems. The analysis facilitates a comparison of competing configurations in community energy systems, as well as a comparison with conventional energy systems. Various components of costs and revenues to be considered are discussed in detail. Computational procedures and accompanying worksheets are provided for calculating the net present value, straight and discounted payback periods, the rate of return, and the savings to investment ratio for the proposed energy system alternatives. These computations are based on a projection of the system's costs and revenues over its economic lifetimes. The guidebook also discusses the sensitivity of the results of this economic analysis to changes in various parameters and assumptions.
Spherical roller bearing analysis. SKF computer program SPHERBEAN. Volume 1: Analysis
NASA Technical Reports Server (NTRS)
Kleckner, R. J.; Pirvics, J.
1980-01-01
The models and associated mathematics used within the SPHERBEAN computer program for prediction of the thermomechanical performance characteristics of high speed lubricated double row spherical roller bearings are presented. The analysis allows six degrees of freedom for each roller and three for each half of an optionally split cage. Roller skew, free lubricant, inertial loads, appropriate elastic and friction forces, and flexible outer ring are considered. Roller quasidynamic equilibrium is calculated for a bearing with up to 30 rollers per row, and distinct roller and flange geometries are specifiable. The user is referred to the material contained here for formulation assumptions and algorithm detail.
COST FUNCTION STUDIES FOR POWER REACTORS
DOE Office of Scientific and Technical Information (OSTI.GOV)
Heestand, J.; Wos, L.T.
1961-11-01
A function to evaluate the cost of electricity produced by a nuclear power reactor was developed. The basic equation, revenue = capital charges + profit + operating expenses, was expanded in terms of various cost parameters to enable analysis of multiregion nuclear reactors with uranium and/or plutonium for fuel. A corresponding IBM 704 computer program, which will compute either the price of electricity or the value of plutonium, is presented in detail. (auth)
Communication: Symmetrical quasi-classical analysis of linear optical spectroscopy
NASA Astrophysics Data System (ADS)
Provazza, Justin; Coker, David F.
2018-05-01
The symmetrical quasi-classical approach for propagation of a many degree of freedom density matrix is explored in the context of computing linear spectra. Calculations on a simple two state model for which exact results are available suggest that the approach gives a qualitative description of peak positions, relative amplitudes, and line broadening. Short time details in the computed dipole autocorrelation function result in exaggerated tails in the spectrum.
Unified Engineering Software System
NASA Technical Reports Server (NTRS)
Purves, L. R.; Gordon, S.; Peltzman, A.; Dube, M.
1989-01-01
Collection of computer programs performs diverse functions in prototype engineering. NEXUS, NASA Engineering Extendible Unified Software system, is research set of computer programs designed to support full sequence of activities encountered in NASA engineering projects. Sequence spans preliminary design, design analysis, detailed design, manufacturing, assembly, and testing. Primarily addresses process of prototype engineering, task of getting single or small number of copies of product to work. Written in FORTRAN 77 and PROLOG.
Fluid-Structure Interaction Analysis of Ruptured Mitral Chordae Tendineae.
Toma, Milan; Bloodworth, Charles H; Pierce, Eric L; Einstein, Daniel R; Cochran, Richard P; Yoganathan, Ajit P; Kunzelman, Karyn S
2017-03-01
The chordal structure is a part of mitral valve geometry that has been commonly neglected or simplified in computational modeling due to its complexity. However, these simplifications cannot be used when investigating the roles of individual chordae tendineae in mitral valve closure. For the first time, advancements in imaging, computational techniques, and hardware technology make it possible to create models of the mitral valve without simplifications to its complex geometry, and to quickly run validated computer simulations that more realistically capture its function. Such simulations can then be used for a detailed analysis of chordae-related diseases. In this work, a comprehensive model of a subject-specific mitral valve with detailed chordal structure is used to analyze the distinct role played by individual chordae in closure of the mitral valve leaflets. Mitral closure was simulated for 51 possible chordal rupture points. Resultant regurgitant orifice area and strain change in the chordae at the papillary muscle tips were then calculated to examine the role of each ruptured chorda in the mitral valve closure. For certain subclassifications of chordae, regurgitant orifice area was found to trend positively with ruptured chordal diameter, and strain changes correlated negatively with regurgitant orifice area. Further advancements in clinical imaging modalities, coupled with the next generation of computational techniques will enable more physiologically realistic simulations.
Fluid-Structure Interaction Analysis of Ruptured Mitral Chordae Tendineae
Toma, Milan; Bloodworth, Charles H.; Pierce, Eric L.; Einstein, Daniel R.; Cochran, Richard P.; Yoganathan, Ajit P.; Kunzelman, Karyn S.
2016-01-01
The chordal structure is a part of mitral valve geometry that has been commonly neglected or simplified in computational modeling due to its complexity. However, these simplifications cannot be used when investigating the roles of individual chordae tendineae in mitral valve closure. For the first time, advancements in imaging, computational techniques, and hardware technology make it possible to create models of the mitral valve without simplifications to its complex geometry, and to quickly run validated computer simulations that more realistically capture its function. Such simulations can then be used for a detailed analysis of chordae-related diseases. In this work, a comprehensive model of a subject-specific mitral valve with detailed chordal structure is used to analyze the distinct role played by individual chordae in closure of the mitral valve leaflets. Mitral closure was simulated for 51 possible chordal rupture points. Resultant regurgitant orifice area and strain change in the chordae at the papillary muscle tips were then calculated to examine the role of each ruptured chorda in the mitral valve closure. For certain subclassifications of chordae, regurgitant orifice area was found to trend positively with ruptured chordal diameter, and strain changes correlated negatively with regurgitant orifice area. Further advancements in clinical imaging modalities, coupled with the next generation of computational techniques will enable more physiologically realistic simulations. PMID:27624659
Faust, Oliver; Yu, Wenwei; Rajendra Acharya, U
2015-03-01
The concept of real-time is very important, as it deals with the realizability of computer based health care systems. In this paper we review biomedical real-time systems with a meta-analysis on computational complexity (CC), delay (Δ) and speedup (Sp). During the review we found that, in the majority of papers, the term real-time is part of the thesis indicating that a proposed system or algorithm is practical. However, these papers were not considered for detailed scrutiny. Our detailed analysis focused on papers which support their claim of achieving real-time, with a discussion on CC or Sp. These papers were analyzed in terms of processing system used, application area (AA), CC, Δ, Sp, implementation/algorithm (I/A) and competition. The results show that the ideas of parallel processing and algorithm delay were only recently introduced and journal papers focus more on Algorithm (A) development than on implementation (I). Most authors compete on big O notation (O) and processing time (PT). Based on these results, we adopt the position that the concept of real-time will continue to play an important role in biomedical systems design. We predict that parallel processing considerations, such as Sp and algorithm scaling, will become more important. Copyright © 2015 Elsevier Ltd. All rights reserved.
Current Lewis Turbomachinery Research: Building on our Legacy of Excellence
NASA Technical Reports Server (NTRS)
Povinelli, Louis A.
1997-01-01
This Wu Chang-Hua lecture is concerned with the development of analysis and computational capability for turbomachinery flows which is based on detailed flow field physics. A brief review of the work of Professor Wu is presented as well as a summary of the current NASA aeropropulsion programs. Two major areas of research are described in order to determine our predictive capabilities using modern day computational tools evolved from the work of Professor Wu. In one of these areas, namely transonic rotor flow, it is demonstrated that a high level of accuracy is obtainable provided sufficient geometric detail is simulated. In the second case, namely turbine heat transfer, our capability is lacking for rotating blade rows and experimental correlations will provide needed information in the near term. It is believed that continuing progress will allow us to realize the full computational potential and its impact on design time and cost.
The Construction of 3-d Neutral Density for Arbitrary Data Sets
NASA Astrophysics Data System (ADS)
Riha, S.; McDougall, T. J.; Barker, P. M.
2014-12-01
The Neutral Density variable allows inference of water pathways from thermodynamic properties in the global ocean, and is therefore an essential component of global ocean circulation analysis. The widely used algorithm for the computation of Neutral Density yields accurate results for data sets which are close to the observed climatological ocean. Long-term numerical climate simulations, however, often generate a significant drift from present-day climate, which renders the existing algorithm inaccurate. To remedy this problem, new algorithms which operate on arbitrary data have been developed, which may potentially be used to compute Neutral Density during runtime of a numerical model.We review existing approaches for the construction of Neutral Density in arbitrary data sets, detail their algorithmic structure, and present an analysis of the computational cost for implementations on a single-CPU computer. We discuss possible strategies for the implementation in state-of-the-art numerical models, with a focus on distributed computing environments.
Spacelab experiment computer study. Volume 1: Executive summary (presentation)
NASA Technical Reports Server (NTRS)
Lewis, J. L.; Hodges, B. C.; Christy, J. O.
1976-01-01
A quantitative cost for various Spacelab flight hardware configurations is provided along with varied software development options. A cost analysis of Spacelab computer hardware and software is presented. The cost study is discussed based on utilization of a central experiment computer with optional auxillary equipment. Groundrules and assumptions used in deriving the costing methods for all options in the Spacelab experiment study are presented. The groundrules and assumptions, are analysed and the options along with their cost considerations, are discussed. It is concluded that Spacelab program cost for software development and maintenance is independent of experimental hardware and software options, that distributed standard computer concept simplifies software integration without a significant increase in cost, and that decisions on flight computer hardware configurations should not be made until payload selection for a given mission and a detailed analysis of the mission requirements are completed.
Global/local stress analysis of composite panels
NASA Technical Reports Server (NTRS)
Ransom, Jonathan B.; Knight, Norman F., Jr.
1989-01-01
A method for performing a global/local stress analysis is described, and its capabilities are demonstrated. The method employs spline interpolation functions which satisfy the linear plate bending equation to determine displacements and rotations from a global model which are used as boundary conditions for the local model. Then, the local model is analyzed independent of the global model of the structure. This approach can be used to determine local, detailed stress states for specific structural regions using independent, refined local models which exploit information from less-refined global models. The method presented is not restricted to having a priori knowledge of the location of the regions requiring local detailed stress analysis. This approach also reduces the computational effort necessary to obtain the detailed stress state. Criteria for applying the method are developed. The effectiveness of the method is demonstrated using a classical stress concentration problem and a graphite-epoxy blade-stiffened panel with a discontinuous stiffener.
Global/local stress analysis of composite structures. M.S. Thesis
NASA Technical Reports Server (NTRS)
Ransom, Jonathan B.
1989-01-01
A method for performing a global/local stress analysis is described and its capabilities are demonstrated. The method employs spline interpolation functions which satisfy the linear plate bending equation to determine displacements and rotations from a global model which are used as boundary conditions for the local model. Then, the local model is analyzed independent of the global model of the structure. This approach can be used to determine local, detailed stress states for specific structural regions using independent, refined local models which exploit information from less-refined global models. The method presented is not restricted to having a priori knowledge of the location of the regions requiring local detailed stress analysis. This approach also reduces the computational effort necessary to obtain the detailed stress state. Criteria for applying the method are developed. The effectiveness of the method is demonstrated using a classical stress concentration problem and a graphite-epoxy blade-stiffened panel with a discontinuous stiffener.
NASA Technical Reports Server (NTRS)
Gale, R. L.; Nease, A. W.; Nelson, D. J.
1978-01-01
Computer program mathematically describes complete hydraulic systems to study their dynamic performance. Program employs subroutines that simulate components of hydraulic system, which are then controlled by main program. Program is useful to engineers working with detailed performance results of aircraft, spacecraft, or similar hydraulic systems.
Live load testing and load rating of five reinforced concrete bridges.
DOT National Transportation Integrated Search
2014-10-01
Five cast-in-place concrete T-beam bridges Eustis #5341, Whitefield #3831, Cambridge #3291, Eddington #5107, : and Albion #2832 were live load tested. Revised load ratings were computed either using test data or detailed : analysis when possi...
Tomographic assessment of the spine in children with spondylocostal dysotosis syndrome
Kaissi, Ali Al; Klaushofer, Klaus; Grill, Franz
2010-01-01
OBJECTIVE: The aim of this study was to perform a detailed tomographic analysis of the skull base, craniocervical junction, and the entire spine in seven patients with spondylocostal dysostosis syndrome. METHOD: Detailed scanning images have been organized in accordance with the most prominent clinical pathology. The reasons behind plagiocephaly, torticollis, short immobile neck, scoliosis and rigid back have been detected. Radiographic documentation was insufficient modality. RESULTS: Detailed computed tomography scans provided excellent delineation of the osseous abnormality pattern in our patients. CONCLUSION: This article throws light on the most serious osseous manifestations of spondylocostal dysostosis syndrome. PMID:21120293
Analyzing How We Do Analysis and Consume Data, Results from the SciDAC-Data Project
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ding, P.; Aliaga, L.; Mubarak, M.
One of the main goals of the Dept. of Energy funded SciDAC-Data project is to analyze the more than 410,000 high energy physics datasets that have been collected, generated and defined over the past two decades by experiments using the Fermilab storage facilities. These datasets have been used as the input to over 5.6 million recorded analysis projects, for which detailed analytics have been gathered. The analytics and meta information for these datasets and analysis projects are being combined with knowledge of their part of the HEP analysis chains for major experiments to understand how modern computing and data deliverymore » is being used. We present the first results of this project, which examine in detail how the CDF, D0, NOvA, MINERvA and MicroBooNE experiments have organized, classified and consumed petascale datasets to produce their physics results. The results include analysis of the correlations in dataset/file overlap, data usage patterns, data popularity, dataset dependency and temporary dataset consumption. The results provide critical insight into how workflows and data delivery schemes can be combined with different caching strategies to more efficiently perform the work required to mine these large HEP data volumes and to understand the physics analysis requirements for the next generation of HEP computing facilities. In particular we present a detailed analysis of the NOvA data organization and consumption model corresponding to their first and second oscillation results (2014-2016) and the first look at the analysis of the Tevatron Run II experiments. We present statistical distributions for the characterization of these data and data driven models describing their consumption« less
Analyzing how we do Analysis and Consume Data, Results from the SciDAC-Data Project
NASA Astrophysics Data System (ADS)
Ding, P.; Aliaga, L.; Mubarak, M.; Tsaris, A.; Norman, A.; Lyon, A.; Ross, R.
2017-10-01
One of the main goals of the Dept. of Energy funded SciDAC-Data project is to analyze the more than 410,000 high energy physics datasets that have been collected, generated and defined over the past two decades by experiments using the Fermilab storage facilities. These datasets have been used as the input to over 5.6 million recorded analysis projects, for which detailed analytics have been gathered. The analytics and meta information for these datasets and analysis projects are being combined with knowledge of their part of the HEP analysis chains for major experiments to understand how modern computing and data delivery is being used. We present the first results of this project, which examine in detail how the CDF, D0, NOvA, MINERvA and MicroBooNE experiments have organized, classified and consumed petascale datasets to produce their physics results. The results include analysis of the correlations in dataset/file overlap, data usage patterns, data popularity, dataset dependency and temporary dataset consumption. The results provide critical insight into how workflows and data delivery schemes can be combined with different caching strategies to more efficiently perform the work required to mine these large HEP data volumes and to understand the physics analysis requirements for the next generation of HEP computing facilities. In particular we present a detailed analysis of the NOvA data organization and consumption model corresponding to their first and second oscillation results (2014-2016) and the first look at the analysis of the Tevatron Run II experiments. We present statistical distributions for the characterization of these data and data driven models describing their consumption.
NASA Technical Reports Server (NTRS)
Razzaq, Zia; Prasad, Venkatesh
1988-01-01
The results of a detailed investigation of the distribution of stresses in aluminum and composite panels subjected to uniform end shortening are presented. The focus problem is a rectangular panel with two longitudinal stiffeners, and an inner stiffener discontinuous at a central hole in the panel. The influence of the stiffeners on the stresses is evaluated through a two-dimensional global finite element analysis in the absence or presence of the hole. Contrary to the physical feel, it is found that the maximum stresses from the glocal analysis for both stiffened aluminum and composite panels are greater than the corresponding stresses for the unstiffened panels. The inner discontinuous stiffener causes a greater increase in stresses than the reduction provided by the two outer stiffeners. A detailed layer-by-layer study of stresses around the hole is also presented for both unstiffened and stiffened composite panels. A parallel equation solver is used for the global system of equations since the computational time is far less than that using a sequential scheme. A parallel Choleski method with up to 16 processors is used on Flex/32 Multicomputer at NASA Langley Research Center. The parallel computing results are summarized and include the computational times, speedups, bandwidths, and their inter-relationships for the panel problems. It is found that the computational time for the Choleski method decreases with a decrease in bandwidth, and better speedups result as the bandwidth increases.
Flood damage assessment using computer-assisted analysis of color infrared photography
Anderson, William H.
1978-01-01
Use of digitized aerial photographs for flood damage assessment in agriculture is new and largely untested. However, under flooding circumstances similar to the 1975 Red River Valley flood, computer-assisted techniques can be extremely useful, especially if detailed crop damage estimates are needed within a relatively short period of time.Airphoto interpretation techniques, manual or computer-assisted, are not intended to replace conventional ground survey and sampling procedures. But their use should be considered a valuable addition to the tools currently available for assessing agricultural flood damage.
Multiphysics Thrust Chamber Modeling for Nuclear Thermal Propulsion
NASA Technical Reports Server (NTRS)
Wang, Ten-See; Cheng, Gary; Chen, Yen-Sen
2006-01-01
The objective of this effort is to develop an efficient and accurate thermo-fluid computational methodology to predict environments for a solid-core, nuclear thermal engine thrust chamber. The computational methodology is based on an unstructured-grid, pressure-based computational fluid dynamics formulation. A two-pronged approach is employed in this effort: A detailed thermo-fluid analysis on a multi-channel flow element for mid-section corrosion investigation; and a global modeling of the thrust chamber to understand the effect of heat transfer on thrust performance. Preliminary results on both aspects are presented.
NASA Technical Reports Server (NTRS)
1973-01-01
An improved method for estimating aircraft weight and cost using a unique and fundamental approach was developed. The results of this study were integrated into a comprehensive digital computer program, which is intended for use at the preliminary design stage of aircraft development. The program provides a means of computing absolute values for weight and cost, and enables the user to perform trade studies with a sensitivity to detail design and overall structural arrangement. Both batch and interactive graphics modes of program operation are available.
NASA Technical Reports Server (NTRS)
Gerstle, Walter
1989-01-01
Engineering problems sometimes involve the numerical solution of boundary value problems over domains containing geometric feature with widely varying scales. Often, a detailed solution is required at one or more of these features. Small details in large structures may have profound effects upon global performance. Conversely, large-scale conditions may effect local performance. Many man-hours and CPU-hours are currently spent in modeling such problems. With the structural zooming technique, it is now possible to design an integrated program which allows the analyst to interactively focus upon a small region of interest, to modify the local geometry, and then to obtain highly accurate responses in that region which reflect both the properties of the overall structure and the local detail. A boundary integral equation analysis program, called BOAST, was recently developed for the stress analysis of cracks. This program can accurately analyze two-dimensional linear elastic fracture mechanics problems with far less computational effort than existing finite element codes. An interactive computer graphical interface to BOAST was written. The graphical interface would have several requirements: it would be menu-driven, with mouse input; all aspects of input would be entered graphically; the results of a BOAST analysis would be displayed pictorially but also the user would be able to probe interactively to get numerical values of displacement and stress at desired locations within the analysis domain; the entire procedure would be integrated into a single, easy to use package; and it would be written using calls to the graphic package called HOOPS. The program is nearing completion. All of the preprocessing features are working satisfactorily and were debugged. The postprocessing features are under development, and rudimentary postprocessing should be available by the end of the summer. The program was developed and run on a VAX workstation, and must be ported to the SUN workstation. This activity is currently underway.
Wheeze sound analysis using computer-based techniques: a systematic review.
Ghulam Nabi, Fizza; Sundaraj, Kenneth; Chee Kiang, Lam; Palaniappan, Rajkumar; Sundaraj, Sebastian
2017-10-31
Wheezes are high pitched continuous respiratory acoustic sounds which are produced as a result of airway obstruction. Computer-based analyses of wheeze signals have been extensively used for parametric analysis, spectral analysis, identification of airway obstruction, feature extraction and diseases or pathology classification. While this area is currently an active field of research, the available literature has not yet been reviewed. This systematic review identified articles describing wheeze analyses using computer-based techniques on the SCOPUS, IEEE Xplore, ACM, PubMed and Springer and Elsevier electronic databases. After a set of selection criteria was applied, 41 articles were selected for detailed analysis. The findings reveal that 1) computerized wheeze analysis can be used for the identification of disease severity level or pathology, 2) further research is required to achieve acceptable rates of identification on the degree of airway obstruction with normal breathing, 3) analysis using combinations of features and on subgroups of the respiratory cycle has provided a pathway to classify various diseases or pathology that stem from airway obstruction.
Integrated design, execution, and analysis of arrayed and pooled CRISPR genome-editing experiments.
Canver, Matthew C; Haeussler, Maximilian; Bauer, Daniel E; Orkin, Stuart H; Sanjana, Neville E; Shalem, Ophir; Yuan, Guo-Cheng; Zhang, Feng; Concordet, Jean-Paul; Pinello, Luca
2018-05-01
CRISPR (clustered regularly interspaced short palindromic repeats) genome-editing experiments offer enormous potential for the evaluation of genomic loci using arrayed single guide RNAs (sgRNAs) or pooled sgRNA libraries. Numerous computational tools are available to help design sgRNAs with optimal on-target efficiency and minimal off-target potential. In addition, computational tools have been developed to analyze deep-sequencing data resulting from genome-editing experiments. However, these tools are typically developed in isolation and oftentimes are not readily translatable into laboratory-based experiments. Here, we present a protocol that describes in detail both the computational and benchtop implementation of an arrayed and/or pooled CRISPR genome-editing experiment. This protocol provides instructions for sgRNA design with CRISPOR (computational tool for the design, evaluation, and cloning of sgRNA sequences), experimental implementation, and analysis of the resulting high-throughput sequencing data with CRISPResso (computational tool for analysis of genome-editing outcomes from deep-sequencing data). This protocol allows for design and execution of arrayed and pooled CRISPR experiments in 4-5 weeks by non-experts, as well as computational data analysis that can be performed in 1-2 d by both computational and noncomputational biologists alike using web-based and/or command-line versions.
NASA Technical Reports Server (NTRS)
Deckman, G.; Rousseau, J. (Editor)
1973-01-01
The Wash Water Recovery System (WWRS) is intended for use in processing shower bath water onboard a spacecraft. The WWRS utilizes flash evaporation, vapor compression, and pyrolytic reaction to process the wash water to allow recovery of potable water. Wash water flashing and foaming characteristics, are evaluated physical properties, of concentrated wash water are determined, and a long term feasibility study on the system is performed. In addition, a computer analysis of the system and a detail design of a 10 lb/hr vortex-type water vapor compressor were completed. The computer analysis also sized remaining system components on the basis of the new vortex compressor design.
Duct flow nonuniformities: Effect of struts in SSME HGM 2+
NASA Technical Reports Server (NTRS)
Burke, Roger
1988-01-01
This study consists of an analysis of flow through the Space Shuttle Main Engine (SSME) Hot Gas Manifold (HGM) for the purpose of understanding and quantifying the flow environment and, in particular, the flow through a region of structural supports located between the inner and outer walls of the HGM. The primary task of the study, as defined by NASA-MSFC, is to assess and develop the computational capability for analyzing detailed three-dimensional flow through the HGM support strut region to be incorporated into a full fuelside HGM analysis. Secondarily, computed results are to be compared with available experimental results.
Vectorized Monte Carlo methods for reactor lattice analysis
NASA Technical Reports Server (NTRS)
Brown, F. B.
1984-01-01
Some of the new computational methods and equivalent mathematical representations of physics models used in the MCV code, a vectorized continuous-enery Monte Carlo code for use on the CYBER-205 computer are discussed. While the principal application of MCV is the neutronics analysis of repeating reactor lattices, the new methods used in MCV should be generally useful for vectorizing Monte Carlo for other applications. For background, a brief overview of the vector processing features of the CYBER-205 is included, followed by a discussion of the fundamentals of Monte Carlo vectorization. The physics models used in the MCV vectorized Monte Carlo code are then summarized. The new methods used in scattering analysis are presented along with details of several key, highly specialized computational routines. Finally, speedups relative to CDC-7600 scalar Monte Carlo are discussed.
Computational Aerodynamic Analysis of Offshore Upwind and Downwind Turbines
Zhao, Qiuying; Sheng, Chunhua; Afjeh, Abdollah
2014-01-01
Aerodynamic interactions of the model NREL 5 MW offshore horizontal axis wind turbines (HAWT) are investigated using a high-fidelity computational fluid dynamics (CFD) analysis. Four wind turbine configurations are considered; three-bladed upwind and downwind and two-bladed upwind and downwind configurations, which operate at two different rotor speeds of 12.1 and 16 RPM. In the present study, both steady and unsteady aerodynamic loads, such as the rotor torque, blade hub bending moment, and base the tower bending moment of the tower, are evaluated in detail to provide overall assessment of different wind turbine configurations. Aerodynamic interactions between the rotor and tower are analyzed,more » including the rotor wake development downstream. The computational analysis provides insight into aerodynamic performance of the upwind and downwind, two- and three-bladed horizontal axis wind turbines.« less
Computational analysis of a multistage axial compressor
NASA Astrophysics Data System (ADS)
Mamidoju, Chaithanya
Turbomachines are used extensively in Aerospace, Power Generation, and Oil & Gas Industries. Efficiency of these machines is often an important factor and has led to the continuous effort to improve the design to achieve better efficiency. The axial flow compressor is a major component in a gas turbine with the turbine's overall performance depending strongly on compressor performance. Traditional analysis of axial compressors involves throughflow calculations, isolated blade passage analysis, Quasi-3D blade-to-blade analysis, single-stage (rotor-stator) analysis, and multi-stage analysis involving larger design cycles. In the current study, the detailed flow through a 15 stage axial compressor is analyzed using a 3-D Navier Stokes CFD solver in a parallel computing environment. Methodology is described for steady state (frozen rotor stator) analysis of one blade passage per component. Various effects such as mesh type and density, boundary conditions, tip clearance and numerical issues such as turbulence model choice, advection model choice, and parallel processing performance are analyzed. A high sensitivity of the predictions to the above was found. Physical explanation to the flow features observed in the computational study are given. The total pressure rise verses mass flow rate was computed.
NASA Technical Reports Server (NTRS)
Davidson, Frederic M.; Sun, Xiaoli; Field, Christopher T.
1994-01-01
This interim report consists of two reports: 'Space Radiation Effects on Si APDs for GLAS' and 'Computer Simulation of Avalanche Photodiode and Preamplifier Output for Laser Altimeters.' The former contains a detailed description of our proton radiation test of Si APD's performed at the Brookhaven National Laboratory. The latter documents the computer program subroutines which were written for the upgrade of NASA's GLAS simulator.
Computer Simulation For Design Of TWT's
NASA Technical Reports Server (NTRS)
Bartos, Karen F.; Fite, E. Brian; Shalkhauser, Kurt A.; Sharp, G. Richard
1992-01-01
A three-dimensional finite-element analytical technique facilitates design and fabrication of traveling-wave-tube (TWT) slow-wave structures. Used to perform thermal and mechanical analyses of TWT designed with variety of configurations, geometries, and materials. Using three-dimensional computer analysis, designer able to simulate building and testing of TWT, with consequent substantial saving of time and money. Technique enables detailed look into operation of traveling-wave tubes to help improve performance for future communications systems.
NOSC Program Managers Handbook. Revision 1
1988-02-01
cost. The effects of application of life-cycle cost analysis through the planning and RIDT&E phases of a program, and the " design to cost" concept on...is the plan for assuring the quality of the design , design documentation, and fabricated/assembled hardware and associated computer software. 13.5.3.2...listings and printouts, which document the n. requirements, design , or details of compute : software; explain the capabilities and limitations of the
NASA Technical Reports Server (NTRS)
Staveland, Lowell
1994-01-01
This is the experimental and software detailed design report for the prototype task loading model (TLM) developed as part of the man-machine integration design and analysis system (MIDAS), as implemented and tested in phase 6 of the Army-NASA Aircrew/Aircraft Integration (A3I) Program. The A3I program is an exploratory development effort to advance the capabilities and use of computational representations of human performance and behavior in the design, synthesis, and analysis of manned systems. The MIDAS TLM computationally models the demands designs impose on operators to aide engineers in the conceptual design of aircraft crewstations. This report describes TLM and the results of a series of experiments which were run this phase to test its capabilities as a predictive task demand modeling tool. Specifically, it includes discussions of: the inputs and outputs of TLM, the theories underlying it, the results of the test experiments, the use of the TLM as both stand alone tool and part of a complete human operator simulation, and a brief introduction to the TLM software design.
Alignment-free genetic sequence comparisons: a review of recent approaches by word analysis.
Bonham-Carter, Oliver; Steele, Joe; Bastola, Dhundy
2014-11-01
Modern sequencing and genome assembly technologies have provided a wealth of data, which will soon require an analysis by comparison for discovery. Sequence alignment, a fundamental task in bioinformatics research, may be used but with some caveats. Seminal techniques and methods from dynamic programming are proving ineffective for this work owing to their inherent computational expense when processing large amounts of sequence data. These methods are prone to giving misleading information because of genetic recombination, genetic shuffling and other inherent biological events. New approaches from information theory, frequency analysis and data compression are available and provide powerful alternatives to dynamic programming. These new methods are often preferred, as their algorithms are simpler and are not affected by synteny-related problems. In this review, we provide a detailed discussion of computational tools, which stem from alignment-free methods based on statistical analysis from word frequencies. We provide several clear examples to demonstrate applications and the interpretations over several different areas of alignment-free analysis such as base-base correlations, feature frequency profiles, compositional vectors, an improved string composition and the D2 statistic metric. Additionally, we provide detailed discussion and an example of analysis by Lempel-Ziv techniques from data compression. © The Author 2013. Published by Oxford University Press. For Permissions, please email: journals.permissions@oup.com.
IDENTIFICATION OF AN IDEAL REACTOR MODEL IN A SECONDARY COMBUSTION CHAMBER
Tracer analysis was applied to a secondary combustion chamber of a rotary kiln incinerator simulator to develop a computationally inexpensive networked ideal reactor model and allow for the later incorporation of detailed reaction mechanisms. Tracer data from sulfur dioxide trace...
Computer-based analysis of microvascular alterations in a mouse model for Alzheimer's disease
NASA Astrophysics Data System (ADS)
Heinzer, Stefan; Müller, Ralph; Stampanoni, Marco; Abela, Rafael; Meyer, Eric P.; Ulmann-Schuler, Alexandra; Krucker, Thomas
2007-03-01
Vascular factors associated with Alzheimer's disease (AD) have recently gained increased attention. To investigate changes in vascular, particularly microvascular architecture, we developed a hierarchical imaging framework to obtain large-volume, high-resolution 3D images from brains of transgenic mice modeling AD. In this paper, we present imaging and data analysis methods which allow compiling unique characteristics from several hundred gigabytes of image data. Image acquisition is based on desktop micro-computed tomography (µCT) and local synchrotron-radiation µCT (SRµCT) scanning with a nominal voxel size of 16 µm and 1.4 µm, respectively. Two visualization approaches were implemented: stacks of Z-buffer projections for fast data browsing, and progressive-mesh based surface rendering for detailed 3D visualization of the large datasets. In a first step, image data was assessed visually via a Java client connected to a central database. Identified characteristics of interest were subsequently quantified using global morphometry software. To obtain even deeper insight into microvascular alterations, tree analysis software was developed providing local morphometric parameters such as number of vessel segments or vessel tortuosity. In the context of ever increasing image resolution and large datasets, computer-aided analysis has proven both powerful and indispensable. The hierarchical approach maintains the context of local phenomena, while proper visualization and morphometry provide the basis for detailed analysis of the pathology related to structure. Beyond analysis of microvascular changes in AD this framework will have significant impact considering that vascular changes are involved in other neurodegenerative diseases as well as in cancer, cardiovascular disease, asthma, and arthritis.
Analysis and selection of optimal function implementations in massively parallel computer
Archer, Charles Jens [Rochester, MN; Peters, Amanda [Rochester, MN; Ratterman, Joseph D [Rochester, MN
2011-05-31
An apparatus, program product and method optimize the operation of a parallel computer system by, in part, collecting performance data for a set of implementations of a function capable of being executed on the parallel computer system based upon the execution of the set of implementations under varying input parameters in a plurality of input dimensions. The collected performance data may be used to generate selection program code that is configured to call selected implementations of the function in response to a call to the function under varying input parameters. The collected performance data may be used to perform more detailed analysis to ascertain the comparative performance of the set of implementations of the function under the varying input parameters.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Goltz, G.; Weiner, H.
A computer program has been developed for designing and analyzing the performance of solar array/battery power systems for the U.S. Coast Guard Navigational Aids. This program is called the Design Synthesis/Performance Analysis (DSPA) Computer Program. The basic function of the Design Synthesis portion of the DSPA program is to evaluate functional and economic criteria to provide specifications for viable solar array/battery power systems. The basic function of the Performance Analysis portion of the DSPA program is to simulate the operation of solar array/battery power systems under specific loads and environmental conditions. This document provides a detailed description of the DSPAmore » Computer Program system and its subprograms. This manual will assist the programmer in revising or updating the several subprograms.« less
Validation of a computerized algorithm to quantify fetal heart rate deceleration area.
Gyllencreutz, Erika; Lu, Ke; Lindecrantz, Kaj; Lindqvist, Pelle G; Nordstrom, Lennart; Holzmann, Malin; Abtahi, Farhad
2018-05-16
Reliability in visual cardiotocography interpretation is unsatisfying, which has led to development of computerized cardiotocography. Computerized analysis is well established for antenatal fetal surveillance, but has yet not performed sufficiently during labor. We aimed to investigate the capacity of a new computerized algorithm compared to visual assessment in identifying intrapartum fetal heart rate baseline and decelerations. Three-hundred-and-twelve intrapartum cardiotocography tracings with variable decelerations were analysed by the computerized algorithm and visually examined by two observers, blinded to each other and the computer analysis. The width, depth and area of each deceleration was measured. Four cases (>100 variable decelerations) were subject to in-depth detailed analysis. The outcome measures were bias in seconds (width), beats per minute (depth), and beats (area) between computer and observers by using Bland-Altman analysis. Interobserver reliability was determined by calculating intraclass correlation and Spearman rank analysis. The analysis (312 cases) showed excellent intraclass correlation (0.89-0.95) and very strong Spearman correlation (0.82-0.91). The detailed analysis of > 100 decelerations in 4 cases revealed low bias between the computer and the two observers; width 1.4 and 1.4 seconds, depth 5.1 and 0.7 beats per minute, and area 0.1 and -1.7 beats. This was comparable to the bias between the two observers; 0.3 seconds (width), 4.4 beats per minute (depth), and 1.7 beats (area). The intraclass correlation was excellent (0.90-0.98). A novel computerized algorithm for intrapartum cardiotocography analysis is as accurate as gold standard visual assessment with high correlation and low bias. This article is protected by copyright. All rights reserved. This article is protected by copyright. All rights reserved.
Potential Flow Theory and Operation Guide for the Panel Code PMARC. Version 14
NASA Technical Reports Server (NTRS)
Ashby, Dale L.
1999-01-01
The theoretical basis for PMARC, a low-order panel code for modeling complex three-dimensional bodies, in potential flow, is outlined. PMARC can be run on a wide variety of computer platforms, including desktop machines, workstations, and supercomputers. Execution times for PMARC vary tremendously depending on the computer resources used, but typically range from several minutes for simple or moderately complex cases to several hours for very large complex cases. Several of the advanced features currently included in the code, such as internal flow modeling, boundary layer analysis, and time-dependent flow analysis, including problems involving relative motion, are discussed in some detail. The code is written in Fortran77, using adjustable-size arrays so that it can be easily redimensioned to match problem requirements and computer hardware constraints. An overview of the program input is presented. A detailed description of the input parameters is provided in the appendices. PMARC results for several test cases are presented along with analytic or experimental data, where available. The input files for these test cases are given in the appendices. PMARC currently supports plotfile output formats for several commercially available graphics packages. The supported graphics packages are Plot3D, Tecplot, and PmarcViewer.
NASA Technical Reports Server (NTRS)
Anderson, O. L.
1974-01-01
A finite-difference procedure for computing the turbulent, swirling, compressible flow in axisymmetric ducts is described. Arbitrary distributions of heat and mass transfer at the boundaries can be treated, and the effects of struts, inlet guide vanes, and flow straightening vanes can be calculated. The calculation procedure is programmed in FORTRAN 4 and has operated successfully on the UNIVAC 1108, IBM 360, and CDC 6600 computers. The analysis which forms the basis of the procedure, a detailed description of the computer program, and the input/output formats are presented. The results of sample calculations performed with the computer program are compared with experimental data.
NASA Technical Reports Server (NTRS)
Gibson, S. G.
1983-01-01
A system of computer programs was developed to model general three dimensional surfaces. Surfaces are modeled as sets of parametric bicubic patches. There are also capabilities to transform coordinates, to compute mesh/surface intersection normals, and to format input data for a transonic potential flow analysis. A graphical display of surface models and intersection normals is available. There are additional capabilities to regulate point spacing on input curves and to compute surface/surface intersection curves. Input and output data formats are described; detailed suggestions are given for user input. Instructions for execution are given, and examples are shown.
Deal, Samantha; Wambaugh, John; Judson, Richard; Mosher, Shad; Radio, Nick; Houck, Keith; Padilla, Stephanie
2016-09-01
One of the rate-limiting procedures in a developmental zebrafish screen is the morphological assessment of each larva. Most researchers opt for a time-consuming, structured visual assessment by trained human observer(s). The present studies were designed to develop a more objective, accurate and rapid method for screening zebrafish for dysmorphology. Instead of the very detailed human assessment, we have developed the computational malformation index, which combines the use of high-content imaging with a very brief human visual assessment. Each larva was quickly assessed by a human observer (basic visual assessment), killed, fixed and assessed for dysmorphology with the Zebratox V4 BioApplication using the Cellomics® ArrayScan® V(TI) high-content image analysis platform. The basic visual assessment adds in-life parameters, and the high-content analysis assesses each individual larva for various features (total area, width, spine length, head-tail length, length-width ratio, perimeter-area ratio). In developing the computational malformation index, a training set of hundreds of embryos treated with hundreds of chemicals were visually assessed using the basic or detailed method. In the second phase, we assessed both the stability of these high-content measurements and its performance using a test set of zebrafish treated with a dose range of two reference chemicals (trans-retinoic acid or cadmium). We found the measures were stable for at least 1 week and comparison of these automated measures to detailed visual inspection of the larvae showed excellent congruence. Our computational malformation index provides an objective manner for rapid phenotypic brightfield assessment of individual larva in a developmental zebrafish assay. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.
1985-04-01
characteristics of targets Tank 9.1 m (30 ft) in diameter by 6.7 m (22 ft) deep , automated with computer con- trol and analysis for detailed studies of acoustic...structures; and conducts experiments in the deep ocean, in acoustically shallow water, and in the Arctic. The Division carries out theoretical and...Laser Materials-Application Center Failure Analysis and Fractography Staff Research Activity Areas Environmental Effects Microstructural characterization
Documentation of the data analysis system for the gamma ray monitor aboard OSO-H
NASA Technical Reports Server (NTRS)
Croteau, S.; Buck, A.; Higbie, P.; Kantauskis, J.; Foss, S.; Chupp, D.; Forrest, D. J.; Suri, A.; Gleske, I.
1973-01-01
The programming system is presented which was developed to prepare the data from the gamma ray monitor on OSO-7 for scientific analysis. The detector, data, and objectives are described in detail. Programs presented include; FEEDER, PASS-1, CAL1, CAL2, PASS-3, Van Allen Belt Predict Program, Computation Center Plot Routine, and Response Function Programs.
Occupational Analysis Products: Operations Management- AFSC 3E6X1 (CD-ROM)
computer laser optical disc (CD-ROM); 4 3/4 in.; 23.4 MB. SYSTEMS DETAIL NOTE: ABSTRACT: This is a report of an occupational survey of the Operations ... Management (AFSC 3E6X1, OSSN 2560, Feb 04) career ladder, conducted by the Occupational Analysis Flight, AFOMS. The OSR reports the findings of current
Turbulent Mixing of Primary and Secondary Flow Streams in a Rocket-Based Combined Cycle Engine
NASA Technical Reports Server (NTRS)
Cramer, J. M.; Greene, M. U.; Pal, S.; Santoro, R. J.; Turner, Jim (Technical Monitor)
2002-01-01
This viewgraph presentation gives an overview of the turbulent mixing of primary and secondary flow streams in a rocket-based combined cycle (RBCC) engine. A significant RBCC ejector mode database has been generated, detailing single and twin thruster configurations and global and local measurements. On-going analysis and correlation efforts include Marshall Space Flight Center computational fluid dynamics modeling and turbulent shear layer analysis. Potential follow-on activities include detailed measurements of air flow static pressure and velocity profiles, investigations into other thruster spacing configurations, performing a fundamental shear layer mixing study, and demonstrating single-shot Raman measurements.
On aerodynamic wake analysis and its relation to total aerodynamic drag in a wind tunnel environment
NASA Astrophysics Data System (ADS)
Guterres, Rui M.
The present work was developed with the goal of advancing the state of the art in the application of three-dimensional wake data analysis to the quantification of aerodynamic drag on a body in a low speed wind tunnel environment. Analysis of the existing tools, their strengths and limitations is presented. Improvements to the existing analysis approaches were made. Software tools were developed to integrate the analysis into a practical tool. A comprehensive derivation of the equations needed for drag computations based on three dimensional separated wake data is developed. A set of complete steps ranging from the basic mathematical concept to the applicable engineering equations is presented. An extensive experimental study was conducted. Three representative body types were studied in varying ground effect conditions. A detailed qualitative wake analysis using wake imaging and two and three dimensional flow visualization was performed. Several significant features of the flow were identified and their relation to the total aerodynamic drag established. A comprehensive wake study of this type is shown to be in itself a powerful tool for the analysis of the wake aerodynamics and its relation to body drag. Quantitative wake analysis techniques were developed. Significant post processing and data conditioning tools and precision analysis were developed. The quality of the data is shown to be in direct correlation with the accuracy of the computed aerodynamic drag. Steps are taken to identify the sources of uncertainty. These are quantified when possible and the accuracy of the computed results is seen to significantly improve. When post processing alone does not resolve issues related to precision and accuracy, solutions are proposed. The improved quantitative wake analysis is applied to the wake data obtained. Guidelines are established that will lead to more successful implementation of these tools in future research programs. Close attention is paid to implementation of issues that are of crucial importance for the accuracy of the results and that are not detailed in the literature. The impact of ground effect on the flows in hand is qualitatively and quantitatively studied. Its impact on the accuracy of the computations as well as the wall drag incompatibility with the theoretical model followed are discussed. The newly developed quantitative analysis provides significantly increased accuracy. The aerodynamic drag coefficient is computed within one percent of balance measured value for the best cases.
Exploiting symmetries in the modeling and analysis of tires
NASA Technical Reports Server (NTRS)
Noor, Ahmed K.; Andersen, Carl M.; Tanner, John A.
1987-01-01
A simple and efficient computational strategy for reducing both the size of a tire model and the cost of the analysis of tires in the presence of symmetry-breaking conditions (unsymmetry in the tire material, geometry, or loading) is presented. The strategy is based on approximating the unsymmetric response of the tire with a linear combination of symmetric and antisymmetric global approximation vectors (or modes). Details are presented for the three main elements of the computational strategy, which include: use of special three-field mixed finite-element models, use of operator splitting, and substantial reduction in the number of degrees of freedom. The proposed computational stategy is applied to three quasi-symmetric problems of tires: linear analysis of anisotropic tires, through use of semianalytic finite elements, nonlinear analysis of anisotropic tires through use of two-dimensional shell finite elements, and nonlinear analysis of orthotropic tires subjected to unsymmetric loading. Three basic types of symmetry (and their combinations) exhibited by the tire response are identified.
Computer-aided roll pass design in rolling of airfoil shapes
NASA Technical Reports Server (NTRS)
Akgerman, N.; Lahoti, G. D.; Altan, T.
1980-01-01
This paper describes two computer-aided design (CAD) programs developed for modeling the shape rolling process for airfoil sections. The first program, SHPROL, uses a modular upper-bound method of analysis and predicts the lateral spread, elongation, and roll torque. The second program, ROLPAS, predicts the stresses, roll separating force, the roll torque and the details of metal flow by simulating the rolling process, using the slab method of analysis. ROLPAS is an interactive program; it offers graphic display capabilities and allows the user to interact with the computer via a keyboard, CRT, and a light pen. The accuracy of the computerized models was evaluated by (a) rolling a selected airfoil shape at room temperature from 1018 steel and isothermally at high temperature from Ti-6Al-4V, and (b) comparing the experimental results with computer predictions. The comparisons indicated that the CAD systems, described here, are useful for practical engineering purposes and can be utilized in roll pass design and analysis for airfoil and similar shapes.
NASA Technical Reports Server (NTRS)
Kapoor, Kamlesh; Anderson, Bernhard H.; Shaw, Robert J.
1994-01-01
A full Navier-Stokes analysis was performed to evaluate the performance of the subsonic diffuser of a NASA Lewis Research Center 70/30 mixed-compression bifurcated supersonic inlet for high speed civil transport application. The PARC3D code was used in the present study. The computations were also performed when approximately 2.5 percent of the engine mass flow was allowed to bypass through the engine bypass doors. The computational results were compared with the available experimental data which consisted of detailed Mach number and total pressure distribution along the entire length of the subsonic diffuser. The total pressure recovery, flow distortion, and crossflow velocity at the engine face were also calculated. The computed surface ramp and cowl pressure distributions were compared with experiments. Overall, the computational results compared well with experimental data. The present CFD analysis demonstrated that the bypass flow improves the total pressure recovery and lessens flow distortions at the engine face.
David A. Marquis; Richard L. Ernst
1992-01-01
Describes the purpose and function of the SILVAH computer program in general terms; provides detailed instructions on use of the program; and provides information on program organization , data formats, and the basis of processing algorithms.
Electric/Hybrid Vehicle Simulation
NASA Technical Reports Server (NTRS)
Slusser, R. A.; Chapman, C. P.; Brennand, J. P.
1985-01-01
ELVEC computer program provides vehicle designer with simulation tool for detailed studies of electric and hybrid vehicle performance and cost. ELVEC simulates performance of user-specified electric or hybrid vehicle under user specified driving schedule profile or operating schedule. ELVEC performs vehicle design and life cycle cost analysis.
The COREL and W12SC3 computer programs for supersonic wing design and analysis
NASA Technical Reports Server (NTRS)
Mason, W. H.; Rosen, B. S.
1983-01-01
Two computer codes useful in the supersonic aerodynamic design of wings, including the supersonic maneuver case are described. The nonlinear full potential equation COREL code performs an analysis of a spanwise section of the wing in the crossflow plane by assuming conical flow over the section. A subsequent approximate correction to the solution can be made in order to account for nonconical effects. In COREL, the flow-field is assumed to be irrotional (Mach numbers normal to shock waves less than about 1.3) and the full potential equation is solved to obtain detailed results for the leading edge expansion, supercritical crossflow, and any crossflow shockwaves. W12SC3 is a linear theory panel method which combines and extends elements of several of Woodward's codes, with emphasis on fighter applications. After a brief review of the aerodynamic theory used by each method, the use of the codes is illustrated with several examples, detailed input instructions and a sample case.
Estimation Accuracy on Execution Time of Run-Time Tasks in a Heterogeneous Distributed Environment.
Liu, Qi; Cai, Weidong; Jin, Dandan; Shen, Jian; Fu, Zhangjie; Liu, Xiaodong; Linge, Nigel
2016-08-30
Distributed Computing has achieved tremendous development since cloud computing was proposed in 2006, and played a vital role promoting rapid growth of data collecting and analysis models, e.g., Internet of things, Cyber-Physical Systems, Big Data Analytics, etc. Hadoop has become a data convergence platform for sensor networks. As one of the core components, MapReduce facilitates allocating, processing and mining of collected large-scale data, where speculative execution strategies help solve straggler problems. However, there is still no efficient solution for accurate estimation on execution time of run-time tasks, which can affect task allocation and distribution in MapReduce. In this paper, task execution data have been collected and employed for the estimation. A two-phase regression (TPR) method is proposed to predict the finishing time of each task accurately. Detailed data of each task have drawn interests with detailed analysis report being made. According to the results, the prediction accuracy of concurrent tasks' execution time can be improved, in particular for some regular jobs.
Pawlowski, Roger P.; Phipps, Eric T.; Salinger, Andrew G.; ...
2012-01-01
A template-based generic programming approach was presented in Part I of this series of papers [Sci. Program. 20 (2012), 197–219] that separates the development effort of programming a physical model from that of computing additional quantities, such as derivatives, needed for embedded analysis algorithms. In this paper, we describe the implementation details for using the template-based generic programming approach for simulation and analysis of partial differential equations (PDEs). We detail several of the hurdles that we have encountered, and some of the software infrastructure developed to overcome them. We end with a demonstration where we present shape optimization and uncertaintymore » quantification results for a 3D PDE application.« less
Predictive Software Cost Model Study. Volume II. Software Package Detailed Data.
1980-06-01
will not be limited to: a. ASN-91 NWDS Computer b. Armament System Control Unit ( ASCU ) c. AN/ASN-90 IMS 6. CONFIGURATION CONTROL. OFP/OTP...planned approach. 3. Detailed analysis and study; impacts on hardware, manuals, data, AGE , etc; alternatives with pros and cons; cost estimates; ECP...WAIT UNTIL RESOURCE REQUEST FOR * : HAG TAPE HAS BEEN FULFILLED )MTS 0 RI * Ae* NESDIIRCE MAG TAPE (SHORT FORM)I:TST IN I" . TEST " AG TAPE RESOURCE
TAIR: A transonic airfoil analysis computer code
NASA Technical Reports Server (NTRS)
Dougherty, F. C.; Holst, T. L.; Grundy, K. L.; Thomas, S. D.
1981-01-01
The operation of the TAIR (Transonic AIRfoil) computer code, which uses a fast, fully implicit algorithm to solve the conservative full-potential equation for transonic flow fields about arbitrary airfoils, is described on two levels of sophistication: simplified operation and detailed operation. The program organization and theory are elaborated to simplify modification of TAIR for new applications. Examples with input and output are given for a wide range of cases, including incompressible, subcritical compressible, and transonic calculations.
Aerothermodynamic Analysis of Commercial Experiment Transporter (COMET) Reentry Capsule
NASA Technical Reports Server (NTRS)
Wood, William A.; Gnoffo, Peter A.; Rault, Didier F. G.
1996-01-01
An aerothermodynamic analysis of the Commercial Experiment Transporter (COMET) reentry capsule has been performed using the laminar thin-layer Navier-Stokes solver Langley Aerothermodynamic Upwind Relaxation Algorithm. Flowfield solutions were obtained at Mach numbers 1.5, 2, 5, 10, 15, 20, 25, and 27.5. Axisymmetric and 5, 10, and 20 degree angles of attack were considered across the Mach-number range, with the Mach 25 conditions taken to 90 degrees angle of attack and the Mach 27.5 cases taken to 60 degrees angle of attack. Detailed surface heat-transfer rates were computed at Mach 20 and 25, revealing that heating rates on the heat-shield shoulder ,can exceed the stagnation-point heating by 230 percent. Finite-rate chemistry solutions were performed above Mach 10, otherwise perfect gas computations were made. Drag, lift, and pitching moment coefficients are computed and details of a wake flow are presented. The effect of including the wake in the solution domain was investigated and base pressure corrections to forebody drag coefficients were numerically determined for the lower Mach numbers. Pitching moment comparisons are made with direct simulation Monte Carlo results in the more rarefied flow at the highest Mach numbers, showing agreement within two-percent. Thin-layer Navier-Stokes computations of the axial force are found to be 15 percent higher across the speed range than the empirical/Newtonian based results used during the initial trajectory analyses.
NASA Technical Reports Server (NTRS)
Rose, Cheryl A.; Starnes, James H., Jr.
1996-01-01
An efficient, approximate analysis for calculating complete three-dimensional stress fields near regions of geometric discontinuities in laminated composite structures is presented. An approximate three-dimensional local analysis is used to determine the detailed local response due to far-field stresses obtained from a global two-dimensional analysis. The stress results from the global analysis are used as traction boundary conditions for the local analysis. A generalized plane deformation assumption is made in the local analysis to reduce the solution domain to two dimensions. This assumption allows out-of-plane deformation to occur. The local analysis is based on the principle of minimum complementary energy and uses statically admissible stress functions that have an assumed through-the-thickness distribution. Examples are presented to illustrate the accuracy and computational efficiency of the local analysis. Comparisons of the results of the present local analysis with the corresponding results obtained from a finite element analysis and from an elasticity solution are presented. These results indicate that the present local analysis predicts the stress field accurately. Computer execution-times are also presented. The demonstrated accuracy and computational efficiency of the analysis make it well suited for parametric and design studies.
NASA Astrophysics Data System (ADS)
Cara, Javier
2016-05-01
Modal parameters comprise natural frequencies, damping ratios, modal vectors and modal masses. In a theoretic framework, these parameters are the basis for the solution of vibration problems using the theory of modal superposition. In practice, they can be computed from input-output vibration data: the usual procedure is to estimate a mathematical model from the data and then to compute the modal parameters from the estimated model. The most popular models for input-output data are based on the frequency response function, but in recent years the state space model in the time domain has become popular among researchers and practitioners of modal analysis with experimental data. In this work, the equations to compute the modal parameters from the state space model when input and output data are available (like in combined experimental-operational modal analysis) are derived in detail using invariants of the state space model: the equations needed to compute natural frequencies, damping ratios and modal vectors are well known in the operational modal analysis framework, but the equation needed to compute the modal masses has not generated much interest in technical literature. These equations are applied to both a numerical simulation and an experimental study in the last part of the work.
Computational Biochemistry-Enzyme Mechanisms Explored.
Culka, Martin; Gisdon, Florian J; Ullmann, G Matthias
2017-01-01
Understanding enzyme mechanisms is a major task to achieve in order to comprehend how living cells work. Recent advances in biomolecular research provide huge amount of data on enzyme kinetics and structure. The analysis of diverse experimental results and their combination into an overall picture is, however, often challenging. Microscopic details of the enzymatic processes are often anticipated based on several hints from macroscopic experimental data. Computational biochemistry aims at creation of a computational model of an enzyme in order to explain microscopic details of the catalytic process and reproduce or predict macroscopic experimental findings. Results of such computations are in part complementary to experimental data and provide an explanation of a biochemical process at the microscopic level. In order to evaluate the mechanism of an enzyme, a structural model is constructed which can be analyzed by several theoretical approaches. Several simulation methods can and should be combined to get a reliable picture of the process of interest. Furthermore, abstract models of biological systems can be constructed combining computational and experimental data. In this review, we discuss structural computational models of enzymatic systems. We first discuss various models to simulate enzyme catalysis. Furthermore, we review various approaches how to characterize the enzyme mechanism both qualitatively and quantitatively using different modeling approaches. © 2017 Elsevier Inc. All rights reserved.
A micro-hydrology computation ordering algorithm
NASA Astrophysics Data System (ADS)
Croley, Thomas E.
1980-11-01
Discrete-distributed-parameter models are essential for watershed modelling where practical consideration of spatial variations in watershed properties and inputs is desired. Such modelling is necessary for analysis of detailed hydrologic impacts from management strategies and land-use effects. Trade-offs between model validity and model complexity exist in resolution of the watershed. Once these are determined, the watershed is then broken into sub-areas which each have essentially spatially-uniform properties. Lumped-parameter (micro-hydrology) models are applied to these sub-areas and their outputs are combined through the use of a computation ordering technique, as illustrated by many discrete-distributed-parameter hydrology models. Manual ordering of these computations requires fore-thought, and is tedious, error prone, sometimes storage intensive and least adaptable to changes in watershed resolution. A programmable algorithm for ordering micro-hydrology computations is presented that enables automatic ordering of computations within the computer via an easily understood and easily implemented "node" definition, numbering and coding scheme. This scheme and the algorithm are detailed in logic flow-charts and an example application is presented. Extensions and modifications of the algorithm are easily made for complex geometries or differing microhydrology models. The algorithm is shown to be superior to manual ordering techniques and has potential use in high-resolution studies.
Interoperability of GADU in using heterogeneous Grid resources for bioinformatics applications.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sulakhe, D.; Rodriguez, A.; Wilde, M.
2008-03-01
Bioinformatics tools used for efficient and computationally intensive analysis of genetic sequences require large-scale computational resources to accommodate the growing data. Grid computational resources such as the Open Science Grid and TeraGrid have proved useful for scientific discovery. The genome analysis and database update system (GADU) is a high-throughput computational system developed to automate the steps involved in accessing the Grid resources for running bioinformatics applications. This paper describes the requirements for building an automated scalable system such as GADU that can run jobs on different Grids. The paper describes the resource-independent configuration of GADU using the Pegasus-based virtual datamore » system that makes high-throughput computational tools interoperable on heterogeneous Grid resources. The paper also highlights the features implemented to make GADU a gateway to computationally intensive bioinformatics applications on the Grid. The paper will not go into the details of problems involved or the lessons learned in using individual Grid resources as it has already been published in our paper on genome analysis research environment (GNARE) and will focus primarily on the architecture that makes GADU resource independent and interoperable across heterogeneous Grid resources.« less
Algorithm implementation on the Navier-Stokes computer
DOE Office of Scientific and Technical Information (OSTI.GOV)
Krist, S.E.; Zang, T.A.
1987-03-01
The Navier-Stokes Computer is a multi-purpose parallel-processing supercomputer which is currently under development at Princeton University. It consists of multiple local memory parallel processors, called Nodes, which are interconnected in a hypercube network. Details of the procedures involved in implementing an algorithm on the Navier-Stokes computer are presented. The particular finite difference algorithm considered in this analysis was developed for simulation of laminar-turbulent transition in wall bounded shear flows. Projected timing results for implementing this algorithm indicate that operation rates in excess of 42 GFLOPS are feasible on a 128 Node machine.
Algorithm implementation on the Navier-Stokes computer
NASA Technical Reports Server (NTRS)
Krist, Steven E.; Zang, Thomas A.
1987-01-01
The Navier-Stokes Computer is a multi-purpose parallel-processing supercomputer which is currently under development at Princeton University. It consists of multiple local memory parallel processors, called Nodes, which are interconnected in a hypercube network. Details of the procedures involved in implementing an algorithm on the Navier-Stokes computer are presented. The particular finite difference algorithm considered in this analysis was developed for simulation of laminar-turbulent transition in wall bounded shear flows. Projected timing results for implementing this algorithm indicate that operation rates in excess of 42 GFLOPS are feasible on a 128 Node machine.
NASA Technical Reports Server (NTRS)
Gibson, A. F.
1983-01-01
A system of computer programs has been developed to model general three-dimensional surfaces. Surfaces are modeled as sets of parametric bicubic patches. There are also capabilities to transform coordinate to compute mesh/surface intersection normals, and to format input data for a transonic potential flow analysis. A graphical display of surface models and intersection normals is available. There are additional capabilities to regulate point spacing on input curves and to compute surface intersection curves. Internal details of the implementation of this system are explained, and maintenance procedures are specified.
Performance analysis of SA-3 missile second stage
NASA Technical Reports Server (NTRS)
Helmy, A. M.
1981-01-01
One SA-3 missile was disassembled. The constituents of the second stage were thoroughly investigated for geometrical details. The second stage slotted composite propellant grain was subjected to mechanical properties testing, physiochemical analyses, and burning rate measurements at different conditions. To determine the propellant performance parameters, the slotted composite propellant grain was machined into a set of small-size tubular grains. These grains were fired in a small size rocket motor with a set of interchangeable nozzles with different throat diameters. The firings were carried out at three different conditions. The data from test motor firings, physiochemical properties of the propellant, burning rate measurement results and geometrical details of the second stage motor, were used as input data in a computer program to compute the internal ballistic characteristics of the second stage.
Non-standard analysis and embedded software
NASA Technical Reports Server (NTRS)
Platek, Richard
1995-01-01
One model for computing in the future is ubiquitous, embedded computational devices analogous to embedded electrical motors. Many of these computers will control physical objects and processes. Such hidden computerized environments introduce new safety and correctness concerns whose treatment go beyond present Formal Methods. In particular, one has to begin to speak about Real Space software in analogy with Real Time software. By this we mean, computerized systems which have to meet requirements expressed in the real geometry of space. How to translate such requirements into ordinary software specifications and how to carry out proofs is a major challenge. In this talk we propose a research program based on the use of no-standard analysis. Much detail remains to be carried out. The purpose of the talk is to inform the Formal Methods community that Non-Standard Analysis provides a possible avenue to attack which we believe will be fruitful.
A Gateway for Phylogenetic Analysis Powered by Grid Computing Featuring GARLI 2.0
Bazinet, Adam L.; Zwickl, Derrick J.; Cummings, Michael P.
2014-01-01
We introduce molecularevolution.org, a publicly available gateway for high-throughput, maximum-likelihood phylogenetic analysis powered by grid computing. The gateway features a garli 2.0 web service that enables a user to quickly and easily submit thousands of maximum likelihood tree searches or bootstrap searches that are executed in parallel on distributed computing resources. The garli web service allows one to easily specify partitioned substitution models using a graphical interface, and it performs sophisticated post-processing of phylogenetic results. Although the garli web service has been used by the research community for over three years, here we formally announce the availability of the service, describe its capabilities, highlight new features and recent improvements, and provide details about how the grid system efficiently delivers high-quality phylogenetic results. [garli, gateway, grid computing, maximum likelihood, molecular evolution portal, phylogenetics, web service.] PMID:24789072
DOT National Transportation Integrated Search
2016-02-01
In this study, a computational approach for conducting durability analysis of bridges using detailed finite element models is developed. The underlying approach adopted is based on the hypothesis that the two main factors affecting the life of a brid...
DORCA 2 computer program. Volume 2: Programmer's manual
NASA Technical Reports Server (NTRS)
Gold, B. J.
1972-01-01
A guide for coding the Dynamic Operational Requirements and Cost Analysis Program (DORCA 2) is presented. The manual provides a detailed operation of every subroutine, the layout in core of the major matrices and arrays, and the meaning of all program values. Flow charts are included.
CALL--Past, Present, and Future.
ERIC Educational Resources Information Center
Bax, Stephen
2003-01-01
Provides a critical examination and reassessment of the history of computer assisted language learning (CALL), and argues for three new strategies--restricted, open, and integrated. Offers definitions and descriptions of the three approaches and argues that they allow a more detailed analysis of institutions and classrooms than earlier analyses.…
Computational Flow Analysis of a Left Ventricular Assist Device
NASA Technical Reports Server (NTRS)
Kiris, Cetin; Kwak, Dochan; Benkowski, Robert
1995-01-01
Computational fluid dynamics has been developed to a level where it has become an Indispensable part of aerospace research and design. Technology developed foe aerospace applications am also be utilized for the benefit of human health. For example, a flange-to-flange rocket engine fuel-pump simulation includes the rotating and non-rotating components: the flow straighteners, the impeller, and diffusers A Ventricular Assist Device developed by NASA Johnson Space Center and Baylor College of Medicine has a design similar to a rocket engine fuel pump in that it also consists of a flow straightener, an impeller, and a diffuser. Accurate and detailed knowledge of the flowfield obtained by incompressible flow calculations can be greatly beneficial to designers in their effort to reduce the cost and improve the reliability of these devices. In addition to the geometric complexities, a variety of flow phenomena are encountered in biofluids Then include turbulent boundary layer separation, wakes, transition, tip vortex resolution, three-dimensional effects, and Reynolds number effects. In order to increase the role of Computational Fluid Dynamics (CFD) in the design process the CFD analysis tools must be evaluated and validated so that designers gain Confidence in their use. The incompressible flow solver, INS3D, has been applied to flow inside of a liquid rocket engine turbopump components and extensively validated. This paper details how the computational flow simulation capability developed for liquid rocket engine pump component analysis has bean applied to the Left Ventricular Assist Device being developed jointly by NASA JSC and Baylor College of Medicine.
Wildlife software: procedures for publication of computer software
Samuel, M.D.
1990-01-01
Computers and computer software have become an integral part of the practice of wildlife science. Computers now play an important role in teaching, research, and management applications. Because of the specialized nature of wildlife problems, specific computer software is usually required to address a given problem (e.g., home range analysis). This type of software is not usually available from commercial vendors and therefore must be developed by those wildlife professionals with particular skill in computer programming. Current journal publication practices generally prevent a detailed description of computer software associated with new techniques. In addition, peer review of journal articles does not usually include a review of associated computer software. Thus, many wildlife professionals are usually unaware of computer software that would meet their needs or of major improvements in software they commonly use. Indeed most users of wildlife software learn of new programs or important changes only by word of mouth.
MOD-1 Wind Turbine Generator Analysis and Design Report, Volume 2
NASA Technical Reports Server (NTRS)
1979-01-01
The MOD-1 detail design is appended. The supporting analyses presented include a parametric system trade study, a verification of the computer codes used for rotor loads analysis, a metal blade study, and a definition of the design loads at each principal wind turbine generator interface for critical loading conditions. Shipping and assembly requirements, composite blade development, and electrical stability are also discussed.
Computational Psychiatry of ADHD: Neural Gain Impairments across Marrian Levels of Analysis
Hauser, Tobias U.; Fiore, Vincenzo G.; Moutoussis, Michael; Dolan, Raymond J.
2016-01-01
Attention-deficit hyperactivity disorder (ADHD), one of the most common psychiatric disorders, is characterised by unstable response patterns across multiple cognitive domains. However, the neural mechanisms that explain these characteristic features remain unclear. Using a computational multilevel approach, we propose that ADHD is caused by impaired gain modulation in systems that generate this phenotypic increased behavioural variability. Using Marr's three levels of analysis as a heuristic framework, we focus on this variable behaviour, detail how it can be explained algorithmically, and how it might be implemented at a neural level through catecholamine influences on corticostriatal loops. This computational, multilevel, approach to ADHD provides a framework for bridging gaps between descriptions of neuronal activity and behaviour, and provides testable predictions about impaired mechanisms. PMID:26787097
Digital image processing for information extraction.
NASA Technical Reports Server (NTRS)
Billingsley, F. C.
1973-01-01
The modern digital computer has made practical image processing techniques for handling nonlinear operations in both the geometrical and the intensity domains, various types of nonuniform noise cleanup, and the numerical analysis of pictures. An initial requirement is that a number of anomalies caused by the camera (e.g., geometric distortion, MTF roll-off, vignetting, and nonuniform intensity response) must be taken into account or removed to avoid their interference with the information extraction process. Examples illustrating these operations are discussed along with computer techniques used to emphasize details, perform analyses, classify materials by multivariate analysis, detect temporal differences, and aid in human interpretation of photos.
A Combined Experimental/Computational Investigation of a Rocket Based Combined Cycle Inlet
NASA Technical Reports Server (NTRS)
Smart, Michael K.; Trexler, Carl A.; Goldman, Allen L.
2001-01-01
A rocket based combined cycle inlet geometry has undergone wind tunnel testing and computational analysis with Mach 4 flow at the inlet face. Performance parameters obtained from the wind tunnel tests were the mass capture, the maximum back-pressure, and the self-starting characteristics of the inlet. The CFD analysis supplied a confirmation of the mass capture, the inlet efficiency and the details of the flowfield structure. Physical parameters varied during the test program were cowl geometry, cowl position, body-side bleed magnitude and ingested boundary layer thickness. An optimum configuration was determined for the inlet as a result of this work.
NASA Technical Reports Server (NTRS)
Williams, F. W.; Anderson, M. S.; Kennedy, D.; Butler, R.; Aston, G.
1990-01-01
A computer program which is designed for efficient, accurate buckling and vibration analysis and optimum design of composite panels is described. The capabilities of the program are given along with detailed user instructions. It is written in FORTRAN 77 and is operational on VAX, IBM, and CDC computers and should be readily adapted to others. Several illustrations of the various aspects of the input are given along the example problems illustrating the use and application of the program.
NASA Astrophysics Data System (ADS)
Teodorescu, Liliana; Britton, David; Glover, Nigel; Heinrich, Gudrun; Lauret, Jérôme; Naumann, Axel; Speer, Thomas; Teixeira-Dias, Pedro
2012-06-01
ACAT2011 This volume of Journal of Physics: Conference Series is dedicated to scientific contributions presented at the 14th International Workshop on Advanced Computing and Analysis Techniques in Physics Research (ACAT 2011) which took place on 5-7 September 2011 at Brunel University, UK. The workshop series, which began in 1990 in Lyon, France, brings together computer science researchers and practitioners, and researchers from particle physics and related fields in order to explore and confront the boundaries of computing and of automatic data analysis and theoretical calculation techniques. It is a forum for the exchange of ideas among the fields, exploring and promoting cutting-edge computing, data analysis and theoretical calculation techniques in fundamental physics research. This year's edition of the workshop brought together over 100 participants from all over the world. 14 invited speakers presented key topics on computing ecosystems, cloud computing, multivariate data analysis, symbolic and automatic theoretical calculations as well as computing and data analysis challenges in astrophysics, bioinformatics and musicology. Over 80 other talks and posters presented state-of-the art developments in the areas of the workshop's three tracks: Computing Technologies, Data Analysis Algorithms and Tools, and Computational Techniques in Theoretical Physics. Panel and round table discussions on data management and multivariate data analysis uncovered new ideas and collaboration opportunities in the respective areas. This edition of ACAT was generously sponsored by the Science and Technology Facility Council (STFC), the Institute for Particle Physics Phenomenology (IPPP) at Durham University, Brookhaven National Laboratory in the USA and Dell. We would like to thank all the participants of the workshop for the high level of their scientific contributions and for the enthusiastic participation in all its activities which were, ultimately, the key factors in the success of the workshop. Further information on ACAT 2011 can be found at http://acat2011.cern.ch Dr Liliana Teodorescu Brunel University ACATgroup The PDF also contains details of the workshop's committees and sponsors.
Computer program to perform cost and weight analysis of transport aircraft. Volume 1: Summary
NASA Technical Reports Server (NTRS)
1973-01-01
A digital computer program for evaluating the weight and costs of advanced transport designs was developed. The resultant program, intended for use at the preliminary design level, incorporates both batch mode and interactive graphics run capability. The basis of the weight and cost estimation method developed is a unique way of predicting the physical design of each detail part of a vehicle structure at a time when only configuration concept drawings are available. In addition, the technique relies on methods to predict the precise manufacturing processes and the associated material required to produce each detail part. Weight data are generated in four areas of the program. Overall vehicle system weights are derived on a statistical basis as part of the vehicle sizing process. Theoretical weights, actual weights, and the weight of the raw material to be purchased are derived as part of the structural synthesis and part definition processes based on the computed part geometry.
Ruppin, Eytan; Papin, Jason A; de Figueiredo, Luis F; Schuster, Stefan
2010-08-01
With the advent of modern omics technologies, it has become feasible to reconstruct (quasi-) whole-cell metabolic networks and characterize them in more and more detail. Computer simulations of the dynamic behavior of such networks are difficult due to a lack of kinetic data and to computational limitations. In contrast, network analysis based on appropriate constraints such as the steady-state condition (constraint-based analysis) is feasible and allows one to derive conclusions about the system's metabolic capabilities. Here, we review methods for the reconstruction of metabolic networks, modeling techniques such as flux balance analysis and elementary flux modes and current progress in their development and applications. Game-theoretical methods for studying metabolic networks are discussed as well. Copyright © 2010 Elsevier Ltd. All rights reserved.
Large-Scale Computation of Nuclear Magnetic Resonance Shifts for Paramagnetic Solids Using CP2K.
Mondal, Arobendo; Gaultois, Michael W; Pell, Andrew J; Iannuzzi, Marcella; Grey, Clare P; Hutter, Jürg; Kaupp, Martin
2018-01-09
Large-scale computations of nuclear magnetic resonance (NMR) shifts for extended paramagnetic solids (pNMR) are reported using the highly efficient Gaussian-augmented plane-wave implementation of the CP2K code. Combining hyperfine couplings obtained with hybrid functionals with g-tensors and orbital shieldings computed using gradient-corrected functionals, contact, pseudocontact, and orbital-shift contributions to pNMR shifts are accessible. Due to the efficient and highly parallel performance of CP2K, a wide variety of materials with large unit cells can be studied with extended Gaussian basis sets. Validation of various approaches for the different contributions to pNMR shifts is done first for molecules in a large supercell in comparison with typical quantum-chemical codes. This is then extended to a detailed study of g-tensors for extended solid transition-metal fluorides and for a series of complex lithium vanadium phosphates. Finally, lithium pNMR shifts are computed for Li 3 V 2 (PO 4 ) 3 , for which detailed experimental data are available. This has allowed an in-depth study of different approaches (e.g., full periodic versus incremental cluster computations of g-tensors and different functionals and basis sets for hyperfine computations) as well as a thorough analysis of the different contributions to the pNMR shifts. This study paves the way for a more-widespread computational treatment of NMR shifts for paramagnetic materials.
NASA Technical Reports Server (NTRS)
Stoll, Frederick
1993-01-01
The NLPAN computer code uses a finite-strip approach to the analysis of thin-walled prismatic composite structures such as stiffened panels. The code can model in-plane axial loading, transverse pressure loading, and constant through-the-thickness thermal loading, and can account for shape imperfections. The NLPAN code represents an attempt to extend the buckling analysis of the VIPASA computer code into the geometrically nonlinear regime. Buckling mode shapes generated using VIPASA are used in NLPAN as global functions for representing displacements in the nonlinear regime. While the NLPAN analysis is approximate in nature, it is computationally economical in comparison with finite-element analysis, and is thus suitable for use in preliminary design and design optimization. A comprehensive description of the theoretical approach of NLPAN is provided. A discussion of some operational considerations for the NLPAN code is included. NLPAN is applied to several test problems in order to demonstrate new program capabilities, and to assess the accuracy of the code in modeling various types of loading and response. User instructions for the NLPAN computer program are provided, including a detailed description of the input requirements and example input files for two stiffened-panel configurations.
NASA Technical Reports Server (NTRS)
Tennille, Geoffrey M.; Howser, Lona M.
1993-01-01
The use of the CONVEX computers that are an integral part of the Supercomputing Network Subsystems (SNS) of the Central Scientific Computing Complex of LaRC is briefly described. Features of the CONVEX computers that are significantly different than the CRAY supercomputers are covered, including: FORTRAN, C, architecture of the CONVEX computers, the CONVEX environment, batch job submittal, debugging, performance analysis, utilities unique to CONVEX, and documentation. This revision reflects the addition of the Applications Compiler and X-based debugger, CXdb. The document id intended for all CONVEX users as a ready reference to frequently asked questions and to more detailed information contained with the vendor manuals. It is appropriate for both the novice and the experienced user.
Toward Theory-Based Instruction in Scientific Problem Solving.
ERIC Educational Resources Information Center
Heller, Joan I.; And Others
Several empirical and theoretical analyses related to scientific problem-solving are reviewed, including: detailed studies of individuals at different levels of expertise, and computer models simulating some aspects of human information processing during problem solving. Analysis of these studies has revealed many facets about the nature of the…
Implementation of radiation shielding calculation methods. Volume 2: Seminar/Workshop notes
NASA Technical Reports Server (NTRS)
Capo, M. A.; Disney, R. K.
1971-01-01
Detailed descriptions are presented of the input data for each of the MSFC computer codes applied to the analysis of a realistic nuclear propelled vehicle. The analytical techniques employed include cross section data, preparation, one and two dimensional discrete ordinates transport, point kernel, and single scatter methods.
Application of computer-aided dispatch in law enforcement: An introductory planning guide
NASA Technical Reports Server (NTRS)
Sohn, R. L.; Gurfield, R. M.; Garcia, E. A.; Fielding, J. E.
1975-01-01
A set of planning guidelines for the application of computer-aided dispatching (CAD) to law enforcement is presented. Some essential characteristics and applications of CAD are outlined; the results of a survey of systems in the operational or planning phases are summarized. Requirements analysis, system concept design, implementation planning, and performance and cost modeling are described and demonstrated with numerous examples. Detailed descriptions of typical law enforcement CAD systems, and a list of vendor sources, are given in appendixes.
NASA Technical Reports Server (NTRS)
Makivic, Miloje S.
1996-01-01
This is the final technical report for the project entitled: "High-Performance Computing and Four-Dimensional Data Assimilation: The Impact on Future and Current Problems", funded at NPAC by the DAO at NASA/GSFC. First, the motivation for the project is given in the introductory section, followed by the executive summary of major accomplishments and the list of project-related publications. Detailed analysis and description of research results is given in subsequent chapters and in the Appendix.
Signal-processing analysis of the MC2823 radar fuze: an addendum concerning clutter effects
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jelinek, D.A.
1978-07-01
A detailed analysis of the signal processing of the MC2823 radar fuze was published by Thompson in 1976 which enabled the computation of dud probability versus signal-to-noise ratio where the noise was receiver noise. An addendum to Thompson's work was published by Williams in 1978 that modified the weighting function used by Thompson. The analysis presented herein extends the work of Thompson to include the effects of clutter (the non-signal portion of the echo from a terrain) using the new weighting function. This extension enables computation of dud probability versus signal-to-total-noise ratio where total noise is the sum of themore » receiver-noise power and the clutter power.« less
Preliminary Design of a Manned Nuclear Electric Propulsion Vehicle Using Genetic Algorithms
NASA Technical Reports Server (NTRS)
Irwin, Ryan W.; Tinker, Michael L.
2005-01-01
Nuclear electric propulsion (NEP) vehicles will be needed for future manned missions to Mars and beyond. Candidate designs must be identified for further detailed design from a large array of possibilities. Genetic algorithms have proven their utility in conceptual design studies by effectively searching a large design space to pinpoint unique optimal designs. This research combined analysis codes for NEP subsystems with a genetic algorithm. The use of penalty functions with scaling ratios was investigated to increase computational efficiency. Also, the selection of design variables for optimization was considered to reduce computation time without losing beneficial design search space. Finally, trend analysis of a reference mission to the asteroids yielded a group of candidate designs for further analysis.
Bayesian Latent Class Analysis Tutorial.
Li, Yuelin; Lord-Bessen, Jennifer; Shiyko, Mariya; Loeb, Rebecca
2018-01-01
This article is a how-to guide on Bayesian computation using Gibbs sampling, demonstrated in the context of Latent Class Analysis (LCA). It is written for students in quantitative psychology or related fields who have a working knowledge of Bayes Theorem and conditional probability and have experience in writing computer programs in the statistical language R . The overall goals are to provide an accessible and self-contained tutorial, along with a practical computation tool. We begin with how Bayesian computation is typically described in academic articles. Technical difficulties are addressed by a hypothetical, worked-out example. We show how Bayesian computation can be broken down into a series of simpler calculations, which can then be assembled together to complete a computationally more complex model. The details are described much more explicitly than what is typically available in elementary introductions to Bayesian modeling so that readers are not overwhelmed by the mathematics. Moreover, the provided computer program shows how Bayesian LCA can be implemented with relative ease. The computer program is then applied in a large, real-world data set and explained line-by-line. We outline the general steps in how to extend these considerations to other methodological applications. We conclude with suggestions for further readings.
Numerical Propulsion System Simulation (NPSS) 1999 Industry Review
NASA Technical Reports Server (NTRS)
Lytle, John; Follen, Greg; Naiman, Cynthia; Evans, Austin
2000-01-01
The technologies necessary to enable detailed numerical simulations of complete propulsion systems are being developed at the NASA Glenn Research Center in cooperation with industry, academia, and other government agencies. Large scale, detailed simulations will be of great value to the nation because they eliminate some of the costly testing required to develop and certify advanced propulsion systems. In addition, time and cost savings will be achieved by enabling design details to be evaluated early in the development process before a commitment is made to a specific design. This concept is called the Numerical Propulsion System Simulation (NPSS). NPSS consists of three main elements: (1) engineering models that enable multidisciplinary analysis of large subsystems and systems at various levels of detail, (2) a simulation environment that maximizes designer productivity, and (3) a cost-effective, high-performance computing platform. A fundamental requirement of the concept is that the simulations must be capable of overnight execution on easily accessible computing platforms. This will greatly facilitate the use of large-scale simulations in a design environment. This paper describes the current status of the NPSS with specific emphasis on the progress made over the past year on air breathing propulsion applications. In addition, the paper contains a summary of the feedback received from industry partners in the development effort and the actions taken over the past year to respond to that feedback. The NPSS development was supported in FY99 by the High Performance Computing and Communications Program.
Stream network analysis from orbital and suborbital imagery, Colorado River Basin, Texas
NASA Technical Reports Server (NTRS)
Baker, V. R. (Principal Investigator)
1973-01-01
The author has identified the following significant results. Orbital SL-2 imagery (earth terrain camera S-190B), received September 5, 1973, was subjected to quantitative network analysis and compared to 7.5 minute topographic mapping (scale: 1/24,000) and U.S.D.A. conventional black and white aerial photography (scale: 1/22,200). Results can only be considered suggestive because detail on the SL-2 imagery was badly obscured by heavy cloud cover. The upper Bee Creek basin was chosen for analysis because it appeared in a relatively cloud-free portion of the orbital imagery. Drainage maps were drawn from the three sources digitized into a computer-compatible format, and analyzed by the WATER system computer program. Even at its small scale (1/172,000) and with bad haze the orbital photo showed much drainage detail. The contour-like character of the Glen Rose Formation's resistant limestone units allowed channel definition. The errors in pattern recognition can be attributed to local areas of dense vegetation and to other areas of very high albedo caused by surficial exposure of caliche. The latter effect caused particular difficulty in the determination of drainage divides.
Ford, Patrick; Santos, Eduardo; Ferrão, Paulo; Margarido, Fernanda; Van Vliet, Krystyn J; Olivetti, Elsa
2016-05-03
The challenges brought on by the increasing complexity of electronic products, and the criticality of the materials these devices contain, present an opportunity for maximizing the economic and societal benefits derived from recovery and recycling. Small appliances and computer devices (SACD), including mobile phones, contain significant amounts of precious metals including gold and platinum, the present value of which should serve as a key economic driver for many recycling decisions. However, a detailed analysis is required to estimate the economic value that is unrealized by incomplete recovery of these and other materials, and to ascertain how such value could be reinvested to improve recovery processes. We present a dynamic product flow analysis for SACD throughout Portugal, a European Union member, including annual data detailing product sales and industrial-scale preprocessing data for recovery of specific materials from devices. We employ preprocessing facility and metals pricing data to identify losses, and develop an economic framework around the value of recycling including uncertainty. We show that significant economic losses occur during preprocessing (over $70 M USD unrecovered in computers and mobile phones, 2006-2014) due to operations that fail to target high value materials, and characterize preprocessing operations according to material recovery and total costs.
A new predictive multi-zone model for HCCI engine combustion
Bissoli, Mattia; Frassoldati, Alessio; Cuoci, Alberto; ...
2016-06-30
Here, this work introduces a new predictive multi-zone model for the description of combustion in Homogeneous Charge Compression Ignition (HCCI) engines. The model exploits the existing OpenSMOKE++ computational suite to handle detailed kinetic mechanisms, providing reliable predictions of the in-cylinder auto-ignition processes. All the elements with a significant impact on the combustion performances and emissions, like turbulence, heat and mass exchanges, crevices, residual burned gases, thermal and feed stratification are taken into account. Compared to other computational approaches, this model improves the description of mixture stratification phenomena by coupling a wall heat transfer model derived from CFD application with amore » proper turbulence model. Furthermore, the calibration of this multi-zone model requires only three parameters, which can be derived from a non-reactive CFD simulation: these adaptive variables depend only on the engine geometry and remain fixed across a wide range of operating conditions, allowing the prediction of auto-ignition, pressure traces and pollutants. This computational framework enables the use of detail kinetic mechanisms, as well as Rate of Production Analysis (RoPA) and Sensitivity Analysis (SA) to investigate the complex chemistry involved in the auto-ignition and the pollutants formation processes. In the final sections of the paper, these capabilities are demonstrated through the comparison with experimental data.« less
Structure-guided Protein Transition Modeling with a Probabilistic Roadmap Algorithm.
Maximova, Tatiana; Plaku, Erion; Shehu, Amarda
2016-07-07
Proteins are macromolecules in perpetual motion, switching between structural states to modulate their function. A detailed characterization of the precise yet complex relationship between protein structure, dynamics, and function requires elucidating transitions between functionally-relevant states. Doing so challenges both wet and dry laboratories, as protein dynamics involves disparate temporal scales. In this paper we present a novel, sampling-based algorithm to compute transition paths. The algorithm exploits two main ideas. First, it leverages known structures to initialize its search and define a reduced conformation space for rapid sampling. This is key to address the insufficient sampling issue suffered by sampling-based algorithms. Second, the algorithm embeds samples in a nearest-neighbor graph where transition paths can be efficiently computed via queries. The algorithm adapts the probabilistic roadmap framework that is popular in robot motion planning. In addition to efficiently computing lowest-cost paths between any given structures, the algorithm allows investigating hypotheses regarding the order of experimentally-known structures in a transition event. This novel contribution is likely to open up new venues of research. Detailed analysis is presented on multiple-basin proteins of relevance to human disease. Multiscaling and the AMBER ff14SB force field are used to obtain energetically-credible paths at atomistic detail.
2000 Numerical Propulsion System Simulation Review
NASA Technical Reports Server (NTRS)
Lytle, John; Follen, Greg; Naiman, Cynthia; Veres, Joseph; Owen, Karl; Lopez, Isaac
2001-01-01
The technologies necessary to enable detailed numerical simulations of complete propulsion systems are being developed at the NASA Glenn Research Center in cooperation with industry, academia, and other government agencies. Large scale, detailed simulations will be of great value to the nation because they eliminate some of the costly testing required to develop and certify advanced propulsion systems. In addition, time and cost savings will be achieved by enabling design details to be evaluated early in the development process before a commitment is made to a specific design. This concept is called the Numerical Propulsion System Simulation (NPSS). NPSS consists of three main elements: (1) engineering models that enable multidisciplinary analysis of large subsystems and systems at various levels of detail, (2) a simulation environment that maximizes designer productivity, and (3) a cost-effective. high-performance computing platform. A fundamental requirement of the concept is that the simulations must be capable of overnight execution on easily accessible computing platforms. This will greatly facilitate the use of large-scale simulations in a design environment. This paper describes the current status of the NPSS with specific emphasis on the progress made over the past year on air breathing propulsion applications. Major accomplishments include the first formal release of the NPSS object-oriented architecture (NPSS Version 1) and the demonstration of a one order of magnitude reduction in computing cost-to-performance ratio using a cluster of personal computers. The paper also describes the future NPSS milestones, which include the simulation of space transportation propulsion systems in response to increased emphasis on safe, low cost access to space within NASA'S Aerospace Technology Enterprise. In addition, the paper contains a summary of the feedback received from industry partners on the fiscal year 1999 effort and the actions taken over the past year to respond to that feedback. NPSS was supported in fiscal year 2000 by the High Performance Computing and Communications Program.
2001 Numerical Propulsion System Simulation Review
NASA Technical Reports Server (NTRS)
Lytle, John; Follen, Gregory; Naiman, Cynthia; Veres, Joseph; Owen, Karl; Lopez, Isaac
2002-01-01
The technologies necessary to enable detailed numerical simulations of complete propulsion systems are being developed at the NASA Glenn Research Center in cooperation with industry, academia and other government agencies. Large scale, detailed simulations will be of great value to the nation because they eliminate some of the costly testing required to develop and certify advanced propulsion systems. In addition, time and cost savings will be achieved by enabling design details to be evaluated early in the development process before a commitment is made to a specific design. This concept is called the Numerical Propulsion System Simulation (NPSS). NPSS consists of three main elements: (1) engineering models that enable multidisciplinary analysis of large subsystems and systems at various levels of detail, (2) a simulation environment that maximizes designer productivity, and (3) a cost-effective, high-performance computing platform. A fundamental requirement of the concept is that the simulations must be capable of overnight execution on easily accessible computing platforms. This will greatly facilitate the use of large-scale simulations in a design environment. This paper describes the current status of the NPSS with specific emphasis on the progress made over the past year on air breathing propulsion applications. Major accomplishments include the first formal release of the NPSS object-oriented architecture (NPSS Version 1) and the demonstration of a one order of magnitude reduction in computing cost-to-performance ratio using a cluster of personal computers. The paper also describes the future NPSS milestones, which include the simulation of space transportation propulsion systems in response to increased emphasis on safe, low cost access to space within NASA's Aerospace Technology Enterprise. In addition, the paper contains a summary of the feedback received from industry partners on the fiscal year 2000 effort and the actions taken over the past year to respond to that feedback. NPSS was supported in fiscal year 2001 by the High Performance Computing and Communications Program.
Multiphysics Nuclear Thermal Rocket Thrust Chamber Analysis
NASA Technical Reports Server (NTRS)
Wang, Ten-See
2005-01-01
The objective of this effort is t o develop an efficient and accurate thermo-fluid computational methodology to predict environments for hypothetical thrust chamber design and analysis. The current task scope is to perform multidimensional, multiphysics analysis of thrust performance and heat transfer analysis for a hypothetical solid-core, nuclear thermal engine including thrust chamber and nozzle. The multiphysics aspects of the model include: real fluid dynamics, chemical reactivity, turbulent flow, and conjugate heat transfer. The model will be designed to identify thermal, fluid, and hydrogen environments in all flow paths and materials. This model would then be used to perform non- nuclear reproduction of the flow element failures demonstrated in the Rover/NERVA testing, investigate performance of specific configurations and assess potential issues and enhancements. A two-pronged approach will be employed in this effort: a detailed analysis of a multi-channel, flow-element, and global modeling of the entire thrust chamber assembly with a porosity modeling technique. It is expected that the detailed analysis of a single flow element would provide detailed fluid, thermal, and hydrogen environments for stress analysis, while the global thrust chamber assembly analysis would promote understanding of the effects of hydrogen dissociation and heat transfer on thrust performance. These modeling activities will be validated as much as possible by testing performed by other related efforts.
Improved numerical solutions for chaotic-cancer-model
NASA Astrophysics Data System (ADS)
Yasir, Muhammad; Ahmad, Salman; Ahmed, Faizan; Aqeel, Muhammad; Akbar, Muhammad Zubair
2017-01-01
In biological sciences, dynamical system of cancer model is well known due to its sensitivity and chaoticity. Present work provides detailed computational study of cancer model by counterbalancing its sensitive dependency on initial conditions and parameter values. Cancer chaotic model is discretized into a system of nonlinear equations that are solved using the well-known Successive-Over-Relaxation (SOR) method with a proven convergence. This technique enables to solve large systems and provides more accurate approximation which is illustrated through tables, time history maps and phase portraits with detailed analysis.
Detailed mechanism of benzene oxidation
NASA Technical Reports Server (NTRS)
Bittker, David A.
1987-01-01
A detailed quantitative mechanism for the oxidation of benzene in both argon and nitrogen diluted systems is presented. Computed ignition delay time for argon diluted mixtures are in satisfactory agreement with experimental results for a wide range of initial conditions. An experimental temperature versus time profile for a nitrogen diluted oxidation was accurately matched and several concentration profiles were matched qualitatively. Application of sensitivity analysis has given approximate rate constant expressions for the two dominant heat release reactions, the oxidation of C6H5 and C5H5 radicals by molecular oxygen.
NASA Astrophysics Data System (ADS)
Abbott, B. P.; Abbott, R.; Abbott, T. D.; Abernathy, M. R.; Acernese, F.; Ackley, K.; Adams, C.; Adams, T.; Addesso, P.; Adhikari, R. X.; Adya, V. B.; Affeldt, C.; Agathos, M.; Agatsuma, K.; Aggarwal, N.; Aguiar, O. D.; Aiello, L.; Ain, A.; Ajith, P.; Allen, B.; Allocca, A.; Altin, P. A.; Anderson, S. B.; Anderson, W. G.; Arai, K.; Araya, M. C.; Arceneaux, C. C.; Areeda, J. S.; Arnaud, N.; Arun, K. G.; Ascenzi, S.; Ashton, G.; Ast, M.; Aston, S. M.; Astone, P.; Aufmuth, P.; Aulbert, C.; Babak, S.; Bacon, P.; Bader, M. K. M.; Baker, P. T.; Baldaccini, F.; Ballardin, G.; Ballmer, S. W.; Barayoga, J. C.; Barclay, S. E.; Barish, B. C.; Barker, D.; Barone, F.; Barr, B.; Barsotti, L.; Barsuglia, M.; Barta, D.; Bartlett, J.; Bartos, I.; Bassiri, R.; Basti, A.; Batch, J. C.; Baune, C.; Bavigadda, V.; Bazzan, M.; Behnke, B.; Bejger, M.; Bell, A. S.; Bell, C. J.; Berger, B. K.; Bergman, J.; Bergmann, G.; Berry, C. P. L.; Bersanetti, D.; Bertolini, A.; Betzwieser, J.; Bhagwat, S.; Bhandare, R.; Bilenko, I. A.; Billingsley, G.; Birch, J.; Birney, R.; Biscans, S.; Bisht, A.; Bitossi, M.; Biwer, C.; Bizouard, M. A.; Blackburn, J. K.; Blair, C. D.; Blair, D. G.; Blair, R. M.; Bloemen, S.; Bock, O.; Bodiya, T. P.; Boer, M.; Bogaert, G.; Bogan, C.; Bohe, A.; Bojtos, P.; Bond, C.; Bondu, F.; Bonnand, R.; Boom, B. A.; Bork, R.; Boschi, V.; Bose, S.; Bouffanais, Y.; Bozzi, A.; Bradaschia, C.; Brady, P. R.; Braginsky, V. B.; Branchesi, M.; Brau, J. E.; Briant, T.; Brillet, A.; Brinkmann, M.; Brisson, V.; Brockill, P.; Brooks, A. F.; Brown, D. A.; Brown, D. D.; Brown, N. M.; Buchanan, C. C.; Buikema, A.; Bulik, T.; Bulten, H. J.; Buonanno, A.; Buskulic, D.; Buy, C.; Byer, R. L.; Cadonati, L.; Cagnoli, G.; Cahillane, C.; Calderón Bustillo, J.; Callister, T.; Calloni, E.; Camp, J. B.; Cannon, K. C.; Cao, J.; Capano, C. D.; Capocasa, E.; Carbognani, F.; Caride, S.; Casanueva Diaz, J.; Casentini, C.; Caudill, S.; Cavaglià, M.; Cavalier, F.; Cavalieri, R.; Cella, G.; Cepeda, C. B.; Cerboni Baiardi, L.; Cerretani, G.; Cesarini, E.; Chakraborty, R.; Chalermsongsak, T.; Chamberlin, S. J.; Chan, M.; Chao, S.; Charlton, P.; Chassande-Mottin, E.; Chen, H. Y.; Chen, Y.; Cheng, C.; Chincarini, A.; Chiummo, A.; Cho, H. S.; Cho, M.; Chow, J. H.; Christensen, N.; Chu, Q.; Chua, S.; Chung, S.; Ciani, G.; Clara, F.; Clark, J. A.; Cleva, F.; Coccia, E.; Cohadon, P.-F.; Colla, A.; Collette, C. G.; Cominsky, L.; Constancio, M., Jr.; Conte, A.; Conti, L.; Cook, D.; Corbitt, T. R.; Cornish, N.; Corsi, A.; Cortese, S.; Costa, C. A.; Coughlin, M. W.; Coughlin, S. B.; Coulon, J.-P.; Countryman, S. T.; Couvares, P.; Cowan, E. E.; Coward, D. M.; Cowart, M. J.; Coyne, D. C.; Coyne, R.; Craig, K.; Creighton, J. D. E.; Cripe, J.; Crowder, S. G.; Cumming, A.; Cunningham, L.; Cuoco, E.; Dal Canton, T.; Danilishin, S. L.; D’Antonio, S.; Danzmann, K.; Darman, N. S.; Dattilo, V.; Dave, I.; Daveloza, H. P.; Davier, M.; Davies, G. S.; Daw, E. J.; Day, R.; De, S.; DeBra, D.; Debreczeni, G.; Degallaix, J.; De Laurentis, M.; Deléglise, S.; Del Pozzo, W.; Denker, T.; Dent, T.; Dereli, H.; Dergachev, V.; De Rosa, R.; DeRosa, R. T.; DeSalvo, R.; Dhurandhar, S.; Díaz, M. C.; Di Fiore, L.; Di Giovanni, M.; Di Lieto, A.; Di Pace, S.; Di Palma, I.; Di Virgilio, A.; Dojcinoski, G.; Dolique, V.; Donovan, F.; Dooley, K. L.; Doravari, S.; Douglas, R.; Downes, T. P.; Drago, M.; Drever, R. W. P.; Driggers, J. C.; Du, Z.; Ducrot, M.; Dwyer, S. E.; Edo, T. B.; Edwards, M. C.; Effler, A.; Eggenstein, H.-B.; Ehrens, P.; Eichholz, J.; Eikenberry, S. S.; Engels, W.; Essick, R. C.; Etzel, T.; Evans, M.; Evans, T. M.; Everett, R.; Factourovich, M.; Fafone, V.; Fair, H.; Fairhurst, S.; Fan, X.; Fang, Q.; Farinon, S.; Farr, B.; Farr, W. M.; Favata, M.; Fays, M.; Fehrmann, H.; Fejer, M. M.; Ferrante, I.; Ferreira, E. C.; Ferrini, F.; Fidecaro, F.; Fiori, I.; Fiorucci, D.; Fisher, R. P.; Flaminio, R.; Fletcher, M.; Fong, H.; Fournier, J.-D.; Franco, S.; Frasca, S.; Frasconi, F.; Frei, Z.; Freise, A.; Frey, R.; Frey, V.; Fricke, T. T.; Fritschel, P.; Frolov, V. V.; Fulda, P.; Fyffe, M.; Gabbard, H. A. G.; Gair, J. R.; Gammaitoni, L.; Gaonkar, S. G.; Garufi, F.; Gatto, A.; Gaur, G.; Gehrels, N.; Gemme, G.; Gendre, B.; Genin, E.; Gennai, A.; George, J.; Gergely, L.; Germain, V.; Ghosh, Archisman; Ghosh, S.; Giaime, J. A.; Giardina, K. D.; Giazotto, A.; Gill, K.; Glaefke, A.; Goetz, E.; Goetz, R.; Gondan, L.; González, G.; Gonzalez Castro, J. M.; Gopakumar, A.; Gordon, N. A.; Gorodetsky, M. L.; Gossan, S. E.; Gosselin, M.; Gouaty, R.; Graef, C.; Graff, P. B.; Granata, M.; Grant, A.; Gras, S.; Gray, C.; Greco, G.; Green, A. C.; Groot, P.; Grote, H.; Grunewald, S.; Guidi, G. M.; Guo, X.; Gupta, A.; Gupta, M. K.; Gushwa, K. E.; Gustafson, E. K.; Gustafson, R.; Hacker, J. J.; Hall, B. R.; Hall, E. D.; Hammond, G.; Haney, M.; Hanke, M. M.; Hanks, J.; Hanna, C.; Hannam, M. D.; Hanson, J.; Hardwick, T.; Harms, J.; Harry, G. M.; Harry, I. W.; Hart, M. J.; Hartman, M. T.; Haster, C.-J.; Haughian, K.; Heidmann, A.; Heintze, M. C.; Heitmann, H.; Hello, P.; Hemming, G.; Hendry, M.; Heng, I. S.; Hennig, J.; Heptonstall, A. W.; Heurs, M.; Hild, S.; Hoak, D.; Hodge, K. A.; Hofman, D.; Hollitt, S. E.; Holt, K.; Holz, D. E.; Hopkins, P.; Hosken, D. J.; Hough, J.; Houston, E. A.; Howell, E. J.; Hu, Y. M.; Huang, S.; Huerta, E. A.; Huet, D.; Hughey, B.; Husa, S.; Huttner, S. H.; Huynh-Dinh, T.; Idrisy, A.; Indik, N.; Ingram, D. R.; Inta, R.; Isa, H. N.; Isac, J.-M.; Isi, M.; Islas, G.; Isogai, T.; Iyer, B. R.; Izumi, K.; Jacqmin, T.; Jang, H.; Jani, K.; Jaranowski, P.; Jawahar, S.; Jiménez-Forteza, F.; Johnson, W. W.; Jones, D. I.; Jones, R.; Jonker, R. J. G.; Ju, L.; K, Haris; Kalaghatgi, C. V.; Kalogera, V.; Kandhasamy, S.; Kang, G.; Kanner, J. B.; Karki, S.; Kasprzack, M.; Katsavounidis, E.; Katzman, W.; Kaufer, S.; Kaur, T.; Kawabe, K.; Kawazoe, F.; Kéfélian, F.; Kehl, M. S.; Keitel, D.; Kelley, D. B.; Kells, W.; Kennedy, R.; Key, J. S.; Khalaidovski, A.; Khalili, F. Y.; Khan, I.; Khan, S.; Khan, Z.; Khazanov, E. A.; Kijbunchoo, N.; Kim, C.; Kim, J.; Kim, K.; Kim, Nam-Gyu; Kim, Namjun; Kim, Y.-M.; King, E. J.; King, P. J.; Kinzel, D. L.; Kissel, J. S.; Kleybolte, L.; Klimenko, S.; Koehlenbeck, S. M.; Kokeyama, K.; Koley, S.; Kondrashov, V.; Kontos, A.; Korobko, M.; Korth, W. Z.; Kowalska, I.; Kozak, D. B.; Kringel, V.; Krishnan, B.; Królak, A.; Krueger, C.; Kuehn, G.; Kumar, P.; Kuo, L.; Kutynia, A.; Lackey, B. D.; Landry, M.; Lange, J.; Lantz, B.; Lasky, P. D.; Lazzarini, A.; Lazzaro, C.; Leaci, P.; Leavey, S.; Lebigot, E. O.; Lee, C. H.; Lee, H. K.; Lee, H. M.; Lee, K.; Lenon, A.; Leonardi, M.; Leong, J. R.; Leroy, N.; Letendre, N.; Levin, Y.; Levine, B. M.; Li, T. G. F.; Libson, A.; Littenberg, T. B.; Lockerbie, N. A.; Logue, J.; Lombardi, A. L.; Lord, J. E.; Lorenzini, M.; Loriette, V.; Lormand, M.; Losurdo, G.; Lough, J. D.; Lück, H.; Lundgren, A. P.; Luo, J.; Lynch, R.; Ma, Y.; MacDonald, T.; Machenschalk, B.; MacInnis, M.; Macleod, D. M.; Magaña-Sandoval, F.; Magee, R. M.; Mageswaran, M.; Majorana, E.; Maksimovic, I.; Malvezzi, V.; Man, N.; Mandel, I.; Mandic, V.; Mangano, V.; Mansell, G. L.; Manske, M.; Mantovani, M.; Marchesoni, F.; Marion, F.; Márka, S.; Márka, Z.; Markosyan, A. S.; Maros, E.; Martelli, F.; Martellini, L.; Martin, I. W.; Martin, R. M.; Martynov, D. V.; Marx, J. N.; Mason, K.; Masserot, A.; Massinger, T. J.; Masso-Reid, M.; Matichard, F.; Matone, L.; Mavalvala, N.; Mazumder, N.; Mazzolo, G.; McCarthy, R.; McClelland, D. E.; McCormick, S.; McGuire, S. C.; McIntyre, G.; McIver, J.; McManus, D. J.; McWilliams, S. T.; Meacher, D.; Meadors, G. D.; Meidam, J.; Melatos, A.; Mendell, G.; Mendoza-Gandara, D.; Mercer, R. A.; Merilh, E.; Merzougui, M.; Meshkov, S.; Messenger, C.; Messick, C.; Meyers, P. M.; Mezzani, F.; Miao, H.; Michel, C.; Middleton, H.; Mikhailov, E. E.; Milano, L.; Miller, J.; Millhouse, M.; Minenkov, Y.; Ming, J.; Mirshekari, S.; Mishra, C.; Mitra, S.; Mitrofanov, V. P.; Mitselmakher, G.; Mittleman, R.; Moggi, A.; Mohan, M.; Mohapatra, S. R. P.; Montani, M.; Moore, B. C.; Moore, C. J.; Moraru, D.; Moreno, G.; Morriss, S. R.; Mossavi, K.; Mours, B.; Mow-Lowry, C. M.; Mueller, C. L.; Mueller, G.; Muir, A. W.; Mukherjee, Arunava; Mukherjee, D.; Mukherjee, S.; Mukund, N.; Mullavey, A.; Munch, J.; Murphy, D. J.; Murray, P. G.; Mytidis, A.; Nardecchia, I.; Naticchioni, L.; Nayak, R. K.; Necula, V.; Nedkova, K.; Nelemans, G.; Neri, M.; Neunzert, A.; Newton, G.; Nguyen, T. T.; Nielsen, A. B.; Nissanke, S.; Nitz, A.; Nocera, F.; Nolting, D.; Normandin, M. E.; Nuttall, L. K.; Oberling, J.; Ochsner, E.; O’Dell, J.; Oelker, E.; Ogin, G. H.; Oh, J. J.; Oh, S. H.; Ohme, F.; Oliver, M.; Oppermann, P.; Oram, Richard J.; O’Reilly, B.; O’Shaughnessy, R.; Ottaway, D. J.; Ottens, R. S.; Overmier, H.; Owen, B. J.; Pai, A.; Pai, S. A.; Palamos, J. R.; Palashov, O.; Palomba, C.; Pal-Singh, A.; Pan, H.; Pankow, C.; Pannarale, F.; Pant, B. C.; Paoletti, F.; Paoli, A.; Papa, M. A.; Paris, H. R.; Parker, W.; Pascucci, D.; Pasqualetti, A.; Passaquieti, R.; Passuello, D.; Patricelli, B.; Patrick, Z.; Pearlstone, B. L.; Pedraza, M.; Pedurand, R.; Pekowsky, L.; Pele, A.; Penn, S.; Perreca, A.; Phelps, M.; Piccinni, O.; Pichot, M.; Piergiovanni, F.; Pierro, V.; Pillant, G.; Pinard, L.; Pinto, I. M.; Pitkin, M.; Poggiani, R.; Popolizio, P.; Porter, E. K.; Post, A.; Powell, J.; Prasad, J.; Predoi, V.; Premachandra, S. S.; Prestegard, T.; Price, L. R.; Prijatelj, M.; Principe, M.; Privitera, S.; Prodi, G. A.; Prokhorov, L.; Puncken, O.; Punturo, M.; Puppo, P.; Pürrer, M.; Qi, H.; Qin, J.; Quetschke, V.; Quintero, E. A.; Quitzow-James, R.; Raab, F. J.; Rabeling, D. S.; Radkins, H.; Raffai, P.; Raja, S.; Rakhmanov, M.; Rapagnani, P.; Raymond, V.; Razzano, M.; Re, V.; Read, J.; Reed, C. M.; Regimbau, T.; Rei, L.; Reid, S.; Reitze, D. H.; Rew, H.; Reyes, S. D.; Ricci, F.; Riles, K.; Robertson, N. A.; Robie, R.; Robinet, F.; Rocchi, A.; Rolland, L.; Rollins, J. G.; Roma, V. J.; Romano, R.; Romanov, G.; Romie, J. H.; Rosińska, D.; Rowan, S.; Rüdiger, A.; Ruggi, P.; Ryan, K.; Sachdev, S.; Sadecki, T.; Sadeghian, L.; Salconi, L.; Saleem, M.; Salemi, F.; Samajdar, A.; Sammut, L.; Sampson, L.; Sanchez, E. J.; Sandberg, V.; Sandeen, B.; Sanders, J. R.; Sassolas, B.; Sathyaprakash, B. S.; Saulson, P. R.; Sauter, O.; Savage, R. L.; Sawadsky, A.; Schale, P.; Schilling, R.; Schmidt, J.; Schmidt, P.; Schnabel, R.; Schofield, R. M. S.; Schönbeck, A.; Schreiber, E.; Schuette, D.; Schutz, B. F.; Scott, J.; Scott, S. M.; Sellers, D.; Sengupta, A. S.; Sentenac, D.; Sequino, V.; Sergeev, A.; Serna, G.; Setyawati, Y.; Sevigny, A.; Shaddock, D. A.; Shah, S.; Shahriar, M. S.; Shaltev, M.; Shao, Z.; Shapiro, B.; Shawhan, P.; Sheperd, A.; Shoemaker, D. H.; Shoemaker, D. M.; Siellez, K.; Siemens, X.; Sigg, D.; Silva, A. D.; Simakov, D.; Singer, A.; Singer, L. P.; Singh, A.; Singh, R.; Singhal, A.; Sintes, A. M.; Slagmolen, B. J. J.; Smith, J. R.; Smith, N. D.; Smith, R. J. E.; Son, E. J.; Sorazu, B.; Sorrentino, F.; Souradeep, T.; Srivastava, A. K.; Staley, A.; Steinke, M.; Steinlechner, J.; Steinlechner, S.; Steinmeyer, D.; Stephens, B. C.; Stevenson, S.; Stone, R.; Strain, K. A.; Straniero, N.; Stratta, G.; Strauss, N. A.; Strigin, S.; Sturani, R.; Stuver, A. L.; Summerscales, T. Z.; Sun, L.; Sutton, P. J.; Swinkels, B. L.; Szczepańczyk, M. J.; Tacca, M.; Talukder, D.; Tanner, D. B.; Tápai, M.; Tarabrin, S. P.; Taracchini, A.; Taylor, R.; Theeg, T.; Thirugnanasambandam, M. P.; Thomas, E. G.; Thomas, M.; Thomas, P.; Thorne, K. A.; Thorne, K. S.; Thrane, E.; Tiwari, S.; Tiwari, V.; Tokmakov, K. V.; Tomlinson, C.; Tonelli, M.; Torres, C. V.; Torrie, C. I.; Töyrä, D.; Travasso, F.; Traylor, G.; Trifirò, D.; Tringali, M. C.; Trozzo, L.; Tse, M.; Turconi, M.; Tuyenbayev, D.; Ugolini, D.; Unnikrishnan, C. S.; Urban, A. L.; Usman, S. A.; Vahlbruch, H.; Vajente, G.; Valdes, G.; Vallisneri, M.; van Bakel, N.; van Beuzekom, M.; van den Brand, J. F. J.; Van Den Broeck, C.; Vander-Hyde, D. C.; van der Schaaf, L.; van Heijningen, J. V.; van Veggel, A. A.; Vardaro, M.; Vass, S.; Vasúth, M.; Vaulin, R.; Vecchio, A.; Vedovato, G.; Veitch, J.; Veitch, P. J.; Venkateswara, K.; Verkindt, D.; Vetrano, F.; Viceré, A.; Vinciguerra, S.; Vine, D. J.; Vinet, J.-Y.; Vitale, S.; Vo, T.; Vocca, H.; Vorvick, C.; Voss, D.; Vousden, W. D.; Vyatchanin, S. P.; Wade, A. R.; Wade, L. E.; Wade, M.; Walker, M.; Wallace, L.; Walsh, S.; Wang, G.; Wang, H.; Wang, M.; Wang, X.; Wang, Y.; Ward, R. L.; Warner, J.; Was, M.; Weaver, B.; Wei, L.-W.; Weinert, M.; Weinstein, A. J.; Weiss, R.; Welborn, T.; Wen, L.; Wesels, P.; Westphal, T.; Wette, K.; Whelan, J. T.; White, D. J.; Whiting, B. F.; Williams, R. D.; Williamson, A. R.; Willis, J. L.; Willke, B.; Wimmer, M. H.; Winkler, W.; Wipf, C. C.; Wittel, H.; Woan, G.; Worden, J.; Wright, J. L.; Wu, G.; Yablon, J.; Yam, W.; Yamamoto, H.; Yancey, C. C.; Yap, M. J.; Yu, H.; Yvert, M.; Zadrożny, A.; Zangrando, L.; Zanolin, M.; Zendri, J.-P.; Zevin, M.; Zhang, F.; Zhang, L.; Zhang, M.; Zhang, Y.; Zhao, C.; Zhou, M.; Zhou, Z.; Zhu, X. J.; Zucker, M. E.; Zuraw, S. E.; Zweizig, J.; LIGO Scientific Collaboration; Virgo Collaboration
2016-12-01
This article provides supplemental information for a Letter reporting the rate of (BBH) coalescences inferred from 16 days of coincident Advanced LIGO observations surrounding the transient (GW) signal GW150914. In that work we reported various rate estimates whose 90% confidence intervals fell in the range 2–600 Gpc‑3 yr‑1. Here we give details on our method and computations, including information about our search pipelines, a derivation of our likelihood function for the analysis, a description of the astrophysical search trigger distribution expected from merging BBHs, details on our computational methods, a description of the effects and our model for calibration uncertainty, and an analytic method for estimating our detector sensitivity, which is calibrated to our measurements.
Monte Carlo track-structure calculations for aqueous solutions containing biomolecules
DOE Office of Scientific and Technical Information (OSTI.GOV)
Turner, J.E.; Hamm, R.N.; Ritchie, R.H.
1993-10-01
Detailed Monte Carlo calculations provide a powerful tool for understanding mechanisms of radiation damage to biological molecules irradiated in aqueous solution. This paper describes the computer codes, OREC and RADLYS, which have been developed for this purpose over a number of years. Some results are given for calculations of the irradiation of pure water. comparisons are presented between computations for liquid water and water vapor. Detailed calculations of the chemical yields of several products from X-irradiated, oxygen-free glycylglycine solutions have been performed as a function of solute concentration. Excellent agreement is obtained between calculated and measured yields. The Monte Carlomore » analysis provides a complete mechanistic picture of pathways to observed radiolytic products. This approach, successful with glycylglycine, will be extended to study the irradiation of oligonucleotides in aqueous solution.« less
A general concept for consistent documentation of computational analyses
Müller, Fabian; Nordström, Karl; Lengauer, Thomas; Schulz, Marcel H.
2015-01-01
The ever-growing amount of data in the field of life sciences demands standardized ways of high-throughput computational analysis. This standardization requires a thorough documentation of each step in the computational analysis to enable researchers to understand and reproduce the results. However, due to the heterogeneity in software setups and the high rate of change during tool development, reproducibility is hard to achieve. One reason is that there is no common agreement in the research community on how to document computational studies. In many cases, simple flat files or other unstructured text documents are provided by researchers as documentation, which are often missing software dependencies, versions and sufficient documentation to understand the workflow and parameter settings. As a solution we suggest a simple and modest approach for documenting and verifying computational analysis pipelines. We propose a two-part scheme that defines a computational analysis using a Process and an Analysis metadata document, which jointly describe all necessary details to reproduce the results. In this design we separate the metadata specifying the process from the metadata describing an actual analysis run, thereby reducing the effort of manual documentation to an absolute minimum. Our approach is independent of a specific software environment, results in human readable XML documents that can easily be shared with other researchers and allows an automated validation to ensure consistency of the metadata. Because our approach has been designed with little to no assumptions concerning the workflow of an analysis, we expect it to be applicable in a wide range of computational research fields. Database URL: http://deep.mpi-inf.mpg.de/DAC/cmds/pub/pyvalid.zip PMID:26055099
Efficient and Flexible Computation of Many-Electron Wave Function Overlaps.
Plasser, Felix; Ruckenbauer, Matthias; Mai, Sebastian; Oppel, Markus; Marquetand, Philipp; González, Leticia
2016-03-08
A new algorithm for the computation of the overlap between many-electron wave functions is described. This algorithm allows for the extensive use of recurring intermediates and thus provides high computational efficiency. Because of the general formalism employed, overlaps can be computed for varying wave function types, molecular orbitals, basis sets, and molecular geometries. This paves the way for efficiently computing nonadiabatic interaction terms for dynamics simulations. In addition, other application areas can be envisaged, such as the comparison of wave functions constructed at different levels of theory. Aside from explaining the algorithm and evaluating the performance, a detailed analysis of the numerical stability of wave function overlaps is carried out, and strategies for overcoming potential severe pitfalls due to displaced atoms and truncated wave functions are presented.
Preferred computer activities among individuals with dementia: a pilot study.
Tak, Sunghee H; Zhang, Hongmei; Hong, Song Hee
2015-03-01
Computers offer new activities that are easily accessible, cognitively stimulating, and enjoyable for individuals with dementia. The current descriptive study examined preferred computer activities among nursing home residents with different severity levels of dementia. A secondary data analysis was conducted using activity observation logs from 15 study participants with dementia (severe = 115 logs, moderate = 234 logs, and mild = 124 logs) who participated in a computer activity program. Significant differences existed in preferred computer activities among groups with different severity levels of dementia. Participants with severe dementia spent significantly more time watching slide shows with music than those with both mild and moderate dementia (F [2,12] = 9.72, p = 0.003). Preference in playing games also differed significantly across the three groups. It is critical to consider individuals' interests and functional abilities when computer activities are provided for individuals with dementia. A practice guideline for tailoring computer activities is detailed. Copyright 2015, SLACK Incorporated.
Artese, Anna; Alcaro, Stefano; Moraca, Federica; Reina, Rocco; Ventura, Marzia; Costantino, Gabriele; Beccari, Andrea R; Ortuso, Francesco
2013-05-01
During the first edition of the Computationally Driven Drug Discovery meeting, held in November 2011 at Dompé Pharma (L'Aquila, Italy), a questionnaire regarding the diffusion and the use of computational tools for drug-design purposes in both academia and industry was distributed among all participants. This is a follow-up of a previously reported investigation carried out among a few companies in 2007. The new questionnaire implemented five sections dedicated to: research group identification and classification; 18 different computational techniques; software information; hardware data; and economical business considerations. In this article, together with a detailed history of the different computational methods, a statistical analysis of the survey results that enabled the identification of the prevalent computational techniques adopted in drug-design projects is reported and a profile of the computational medicinal chemist currently working in academia and pharmaceutical companies in Italy is highlighted.
NASA Technical Reports Server (NTRS)
Coulam, C. M.; Dunnette, W. H.; Wood, E. H.
1970-01-01
Two methods whereby a digital computer may be used to regulate a scintiscanning process are discussed from the viewpoint of computer input-output software. The computer's function, in this case, is to govern the data acquisition and storage, and to display the results to the investigator in a meaningful manner, both during and subsequent to the scanning process. Several methods (such as three-dimensional maps, contour plots, and wall-reflection maps) have been developed by means of which the computer can graphically display the data on-line, for real-time monitoring purposes, during the scanning procedure and subsequently for detailed analysis of the data obtained. A computer-governed method for converting scintiscan data recorded over the dorsal or ventral surfaces of the thorax into fractions of pulmonary blood flow traversing the right and left lungs is presented.
Tools and techniques for computational reproducibility.
Piccolo, Stephen R; Frampton, Michael B
2016-07-11
When reporting research findings, scientists document the steps they followed so that others can verify and build upon the research. When those steps have been described in sufficient detail that others can retrace the steps and obtain similar results, the research is said to be reproducible. Computers play a vital role in many research disciplines and present both opportunities and challenges for reproducibility. Computers can be programmed to execute analysis tasks, and those programs can be repeated and shared with others. The deterministic nature of most computer programs means that the same analysis tasks, applied to the same data, will often produce the same outputs. However, in practice, computational findings often cannot be reproduced because of complexities in how software is packaged, installed, and executed-and because of limitations associated with how scientists document analysis steps. Many tools and techniques are available to help overcome these challenges; here we describe seven such strategies. With a broad scientific audience in mind, we describe the strengths and limitations of each approach, as well as the circumstances under which each might be applied. No single strategy is sufficient for every scenario; thus we emphasize that it is often useful to combine approaches.
GPS synchronized power system phase angle measurements
NASA Astrophysics Data System (ADS)
Wilson, Robert E.; Sterlina, Patrick S.
1994-09-01
This paper discusses the use of Global Positioning System (GPS) synchronized equipment for the measurement and analysis of key power system quantities. Two GPS synchronized phasor measurement units (PMU) were installed before testing. It was indicated that PMUs recorded the dynamic response of the power system phase angles when the northern California power grid was excited by the artificial short circuits. Power system planning engineers perform detailed computer generated simulations of the dynamic response of the power system to naturally occurring short circuits. The computer simulations use models of transmission lines, transformers, circuit breakers, and other high voltage components. This work will compare computer simulations of the same event with field measurement.
An approach to quality and performance control in a computer-assisted clinical chemistry laboratory.
Undrill, P E; Frazer, S C
1979-01-01
A locally developed, computer-based clinical chemistry laboratory system has been in operation since 1970. This utilises a Digital Equipment Co Ltd PDP 12 and an interconnected PDP 8/F computer. Details are presented of the performance and quality control techniques incorporated into the system. Laboratory performance is assessed through analysis of results from fixed-level control sera as well as from cumulative sum methods. At a simple level the presentation may be considered purely indicative, while at a more sophisticated level statistical concepts have been introduced to aid the laboratory controller in decision-making processes. PMID:438340
Improvements to the fastex flutter analysis computer code
NASA Technical Reports Server (NTRS)
Taylor, Ronald F.
1987-01-01
Modifications to the FASTEX flutter analysis computer code (UDFASTEX) are described. The objectives were to increase the problem size capacity of FASTEX, reduce run times by modification of the modal interpolation procedure, and to add new user features. All modifications to the program are operable on the VAX 11/700 series computers under the VAX operating system. Interfaces were provided to aid in the inclusion of alternate aerodynamic and flutter eigenvalue calculations. Plots can be made of the flutter velocity, display and frequency data. A preliminary capability was also developed to plot contours of unsteady pressure amplitude and phase. The relevant equations of motion, modal interpolation procedures, and control system considerations are described and software developments are summarized. Additional information documenting input instructions, procedures, and details of the plate spline algorithm is found in the appendices.
NASA Technical Reports Server (NTRS)
Benyo, Theresa L.
2002-01-01
Integration of a supersonic inlet simulation with a computer aided design (CAD) system is demonstrated. The integration is performed using the Project Integration Architecture (PIA). PIA provides a common environment for wrapping many types of applications. Accessing geometry data from CAD files is accomplished by incorporating appropriate function calls from the Computational Analysis Programming Interface (CAPRI). CAPRI is a CAD vendor neutral programming interface that aids in acquiring geometry data directly from CAD files. The benefits of wrapping a supersonic inlet simulation into PIA using CAPRI are; direct access of geometry data, accurate capture of geometry data, automatic conversion of data units, CAD vendor neutral operation, and on-line interactive history capture. This paper describes the PIA and the CAPRI wrapper and details the supersonic inlet simulation demonstration.
Subsonic Analysis of 0.04-Scale F-16XL Models Using an Unstructured Euler Code
NASA Technical Reports Server (NTRS)
Lessard, Wendy B.
1996-01-01
The subsonic flow field about an F-16XL airplane model configuration was investigated with an inviscid unstructured grid technique. The computed surface pressures were compared to wind-tunnel test results at Mach 0.148 for a range of angles of attack from 0 deg to 20 deg. To evaluate the effect of grid dependency on the solution, a grid study was performed in which fine, medium, and coarse grid meshes were generated. The off-surface vortical flow field was locally adapted and showed improved correlation to the wind-tunnel data when compared to the nonadapted flow field. Computational results are also compared to experimental five-hole pressure probe data. A detailed analysis of the off-body computed pressure contours, velocity vectors, and particle traces are presented and discussed.
1990-07-01
replacing "logic diagrams" or "flow charts") to aid in coordinating the functions to be performed by a computer program and its associated Inputs...ADDRESS (City, State, and ZIP Code) 10. SOURCE OF FUNDING NUMBERS PROGRAM PROJECT ITASK IWORK UNIT ELEMENT NO. NO. NO. ACCESSION NO. 11. TITLE...the analysis. Both the logical model and detailed procedures are used to develop the application software programs which will be provided to Government
The application of digital techniques to the analysis of metallurgical experiments
NASA Technical Reports Server (NTRS)
Rathz, T. J.
1977-01-01
The application of a specific digital computer system (known as the Image Data Processing System) to the analysis of three NASA-sponsored metallurgical experiments is discussed in some detail. The basic hardware and software components of the Image Data Processing System are presented. Many figures are presented in the discussion of each experimental analysis in an attempt to show the accuracy and speed that the Image Data Processing System affords in analyzing photographic images dealing with metallurgy, and in particular with material processing.
Surface electrical properties experiment, Part 3
NASA Technical Reports Server (NTRS)
1974-01-01
A complete unified discussion of the electromagnetic response of a plane stratified structure is reported. A detailed and comprehensive analysis of the theoretical parts of the electromagnetic is given. The numerical problem of computing numbers of the electromagnetic field strengths is discussed. It is shown that the analysis of the conductive media is not very far removed from the theoretical analysis and the numerical difficulties are not as accute as for the low-loss problem. For Vol. 1, see N75-15570; for Vol. 2 see N75-15571.
Extensible Computational Chemistry Environment
DOE Office of Scientific and Technical Information (OSTI.GOV)
2012-08-09
ECCE provides a sophisticated graphical user interface, scientific visualization tools, and the underlying data management framework enabling scientists to efficiently set up calculations and store, retrieve, and analyze the rapidly growing volumes of data produced by computational chemistry studies. ECCE was conceived as part of the Environmental Molecular Sciences Laboratory construction to solve the problem of researchers being able to effectively utilize complex computational chemistry codes and massively parallel high performance compute resources. Bringing the power of these codes and resources to the desktops of researcher and thus enabling world class research without users needing a detailed understanding of themore » inner workings of either the theoretical codes or the supercomputers needed to run them was a grand challenge problem in the original version of the EMSL. ECCE allows collaboration among researchers using a web-based data repository where the inputs and results for all calculations done within ECCE are organized. ECCE is a first of kind end-to-end problem solving environment for all phases of computational chemistry research: setting up calculations with sophisticated GUI and direct manipulation visualization tools, submitting and monitoring calculations on remote high performance supercomputers without having to be familiar with the details of using these compute resources, and performing results visualization and analysis including creating publication quality images. ECCE is a suite of tightly integrated applications that are employed as the user moves through the modeling process.« less
CFD Simulations in Support of Shuttle Orbiter Contingency Abort Aerodynamic Database Enhancement
NASA Technical Reports Server (NTRS)
Papadopoulos, Periklis E.; Prabhu, Dinesh; Wright, Michael; Davies, Carol; McDaniel, Ryan; Venkatapathy, E.; Wercinski, Paul; Gomez, R. J.
2001-01-01
Modern Computational Fluid Dynamics (CFD) techniques were used to compute aerodynamic forces and moments of the Space Shuttle Orbiter in specific portions of contingency abort trajectory space. The trajectory space covers a Mach number range of 3.5-15, an angle-of-attack range of 20deg-60deg, an altitude range of 100-190 kft, and several different settings of the control surfaces (elevons, body flap, and speed brake). Presented here are details of the methodology and comparisons of computed aerodynamic coefficients against the values in the current Orbiter Operational Aerodynamic Data Book (OADB). While approximately 40 cases have been computed, only a sampling of the results is provided here. The computed results, in general, are in good agreement with the OADB data (i.e., within the uncertainty bands) for almost all the cases. However, in a limited number of high angle-of-attack cases (at Mach 15), there are significant differences between the computed results, especially the vehicle pitching moment, and the OADB data. A preliminary analysis of the data from the CFD simulations at Mach 15 shows that these differences can be attributed to real-gas/Mach number effects. The aerodynamic coefficients and detailed surface pressure distributions of the present simulations are being used by the Shuttle Program in the evaluation of the capabilities of the Orbiter in contingency abort scenarios.
Shortcomings of low-cost imaging systems for viewing computed radiographs.
Ricke, J; Hänninen, E L; Zielinski, C; Amthauer, H; Stroszczynski, C; Liebig, T; Wolf, M; Hosten, N
2000-01-01
To assess potential advantages of a new PC-based viewing tool featuring image post-processing for viewing computed radiographs on low-cost hardware (PC) with a common display card and color monitor, and to evaluate the effect of using color versus monochrome monitors. Computed radiographs of a statistical phantom were viewed on a PC, with and without post-processing (spatial frequency and contrast processing), employing a monochrome or a color monitor. Findings were compared with the viewing on a radiological Workstation and evaluated with ROC analysis. Image post-processing improved the perception of low-contrast details significantly irrespective of the monitor used. No significant difference in perception was observed between monochrome and color monitors. The review at the radiological Workstation was superior to the review done using the PC with image processing. Lower quality hardware (graphic card and monitor) used in low cost PCs negatively affects perception of low-contrast details in computed radiographs. In this situation, it is highly recommended to use spatial frequency and contrast processing. No significant quality gain has been observed for the high-end monochrome monitor compared to the color display. However, the color monitor was affected stronger by high ambient illumination.
NASA Technical Reports Server (NTRS)
Marchese, Anthony J.; Dryer, Frederick L.
1997-01-01
This program supports the engineering design, data analysis, and data interpretation requirements for the study of initially single component, spherically symmetric, isolated droplet combustion studies. Experimental emphasis is on the study of simple alcohols (methanol, ethanol) and alkanes (n-heptane, n-decane) as fuels with time dependent measurements of drop size, flame-stand-off, liquid-phase composition, and finally, extinction. Experiments have included bench-scale studies at Princeton, studies in the 2.2 and 5.18 drop towers at NASA-LeRC, and both the Fiber Supported Droplet Combustion (FSDC-1, FSDC-2) and the free Droplet Combustion Experiment (DCE) studies aboard the shuttle. Test matrix and data interpretation are performed through spherically-symmetric, time-dependent numerical computations which embody detailed sub-models for physical and chemical processes. The computed burning rate, flame stand-off, and extinction diameter are compared with the respective measurements for each individual experiment. In particular, the data from FSDC-1 and subsequent space-based experiments provide the opportunity to compare all three types of data simultaneously with the computed parameters. Recent numerical efforts are extending the computational tools to consider time dependent, axisymmetric 2-dimensional reactive flow situations.
Concurrent Probabilistic Simulation of High Temperature Composite Structural Response
NASA Technical Reports Server (NTRS)
Abdi, Frank
1996-01-01
A computational structural/material analysis and design tool which would meet industry's future demand for expedience and reduced cost is presented. This unique software 'GENOA' is dedicated to parallel and high speed analysis to perform probabilistic evaluation of high temperature composite response of aerospace systems. The development is based on detailed integration and modification of diverse fields of specialized analysis techniques and mathematical models to combine their latest innovative capabilities into a commercially viable software package. The technique is specifically designed to exploit the availability of processors to perform computationally intense probabilistic analysis assessing uncertainties in structural reliability analysis and composite micromechanics. The primary objectives which were achieved in performing the development were: (1) Utilization of the power of parallel processing and static/dynamic load balancing optimization to make the complex simulation of structure, material and processing of high temperature composite affordable; (2) Computational integration and synchronization of probabilistic mathematics, structural/material mechanics and parallel computing; (3) Implementation of an innovative multi-level domain decomposition technique to identify the inherent parallelism, and increasing convergence rates through high- and low-level processor assignment; (4) Creating the framework for Portable Paralleled architecture for the machine independent Multi Instruction Multi Data, (MIMD), Single Instruction Multi Data (SIMD), hybrid and distributed workstation type of computers; and (5) Market evaluation. The results of Phase-2 effort provides a good basis for continuation and warrants Phase-3 government, and industry partnership.
Chapter 13: Tools for analysis
William Elliot; Kevin Hyde; Lee MacDonald; James McKean
2007-01-01
This chapter presents a synthesis of current computer modeling tools that are, or could be, adopted for use in evaluating the cumulative watershed effects of fuel management. The chapter focuses on runoff, soil erosion and slope stability predictive tools. Readers should refer to chapters on soil erosion and stability for more detailed information on the physical...
The Use of Images in Intelligent Advisor Systems.
ERIC Educational Resources Information Center
Boulet, Marie-Michele
This paper describes the intelligent advisor system, named CODAMA, used in teaching a university-level systems analysis and design course. The paper discusses: (1) the use of CODAMA to assist students to transfer theoretical knowledge to the practical; (2) details of how CODAMA is applied in conjunction with a computer-aided software engineering…
Negotiating Power in L2 Synchronous Online Peer Response Groups
ERIC Educational Resources Information Center
Tsai, Mei-Hsing
2017-01-01
Many synchronous computer-mediated communication (SCMC) studies have been conducted on the nature of online interaction across a range of pragmatic issues. However, the detailed analyses of resistance to advice have received less attention. Using the methodology of conversation analysis (CA), the present study focuses on L2 peer review activities…
Estimation Accuracy on Execution Time of Run-Time Tasks in a Heterogeneous Distributed Environment
Liu, Qi; Cai, Weidong; Jin, Dandan; Shen, Jian; Fu, Zhangjie; Liu, Xiaodong; Linge, Nigel
2016-01-01
Distributed Computing has achieved tremendous development since cloud computing was proposed in 2006, and played a vital role promoting rapid growth of data collecting and analysis models, e.g., Internet of things, Cyber-Physical Systems, Big Data Analytics, etc. Hadoop has become a data convergence platform for sensor networks. As one of the core components, MapReduce facilitates allocating, processing and mining of collected large-scale data, where speculative execution strategies help solve straggler problems. However, there is still no efficient solution for accurate estimation on execution time of run-time tasks, which can affect task allocation and distribution in MapReduce. In this paper, task execution data have been collected and employed for the estimation. A two-phase regression (TPR) method is proposed to predict the finishing time of each task accurately. Detailed data of each task have drawn interests with detailed analysis report being made. According to the results, the prediction accuracy of concurrent tasks’ execution time can be improved, in particular for some regular jobs. PMID:27589753
Job Superscheduler Architecture and Performance in Computational Grid Environments
NASA Technical Reports Server (NTRS)
Shan, Hongzhang; Oliker, Leonid; Biswas, Rupak
2003-01-01
Computational grids hold great promise in utilizing geographically separated heterogeneous resources to solve large-scale complex scientific problems. However, a number of major technical hurdles, including distributed resource management and effective job scheduling, stand in the way of realizing these gains. In this paper, we propose a novel grid superscheduler architecture and three distributed job migration algorithms. We also model the critical interaction between the superscheduler and autonomous local schedulers. Extensive performance comparisons with ideal, central, and local schemes using real workloads from leading computational centers are conducted in a simulation environment. Additionally, synthetic workloads are used to perform a detailed sensitivity analysis of our superscheduler. Several key metrics demonstrate that substantial performance gains can be achieved via smart superscheduling in distributed computational grids.
[The characteristics of computer simulation of traffic accidents].
Zou, Dong-Hua; Liu, Ning-Guo; Chen, Jian-Guo; Jin, Xian-Long; Zhang, Xiao-Yun; Zhang, Jian-Hua; Chen, Yi-Jiu
2008-12-01
To reconstruct the collision process of traffic accident and the injury mode of the victim by computer simulation technology in forensic assessment of traffic accident. Forty actual accidents were reconstructed by stimulation software and high performance computer based on analysis of the trace evidences at the scene, damage of the vehicles and injury of the victims, with 2 cases discussed in details. The reconstruction correlated very well in 28 cases, well in 9 cases, and suboptimal in 3 cases with the above parameters. Accurate reconstruction of the accident would be helpful for assessment of the injury mechanism of the victims. Reconstruction of the collision process of traffic accident and the injury mechanism of the victim by computer simulation is useful in traffic accident assessment.
The vehicle design evaluation program - A computer-aided design procedure for transport aircraft
NASA Technical Reports Server (NTRS)
Oman, B. H.; Kruse, G. S.; Schrader, O. E.
1977-01-01
The vehicle design evaluation program is described. This program is a computer-aided design procedure that provides a vehicle synthesis capability for vehicle sizing, external load analysis, structural analysis, and cost evaluation. The vehicle sizing subprogram provides geometry, weight, and balance data for aircraft using JP, hydrogen, or methane fuels. The structural synthesis subprogram uses a multistation analysis for aerodynamic surfaces and fuselages to develop theoretical weights and geometric dimensions. The parts definition subprogram uses the geometric data from the structural analysis and develops the predicted fabrication dimensions, parts material raw stock buy requirements, and predicted actual weights. The cost analysis subprogram uses detail part data in conjunction with standard hours, realization factors, labor rates, and material data to develop the manufacturing costs. The program is used to evaluate overall design effects on subsonic commercial type aircraft due to parameter variations.
Predictive Control of Networked Multiagent Systems via Cloud Computing.
Liu, Guo-Ping
2017-01-18
This paper studies the design and analysis of networked multiagent predictive control systems via cloud computing. A cloud predictive control scheme for networked multiagent systems (NMASs) is proposed to achieve consensus and stability simultaneously and to compensate for network delays actively. The design of the cloud predictive controller for NMASs is detailed. The analysis of the cloud predictive control scheme gives the necessary and sufficient conditions of stability and consensus of closed-loop networked multiagent control systems. The proposed scheme is verified to characterize the dynamical behavior and control performance of NMASs through simulations. The outcome provides a foundation for the development of cooperative and coordinative control of NMASs and its applications.
Verification of a Viscous Computational Aeroacoustics Code using External Verification Analysis
NASA Technical Reports Server (NTRS)
Ingraham, Daniel; Hixon, Ray
2015-01-01
The External Verification Analysis approach to code verification is extended to solve the three-dimensional Navier-Stokes equations with constant properties, and is used to verify a high-order computational aeroacoustics (CAA) code. After a brief review of the relevant literature, the details of the EVA approach are presented and compared to the similar Method of Manufactured Solutions (MMS). Pseudocode representations of EVA's algorithms are included, along with the recurrence relations needed to construct the EVA solution. The code verification results show that EVA was able to convincingly verify a high-order, viscous CAA code without the addition of MMS-style source terms, or any other modifications to the code.
Verification of a Viscous Computational Aeroacoustics Code Using External Verification Analysis
NASA Technical Reports Server (NTRS)
Ingraham, Daniel; Hixon, Ray
2015-01-01
The External Verification Analysis approach to code verification is extended to solve the three-dimensional Navier-Stokes equations with constant properties, and is used to verify a high-order computational aeroacoustics (CAA) code. After a brief review of the relevant literature, the details of the EVA approach are presented and compared to the similar Method of Manufactured Solutions (MMS). Pseudocode representations of EVA's algorithms are included, along with the recurrence relations needed to construct the EVA solution. The code verification results show that EVA was able to convincingly verify a high-order, viscous CAA code without the addition of MMS-style source terms, or any other modifications to the code.
Second-order shaped pulsed for solid-state quantum computation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sengupta, Pinaki
2008-01-01
We present the construction and detailed analysis of highly optimized self-refocusing pulse shapes for several rotation angles. We characterize the constructed pulses by the coefficients appearing in the Magnus expansion up to second order. This allows a semianalytical analysis of the performance of the constructed shapes in sequences and composite pulses by computing the corresponding leading-order error operators. Higher orders can be analyzed with the numerical technique suggested by us previously. We illustrate the technique by analyzing several composite pulses designed to protect against pulse amplitude errors, and on decoupling sequences for potentially long chains of qubits with on-site andmore » nearest-neighbor couplings.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hayes, D.F.; Schroeder, P.R.; Engler, R.M.
This technical note describes procedures for determining mean hydraulic retention time and efficiency of a confined disposal facility (CDF) from a dye tracer slug test. These parameters are required to properly design a CDF for solids retention and for effluent quality considerations. Detailed information on conduct and analysis of dye tracer studies can be found in Engineer Manual 1110-2-5027, Confined Dredged Material Disposal. This technical note documents the DYECON computer program which facilitates the analysis of dye tracer concentration data and computes the hydraulic efficiency of a CDF as part of the Automated Dredging and Disposal Alternatives Management System (ADDAMS).
Time-Accurate Simulations and Acoustic Analysis of Slat Free-Shear Layer
NASA Technical Reports Server (NTRS)
Khorrami, Mehdi R.; Singer, Bart A.; Berkman, Mert E.
2001-01-01
A detailed computational aeroacoustic analysis of a high-lift flow field is performed. Time-accurate Reynolds Averaged Navier-Stokes (RANS) computations simulate the free shear layer that originates from the slat cusp. Both unforced and forced cases are studied. Preliminary results show that the shear layer is a good amplifier of disturbances in the low to mid-frequency range. The Ffowcs-Williams and Hawkings equation is solved to determine the acoustic field using the unsteady flow data from the RANS calculations. The noise radiated from the excited shear layer has a spectral shape qualitatively similar to that obtained from measurements in a corresponding experimental study of the high-lift system.
NASA Technical Reports Server (NTRS)
Glassman, A. J.
1974-01-01
A computer program to analyze power systems having any number of shafts up to a maximum of five is presented. On each shaft there can be as many as five compressors and five turbines, along with any specified number of intervening intercoolers and reheaters. A recuperator can be included. Turbine coolant flow can be accounted for. Any fuel consisting entirely of hydrogen and/or carbon can be used. The program is valid for maximum temperatures up to about 2000 K (3600 R). The system description, the analysis method, a detailed explanation of program input and output including an illustrative example, a dictionary of program variables, and the program listing are explained.
NASA Astrophysics Data System (ADS)
Wang, Jianxiong
2014-06-01
This volume of Journal of Physics: Conference Series is dedicated to scientific contributions presented at the 15th International Workshop on Advanced Computing and Analysis Techniques in Physics Research (ACAT 2013) which took place on 16-21 May 2013 at the Institute of High Energy Physics, Chinese Academy of Sciences, Beijing, China. The workshop series brings together computer science researchers and practitioners, and researchers from particle physics and related fields to explore and confront the boundaries of computing and of automatic data analysis and theoretical calculation techniques. This year's edition of the workshop brought together over 120 participants from all over the world. 18 invited speakers presented key topics on the universe in computer, Computing in Earth Sciences, multivariate data analysis, automated computation in Quantum Field Theory as well as computing and data analysis challenges in many fields. Over 70 other talks and posters presented state-of-the-art developments in the areas of the workshop's three tracks: Computing Technologies, Data Analysis Algorithms and Tools, and Computational Techniques in Theoretical Physics. The round table discussions on open-source, knowledge sharing and scientific collaboration stimulate us to think over the issue in the respective areas. ACAT 2013 was generously sponsored by the Chinese Academy of Sciences (CAS), National Natural Science Foundation of China (NFSC), Brookhaven National Laboratory in the USA (BNL), Peking University (PKU), Theoretical Physics Cernter for Science facilities of CAS (TPCSF-CAS) and Sugon. We would like to thank all the participants for their scientific contributions and for the en- thusiastic participation in all its activities of the workshop. Further information on ACAT 2013 can be found at http://acat2013.ihep.ac.cn. Professor Jianxiong Wang Institute of High Energy Physics Chinese Academy of Science Details of committees and sponsors are available in the PDF
NASA Technical Reports Server (NTRS)
1974-01-01
The feasibility is evaluated of an evolutionary development for use of a single-axis gimbal star tracker from prior two-axis gimbal star tracker based system applications. Detailed evaluation of the star tracker gimbal encoder is considered. A brief system description is given including the aspects of tracker evolution and encoder evaluation. System analysis includes evaluation of star availability and mounting constraints for the geosynchronous orbit application, and a covariance simulation analysis to evaluate performance potential. Star availability and covariance analysis digital computer programs are included.
NASA Technical Reports Server (NTRS)
Thacker, B. H.; Mcclung, R. C.; Millwater, H. R.
1990-01-01
An eigenvalue analysis of a typical space propulsion system turbopump blade is presented using an approximate probabilistic analysis methodology. The methodology was developed originally to investigate the feasibility of computing probabilistic structural response using closed-form approximate models. This paper extends the methodology to structures for which simple closed-form solutions do not exist. The finite element method will be used for this demonstration, but the concepts apply to any numerical method. The results agree with detailed analysis results and indicate the usefulness of using a probabilistic approximate analysis in determining efficient solution strategies.
LeRC-HT: NASA Lewis Research Center General Multiblock Navier-Stokes Heat Transfer Code Developed
NASA Technical Reports Server (NTRS)
Heidmann, James D.; Gaugler, Raymond E.
1999-01-01
For the last several years, LeRC-HT, a three-dimensional computational fluid dynamics (CFD) computer code for analyzing gas turbine flow and convective heat transfer, has been evolving at the NASA Lewis Research Center. The code is unique in its ability to give a highly detailed representation of the flow field very close to solid surfaces. This is necessary for an accurate representation of fluid heat transfer and viscous shear stresses. The code has been used extensively for both internal cooling passage flows and hot gas path flows--including detailed film cooling calculations, complex tip-clearance gap flows, and heat transfer. In its current form, this code has a multiblock grid capability and has been validated for a number of turbine configurations. The code has been developed and used primarily as a research tool (at least 35 technical papers have been published relative to the code and its application), but it should be useful for detailed design analysis. We now plan to make this code available to selected users for further evaluation.
Post-processing interstitialcy diffusion from molecular dynamics simulations
NASA Astrophysics Data System (ADS)
Bhardwaj, U.; Bukkuru, S.; Warrier, M.
2016-01-01
An algorithm to rigorously trace the interstitialcy diffusion trajectory in crystals is developed. The algorithm incorporates unsupervised learning and graph optimization which obviate the need to input extra domain specific information depending on crystal or temperature of the simulation. The algorithm is implemented in a flexible framework as a post-processor to molecular dynamics (MD) simulations. We describe in detail the reduction of interstitialcy diffusion into known computational problems of unsupervised clustering and graph optimization. We also discuss the steps, computational efficiency and key components of the algorithm. Using the algorithm, thermal interstitialcy diffusion from low to near-melting point temperatures is studied. We encapsulate the algorithms in a modular framework with functionality to calculate diffusion coefficients, migration energies and other trajectory properties. The study validates the algorithm by establishing the conformity of output parameters with experimental values and provides detailed insights for the interstitialcy diffusion mechanism. The algorithm along with the help of supporting visualizations and analysis gives convincing details and a new approach to quantifying diffusion jumps, jump-lengths, time between jumps and to identify interstitials from lattice atoms.
Post-processing interstitialcy diffusion from molecular dynamics simulations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bhardwaj, U., E-mail: haptork@gmail.com; Bukkuru, S.; Warrier, M.
2016-01-15
An algorithm to rigorously trace the interstitialcy diffusion trajectory in crystals is developed. The algorithm incorporates unsupervised learning and graph optimization which obviate the need to input extra domain specific information depending on crystal or temperature of the simulation. The algorithm is implemented in a flexible framework as a post-processor to molecular dynamics (MD) simulations. We describe in detail the reduction of interstitialcy diffusion into known computational problems of unsupervised clustering and graph optimization. We also discuss the steps, computational efficiency and key components of the algorithm. Using the algorithm, thermal interstitialcy diffusion from low to near-melting point temperatures ismore » studied. We encapsulate the algorithms in a modular framework with functionality to calculate diffusion coefficients, migration energies and other trajectory properties. The study validates the algorithm by establishing the conformity of output parameters with experimental values and provides detailed insights for the interstitialcy diffusion mechanism. The algorithm along with the help of supporting visualizations and analysis gives convincing details and a new approach to quantifying diffusion jumps, jump-lengths, time between jumps and to identify interstitials from lattice atoms. -- Graphical abstract:.« less
Design and Analysis of a Turbopump for a Conceptual Expander Cycle Upper-Stage Engine
NASA Technical Reports Server (NTRS)
Dorney, Daniel J.; Rothermel, Jeffry; Griffin, Lisa W.; Thornton, Randall J.; Forbes, John C.; Skelly, Stephen E.; Huber, Frank W.
2006-01-01
As part of the development of technologies for rocket engines that will power spacecraft to the Moon and Mars, a program was initiated to develop a conceptual upper stage engine with wide flow range capability. The resulting expander cycle engine design employs a radial turbine to allow higher pump speeds and efficiencies. In this paper, the design and analysis of the pump section of the engine are discussed. One-dimensional meanline analyses and three-dimensional unsteady computational fluid dynamics simulations were performed for the pump stage. Configurations with both vaneless and vaned diffusers were investigated. Both the meanline analysis and computational predictions show that the pump will meet the performance objectives. Additional details describing the development of a water flow facility test are also presented.
Finite element solution of torsion and other 2-D Poisson equations
NASA Technical Reports Server (NTRS)
Everstine, G. C.
1982-01-01
The NASTRAN structural analysis computer program may be used, without modification, to solve two dimensional Poisson equations such as arise in the classical Saint Venant torsion problem. The nonhomogeneous term (the right-hand side) in the Poisson equation can be handled conveniently by specifying a gravitational load in a "structural" analysis. The use of an analogy between the equations of elasticity and those of classical mathematical physics is summarized in detail.
A method for identifying EMI critical circuits during development of a large C3
NASA Astrophysics Data System (ADS)
Barr, Douglas H.
The circuit analysis methods and process Boeing Aerospace used on a large, ground-based military command, control, and communications (C3) system are described. This analysis was designed to help identify electromagnetic interference (EMI) critical circuits. The methodology used the MIL-E-6051 equipment criticality categories as the basis for defining critical circuits, relational database technology to help sort through and account for all of the approximately 5000 system signal cables, and Macintosh Plus personal computers to predict critical circuits based on safety margin analysis. The EMI circuit analysis process systematically examined all system circuits to identify which ones were likely to be EMI critical. The process used two separate, sequential safety margin analyses to identify critical circuits (conservative safety margin analysis, and detailed safety margin analysis). These analyses used field-to-wire and wire-to-wire coupling models using both worst-case and detailed circuit parameters (physical and electrical) to predict circuit safety margins. This process identified the predicted critical circuits that could then be verified by test.
NOAA/DOE CWP structural analysis package. [CWPFLY, CWPEXT, COTEC, and XOTEC codes
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pompa, J.A.; Lunz, D.F.
1979-09-01
The theoretical development and computer code user's manual for analysis of the Ocean Thermal Energy Conversion (OTEC) plant cold water pipe (CWP) are presented. The analysis of the CWP includes coupled platform/CWP loadngs and dynamic responses. This report with the exception of the Introduction and Appendix F was orginally published as Hydronautics, Inc., Technical Report No. 7825-2 (by Barr, Chang, and Thasanatorn) in November 1978. A detailed theoretical development of the equations describing the coupled platform/CWP system and preliminary validation efforts are described. The appendices encompass a complete user's manual, describing the inputs, outputs and operation of the four componentmore » programs, and detail changes and updates implemented since the original release of the code by Hydronautics. The code itself is available through NOAA's Office of Ocean Technology and Engineering Services.« less
Stone, John E; Hallock, Michael J; Phillips, James C; Peterson, Joseph R; Luthey-Schulten, Zaida; Schulten, Klaus
2016-05-01
Many of the continuing scientific advances achieved through computational biology are predicated on the availability of ongoing increases in computational power required for detailed simulation and analysis of cellular processes on biologically-relevant timescales. A critical challenge facing the development of future exascale supercomputer systems is the development of new computing hardware and associated scientific applications that dramatically improve upon the energy efficiency of existing solutions, while providing increased simulation, analysis, and visualization performance. Mobile computing platforms have recently become powerful enough to support interactive molecular visualization tasks that were previously only possible on laptops and workstations, creating future opportunities for their convenient use for meetings, remote collaboration, and as head mounted displays for immersive stereoscopic viewing. We describe early experiences adapting several biomolecular simulation and analysis applications for emerging heterogeneous computing platforms that combine power-efficient system-on-chip multi-core CPUs with high-performance massively parallel GPUs. We present low-cost power monitoring instrumentation that provides sufficient temporal resolution to evaluate the power consumption of individual CPU algorithms and GPU kernels. We compare the performance and energy efficiency of scientific applications running on emerging platforms with results obtained on traditional platforms, identify hardware and algorithmic performance bottlenecks that affect the usability of these platforms, and describe avenues for improving both the hardware and applications in pursuit of the needs of molecular modeling tasks on mobile devices and future exascale computers.
DUKSUP: A Computer Program for High Thrust Launch Vehicle Trajectory Design and Optimization
NASA Technical Reports Server (NTRS)
Williams, C. H.; Spurlock, O. F.
2014-01-01
From the late 1960's through 1997, the leadership of NASA's Intermediate and Large class unmanned expendable launch vehicle projects resided at the NASA Lewis (now Glenn) Research Center (LeRC). One of LeRC's primary responsibilities --- trajectory design and performance analysis --- was accomplished by an internally-developed analytic three dimensional computer program called DUKSUP. Because of its Calculus of Variations-based optimization routine, this code was generally more capable of finding optimal solutions than its contemporaries. A derivation of optimal control using the Calculus of Variations is summarized including transversality, intermediate, and final conditions. The two point boundary value problem is explained. A brief summary of the code's operation is provided, including iteration via the Newton-Raphson scheme and integration of variational and motion equations via a 4th order Runge-Kutta scheme. Main subroutines are discussed. The history of the LeRC trajectory design efforts in the early 1960's is explained within the context of supporting the Centaur upper stage program. How the code was constructed based on the operation of the Atlas/Centaur launch vehicle, the limits of the computers of that era, the limits of the computer programming languages, and the missions it supported are discussed. The vehicles DUKSUP supported (Atlas/Centaur, Titan/Centaur, and Shuttle/Centaur) are briefly described. The types of missions, including Earth orbital and interplanetary, are described. The roles of flight constraints and their impact on launch operations are detailed (such as jettisoning hardware on heating, Range Safety, ground station tracking, and elliptical parking orbits). The computer main frames on which the code was hosted are described. The applications of the code are detailed, including independent check of contractor analysis, benchmarking, leading edge analysis, and vehicle performance improvement assessments. Several of DUKSUP's many major impacts on launches are discussed including Intelsat, Voyager, Pioneer Venus, HEAO, Galileo, and Cassini.
DUKSUP: A Computer Program for High Thrust Launch Vehicle Trajectory Design and Optimization
NASA Technical Reports Server (NTRS)
Spurlock, O. Frank; Williams, Craig H.
2015-01-01
From the late 1960s through 1997, the leadership of NASAs Intermediate and Large class unmanned expendable launch vehicle projects resided at the NASA Lewis (now Glenn) Research Center (LeRC). One of LeRCs primary responsibilities --- trajectory design and performance analysis --- was accomplished by an internally-developed analytic three dimensional computer program called DUKSUP. Because of its Calculus of Variations-based optimization routine, this code was generally more capable of finding optimal solutions than its contemporaries. A derivation of optimal control using the Calculus of Variations is summarized including transversality, intermediate, and final conditions. The two point boundary value problem is explained. A brief summary of the codes operation is provided, including iteration via the Newton-Raphson scheme and integration of variational and motion equations via a 4th order Runge-Kutta scheme. Main subroutines are discussed. The history of the LeRC trajectory design efforts in the early 1960s is explained within the context of supporting the Centaur upper stage program. How the code was constructed based on the operation of the AtlasCentaur launch vehicle, the limits of the computers of that era, the limits of the computer programming languages, and the missions it supported are discussed. The vehicles DUKSUP supported (AtlasCentaur, TitanCentaur, and ShuttleCentaur) are briefly described. The types of missions, including Earth orbital and interplanetary, are described. The roles of flight constraints and their impact on launch operations are detailed (such as jettisoning hardware on heating, Range Safety, ground station tracking, and elliptical parking orbits). The computer main frames on which the code was hosted are described. The applications of the code are detailed, including independent check of contractor analysis, benchmarking, leading edge analysis, and vehicle performance improvement assessments. Several of DUKSUPs many major impacts on launches are discussed including Intelsat, Voyager, Pioneer Venus, HEAO, Galileo, and Cassini.
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
The worldwide semisubmersible drilling rig fleet is approaching retirement. But replacement is not an attractive option even though dayrates are reaching record highs. In 1991, Schlumberger Sedco Forex managers decided that an alternative might exist if regulators and insurers could be convinced to extend rig life expectancy through restoration. Sedco Forex chose their No. 704 semisubmersible, an 18-year North Sea veteran, to test their process. The first step was to determine what required restoration, meaning fatigue life analysis of each weld on the huge vessel. If inspected, the task would be unacceptably time-consuming and of questionable accuracy. Instead a suitemore » of computer programs modeled the stress seen by each weld, statistically estimated the sea states seen by the rig throughout its North Sea service and calibrated a beam-element model on which to run their computer simulations. The elastic stiffness of the structure and detailed stress analysis of each weld was performed with ANSYS, a commercially available finite-element analysis program. The use of computer codes to evaluate service life extension is described.« less
PlantCV v2: Image analysis software for high-throughput plant phenotyping
Abbasi, Arash; Berry, Jeffrey C.; Callen, Steven T.; Chavez, Leonardo; Doust, Andrew N.; Feldman, Max J.; Gilbert, Kerrigan B.; Hodge, John G.; Hoyer, J. Steen; Lin, Andy; Liu, Suxing; Lizárraga, César; Lorence, Argelia; Miller, Michael; Platon, Eric; Tessman, Monica; Sax, Tony
2017-01-01
Systems for collecting image data in conjunction with computer vision techniques are a powerful tool for increasing the temporal resolution at which plant phenotypes can be measured non-destructively. Computational tools that are flexible and extendable are needed to address the diversity of plant phenotyping problems. We previously described the Plant Computer Vision (PlantCV) software package, which is an image processing toolkit for plant phenotyping analysis. The goal of the PlantCV project is to develop a set of modular, reusable, and repurposable tools for plant image analysis that are open-source and community-developed. Here we present the details and rationale for major developments in the second major release of PlantCV. In addition to overall improvements in the organization of the PlantCV project, new functionality includes a set of new image processing and normalization tools, support for analyzing images that include multiple plants, leaf segmentation, landmark identification tools for morphometrics, and modules for machine learning. PMID:29209576
PlantCV v2: Image analysis software for high-throughput plant phenotyping.
Gehan, Malia A; Fahlgren, Noah; Abbasi, Arash; Berry, Jeffrey C; Callen, Steven T; Chavez, Leonardo; Doust, Andrew N; Feldman, Max J; Gilbert, Kerrigan B; Hodge, John G; Hoyer, J Steen; Lin, Andy; Liu, Suxing; Lizárraga, César; Lorence, Argelia; Miller, Michael; Platon, Eric; Tessman, Monica; Sax, Tony
2017-01-01
Systems for collecting image data in conjunction with computer vision techniques are a powerful tool for increasing the temporal resolution at which plant phenotypes can be measured non-destructively. Computational tools that are flexible and extendable are needed to address the diversity of plant phenotyping problems. We previously described the Plant Computer Vision (PlantCV) software package, which is an image processing toolkit for plant phenotyping analysis. The goal of the PlantCV project is to develop a set of modular, reusable, and repurposable tools for plant image analysis that are open-source and community-developed. Here we present the details and rationale for major developments in the second major release of PlantCV. In addition to overall improvements in the organization of the PlantCV project, new functionality includes a set of new image processing and normalization tools, support for analyzing images that include multiple plants, leaf segmentation, landmark identification tools for morphometrics, and modules for machine learning.
PlantCV v2: Image analysis software for high-throughput plant phenotyping
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gehan, Malia A.; Fahlgren, Noah; Abbasi, Arash
Systems for collecting image data in conjunction with computer vision techniques are a powerful tool for increasing the temporal resolution at which plant phenotypes can be measured non-destructively. Computational tools that are flexible and extendable are needed to address the diversity of plant phenotyping problems. We previously described the Plant Computer Vision (PlantCV) software package, which is an image processing toolkit for plant phenotyping analysis. The goal of the PlantCV project is to develop a set of modular, reusable, and repurposable tools for plant image analysis that are open-source and community-developed. Here in this paper we present the details andmore » rationale for major developments in the second major release of PlantCV. In addition to overall improvements in the organization of the PlantCV project, new functionality includes a set of new image processing and normalization tools, support for analyzing images that include multiple plants, leaf segmentation, landmark identification tools for morphometrics, and modules for machine learning.« less
Evangelopoulos, Nicholas E
2013-11-01
This article reviews latent semantic analysis (LSA), a theory of meaning as well as a method for extracting that meaning from passages of text, based on statistical computations over a collection of documents. LSA as a theory of meaning defines a latent semantic space where documents and individual words are represented as vectors. LSA as a computational technique uses linear algebra to extract dimensions that represent that space. This representation enables the computation of similarity among terms and documents, categorization of terms and documents, and summarization of large collections of documents using automated procedures that mimic the way humans perform similar cognitive tasks. We present some technical details, various illustrative examples, and discuss a number of applications from linguistics, psychology, cognitive science, education, information science, and analysis of textual data in general. WIREs Cogn Sci 2013, 4:683-692. doi: 10.1002/wcs.1254 CONFLICT OF INTEREST: The author has declared no conflicts of interest for this article. For further resources related to this article, please visit the WIREs website. © 2013 John Wiley & Sons, Ltd.
PlantCV v2: Image analysis software for high-throughput plant phenotyping
Gehan, Malia A.; Fahlgren, Noah; Abbasi, Arash; ...
2017-12-01
Systems for collecting image data in conjunction with computer vision techniques are a powerful tool for increasing the temporal resolution at which plant phenotypes can be measured non-destructively. Computational tools that are flexible and extendable are needed to address the diversity of plant phenotyping problems. We previously described the Plant Computer Vision (PlantCV) software package, which is an image processing toolkit for plant phenotyping analysis. The goal of the PlantCV project is to develop a set of modular, reusable, and repurposable tools for plant image analysis that are open-source and community-developed. Here in this paper we present the details andmore » rationale for major developments in the second major release of PlantCV. In addition to overall improvements in the organization of the PlantCV project, new functionality includes a set of new image processing and normalization tools, support for analyzing images that include multiple plants, leaf segmentation, landmark identification tools for morphometrics, and modules for machine learning.« less
NASA Technical Reports Server (NTRS)
Faller, K. H.
1976-01-01
A technique for the detection and measurement of surface feature interfaces in remotely acquired data was developed and evaluated. A computer implementation of this technique was effected to automatically process classified data derived from various sources such as the LANDSAT multispectral scanner and other scanning sensors. The basic elements of the operational theory of the technique are described, followed by the details of the procedure. An example of an application of the technique to the analysis of tidal shoreline length is given with a breakdown of manpower requirements.
Investigations into the triggered lightning response of the F106B thunderstorm research aircraft
NASA Technical Reports Server (NTRS)
Rudolph, Terence H.; Perala, Rodney A.; Mckenna, Paul M.; Parker, Steven L.
1985-01-01
An investigation has been conducted into the lightning characteristics of the NASA F106B thunderstorm research aircraft. The investigation includes analysis of measured data from the aircraft in the time and frequency domains. Linear and nonlinear computer modelling has also been performed. In addition, new computer tools have been developed, including a new enhanced nonlinear air breakdown model, and a subgrid model useful for analyzing fine details of the aircraft's geometry. Comparison of measured and calculated electromagnetic responses of the aircraft to a triggered lightning environment are presented.
A survey of computational aerodynamics in the United States
NASA Technical Reports Server (NTRS)
Gessow, A.; Morris, D. J.
1977-01-01
Programs in theoretical and computational aerodynamics in the United States are described. Those aspects of programs that relate to aeronautics are detailed. The role of analysis at various levels of sophistication is discussed as well as the inverse solution techniques that are of primary importance in design methodology. The research is divided into the broad categories of application for boundary layer flow, Navier-Stokes turbulence modeling, internal flows, two-dimensional configurations, subsonic and supersonic aircraft, transonic aircraft, and the space shuttle. A survey of representative work in each area is presented.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Elnabawy, Ahmed O.; Rangarajan, Srinivas; Mavrikakis, Manos
Computational chemistry, especially density functional theory, has experienced a remarkable growth in terms of application over the last few decades. This is attributed to the improvements in theory and computing infrastructure that enable the analysis of systems of unprecedented size and detail at an affordable computational expense. In this perspective, we discuss recent progress and current challenges facing electronic structure theory in the context of heterogeneous catalysis. We specifically focus on the impact of computational chemistry in elucidating and designing catalytic systems in three topics of interest to Haldor Topsøe – ammonia, synthesis, hydrotreating, and NO x reduction. Furthermore, wemore » then discuss the common tools and concepts in computational catalysis that underline these topics and provide a perspective on the challenges and future directions of research in this area of catalysis research.« less
Elnabawy, Ahmed O.; Rangarajan, Srinivas; Mavrikakis, Manos
2015-06-05
Computational chemistry, especially density functional theory, has experienced a remarkable growth in terms of application over the last few decades. This is attributed to the improvements in theory and computing infrastructure that enable the analysis of systems of unprecedented size and detail at an affordable computational expense. In this perspective, we discuss recent progress and current challenges facing electronic structure theory in the context of heterogeneous catalysis. We specifically focus on the impact of computational chemistry in elucidating and designing catalytic systems in three topics of interest to Haldor Topsøe – ammonia, synthesis, hydrotreating, and NO x reduction. Furthermore, wemore » then discuss the common tools and concepts in computational catalysis that underline these topics and provide a perspective on the challenges and future directions of research in this area of catalysis research.« less
NASA Technical Reports Server (NTRS)
Greathouse, James S.; Schwing, Alan M.
2015-01-01
This paper explores use of computational fluid dynamics to study the e?ect of geometric porosity on static stability and drag for NASA's Multi-Purpose Crew Vehicle main parachute. Both of these aerodynamic characteristics are of interest to in parachute design, and computational methods promise designers the ability to perform detailed parametric studies and other design iterations with a level of control previously unobtainable using ground or flight testing. The approach presented here uses a canopy structural analysis code to define the inflated parachute shapes on which structured computational grids are generated. These grids are used by the computational fluid dynamics code OVERFLOW and are modeled as rigid, impermeable bodies for this analysis. Comparisons to Apollo drop test data is shown as preliminary validation of the technique. Results include several parametric sweeps through design variables in order to better understand the trade between static stability and drag. Finally, designs that maximize static stability with a minimal loss in drag are suggested for further study in subscale ground and flight testing.
High temperature composite analyzer (HITCAN) user's manual, version 1.0
NASA Technical Reports Server (NTRS)
Lackney, J. J.; Singhal, S. N.; Murthy, P. L. N.; Gotsis, P.
1993-01-01
This manual describes 'how-to-use' the computer code, HITCAN (HIgh Temperature Composite ANalyzer). HITCAN is a general purpose computer program for predicting nonlinear global structural and local stress-strain response of arbitrarily oriented, multilayered high temperature metal matrix composite structures. This code combines composite mechanics and laminate theory with an internal data base for material properties of the constituents (matrix, fiber and interphase). The thermo-mechanical properties of the constituents are considered to be nonlinearly dependent on several parameters including temperature, stress and stress rate. The computation procedure for the analysis of the composite structures uses the finite element method. HITCAN is written in FORTRAN 77 computer language and at present has been configured and executed on the NASA Lewis Research Center CRAY XMP and YMP computers. This manual describes HlTCAN's capabilities and limitations followed by input/execution/output descriptions and example problems. The input is described in detail including (1) geometry modeling, (2) types of finite elements, (3) types of analysis, (4) material data, (5) types of loading, (6) boundary conditions, (7) output control, (8) program options, and (9) data bank.
NASA Technical Reports Server (NTRS)
Foley, Michael J.
1989-01-01
The primary nozzle diffuser routes fuel from the main fuel valve on the Space Shuttle Main Engine (SSME) to the nozzle coolant inlet mainfold, main combustion chamber coolant inlet mainfold, chamber coolant valve, and the augmented spark igniters. The diffuser also includes the fuel system purge check valve connection. A static stress analysis was performed on the diffuser because no detailed analysis was done on this part in the past. Structural concerns were in the area of the welds because approximately 10 percent are in areas inaccessible by X-ray testing devices. Flow dynamics and thermodynamics were not included in the analysis load case. Constant internal pressure at maximum SSME power was used instead. A three-dimensional, finite element method was generated using ANSYS version 4.3A on the Lockheed VAX 11/785 computer to perform the stress computations. IDEAS Supertab on a Sun 3/60 computer was used to create the finite element model. Rocketdyne drawing number RS009156 was used for the model interpretation. The flight diffuser is denoted as -101. A description of the model, boundary conditions/load case, material properties, structural analysis/results, and a summary are included for documentation.
NASA Astrophysics Data System (ADS)
Poggio, Andrew J.
1988-10-01
This issue of Energy and Technology Review contains: Neutron Penumbral Imaging of Laser-Fusion Targets--using our new penumbral-imaging diagnostic, we have obtained the first images that can be used to measure directly the deuterium-tritium burn region in laser-driven fusion targets; Computed Tomography for Nondestructive Evaluation--various computed tomography systems and computational techniques are used in nondestructive evaluation; Three-Dimensional Image Analysis for Studying Nuclear Chromatin Structure--we have developed an optic-electronic system for acquiring cross-sectional views of cell nuclei, and computer codes to analyze these images and reconstruct the three-dimensional structures they represent; Imaging in the Nuclear Test Program--advanced techniques produce images of unprecedented detail and resolution from Nevada Test Site data; and Computational X-Ray Holography--visible-light experiments and numerically simulated holograms test our ideas about an X-ray microscope for biological research.
Stochastic model simulation using Kronecker product analysis and Zassenhaus formula approximation.
Caglar, Mehmet Umut; Pal, Ranadip
2013-01-01
Probabilistic Models are regularly applied in Genetic Regulatory Network modeling to capture the stochastic behavior observed in the generation of biological entities such as mRNA or proteins. Several approaches including Stochastic Master Equations and Probabilistic Boolean Networks have been proposed to model the stochastic behavior in genetic regulatory networks. It is generally accepted that Stochastic Master Equation is a fundamental model that can describe the system being investigated in fine detail, but the application of this model is computationally enormously expensive. On the other hand, Probabilistic Boolean Network captures only the coarse-scale stochastic properties of the system without modeling the detailed interactions. We propose a new approximation of the stochastic master equation model that is able to capture the finer details of the modeled system including bistabilities and oscillatory behavior, and yet has a significantly lower computational complexity. In this new method, we represent the system using tensors and derive an identity to exploit the sparse connectivity of regulatory targets for complexity reduction. The algorithm involves an approximation based on Zassenhaus formula to represent the exponential of a sum of matrices as product of matrices. We derive upper bounds on the expected error of the proposed model distribution as compared to the stochastic master equation model distribution. Simulation results of the application of the model to four different biological benchmark systems illustrate performance comparable to detailed stochastic master equation models but with considerably lower computational complexity. The results also demonstrate the reduced complexity of the new approach as compared to commonly used Stochastic Simulation Algorithm for equivalent accuracy.
NASA Astrophysics Data System (ADS)
García, Isaías; Benavides, Carmen; Alaiz, Héctor; Alonso, Angel
2013-08-01
This paper describes research on the use of knowledge models (ontologies) for building computer-aided educational software in the field of control engineering. Ontologies are able to represent in the computer a very rich conceptual model of a given domain. This model can be used later for a number of purposes in different software applications. In this study, domain ontology about the field of lead-lag compensator design has been built and used for automatic exercise generation, graphical user interface population and interaction with the user at any level of detail, including explanations about why things occur. An application called Onto-CELE (ontology-based control engineering learning environment) uses the ontology for implementing a learning environment that can be used for self and lifelong learning purposes. The experience has shown that the use of knowledge models as the basis for educational software applications is capable of showing students the whole complexity of the analysis and design processes at any level of detail. A practical experience with postgraduate students has shown the mentioned benefits and possibilities of the approach.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Abbott, B. P.; Abbott, R.; Abernathy, M. R.
This article provides supplemental information for a Letter reporting the rate of (BBH) coalescences inferred from 16 days of coincident Advanced LIGO observations surrounding the transient (GW) signal GW150914. In that work we reported various rate estimates whose 90% confidence intervals fell in the range 2–600 Gpc{sup −3} yr{sup −1}. Here we give details on our method and computations, including information about our search pipelines, a derivation of our likelihood function for the analysis, a description of the astrophysical search trigger distribution expected from merging BBHs, details on our computational methods, a description of the effects and our model for calibration uncertainty,more » and an analytic method for estimating our detector sensitivity, which is calibrated to our measurements.« less
The Numerical Propulsion System Simulation: A Multidisciplinary Design System for Aerospace Vehicles
NASA Technical Reports Server (NTRS)
Lytle, John K.
1999-01-01
Advances in computational technology and in physics-based modeling are making large scale, detailed simulations of complex systems possible within the design environment. For example, the integration of computing, communications, and aerodynamics has reduced the time required to analyze ma or propulsion system components from days and weeks to minutes and hours. This breakthrough has enabled the detailed simulation of major propulsion system components to become a routine part of design process and to provide the designer with critical information about the components early in the design process. This paper describes the development of the Numerical Propulsion System Simulation (NPSS), a multidisciplinary system of analysis tools that is focussed on extending the simulation capability from components to the full system. This will provide the product developer with a "virtual wind tunnel" that will reduce the number of hardware builds and tests required during the development of advanced aerospace propulsion systems.
Nonlinear Aerodynamics and the Design of Wing Tips
NASA Technical Reports Server (NTRS)
Kroo, Ilan
1991-01-01
The analysis and design of wing tips for fixed wing and rotary wing aircraft still remains part art, part science. Although the design of airfoil sections and basic planform geometry is well developed, the tip regions require more detailed consideration. This is important because of the strong impact of wing tip flow on wing drag; although the tip region constitutes a small portion of the wing, its effect on the drag can be significant. The induced drag of a wing is, for a given lift and speed, inversely proportional to the square of the wing span. Concepts are proposed as a means of reducing drag. Modern computational methods provide a tool for studying these issues in greater detail. The purpose of the current research program is to improve the understanding of the fundamental issues involved in the design of wing tips and to develop the range of computational and experimental tools needed for further study of these ideas.
Clustering and Network Analysis of Reverse Phase Protein Array Data.
Byron, Adam
2017-01-01
Molecular profiling of proteins and phosphoproteins using a reverse phase protein array (RPPA) platform, with a panel of target-specific antibodies, enables the parallel, quantitative proteomic analysis of many biological samples in a microarray format. Hence, RPPA analysis can generate a high volume of multidimensional data that must be effectively interrogated and interpreted. A range of computational techniques for data mining can be applied to detect and explore data structure and to form functional predictions from large datasets. Here, two approaches for the computational analysis of RPPA data are detailed: the identification of similar patterns of protein expression by hierarchical cluster analysis and the modeling of protein interactions and signaling relationships by network analysis. The protocols use freely available, cross-platform software, are easy to implement, and do not require any programming expertise. Serving as data-driven starting points for further in-depth analysis, validation, and biological experimentation, these and related bioinformatic approaches can accelerate the functional interpretation of RPPA data.
26 CFR 302.1-4 - Computation of taxes.
Code of Federal Regulations, 2010 CFR
2010-04-01
... ADMINISTRATION TAXES UNDER THE INTERNATIONAL CLAIMS SETTLEMENT ACT, AS AMENDED AUGUST 9, 1955 § 302.1-4 Computation of taxes. (a) Detail of employees of the Internal Revenue Service. The Commissioner will detail... 26 Internal Revenue 18 2010-04-01 2010-04-01 false Computation of taxes. 302.1-4 Section 302.1-4...
A General Cross-Layer Cloud Scheduling Framework for Multiple IoT Computer Tasks.
Wu, Guanlin; Bao, Weidong; Zhu, Xiaomin; Zhang, Xiongtao
2018-05-23
The diversity of IoT services and applications brings enormous challenges to improving the performance of multiple computer tasks' scheduling in cross-layer cloud computing systems. Unfortunately, the commonly-employed frameworks fail to adapt to the new patterns on the cross-layer cloud. To solve this issue, we design a new computer task scheduling framework for multiple IoT services in cross-layer cloud computing systems. Specifically, we first analyze the features of the cross-layer cloud and computer tasks. Then, we design the scheduling framework based on the analysis and present detailed models to illustrate the procedures of using the framework. With the proposed framework, the IoT services deployed in cross-layer cloud computing systems can dynamically select suitable algorithms and use resources more effectively to finish computer tasks with different objectives. Finally, the algorithms are given based on the framework, and extensive experiments are also given to validate its effectiveness, as well as its superiority.
DETAIL VIEW OF COMPUTER PANELS, ROOM 8A Cape Canaveral ...
DETAIL VIEW OF COMPUTER PANELS, ROOM 8A - Cape Canaveral Air Force Station, Launch Complex 39, Mobile Launcher Platforms, Launcher Road, East of Kennedy Parkway North, Cape Canaveral, Brevard County, FL
Equation-free multiscale computation: algorithms and applications.
Kevrekidis, Ioannis G; Samaey, Giovanni
2009-01-01
In traditional physicochemical modeling, one derives evolution equations at the (macroscopic, coarse) scale of interest; these are used to perform a variety of tasks (simulation, bifurcation analysis, optimization) using an arsenal of analytical and numerical techniques. For many complex systems, however, although one observes evolution at a macroscopic scale of interest, accurate models are only given at a more detailed (fine-scale, microscopic) level of description (e.g., lattice Boltzmann, kinetic Monte Carlo, molecular dynamics). Here, we review a framework for computer-aided multiscale analysis, which enables macroscopic computational tasks (over extended spatiotemporal scales) using only appropriately initialized microscopic simulation on short time and length scales. The methodology bypasses the derivation of macroscopic evolution equations when these equations conceptually exist but are not available in closed form-hence the term equation-free. We selectively discuss basic algorithms and underlying principles and illustrate the approach through representative applications. We also discuss potential difficulties and outline areas for future research.
A gateway for phylogenetic analysis powered by grid computing featuring GARLI 2.0.
Bazinet, Adam L; Zwickl, Derrick J; Cummings, Michael P
2014-09-01
We introduce molecularevolution.org, a publicly available gateway for high-throughput, maximum-likelihood phylogenetic analysis powered by grid computing. The gateway features a garli 2.0 web service that enables a user to quickly and easily submit thousands of maximum likelihood tree searches or bootstrap searches that are executed in parallel on distributed computing resources. The garli web service allows one to easily specify partitioned substitution models using a graphical interface, and it performs sophisticated post-processing of phylogenetic results. Although the garli web service has been used by the research community for over three years, here we formally announce the availability of the service, describe its capabilities, highlight new features and recent improvements, and provide details about how the grid system efficiently delivers high-quality phylogenetic results. © The Author(s) 2014. Published by Oxford University Press, on behalf of the Society of Systematic Biologists.
NASA Technical Reports Server (NTRS)
Williams, Jessica L.; Bhat, Ramachandra S.; You, Tung-Han
2012-01-01
The Soil Moisture Active Passive (SMAP) mission will perform soil moisture content and freeze/thaw state observations from a low-Earth orbit. The observatory is scheduled to launch in October 2014 and will perform observations from a near-polar, frozen, and sun-synchronous Science Orbit for a 3-year data collection mission. At launch, the observatory is delivered to an Injection Orbit that is biased below the Science Orbit; the spacecraft will maneuver to the Science Orbit during the mission Commissioning Phase. The delta V needed to maneuver from the Injection Orbit to the Science Orbit is computed statistically via a Monte Carlo simulation; the 99th percentile delta V (delta V99) is carried as a line item in the mission delta V budget. This paper details the simulation and analysis performed to compute this figure and the delta V99 computed per current mission parameters.
Numerical Analysis of Incipient Separation on 53 Deg Swept Diamond Wing
NASA Technical Reports Server (NTRS)
Frink, Neal T.
2015-01-01
A systematic analysis of incipient separation and subsequent vortex formation from moderately swept blunt leading edges is presented for a 53 deg swept diamond wing. This work contributes to a collective body of knowledge generated within the NATO/STO AVT-183 Task Group titled 'Reliable Prediction of Separated Flow Onset and Progression for Air and Sea Vehicles'. The objective is to extract insights from the experimentally measured and numerically computed flow fields that might enable turbulence experts to further improve their models for predicting swept blunt leading-edge flow separation. Details of vortex formation are inferred from numerical solutions after establishing a good correlation of the global flow field and surface pressure distributions between wind tunnel measurements and computed flow solutions. From this, significant and sometimes surprising insights into the nature of incipient separation and part-span vortex formation are derived from the wealth of information available in the computational solutions.
An attempt to obtain a detailed declination chart from the United States magnetic anomaly map
Alldredge, L.R.
1989-01-01
Modern declination charts of the United States show almost no details. It was hoped that declination details could be derived from the information contained in the existing magnetic anomaly map of the United States. This could be realized only if all of the survey data were corrected to a common epoch, at which time a main-field vector model was known, before the anomaly values were computed. Because this was not done, accurate declination values cannot be determined. In spite of this conclusion, declination values were computed using a common main-field model for the entire United States to see how well they compared with observed values. The computed detailed declination values were found to compare less favourably with observed values of declination than declination values computed from the IGRF 1985 model itself. -from Author
ERIC Educational Resources Information Center
Doring, Richard; Hicks, Bruce
A comparison is made of four maxicalculators and two minicomputers with an emphasis on two, the HP 9830 and the Wang 2200. Comparisons are in the form of a table with individual guidelines for analysis followed by the specific characteristics of the particular calculator. Features compared include: manual input facilities, screen, secondary…
Center of Excellence for Hypersonics Research
2012-01-25
detailed simulations of actual combustor configurations, and ultimately for the optimization of hypersonic air - breathing propulsion system flow paths... vehicle development programs. The Center engaged leading experts in experimental and computational analysis of hypersonic flows to provide research...advanced hypersonic vehicles and space access systems will require significant advances in the design methods and ground testing techniques to ensure
Capital Budgeting Guidelines: How to Decide Whether to Fund a New Dorm or an Upgraded Computer Lab.
ERIC Educational Resources Information Center
Swiger, John; Klaus, Allen
1996-01-01
A process for college and university decision making and budgeting for capital outlays that focuses on evaluating the qualitative and quantitative benefits of each proposed project is described and illustrated. The process provides a means to solicit suggestions from those involved and provide detailed information for cost-benefit analysis. (MSE)
Parsing Protocols Using Problem Solving Grammars. AI Memo 385.
ERIC Educational Resources Information Center
Miller, Mark L.; Goldstein, Ira P.
A theory of the planning and debugging of computer programs is formalized as a context free grammar, which is used to reveal the constituent structure of problem solving episodes by parsing protocols in which programs are written, tested, and debugged. This is illustrated by the detailed analysis of an actual session with a beginning student…
Application of Simulation to Individualized Self-Paced Training. Final Report. TAEG Report No. 11-2.
ERIC Educational Resources Information Center
Lindahl, William H.; Gardner, James H.
Computer simulation is recognized as a valuable systems analysis research tool which enables the detailed examination, evaluation, and manipulation, under stated conditions, of a system without direct action on the system. This technique provides management with quantitative data on system performance and capabilities which can be used to compare…
ERIC Educational Resources Information Center
Technomics, Inc., McLean, VA.
This publication is Attachment 1 of a set of 16 computer listed QPCB task sorts, by career level, for the entire Hospital Corps and Dental Technician fields. Statistical data are presented in tabular form for a detailed listing of job duties in radiation. (BT)
Analysis of LH Launcher Arrays (Like the ITER One) Using the TOPLHA Code
DOE Office of Scientific and Technical Information (OSTI.GOV)
Maggiora, R.; Milanesio, D.; Vecchi, G.
2009-11-26
TOPLHA (Torino Polytechnic Lower Hybrid Antenna) code is an innovative tool for the 3D/1D simulation of Lower Hybrid (LH) antennas, i.e. accounting for realistic 3D waveguides geometry and for accurate 1D plasma models, and without restrictions on waveguide shape, including curvature. This tool provides a detailed performances prediction of any LH launcher, by computing the antenna scattering parameters, the current distribution, electric field maps and power spectra for any user-specified waveguide excitation. In addition, a fully parallelized and multi-cavity version of TOPLHA permits the analysis of large and complex waveguide arrays in a reasonable simulation time. A detailed analysis ofmore » the performances of the proposed ITER LH antenna geometry has been carried out, underlining the strong dependence of the antenna input parameters with respect to plasma conditions. A preliminary optimization of the antenna dimensions has also been accomplished. Electric current distribution on conductors, electric field distribution at the interface with plasma, and power spectra have been calculated as well. The analysis shows the strong capabilities of the TOPLHA code as a predictive tool and its usefulness to LH launcher arrays detailed design.« less
Homogenized modeling methodology for 18650 lithium-ion battery module under large deformation
Tang, Liang; Cheng, Pengle
2017-01-01
Effective lithium-ion battery module modeling has become a bottleneck for full-size electric vehicle crash safety numerical simulation. Modeling every single cell in detail would be costly. However, computational accuracy could be lost if the module is modeled by using a simple bulk material or rigid body. To solve this critical engineering problem, a general method to establish a computational homogenized model for the cylindrical battery module is proposed. A single battery cell model is developed and validated through radial compression and bending experiments. To analyze the homogenized mechanical properties of the module, a representative unit cell (RUC) is extracted with the periodic boundary condition applied on it. An elastic–plastic constitutive model is established to describe the computational homogenized model for the module. Two typical packing modes, i.e., cubic dense packing and hexagonal packing for the homogenized equivalent battery module (EBM) model, are targeted for validation compression tests, as well as the models with detailed single cell description. Further, the homogenized EBM model is confirmed to agree reasonably well with the detailed battery module (DBM) model for different packing modes with a length scale of up to 15 × 15 cells and 12% deformation where the short circuit takes place. The suggested homogenized model for battery module makes way for battery module and pack safety evaluation for full-size electric vehicle crashworthiness analysis. PMID:28746390
Homogenized modeling methodology for 18650 lithium-ion battery module under large deformation.
Tang, Liang; Zhang, Jinjie; Cheng, Pengle
2017-01-01
Effective lithium-ion battery module modeling has become a bottleneck for full-size electric vehicle crash safety numerical simulation. Modeling every single cell in detail would be costly. However, computational accuracy could be lost if the module is modeled by using a simple bulk material or rigid body. To solve this critical engineering problem, a general method to establish a computational homogenized model for the cylindrical battery module is proposed. A single battery cell model is developed and validated through radial compression and bending experiments. To analyze the homogenized mechanical properties of the module, a representative unit cell (RUC) is extracted with the periodic boundary condition applied on it. An elastic-plastic constitutive model is established to describe the computational homogenized model for the module. Two typical packing modes, i.e., cubic dense packing and hexagonal packing for the homogenized equivalent battery module (EBM) model, are targeted for validation compression tests, as well as the models with detailed single cell description. Further, the homogenized EBM model is confirmed to agree reasonably well with the detailed battery module (DBM) model for different packing modes with a length scale of up to 15 × 15 cells and 12% deformation where the short circuit takes place. The suggested homogenized model for battery module makes way for battery module and pack safety evaluation for full-size electric vehicle crashworthiness analysis.
NASA Technical Reports Server (NTRS)
Biernacki, John; Juhasz, John; Sadler, Gerald
1991-01-01
A team of Space Station Freedom (SSF) system engineers are in the process of extensive analysis of the SSF requirements, particularly those pertaining to the electrical power system (EPS). The objective of this analysis is the development of a comprehensive, computer-based requirements model, using an enhanced modern structured analysis methodology (EMSA). Such a model provides a detailed and consistent representation of the system's requirements. The process outlined in the EMSA methodology is unique in that it allows the graphical modeling of real-time system state transitions, as well as functional requirements and data relationships, to be implemented using modern computer-based tools. These tools permit flexible updating and continuous maintenance of the models. Initial findings resulting from the application of EMSA to the EPS have benefited the space station program by linking requirements to design, providing traceability of requirements, identifying discrepancies, and fostering an understanding of the EPS.
BEST3D user's manual: Boundary Element Solution Technology, 3-Dimensional Version 3.0
NASA Technical Reports Server (NTRS)
1991-01-01
The theoretical basis and programming strategy utilized in the construction of the computer program BEST3D (boundary element solution technology - three dimensional) and detailed input instructions are provided for the use of the program. An extensive set of test cases and sample problems is included in the manual and is also available for distribution with the program. The BEST3D program was developed under the 3-D Inelastic Analysis Methods for Hot Section Components contract (NAS3-23697). The overall objective of this program was the development of new computer programs allowing more accurate and efficient three-dimensional thermal and stress analysis of hot section components, i.e., combustor liners, turbine blades, and turbine vanes. The BEST3D program allows both linear and nonlinear analysis of static and quasi-static elastic problems and transient dynamic analysis for elastic problems. Calculation of elastic natural frequencies and mode shapes is also provided.
An automated data management/analysis system for space shuttle orbiter tiles. [stress analysis
NASA Technical Reports Server (NTRS)
Giles, G. L.; Ballas, M.
1982-01-01
An engineering data management system was combined with a nonlinear stress analysis program to provide a capability for analyzing a large number of tiles on the space shuttle orbiter. Tile geometry data and all data necessary of define the tile loads environment accessed automatically as needed for the analysis of a particular tile or a set of tiles. User documentation provided includes: (1) description of computer programs and data files contained in the system; (2) definitions of all engineering data stored in the data base; (3) characteristics of the tile anaytical model; (4) instructions for preparation of user input; and (5) a sample problem to illustrate use of the system. Description of data, computer programs, and analytical models of the tile are sufficiently detailed to guide extension of the system to include additional zones of tiles and/or additional types of analyses
Basic research for the geodynamics program
NASA Technical Reports Server (NTRS)
1991-01-01
The mathematical models of space very long base interferometry (VLBI) observables suitable for least squares covariance analysis were derived and estimatability problems inherent in the space VLBI system were explored, including a detailed rank defect analysis and sensitivity analysis. An important aim is to carry out a comparative analysis of the mathematical models of the ground-based VLBI and space VLBI observables in order to describe the background in detail. Computer programs were developed in order to check the relations, assess errors, and analyze sensitivity. In order to investigate the estimatability of different geodetic and geodynamic parameters from the space VLBI observables, the mathematical models for time delay and time delay rate observables of space VLBI were analytically derived along with the partial derivatives with respect to the parameters. Rank defect analysis was carried out both by analytical and numerical testing of linear dependencies between the columns of the normal matrix thus formed. Definite conclusions were formed about the rank defects in the system.
NASA Technical Reports Server (NTRS)
Foss, W. E., Jr.
1979-01-01
The takeoff and approach performance of an aircraft is calculated in accordance with the airworthiness standards of the Federal Aviation Regulations. The aircraft and flight constraints are represented in sufficient detail to permit realistic sensitivity studies in terms of either configuration modifications or changes in operational procedures. The program may be used to investigate advanced operational procedures for noise alleviation such as programmed throttle and flap controls. Extensive profile time history data are generated and are placed on an interface file which can be input directly to the NASA aircraft noise prediction program (ANOPP).
Identification of cost effective energy conservation measures
NASA Technical Reports Server (NTRS)
Bierenbaum, H. S.; Boggs, W. H.
1978-01-01
In addition to a successful program of readily implemented conservation actions for reducing building energy consumption at Kennedy Space Center, recent detailed analyses have identified further substantial savings for buildings representative of technical facilities designed when energy costs were low. The techniques employed for determination of these energy savings consisted of facility configuration analysis, power and lighting measurements, detailed computer simulations and simulation verifications. Use of these methods resulted in identification of projected energy savings as large as $330,000 a year (approximately two year break-even period) in a single building. Application of these techniques to other commercial buildings is discussed
GEO3D - Three-Dimensional Computer Model of a Ground Source Heat Pump System
James Menart
2013-06-07
This file is the setup file for the computer program GEO3D. GEO3D is a computer program written by Jim Menart to simulate vertical wells in conjunction with a heat pump for ground source heat pump (GSHP) systems. This is a very detailed three-dimensional computer model. This program produces detailed heat transfer and temperature field information for a vertical GSHP system.
NASA Technical Reports Server (NTRS)
Iyer, Venkit
1993-01-01
The theory, formulation, and solution of three-dimensional, compressible attached laminar flows, applied to swept wings in subsonic or supersonic flow are discussed. Several new features and modifications to an earlier general procedure described in NASA CR 4269, Jan. 1990 are incorporated. Details of interfacing the boundary-layer computation with solution of the inviscid Euler equations are discussed. A description of the computer program, complete with user's manual and example cases, is also included. Comparison of solutions with Navier-Stokes computations with or without boundary-layer suction is given. Output of solution profiles and derivatives required in boundary-layer stability analysis is provided.
The computational neurobiology of learning and reward.
Daw, Nathaniel D; Doya, Kenji
2006-04-01
Following the suggestion that midbrain dopaminergic neurons encode a signal, known as a 'reward prediction error', used by artificial intelligence algorithms for learning to choose advantageous actions, the study of the neural substrates for reward-based learning has been strongly influenced by computational theories. In recent work, such theories have been increasingly integrated into experimental design and analysis. Such hybrid approaches have offered detailed new insights into the function of a number of brain areas, especially the cortex and basal ganglia. In part this is because these approaches enable the study of neural correlates of subjective factors (such as a participant's beliefs about the reward to be received for performing some action) that the computational theories purport to quantify.
Ubiquitous computing in sports: A review and analysis.
Baca, Arnold; Dabnichki, Peter; Heller, Mario; Kornfeind, Philipp
2009-10-01
Ubiquitous (pervasive) computing is a term for a synergetic use of sensing, communication and computing. Pervasive use of computing has seen a rapid increase in the current decade. This development has propagated in applied sport science and everyday life. The work presents a survey of recent developments in sport and leisure with emphasis on technology and computational techniques. A detailed analysis on new technological developments is performed. Sensors for position and motion detection, and such for equipment and physiological monitoring are discussed. Aspects of novel trends in communication technologies and data processing are outlined. Computational advancements have started a new trend - development of smart and intelligent systems for a wide range of applications - from model-based posture recognition to context awareness algorithms for nutrition monitoring. Examples particular to coaching and training are discussed. Selected tools for monitoring rules' compliance and automatic decision-making are outlined. Finally, applications in leisure and entertainment are presented, from systems supporting physical activity to systems providing motivation. It is concluded that the emphasis in future will shift from technologies to intelligent systems that allow for enhanced social interaction as efforts need to be made to improve user-friendliness and standardisation of measurement and transmission protocols.
Navier-Stokes analysis of an oxidizer turbine blade with tip clearance
NASA Technical Reports Server (NTRS)
Gibeling, Howard J.; Sabnis, Jayant S.
1992-01-01
The Gas Generator Oxidizer Turbine (GGOT) Blade is being analyzed by various investigators under the NASA MSFC sponsored Turbine Stage Technology Team design effort. The present work concentrates on the tip clearance region flow and associated losses; however, flow details for the passage region are also obtained in the simulations. The present calculations simulate the rotor blade row in a rotating reference frame with the appropriate coriolis and centrifugal acceleration terms included in the momentum equation. The upstream computational boundary is located about one axial chord from the blade leading edge. The boundary conditions at this location were determined by using a Euler analysis without the vanes to obtain approximately the same flow profiles at the rotor as were obtained with the Euler stage analysis including the vanes. Inflow boundary layer profiles are then constructed assuming the skin friction coefficient at both the hub and the casing. The downstream computational boundary is located about one axial chord from the blade trailing edge, and the circumferentially averaged static pressure at this location was also obtained from the Euler analysis. Results were obtained for the 3-D baseline GGOT geometry at the full scale design Reynolds number. Details of the clearance region flow behavior and blade pressure distributions were computed. The spanwise variation in blade loading distributions are shown, and circumferentially averaged spanwise distributions of total pressure, total temperature, Mach number, and flow angle are shown at several axial stations. The spanwise variation of relative total pressure loss shows a region of high loss in the region near the casing. Particle traces in the near tip region show vortical behavior of the fluid which passes through the clearance region and exits at the downstream edge of the gap.
NASA Astrophysics Data System (ADS)
Sachdeva, Ritika; Soni, Abhinav; Singh, V. P.; Saini, G. S. S.
2018-05-01
Etoricoxib is one of the selective cyclooxygenase inhibitor drug which plays a significant role in the pharmacological management of arthritis and pain. The theoretical investigation of its reactivity is done using Density Functional Theory calculations. Molecular Electrostatic Potential Surface of etoricoxib and its Mulliken atomic charge distribution are used for the prediction of its electrophilic and nucleophilic sites. The detailed analysis of its frontier molecular orbitals is also done.
A visiting scientist program in atmospheric sciences for the Goddard Space Flight Center
NASA Technical Reports Server (NTRS)
Davis, M. H.
1989-01-01
A visiting scientist program was conducted in the atmospheric sciences and related areas at the Goddard Laboratory for Atmospheres. Research was performed in mathematical analysis as applied to computer modeling of the atmospheres; development of atmospheric modeling programs; analysis of remotely sensed atmospheric, surface, and oceanic data and its incorporation into atmospheric models; development of advanced remote sensing instrumentation; and related research areas. The specific research efforts are detailed by tasks.
Design Through Manufacturing: The Solid Model - Finite Element Analysis Interface
NASA Technical Reports Server (NTRS)
Rubin, Carol
2003-01-01
State-of-the-art computer aided design (CAD) presently affords engineers the opportunity to create solid models of machine parts which reflect every detail of the finished product. Ideally, these models should fulfill two very important functions: (1) they must provide numerical control information for automated manufacturing of precision parts, and (2) they must enable analysts to easily evaluate the stress levels (using finite element analysis - FEA) for all structurally significant parts used in space missions. Today's state-of-the-art CAD programs perform function (1) very well, providing an excellent model for precision manufacturing. But they do not provide a straightforward and simple means of automating the translation from CAD to FEA models, especially for aircraft-type structures. The research performed during the fellowship period investigated the transition process from the solid CAD model to the FEA stress analysis model with the final goal of creating an automatic interface between the two. During the period of the fellowship a detailed multi-year program for the development of such an interface was created. The ultimate goal of this program will be the development of a fully parameterized automatic ProE/FEA translator for parts and assemblies, with the incorporation of data base management into the solution, and ultimately including computational fluid dynamics and thermal modeling in the interface.
Techniques for digital enhancement of Landsat MSS data using an Apple II+ microcomputer
NASA Technical Reports Server (NTRS)
Harrington, J. A., Jr.; Cartin, K. F.
1984-01-01
The information provided by remotely sensed data collected from orbiting platforms has been useful in many research fields. Particularly convenient for evaluation are generally digital data stored on computer compatible tapes (CCT's). The major advantages of CCT's are the quality of the data and the accessibility to computer manipulation. Minicomputer systems are widely used for the required computer processing operations. However, microprocessor-related technological advances make it now possible to process CCT data with computing systems which can be obtained at a much lower price than minicomputer systems. A detailed description is provided of the design considerations of a microcomputer-based Digital Image Analysis System (DIAS). Particular attention is given to the algorithms which are incorporated for eighter edge enhancement or smoothing Landsat multispectral scanner data.
NASA Technical Reports Server (NTRS)
Chen, H. C.; Neback, H. E.; Kao, T. J.; Yu, N. Y.; Kusunose, K.
1991-01-01
This manual explains how to use an Euler based computational method for predicting the airframe/propulsion integration effects for an aft-mounted turboprop transport. The propeller power effects are simulated by the actuator disk concept. This method consists of global flow field analysis and the embedded flow solution for predicting the detailed flow characteristics in the local vicinity of an aft-mounted propfan engine. The computational procedure includes the use of several computer programs performing four main functions: grid generation, Euler solution, grid embedding, and streamline tracing. This user's guide provides information for these programs, including input data preparations with sample input decks, output descriptions, and sample Unix scripts for program execution in the UNICOS environment.
NASA Technical Reports Server (NTRS)
Rodal, J. J. A.; French, S. E.; Witmer, E. A.; Stagliano, T. R.
1979-01-01
The CIVM-JET 4C computer program for the 'finite strain' analysis of 2 d transient structural responses of complete or partial rings and beams subjected to fragment impact stored on tape as a series of individual files. Which subroutines are found in these files are described in detail. All references to the CIVM-JET 4C program are made assuming that the user has a copy of NASA CR-134907 (ASRL TR 154-9) which serves as a user's guide to (1) the CIVM-JET 4B computer code and (2) the CIVM-JET 4C computer code 'with the use of the modified input instructions' attached hereto.
Knowledge representation in metabolic pathway databases.
Stobbe, Miranda D; Jansen, Gerbert A; Moerland, Perry D; van Kampen, Antoine H C
2014-05-01
The accurate representation of all aspects of a metabolic network in a structured format, such that it can be used for a wide variety of computational analyses, is a challenge faced by a growing number of researchers. Analysis of five major metabolic pathway databases reveals that each database has made widely different choices to address this challenge, including how to deal with knowledge that is uncertain or missing. In concise overviews, we show how concepts such as compartments, enzymatic complexes and the direction of reactions are represented in each database. Importantly, also concepts which a database does not represent are described. Which aspects of the metabolic network need to be available in a structured format and to what detail differs per application. For example, for in silico phenotype prediction, a detailed representation of gene-protein-reaction relations and the compartmentalization of the network is essential. Our analysis also shows that current databases are still limited in capturing all details of the biology of the metabolic network, further illustrated with a detailed analysis of three metabolic processes. Finally, we conclude that the conceptual differences between the databases, which make knowledge exchange and integration a challenge, have not been resolved, so far, by the exchange formats in which knowledge representation is standardized.
Lowering the Barrier to Reproducible Research by Publishing Provenance from Common Analytical Tools
NASA Astrophysics Data System (ADS)
Jones, M. B.; Slaughter, P.; Walker, L.; Jones, C. S.; Missier, P.; Ludäscher, B.; Cao, Y.; McPhillips, T.; Schildhauer, M.
2015-12-01
Scientific provenance describes the authenticity, origin, and processing history of research products and promotes scientific transparency by detailing the steps in computational workflows that produce derived products. These products include papers, findings, input data, software products to perform computations, and derived data and visualizations. The geosciences community values this type of information, and, at least theoretically, strives to base conclusions on computationally replicable findings. In practice, capturing detailed provenance is laborious and thus has been a low priority; beyond a lab notebook describing methods and results, few researchers capture and preserve detailed records of scientific provenance. We have built tools for capturing and publishing provenance that integrate into analytical environments that are in widespread use by geoscientists (R and Matlab). These tools lower the barrier to provenance generation by automating capture of critical information as researchers prepare data for analysis, develop, test, and execute models, and create visualizations. The 'recordr' library in R and the `matlab-dataone` library in Matlab provide shared functions to capture provenance with minimal changes to normal working procedures. Researchers can capture both scripted and interactive sessions, tag and manage these executions as they iterate over analyses, and then prune and publish provenance metadata and derived products to the DataONE federation of archival repositories. Provenance traces conform to the ProvONE model extension of W3C PROV, enabling interoperability across tools and languages. The capture system supports fine-grained versioning of science products and provenance traces. By assigning global identifiers such as DOIs, reseachers can cite the computational processes used to reach findings. And, finally, DataONE has built a web portal to search, browse, and clearly display provenance relationships between input data, the software used to execute analyses and models, and derived data and products that arise from these computations. This provenance is vital to interpretation and understanding of science, and provides an audit trail that researchers can use to understand and replicate computational workflows in the geosciences.
Artificial intelligence techniques used in respiratory sound analysis--a systematic review.
Palaniappan, Rajkumar; Sundaraj, Kenneth; Sundaraj, Sebastian
2014-02-01
Artificial intelligence (AI) has recently been established as an alternative method to many conventional methods. The implementation of AI techniques for respiratory sound analysis can assist medical professionals in the diagnosis of lung pathologies. This article highlights the importance of AI techniques in the implementation of computer-based respiratory sound analysis. Articles on computer-based respiratory sound analysis using AI techniques were identified by searches conducted on various electronic resources, such as the IEEE, Springer, Elsevier, PubMed, and ACM digital library databases. Brief descriptions of the types of respiratory sounds and their respective characteristics are provided. We then analyzed each of the previous studies to determine the specific respiratory sounds/pathology analyzed, the number of subjects, the signal processing method used, the AI techniques used, and the performance of the AI technique used in the analysis of respiratory sounds. A detailed description of each of these studies is provided. In conclusion, this article provides recommendations for further advancements in respiratory sound analysis.
NASA Technical Reports Server (NTRS)
Erb, R. B.
1974-01-01
The results of the ERTS-1 investigations conducted by the Earth Observations Division at the NASA Lyndon B. Johnson Space Center are summarized in this report, which is an overview of documents detailing individual investigations. Conventional image interpretation and computer-aided classification procedures were the two basic techniques used in analyzing the data for detecting, identifying, locating, and measuring surface features related to earth resources. Data from the ERTS-1 multispectral scanner system were useful for all applications studied, which included agriculture, coastal and estuarine analysis, forestry, range, land use and urban land use, and signature extension. Percentage classification accuracies are cited for the conventional and computer-aided techniques.
Optimal low thrust geocentric transfer. [mission analysis computer program
NASA Technical Reports Server (NTRS)
Edelbaum, T. N.; Sackett, L. L.; Malchow, H. L.
1973-01-01
A computer code which will rapidly calculate time-optimal low thrust transfers is being developed as a mission analysis tool. The final program will apply to NEP or SEP missions and will include a variety of environmental effects. The current program assumes constant acceleration. The oblateness effect and shadowing may be included. Detailed state and costate equations are given for the thrust effect, oblateness effect, and shadowing. A simple but adequate model yields analytical formulas for power degradation due to the Van Allen radiation belts for SEP missions. The program avoids the classical singularities by the use of equinoctial orbital elements. Kryloff-Bogoliuboff averaging is used to facilitate rapid calculation. Results for selected cases using the current program are given.
Analysis of outcomes in radiation oncology: An integrated computational platform
Liu, Dezhi; Ajlouni, Munther; Jin, Jian-Yue; Ryu, Samuel; Siddiqui, Farzan; Patel, Anushka; Movsas, Benjamin; Chetty, Indrin J.
2009-01-01
Radiotherapy research and outcome analyses are essential for evaluating new methods of radiation delivery and for assessing the benefits of a given technology on locoregional control and overall survival. In this article, a computational platform is presented to facilitate radiotherapy research and outcome studies in radiation oncology. This computational platform consists of (1) an infrastructural database that stores patient diagnosis, IMRT treatment details, and follow-up information, (2) an interface tool that is used to import and export IMRT plans in DICOM RT and AAPM/RTOG formats from a wide range of planning systems to facilitate reproducible research, (3) a graphical data analysis and programming tool that visualizes all aspects of an IMRT plan including dose, contour, and image data to aid the analysis of treatment plans, and (4) a software package that calculates radiobiological models to evaluate IMRT treatment plans. Given the limited number of general-purpose computational environments for radiotherapy research and outcome studies, this computational platform represents a powerful and convenient tool that is well suited for analyzing dose distributions biologically and correlating them with the delivered radiation dose distributions and other patient-related clinical factors. In addition the database is web-based and accessible by multiple users, facilitating its convenient application and use. PMID:19544785
Computational approaches in the design of synthetic receptors - A review.
Cowen, Todd; Karim, Kal; Piletsky, Sergey
2016-09-14
The rational design of molecularly imprinted polymers (MIPs) has been a major contributor to their reputation as "plastic antibodies" - high affinity robust synthetic receptors which can be optimally designed, and produced for a much reduced cost than their biological equivalents. Computational design has become a routine procedure in the production of MIPs, and has led to major advances in functional monomer screening, selection of cross-linker and solvent, optimisation of monomer(s)-template ratio and selectivity analysis. In this review the various computational methods will be discussed with reference to all the published relevant literature since the end of 2013, with each article described by the target molecule, the computational approach applied (whether molecular mechanics/molecular dynamics, semi-empirical quantum mechanics, ab initio quantum mechanics (Hartree-Fock, Møller-Plesset, etc.) or DFT) and the purpose for which they were used. Detailed analysis is given to novel techniques including analysis of polymer binding sites, the use of novel screening programs and simulations of MIP polymerisation reaction. The further advances in molecular modelling and computational design of synthetic receptors in particular will have serious impact on the future of nanotechnology and biotechnology, permitting the further translation of MIPs into the realms of analytics and medical technology. Copyright © 2016 Elsevier B.V. All rights reserved.
Micro-computed tomography imaging and analysis in developmental biology and toxicology.
Wise, L David; Winkelmann, Christopher T; Dogdas, Belma; Bagchi, Ansuman
2013-06-01
Micro-computed tomography (micro-CT) is a high resolution imaging technique that has expanded and strengthened in use since it was last reviewed in this journal in 2004. The technology has expanded to include more detailed analysis of bone, as well as soft tissues, by use of various contrast agents. It is increasingly applied to questions in developmental biology and developmental toxicology. Relatively high-throughput protocols now provide a powerful and efficient means to evaluate embryos and fetuses subjected to genetic manipulations or chemical exposures. This review provides an overview of the technology, including scanning, reconstruction, visualization, segmentation, and analysis of micro-CT generated images. This is followed by a review of more recent applications of the technology in some common laboratory species that highlight the diverse issues that can be addressed. Copyright © 2013 Wiley Periodicals, Inc.
Integral equation and discontinuous Galerkin methods for the analysis of light-matter interaction
NASA Astrophysics Data System (ADS)
Baczewski, Andrew David
Light-matter interaction is among the most enduring interests of the physical sciences. The understanding and control of this physics is of paramount importance to the design of myriad technologies ranging from stained glass, to molecular sensing and characterization techniques, to quantum computers. The development of complex engineered systems that exploit this physics is predicated at least partially upon in silico design and optimization that properly capture the light-matter coupling. In this thesis, the details of computational frameworks that enable this type of analysis, based upon both Integral Equation and Discontinuous Galerkin formulations will be explored. There will be a primary focus on the development of efficient and accurate software, with results corroborating both. The secondary focus will be on the use of these tools in the analysis of a number of exemplary systems.
RI 1170 advanced strapdown gyro
NASA Technical Reports Server (NTRS)
1973-01-01
The major components of the RI 1170 gyroscope are described. A detailed functional description of the electronics including block diagrams and photographs of output waveshapes within the loop electronics are presented. An electronic data flow diagram is included. Those gyro subassemblies that were originally planned and subsequently changed or modified for one reason or another are discussed in detail. Variations to the original design included the capacitive pickoffs, torquer flexleads, magnetic suspension, gas bearings, electronic design, and packaging. The selection of components and changes from the original design and components selected are discussed. Device failures experienced throughout the program are reported and design corrections to eliminate the failure modes are noted. Major design deficiencies such as those of the MSE electronics are described in detail. Modifications made to the gas bearing parts and design improvements to the wheel are noted. Changes to the gas bearing prints are included as well as a mathematical analysis of the 1170 gas bearing wheel by computer analysis. The mean free-path effects on gas bearing performance is summarized.
Early classification of pathological heartbeats on wireless body sensor nodes.
Braojos, Rubén; Beretta, Ivan; Ansaloni, Giovanni; Atienza, David
2014-11-27
Smart Wireless Body Sensor Nodes (WBSNs) are a novel class of unobtrusive, battery-powered devices allowing the continuous monitoring and real-time interpretation of a subject's bio-signals, such as the electrocardiogram (ECG). These low-power platforms, while able to perform advanced signal processing to extract information on heart conditions, are usually constrained in terms of computational power and transmission bandwidth. It is therefore essential to identify in the early stages which parts of an ECG are critical for the diagnosis and, only in these cases, activate on demand more detailed and computationally intensive analysis algorithms. In this work, we present a comprehensive framework for real-time automatic classification of normal and abnormal heartbeats, targeting embedded and resource-constrained WBSNs. In particular, we provide a comparative analysis of different strategies to reduce the heartbeat representation dimensionality, and therefore the required computational effort. We then combine these techniques with a neuro-fuzzy classification strategy, which effectively discerns normal and pathological heartbeats with a minimal run time and memory overhead. We prove that, by performing a detailed analysis only on the heartbeats that our classifier identifies as abnormal, a WBSN system can drastically reduce its overall energy consumption. Finally, we assess the choice of neuro-fuzzy classification by comparing its performance and workload with respect to other state-of-the-art strategies. Experimental results using the MIT-BIH Arrhythmia database show energy savings of as much as 60% in the signal processing stage, and 63% in the subsequent wireless transmission, when a neuro-fuzzy classification structure is employed, coupled with a dimensionality reduction technique based on random projections.
Early Classification of Pathological Heartbeats on Wireless Body Sensor Nodes
Braojos, Rubén; Beretta, Ivan; Ansaloni, Giovanni; Atienza, David
2014-01-01
Smart Wireless Body Sensor Nodes (WBSNs) are a novel class of unobtrusive, battery-powered devices allowing the continuous monitoring and real-time interpretation of a subject's bio-signals, such as the electrocardiogram (ECG). These low-power platforms, while able to perform advanced signal processing to extract information on heart conditions, are usually constrained in terms of computational power and transmission bandwidth. It is therefore essential to identify in the early stages which parts of an ECG are critical for the diagnosis and, only in these cases, activate on demand more detailed and computationally intensive analysis algorithms. In this work, we present a comprehensive framework for real-time automatic classification of normal and abnormal heartbeats, targeting embedded and resource-constrained WBSNs. In particular, we provide a comparative analysis of different strategies to reduce the heartbeat representation dimensionality, and therefore the required computational effort. We then combine these techniques with a neuro-fuzzy classification strategy, which effectively discerns normal and pathological heartbeats with a minimal run time and memory overhead. We prove that, by performing a detailed analysis only on the heartbeats that our classifier identifies as abnormal, a WBSN system can drastically reduce its overall energy consumption. Finally, we assess the choice of neuro-fuzzy classification by comparing its performance and workload with respect to other state-of-the-art strategies. Experimental results using the MIT-BIH Arrhythmia database show energy savings of as much as 60% in the signal processing stage, and 63% in the subsequent wireless transmission, when a neuro-fuzzy classification structure is employed, coupled with a dimensionality reduction technique based on random projections. PMID:25436654
Edge enhancement and noise suppression for infrared image based on feature analysis
NASA Astrophysics Data System (ADS)
Jiang, Meng
2018-06-01
Infrared images are often suffering from background noise, blurred edges, few details and low signal-to-noise ratios. To improve infrared image quality, it is essential to suppress noise and enhance edges simultaneously. To realize it in this paper, we propose a novel algorithm based on feature analysis in shearlet domain. Firstly, as one of multi-scale geometric analysis (MGA), we introduce the theory and superiority of shearlet transform. Secondly, after analyzing the defects of traditional thresholding technique to suppress noise, we propose a novel feature extraction distinguishing image structures from noise well and use it to improve the traditional thresholding technique. Thirdly, with computing the correlations between neighboring shearlet coefficients, the feature attribute maps identifying the weak detail and strong edges are completed to improve the generalized unsharped masking (GUM). At last, experiment results with infrared images captured in different scenes demonstrate that the proposed algorithm suppresses noise efficiently and enhances image edges adaptively.
Nozzle Numerical Analysis Of The Scimitar Engine
NASA Astrophysics Data System (ADS)
Battista, F.; Marini, M.; Cutrone, L.
2011-05-01
This work describes part of the activities on the LAPCAT-II A2 vehicle, in which starting from the available conceptual vehicle design and the related pre- cooled turbo-ramjet engine called SCIMITAR, well- thought assumptions made for performance figures of different components during the iteration process within LAPCAT-I will be assessed in more detail. In this paper it is presented a numerical analysis aimed at the design optimization of the nozzle contour of the LAPCAT A2 SCIMITAR engine designed by Reaction Engines Ltd. (REL) (see Figure 1). In particular, nozzle shape optimization process is presented for cruise conditions. All the computations have been carried out by using the CIRA C3NS code in non equilibrium conditions. The effect of considering detailed or reduced chemical kinetic schemes has been analyzed with a particular focus on the production of pollutants. An analysis of engine performance parameters, such as thrust and combustion efficiency has been carried out.
Study on the Preliminary Design of ARGO-M Operation System
NASA Astrophysics Data System (ADS)
Seo, Yoon-Kyung; Lim, Hyung-Chul; Rew, Dong-Young; Jo, Jung Hyun; Park, Jong-Uk; Park, Eun-Seo; Park, Jang-Hyun
2010-12-01
Korea Astronomy and Space Science Institute has been developing one mobile satellite laser ranging system named as accurate ranging system for geodetic observation-mobile (ARGO-M). Preliminary design of ARGO-M operation system (AOS) which is one of the ARGO-M subsystems was completed in 2009. Preliminary design results are applied to the following development phase by performing detailed design with analysis of pre-defined requirements and analysis of the derived specifications. This paper addresses the preliminary design of the whole AOS. The design results in operation and control part which is a key part in the operation system are described in detail. Analysis results of the interface between operation-supporting hardware and the control computer are summarized, which is necessary in defining the requirements for the operation-supporting hardware. Results of this study are expected to be used in the critical design phase to finalize the design process.
High speed three-dimensional laser scanner with real time processing
NASA Technical Reports Server (NTRS)
Lavelle, Joseph P. (Inventor); Schuet, Stefan R. (Inventor)
2008-01-01
A laser scanner computes a range from a laser line to an imaging sensor. The laser line illuminates a detail within an area covered by the imaging sensor, the area having a first dimension and a second dimension. The detail has a dimension perpendicular to the area. A traverse moves a laser emitter coupled to the imaging sensor, at a height above the area. The laser emitter is positioned at an offset along the scan direction with respect to the imaging sensor, and is oriented at a depression angle with respect to the area. The laser emitter projects the laser line along the second dimension of the area at a position where a image frame is acquired. The imaging sensor is sensitive to laser reflections from the detail produced by the laser line. The imaging sensor images the laser reflections from the detail to generate the image frame. A computer having a pipeline structure is connected to the imaging sensor for reception of the image frame, and for computing the range to the detail using height, depression angle and/or offset. The computer displays the range to the area and detail thereon covered by the image frame.
Levels of detail analysis of microwave scattering from human head models for brain stroke detection
2017-01-01
In this paper, we have presented a microwave scattering analysis from multiple human head models. This study incorporates different levels of detail in the human head models and its effect on microwave scattering phenomenon. Two levels of detail are taken into account; (i) Simplified ellipse shaped head model (ii) Anatomically realistic head model, implemented using 2-D geometry. In addition, heterogenic and frequency-dispersive behavior of the brain tissues has also been incorporated in our head models. It is identified during this study that the microwave scattering phenomenon changes significantly once the complexity of head model is increased by incorporating more details using magnetic resonance imaging database. It is also found out that the microwave scattering results match in both types of head model (i.e., geometrically simple and anatomically realistic), once the measurements are made in the structurally simplified regions. However, the results diverge considerably in the complex areas of brain due to the arbitrary shape interface of tissue layers in the anatomically realistic head model. After incorporating various levels of detail, the solution of subject microwave scattering problem and the measurement of transmitted and backscattered signals were obtained using finite element method. Mesh convergence analysis was also performed to achieve error free results with a minimum number of mesh elements and a lesser degree of freedom in the fast computational time. The results were promising and the E-Field values converged for both simple and complex geometrical models. However, the E-Field difference between both types of head model at the same reference point differentiated a lot in terms of magnitude. At complex location, a high difference value of 0.04236 V/m was measured compared to the simple location, where it turned out to be 0.00197 V/m. This study also contributes to provide a comparison analysis between the direct and iterative solvers so as to find out the solution of subject microwave scattering problem in a minimum computational time along with memory resources requirement. It is seen from this study that the microwave imaging may effectively be utilized for the detection, localization and differentiation of different types of brain stroke. The simulation results verified that the microwave imaging can be efficiently exploited to study the significant contrast between electric field values of the normal and abnormal brain tissues for the investigation of brain anomalies. In the end, a specific absorption rate analysis was carried out to compare the ionizing effects of microwave signals to different types of head model using a factor of safety for brain tissues. It is also suggested after careful study of various inversion methods in practice for microwave head imaging, that the contrast source inversion method may be more suitable and computationally efficient for such problems. PMID:29177115
DOE Office of Scientific and Technical Information (OSTI.GOV)
Huang, Renke; Jin, Shuangshuang; Chen, Yousu
This paper presents a faster-than-real-time dynamic simulation software package that is designed for large-size power system dynamic simulation. It was developed on the GridPACKTM high-performance computing (HPC) framework. The key features of the developed software package include (1) faster-than-real-time dynamic simulation for a WECC system (17,000 buses) with different types of detailed generator, controller, and relay dynamic models, (2) a decoupled parallel dynamic simulation algorithm with optimized computation architecture to better leverage HPC resources and technologies, (3) options for HPC-based linear and iterative solvers, (4) hidden HPC details, such as data communication and distribution, to enable development centered on mathematicalmore » models and algorithms rather than on computational details for power system researchers, and (5) easy integration of new dynamic models and related algorithms into the software package.« less
Convergence of sampling in protein simulations
NASA Astrophysics Data System (ADS)
Hess, Berk
2002-03-01
With molecular dynamics protein dynamics can be simulated in atomic detail. Current computers are not fast enough to probe all available conformations, but fluctuations around one conformation can be sampled to a reasonable extent. The motions with the largest fluctuations can be filtered out of a simulation using covariance or principal component analysis. A problem with this analysis is that random diffusion can appear as correlated motion. An analysis is presented of how long a simulation should be to obtain relevant results for global motions. The analysis reveals that the cosine content of the principal components is a good indicator for bad sampling.
Research in interactive scene analysis
NASA Technical Reports Server (NTRS)
Tenenbaum, J. M.; Barrow, H. G.; Weyl, S. A.
1976-01-01
Cooperative (man-machine) scene analysis techniques were developed whereby humans can provide a computer with guidance when completely automated processing is infeasible. An interactive approach promises significant near-term payoffs in analyzing various types of high volume satellite imagery, as well as vehicle-based imagery used in robot planetary exploration. This report summarizes the work accomplished over the duration of the project and describes in detail three major accomplishments: (1) the interactive design of texture classifiers; (2) a new approach for integrating the segmentation and interpretation phases of scene analysis; and (3) the application of interactive scene analysis techniques to cartography.
Next Generation Distributed Computing for Cancer Research
Agarwal, Pankaj; Owzar, Kouros
2014-01-01
Advances in next generation sequencing (NGS) and mass spectrometry (MS) technologies have provided many new opportunities and angles for extending the scope of translational cancer research while creating tremendous challenges in data management and analysis. The resulting informatics challenge is invariably not amenable to the use of traditional computing models. Recent advances in scalable computing and associated infrastructure, particularly distributed computing for Big Data, can provide solutions for addressing these challenges. In this review, the next generation of distributed computing technologies that can address these informatics problems is described from the perspective of three key components of a computational platform, namely computing, data storage and management, and networking. A broad overview of scalable computing is provided to set the context for a detailed description of Hadoop, a technology that is being rapidly adopted for large-scale distributed computing. A proof-of-concept Hadoop cluster, set up for performance benchmarking of NGS read alignment, is described as an example of how to work with Hadoop. Finally, Hadoop is compared with a number of other current technologies for distributed computing. PMID:25983539
Modeling Biodegradation and Reactive Transport: Analytical and Numerical Models
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sun, Y; Glascoe, L
The computational modeling of the biodegradation of contaminated groundwater systems accounting for biochemical reactions coupled to contaminant transport is a valuable tool for both the field engineer/planner with limited computational resources and the expert computational researcher less constrained by time and computer power. There exists several analytical and numerical computer models that have been and are being developed to cover the practical needs put forth by users to fulfill this spectrum of computational demands. Generally, analytical models provide rapid and convenient screening tools running on very limited computational power, while numerical models can provide more detailed information with consequent requirementsmore » of greater computational time and effort. While these analytical and numerical computer models can provide accurate and adequate information to produce defensible remediation strategies, decisions based on inadequate modeling output or on over-analysis can have costly and risky consequences. In this chapter we consider both analytical and numerical modeling approaches to biodegradation and reactive transport. Both approaches are discussed and analyzed in terms of achieving bioremediation goals, recognizing that there is always a tradeoff between computational cost and the resolution of simulated systems.« less
Next generation distributed computing for cancer research.
Agarwal, Pankaj; Owzar, Kouros
2014-01-01
Advances in next generation sequencing (NGS) and mass spectrometry (MS) technologies have provided many new opportunities and angles for extending the scope of translational cancer research while creating tremendous challenges in data management and analysis. The resulting informatics challenge is invariably not amenable to the use of traditional computing models. Recent advances in scalable computing and associated infrastructure, particularly distributed computing for Big Data, can provide solutions for addressing these challenges. In this review, the next generation of distributed computing technologies that can address these informatics problems is described from the perspective of three key components of a computational platform, namely computing, data storage and management, and networking. A broad overview of scalable computing is provided to set the context for a detailed description of Hadoop, a technology that is being rapidly adopted for large-scale distributed computing. A proof-of-concept Hadoop cluster, set up for performance benchmarking of NGS read alignment, is described as an example of how to work with Hadoop. Finally, Hadoop is compared with a number of other current technologies for distributed computing.
Sawicka, Monika; Bedini, Rossella; Pecci, Raffaella; Pameijer, Cornelis Hans; Kmiec, Zbigniew
2012-01-01
The purpose of this study was to demonstrate potential application of micro-computed tomography in the morphometric analysis of the root resorption in extracted human first premolars subjected to the orthodontic force. In one patient treated in the orthodontic clinic two mandibular first premolars subjected to orthodontic force for 4 weeks and one control tooth were selected for micro-computed tomographic analysis. The hardware device used in this study was a desktop X-ray microfocus CT scanner (SkyScan 1072). The morphology of root's surfaces was assessed by TView and Computer Tomography Analyzer (CTAn) softwares (SkyScan, bvba) which allowed analysis of all microscans, identification of root resorption craters and measurement of their length, width and volume. Microscans showed in details the surface morphology of the investigated teeth. The analysis of microscans allowed to detect 3 root resorption cavities in each of the orthodontically moved tooth and only one resorption crater in the control tooth. The volumes of the resorption craters in orthodontically-treated teeth were much larger than in a control tooth. Micro-computed tomography is a reproducible technique for the three-dimensional non-invasive assessment of root's morphology ex vivo. TView and CTan softwares are useful in accurate morphometric measurements of root's resorption.
Metal Hydrides, MOFs, and Carbon Composites as Space Radiation Shielding Mitigators
NASA Technical Reports Server (NTRS)
Atwell, William; Rojdev, Kristina; Liang, Daniel; Hill, Matthew
2014-01-01
Recently, metal hydrides and MOFs (Metal-Organic Framework/microporous organic polymer composites - for their hydrogen and methane storage capabilities) have been studied with applications in fuel cell technology. We have investigated a dual-use of these materials and carbon composites (CNT-HDPE) to include space radiation shielding mitigation. In this paper we present the results of a detailed study where we have analyzed 64 materials. We used the Band fit spectra for the combined 19-24 October 1989 solar proton events as the input source term radiation environment. These computational analyses were performed with the NASA high energy particle transport/dose code HZETRN. Through this analysis we have identified several of the materials that have excellent radiation shielding properties and the details of this analysis will be discussed further in the paper.
NASA Technical Reports Server (NTRS)
Amiet, R. K.
1991-01-01
A unified theory for aerodynamics and noise of advanced turboprops is presented. The theory and a computer code developed for evaluation at the shielding benefits that might be expected by an aircraft wing in a wing-mounted propeller installation are presented. Several computed directivity patterns are presented to demonstrate the theory. Recently with the advent of the concept of using the wing of an aircraft for noise shielding, the case of diffraction by a surface in a flow has been given attention. The present analysis is based on the case of diffraction of no flow. By combining a Galilean and a Lorentz transform, the wave equation with a mean flow can be reduced to the ordinary equation. Allowance is also made in the analysis for the case of a swept wing. The same combination of Galilean and Lorentz transforms lead to a problem with no flow but a different sweep. The solution procedures for the cases of leading and trailing edges are basically the same. Two normalizations of the solution are given by the computer program. FORTRAN computer programs are presented with detailed documentation. The output from these programs compares favorably with the results of other investigators.
CFD research and systems in Kawasaki Heavy Industries and its future prospects
NASA Astrophysics Data System (ADS)
Hiraoka, Koichi
1990-09-01
KHI Computational Fluid Dynamics (CFD) system is composed of VP100 computer and 2-D and 3-D Euler and/or Navier-Stokes (NS) analysis softwares. For KHI, this system has become a very powerful aerodynamic tool together with the Kawasaki 1 m Transonic Wind Tunnel. The 2-D Euler/NS software, developed in-house, is fully automated, requires no special skill, and was successfully applied to the design of YXX high lift devices and SST supersonic inlet, etc. The 3-D Euler/NS software, developed under joint research with NAL, has an interactively operated Multi-Block type grid generator and can effectively generate grids around complex airplane shapes. Due to the main memory size limitation, 3-D analysis of relatively simple shape, such as SST wing-body, was computed in-house on VP100, otherwise, such as detailed 3-D analyses of ASUKA and HOPE, were computed on NAL VP400, which is 10 times more powerful than VP100, under KHI-NAL joint research. These analysis results have very good correlation with experimental results. However, the present CFD system is less productive than wind tunnel and has applicability limitations.
Nesvizhskii, Alexey I.
2010-01-01
This manuscript provides a comprehensive review of the peptide and protein identification process using tandem mass spectrometry (MS/MS) data generated in shotgun proteomic experiments. The commonly used methods for assigning peptide sequences to MS/MS spectra are critically discussed and compared, from basic strategies to advanced multi-stage approaches. A particular attention is paid to the problem of false-positive identifications. Existing statistical approaches for assessing the significance of peptide to spectrum matches are surveyed, ranging from single-spectrum approaches such as expectation values to global error rate estimation procedures such as false discovery rates and posterior probabilities. The importance of using auxiliary discriminant information (mass accuracy, peptide separation coordinates, digestion properties, and etc.) is discussed, and advanced computational approaches for joint modeling of multiple sources of information are presented. This review also includes a detailed analysis of the issues affecting the interpretation of data at the protein level, including the amplification of error rates when going from peptide to protein level, and the ambiguities in inferring the identifies of sample proteins in the presence of shared peptides. Commonly used methods for computing protein-level confidence scores are discussed in detail. The review concludes with a discussion of several outstanding computational issues. PMID:20816881
Digital avionics design and reliability analyzer
NASA Technical Reports Server (NTRS)
1981-01-01
The description and specifications for a digital avionics design and reliability analyzer are given. Its basic function is to provide for the simulation and emulation of the various fault-tolerant digital avionic computer designs that are developed. It has been established that hardware emulation at the gate-level will be utilized. The primary benefit of emulation to reliability analysis is the fact that it provides the capability to model a system at a very detailed level. Emulation allows the direct insertion of faults into the system, rather than waiting for actual hardware failures to occur. This allows for controlled and accelerated testing of system reaction to hardware failures. There is a trade study which leads to the decision to specify a two-machine system, including an emulation computer connected to a general-purpose computer. There is also an evaluation of potential computers to serve as the emulation computer.
NASA Astrophysics Data System (ADS)
Grujicic, M.; Arakere, G.; Hariharan, A.; Pandurangan, B.
2012-06-01
The introduction of newer joining technologies like the so-called friction-stir welding (FSW) into automotive engineering entails the knowledge of the joint-material microstructure and properties. Since, the development of vehicles (including military vehicles capable of surviving blast and ballistic impacts) nowadays involves extensive use of the computational engineering analyses (CEA), robust high-fidelity material models are needed for the FSW joints. A two-level material-homogenization procedure is proposed and utilized in this study to help manage computational cost and computer storage requirements for such CEAs. The method utilizes experimental (microstructure, microhardness, tensile testing, and x-ray diffraction) data to construct: (a) the material model for each weld zone and (b) the material model for the entire weld. The procedure is validated by comparing its predictions with the predictions of more detailed but more costly computational analyses.
Stone, John E.; Hallock, Michael J.; Phillips, James C.; Peterson, Joseph R.; Luthey-Schulten, Zaida; Schulten, Klaus
2016-01-01
Many of the continuing scientific advances achieved through computational biology are predicated on the availability of ongoing increases in computational power required for detailed simulation and analysis of cellular processes on biologically-relevant timescales. A critical challenge facing the development of future exascale supercomputer systems is the development of new computing hardware and associated scientific applications that dramatically improve upon the energy efficiency of existing solutions, while providing increased simulation, analysis, and visualization performance. Mobile computing platforms have recently become powerful enough to support interactive molecular visualization tasks that were previously only possible on laptops and workstations, creating future opportunities for their convenient use for meetings, remote collaboration, and as head mounted displays for immersive stereoscopic viewing. We describe early experiences adapting several biomolecular simulation and analysis applications for emerging heterogeneous computing platforms that combine power-efficient system-on-chip multi-core CPUs with high-performance massively parallel GPUs. We present low-cost power monitoring instrumentation that provides sufficient temporal resolution to evaluate the power consumption of individual CPU algorithms and GPU kernels. We compare the performance and energy efficiency of scientific applications running on emerging platforms with results obtained on traditional platforms, identify hardware and algorithmic performance bottlenecks that affect the usability of these platforms, and describe avenues for improving both the hardware and applications in pursuit of the needs of molecular modeling tasks on mobile devices and future exascale computers. PMID:27516922
Atlas2 Cloud: a framework for personal genome analysis in the cloud
2012-01-01
Background Until recently, sequencing has primarily been carried out in large genome centers which have invested heavily in developing the computational infrastructure that enables genomic sequence analysis. The recent advancements in next generation sequencing (NGS) have led to a wide dissemination of sequencing technologies and data, to highly diverse research groups. It is expected that clinical sequencing will become part of diagnostic routines shortly. However, limited accessibility to computational infrastructure and high quality bioinformatic tools, and the demand for personnel skilled in data analysis and interpretation remains a serious bottleneck. To this end, the cloud computing and Software-as-a-Service (SaaS) technologies can help address these issues. Results We successfully enabled the Atlas2 Cloud pipeline for personal genome analysis on two different cloud service platforms: a community cloud via the Genboree Workbench, and a commercial cloud via the Amazon Web Services using Software-as-a-Service model. We report a case study of personal genome analysis using our Atlas2 Genboree pipeline. We also outline a detailed cost structure for running Atlas2 Amazon on whole exome capture data, providing cost projections in terms of storage, compute and I/O when running Atlas2 Amazon on a large data set. Conclusions We find that providing a web interface and an optimized pipeline clearly facilitates usage of cloud computing for personal genome analysis, but for it to be routinely used for large scale projects there needs to be a paradigm shift in the way we develop tools, in standard operating procedures, and in funding mechanisms. PMID:23134663
Atlas2 Cloud: a framework for personal genome analysis in the cloud.
Evani, Uday S; Challis, Danny; Yu, Jin; Jackson, Andrew R; Paithankar, Sameer; Bainbridge, Matthew N; Jakkamsetti, Adinarayana; Pham, Peter; Coarfa, Cristian; Milosavljevic, Aleksandar; Yu, Fuli
2012-01-01
Until recently, sequencing has primarily been carried out in large genome centers which have invested heavily in developing the computational infrastructure that enables genomic sequence analysis. The recent advancements in next generation sequencing (NGS) have led to a wide dissemination of sequencing technologies and data, to highly diverse research groups. It is expected that clinical sequencing will become part of diagnostic routines shortly. However, limited accessibility to computational infrastructure and high quality bioinformatic tools, and the demand for personnel skilled in data analysis and interpretation remains a serious bottleneck. To this end, the cloud computing and Software-as-a-Service (SaaS) technologies can help address these issues. We successfully enabled the Atlas2 Cloud pipeline for personal genome analysis on two different cloud service platforms: a community cloud via the Genboree Workbench, and a commercial cloud via the Amazon Web Services using Software-as-a-Service model. We report a case study of personal genome analysis using our Atlas2 Genboree pipeline. We also outline a detailed cost structure for running Atlas2 Amazon on whole exome capture data, providing cost projections in terms of storage, compute and I/O when running Atlas2 Amazon on a large data set. We find that providing a web interface and an optimized pipeline clearly facilitates usage of cloud computing for personal genome analysis, but for it to be routinely used for large scale projects there needs to be a paradigm shift in the way we develop tools, in standard operating procedures, and in funding mechanisms.
DETAIL VIEW OF THE POWER CONNECTIONS (FRONT) AND COMPUTER PANELS ...
DETAIL VIEW OF THE POWER CONNECTIONS (FRONT) AND COMPUTER PANELS (REAR), ROOM 8A - Cape Canaveral Air Force Station, Launch Complex 39, Mobile Launcher Platforms, Launcher Road, East of Kennedy Parkway North, Cape Canaveral, Brevard County, FL
NASA Astrophysics Data System (ADS)
Li, Xiaoyi; Soteriou, Marios C.
2016-08-01
Recent advances in numerical methods coupled with the substantial enhancements in computing power and the advent of high performance computing have presented first principle, high fidelity simulation as a viable tool in the prediction and analysis of spray atomization processes. The credibility and potential impact of such simulations, however, has been hampered by the relative absence of detailed validation against experimental evidence. The numerical stability and accuracy challenges arising from the need to simulate the high liquid-gas density ratio across the sharp interfaces encountered in these flows are key reasons for this. In this work we challenge this status quo by presenting a numerical model able to deal with these challenges, employing it in simulations of liquid jet in crossflow atomization and performing extensive validation of its results against a carefully executed experiment with detailed measurements in the atomization region. We then proceed to the detailed analysis of the flow physics. The computational model employs the coupled level set and volume of fluid approach to directly capture the spatiotemporal evolution of the liquid-gas interface and the sharp-interface ghost fluid method to stably handle high liquid-air density ratio. Adaptive mesh refinement and Lagrangian droplet models are shown to be viable options for computational cost reduction. Moreover, high performance computing is leveraged to manage the computational cost. The experiment selected for validation eliminates the impact of inlet liquid and gas turbulence and focuses on the impact of the crossflow aerodynamic forces on the atomization physics. Validation is demonstrated by comparing column surface wavelengths, deformation, breakup locations, column trajectories and droplet sizes, velocities, and mass rates for a range of intermediate Weber numbers. Analysis of the physics is performed in terms of the instability and breakup characteristics and the features of downstream flow recirculation, and vortex shedding. Formation of "Λ" shape windward column waves is observed and explained by the combined upward and lateral surface motion. The existence of Rayleigh-Taylor instability as the primary mechanism for the windward column waves is verified for this case by comparing wavelengths from the simulations to those predicted by linear stability analyses. Physical arguments are employed to postulate that the type of instability manifested may be related to conditions such as the gas Weber number and the inlet turbulence level. The decreased column wavelength with increasing Weber number is found to cause enhanced surface stripping and early depletion of liquid core at higher Weber number. A peculiar "three-streak-two-membrane" liquid structure is identified at the lowest Weber number and explained as the consequence of the symmetric recirculation zones behind the jet column. It is found that the vortical flow downstream of the liquid column resembles a von Karman vortex street and that the coupling between the gas flow and droplet transport is weak for the conditions explored.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Li, Xiaoyi, E-mail: lixy2@utrc.utc.com; Soteriou, Marios C.
Recent advances in numerical methods coupled with the substantial enhancements in computing power and the advent of high performance computing have presented first principle, high fidelity simulation as a viable tool in the prediction and analysis of spray atomization processes. The credibility and potential impact of such simulations, however, has been hampered by the relative absence of detailed validation against experimental evidence. The numerical stability and accuracy challenges arising from the need to simulate the high liquid-gas density ratio across the sharp interfaces encountered in these flows are key reasons for this. In this work we challenge this status quomore » by presenting a numerical model able to deal with these challenges, employing it in simulations of liquid jet in crossflow atomization and performing extensive validation of its results against a carefully executed experiment with detailed measurements in the atomization region. We then proceed to the detailed analysis of the flow physics. The computational model employs the coupled level set and volume of fluid approach to directly capture the spatiotemporal evolution of the liquid-gas interface and the sharp-interface ghost fluid method to stably handle high liquid-air density ratio. Adaptive mesh refinement and Lagrangian droplet models are shown to be viable options for computational cost reduction. Moreover, high performance computing is leveraged to manage the computational cost. The experiment selected for validation eliminates the impact of inlet liquid and gas turbulence and focuses on the impact of the crossflow aerodynamic forces on the atomization physics. Validation is demonstrated by comparing column surface wavelengths, deformation, breakup locations, column trajectories and droplet sizes, velocities, and mass rates for a range of intermediate Weber numbers. Analysis of the physics is performed in terms of the instability and breakup characteristics and the features of downstream flow recirculation, and vortex shedding. Formation of “Λ” shape windward column waves is observed and explained by the combined upward and lateral surface motion. The existence of Rayleigh-Taylor instability as the primary mechanism for the windward column waves is verified for this case by comparing wavelengths from the simulations to those predicted by linear stability analyses. Physical arguments are employed to postulate that the type of instability manifested may be related to conditions such as the gas Weber number and the inlet turbulence level. The decreased column wavelength with increasing Weber number is found to cause enhanced surface stripping and early depletion of liquid core at higher Weber number. A peculiar “three-streak-two-membrane” liquid structure is identified at the lowest Weber number and explained as the consequence of the symmetric recirculation zones behind the jet column. It is found that the vortical flow downstream of the liquid column resembles a von Karman vortex street and that the coupling between the gas flow and droplet transport is weak for the conditions explored.« less
NASA Technical Reports Server (NTRS)
Fiske, David R.
2004-01-01
In an earlier paper, Misner (2004, Class. Quant. Grav., 21, S243) presented a novel algorithm for computing the spherical harmonic components of data represented on a cubic grid. I extend Misner s original analysis by making detailed error estimates of the numerical errors accrued by the algorithm, by using symmetry arguments to suggest a more efficient implementation scheme, and by explaining how the algorithm can be applied efficiently on data with explicit reflection symmetries.
Radiological performance assessment for the E-Area Vaults Disposal Facility. Appendices A through M
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cook, J.R.
1994-04-15
These document contains appendices A-M for the performance assessment. They are A: details of models and assumptions, B: computer codes, C: data tabulation, D: geochemical interactions, E: hydrogeology of the Savannah River Site, F: software QA plans, G: completeness review guide, H: performance assessment peer review panel recommendations, I: suspect soil performance analysis, J: sensitivity/uncertainty analysis, K: vault degradation study, L: description of naval reactor waste disposal, M: porflow input file. (GHH)
Thermal/structural Tailoring of Engine Blades (T/STAEBL) User's Manual
NASA Technical Reports Server (NTRS)
Brown, K. W.; Clevenger, W. B.; Arel, J. D.
1994-01-01
The Thermal/Structural Tailoring of Engine Blades (T/STAEBL) system is a family of computer programs executed by a control program. The T/STAEBL system performs design optimizations of cooled, hollow turbine blades and vanes. This manual contains an overview of the system, fundamentals of the data block structure, and detailed descriptions of the inputs required by the optimizer. Additionally, the thermal analysis input requirements are described as well as the inputs required to perform a finite element blade vibrations analysis.
Transonic propulsion system integration analysis at McDonnell Aircraft Company
NASA Technical Reports Server (NTRS)
Cosner, Raymond R.
1989-01-01
The technology of Computational Fluid Dynamics (CFD) is becoming an important tool in the development of aircraft propulsion systems. Two of the most valuable features of CFD are: (1) quick acquisition of flow field data; and (2) complete description of flow fields, allowing detailed investigation of interactions. Current analysis methods complement wind tunnel testing in several ways. Herein, the discussion is focused on CFD methods. However, aircraft design studies need data from both CFD and wind tunnel testing. Each approach complements the other.
Aeroelastic Stability and Response of Rotating Structures
NASA Technical Reports Server (NTRS)
Keith, Theo G., Jr.; Reddy, Tondapu
2004-01-01
A summary of the work performed under NASA grant is presented. More details can be found in the cited references. This grant led to the development of relatively faster aeroelastic analysis methods for predicting flutter and forced response in fans, compressors, and turbines using computational fluid dynamic (CFD) methods. These methods are based on linearized two- and three-dimensional, unsteady, nonlinear aerodynamic equations. During the period of the grant, aeroelastic analysis that includes the effects of uncertainties in the design variables has also been developed.
Simplified diagnostic coding sheet for computerized data storage and analysis in ophthalmology.
Tauber, J; Lahav, M
1987-11-01
A review of currently-available diagnostic coding systems revealed that most are either too abbreviated or too detailed. We have compiled a simplified diagnostic coding sheet based on the International Coding and Diagnosis (ICD-9), which is both complete and easy to use in a general practice. The information is transferred to a computer, which uses the relevant (ICD-9) diagnoses as database and can be retrieved later for display of patients' problems or analysis of clinical data.
Interaction with an Edu-Game: A Detailed Analysis of Student Emotions and Judges' Perceptions
ERIC Educational Resources Information Center
Conati, Cristina; Gutica, Mirela
2016-01-01
We present the results of a study that explored the emotions experienced by students during interaction with an educational game for math (Heroes of Math Island). Starting from emotion frameworks in affective computing and education, we considered a larger set of emotions than in related research. For emotion labeling, we started from a standard…
Communications network design and costing model users manual
NASA Technical Reports Server (NTRS)
Logan, K. P.; Somes, S. S.; Clark, C. A.
1983-01-01
The information and procedures needed to exercise the communications network design and costing model for performing network analysis are presented. Specific procedures are included for executing the model on the NASA Lewis Research Center IBM 3033 computer. The concepts, functions, and data bases relating to the model are described. Model parameters and their format specifications for running the model are detailed.
Designing Worked Examples in Statics to Promote an Expert Stance: Working THRU vs. Working OUT
ERIC Educational Resources Information Center
Calfee, Robert; Stahovich, Thomas
2011-01-01
The purpose of this study was to examine the performance patterns of freshman engineering students as they completed a tutorial on freebody problems that employed a computer-based pen (CBP) to provide feedback and direct learning. A secondary analysis was conducted on detailed performance data for 16 participants from a freshman Engineering course…
Predictive and mechanistic multivariate linear regression models for reaction development
Santiago, Celine B.; Guo, Jing-Yao
2018-01-01
Multivariate Linear Regression (MLR) models utilizing computationally-derived and empirically-derived physical organic molecular descriptors are described in this review. Several reports demonstrating the effectiveness of this methodological approach towards reaction optimization and mechanistic interrogation are discussed. A detailed protocol to access quantitative and predictive MLR models is provided as a guide for model development and parameter analysis. PMID:29719711
ERIC Educational Resources Information Center
Technomics, Inc., McLean, VA.
This publication is Attachment 9 of a set of 16 computer listed QPCB task sorts, by career level, for the entire Hospital Corps and Dental Technician fields. Statistical data are presented in tabular form for a detailed listing of job duties in medical laboratory technology. (BT)
Timber products output and timber harvests in Alaska: projections for 1989-2010.
David J. Brooks; Richard W. Haynes
1990-01-01
Projections of Alaska timber products output and timber harvest by owner were developed by using a detailed, trend-based analysis. Historical data for 1965-88 were the basis for projections for 1989-2010. Projections of timber products output for each major product (export logs, sawn wood, and market pulp) were used to compute the derived demand for timber. The...
ERIC Educational Resources Information Center
d'Alessio, Matthew; Lundquist, Loraine
2013-01-01
Each year our physical science class for pre-service elementary teachers launches water-powered rockets based on the activity from NASA. We analyze the rocket flight using data from frame-by-frame video analysis of the launches. Before developing the methods presented in this paper, we noticed our students were mired in calculation details while…
Issues, concerns, and initial implementation results for space based telerobotic control
NASA Technical Reports Server (NTRS)
Lawrence, D. A.; Chapel, J. D.; Depkovich, T. M.
1987-01-01
Telerobotic control for space based assembly and servicing tasks presents many problems in system design. Traditional force reflection teleoperation schemes are not well suited to this application, and the approaches to compliance control via computer algorithms have yet to see significant testing and comparison. These observations are discussed in detail, as well as the concerns they raise for imminent design and testing of space robotic systems. As an example of the detailed technical work yet to be done before such systems can be specified, a particular approach to providing manipulator compliance is examined experimentally and through modeling and analysis. This yields some initial insight into the limitations and design trade-offs for this class of manipulator control schemes. Implications of this investigation for space based telerobots are discussed in detail.
Comparative study of solar optics for paraboloidal concentrators
NASA Technical Reports Server (NTRS)
Wen, L.; Poon, P.; Carley, W.; Huang, L.
1979-01-01
Different analytical methods for computing the flux distribution on the focal plane of a paraboloidal solar concentrator are reviewed. An analytical solution in algebraic form is also derived for an idealized model. The effects resulting from using different assumptions in the definition of optical parameters used in these methodologies are compared and discussed in detail. These parameters include solar irradiance distribution (limb darkening and circumsolar), reflector surface specular spreading, surface slope error, and concentrator pointing inaccuracy. The type of computational method selected for use depends on the maturity of the design and the data available at the time the analysis is made.
SPECIFIC HEAT DATA ANALYSIS PROGRAM FOR THE IBM 704 DIGITAL COMPUTER
DOE Office of Scientific and Technical Information (OSTI.GOV)
Roach, P.R.
1962-01-01
A computer program was developed to calculate the specific heat of a substance in the temperature range from 0.3 to 4.2 deg K, given temperature calibration data for a carbon resistance thermometer, experimental temperature drift, and heating period data. The speciftc heats calculated from these data are then fitted by a curve by the methods of least squares and the specific heats are corrected for the effect of the curvature of the data. The method, operation, program details, and program stops are discussed. A program listing is included. (M.C.G.)
NASA Technical Reports Server (NTRS)
Foss, W. E., Jr.
1981-01-01
A computer technique to determine the mission radius and maneuverability characteristics of combat aircraft was developed. The technique was used to determine critical operational requirements and the areas in which research programs would be expected to yield the most beneficial results. In turn, the results of research efforts were evaluated in terms of aircraft performance on selected mission segments and for complete mission profiles. Extensive use of the technique in evaluation studies indicates that the calculated performance is essentially the same as that obtained by the proprietary programs in use throughout the aircraft industry.
NASA Astrophysics Data System (ADS)
Palchykov, Vitalii A.; Zarovnaya, Iryna S.; Tretiakov, Serhii V.; Reshetnyak, Alyona V.; Omelchenko, Iryna V.; Shishkin, Oleg V.; Okovytyy, Sergiy I.
2018-04-01
Aminolysis of 3,4-epoxysulfolane in aqueous media leads to a very complex mixture of products with unresolved stereochemistry. Herein, we report a detailed theoretical and experimental mechanistic investigation of this reaction along with extensive spectroscopic characterization of the resulting amino alcohols, using 1D and 2D NMR techniques (1H, 13C, NOE, NOESY, COSY, HSQC, HMBC) as well as XRD analysis. In addition to simple amines such as ammonia and benzylamine, our study also employed the more sterically hindered endo-bicyclo[2.2.1]hept-5-en-2-ylmethanamine. The mechanism of the aminolysis of 3,4-epoxysulfolane by aqueous ammonia was studied in more detail using quantum chemical calculations at the M06-2X/6-31++G** level of theory. The computational results led us to conclude that the most probable way of initial epoxide transformation is base-catalyzed rearrangement to a corresponding allylic alcohol. Subsequent formation of vicinal amino alcohols and diols proceeds via addition of ammonia or hydroxy-anions to activated double Cdbnd C bond with some preference of a cis-attack. Detailed analytical data obtained in the course of our work will be useful for the stereochemical identification of new sulfolane derivatives.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
1978-05-01
The Transient Reactor Analysis Code (TRAC) is being developed at the Los Alamos Scientific Laboratory (LASL) to provide an advanced ''best estimate'' predictive capability for the analysis of postulated accidents in light water reactors (LWRs). TRAC-Pl provides this analysis capability for pressurized water reactors (PWRs) and for a wide variety of thermal-hydraulic experimental facilities. It features a three-dimensional treatment of the pressure vessel and associated internals; two-phase nonequilibrium hydrodynamics models; flow-regime-dependent constitutive equation treatment; reflood tracking capability for both bottom flood and falling film quench fronts; and consistent treatment of entire accident sequences including the generation of consistent initial conditions.more » The TRAC-Pl User's Manual is composed of two separate volumes. Volume I gives a description of the thermal-hydraulic models and numerical solution methods used in the code. Detailed programming and user information is also provided. Volume II presents the results of the developmental verification calculations.« less
Damage Tolerance of Sandwich Plates With Debonded Face Sheets
NASA Technical Reports Server (NTRS)
Sankar, Bhavani V.
2001-01-01
A nonlinear finite element analysis was performed to simulate axial compression of sandwich beams with debonded face sheets. The load - end-shortening diagrams were generated for a variety of specimens used in a previous experimental study. The energy release rate at the crack tip was computed using the J-integral, and plotted as a function of the load. A detailed stress analysis was performed and the critical stresses in the face sheet and the core were computed. The core was also modeled as an isotropic elastic-perfectly plastic material and a nonlinear post buckling analysis was performed. A Graeco-Latin factorial plan was used to study the effects of debond length, face sheet and core thicknesses, and core density on the load carrying capacity of the sandwich composite. It has been found that a linear buckling analysis is inadequate in determining the maximum load a debonded sandwich beam can carry. A nonlinear post-buckling analysis combined with an elastoplastic model of the core is required to predict the compression behavior of debonded sandwich beams.
Simplified and refined structural modeling for economical flutter analysis and design
NASA Technical Reports Server (NTRS)
Ricketts, R. H.; Sobieszczanski, J.
1977-01-01
A coordinated use of two finite-element models of different levels of refinement is presented to reduce the computer cost of the repetitive flutter analysis commonly encountered in structural resizing to meet flutter requirements. One model, termed a refined model (RM), represents a high degree of detail needed for strength-sizing and flutter analysis of an airframe. The other model, called a simplified model (SM), has a relatively much smaller number of elements and degrees-of-freedom. A systematic method of deriving an SM from a given RM is described. The method consists of judgmental and numerical operations to make the stiffness and mass of the SM elements equivalent to the corresponding substructures of RM. The structural data are automatically transferred between the two models. The bulk of analysis is performed on the SM with periodical verifications carried out by analysis of the RM. In a numerical example of a supersonic cruise aircraft with an arrow wing, this approach permitted substantial savings in computer costs and acceleration of the job turn-around.
SRM Internal Flow Test and Computational Fluid Dynamic Analysis. Volume 1; Major Task Summaries
NASA Technical Reports Server (NTRS)
Whitesides, R. Harold; Dill, Richard A.; Purinton, David C.
1995-01-01
During the four year period of performance for NASA contract, NASB-39095, ERC has performed a wide variety of tasks to support the design and continued development of new and existing solid rocket motors and the resolution of operational problems associated with existing solid rocket motor's at NASA MSFC. This report summarizes the support provided to NASA MSFC during the contractual period of performance. The report is divided into three main sections. The first section presents summaries for the major tasks performed. These tasks are grouped into three major categories: full scale motor analysis, subscale motor analysis and cold flow analysis. The second section includes summaries describing the computational fluid dynamics (CFD) tasks performed. The third section, the appendices of the report, presents detailed descriptions of the analysis efforts as well as published papers, memoranda and final reports associated with specific tasks. These appendices are referenced in the summaries. The subsection numbers for the three sections correspond to the same topics for direct cross referencing.
SciDAC-Data, A Project to Enabling Data Driven Modeling of Exascale Computing
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mubarak, M.; Ding, P.; Aliaga, L.
The SciDAC-Data project is a DOE funded initiative to analyze and exploit two decades of information and analytics that have been collected by the Fermilab Data Center on the organization, movement, and consumption of High Energy Physics data. The project will analyze the analysis patterns and data organization that have been used by the NOvA, MicroBooNE, MINERvA and other experiments, to develop realistic models of HEP analysis workflows and data processing. The SciDAC-Data project aims to provide both realistic input vectors and corresponding output data that can be used to optimize and validate simulations of HEP analysis. These simulations aremore » designed to address questions of data handling, cache optimization and workflow structures that are the prerequisites for modern HEP analysis chains to be mapped and optimized to run on the next generation of leadership class exascale computing facilities. We will address the use of the SciDAC-Data distributions acquired from Fermilab Data Center’s analysis workflows and corresponding to around 71,000 HEP jobs, as the input to detailed queuing simulations that model the expected data consumption and caching behaviors of the work running in HPC environments. In particular we describe in detail how the Sequential Access via Metadata (SAM) data handling system in combination with the dCache/Enstore based data archive facilities have been analyzed to develop the radically different models of the analysis of HEP data. We present how the simulation may be used to analyze the impact of design choices in archive facilities.« less
Reduction and Uncertainty Analysis of Chemical Mechanisms Based on Local and Global Sensitivities
NASA Astrophysics Data System (ADS)
Esposito, Gaetano
Numerical simulations of critical reacting flow phenomena in hypersonic propulsion devices require accurate representation of finite-rate chemical kinetics. The chemical kinetic models available for hydrocarbon fuel combustion are rather large, involving hundreds of species and thousands of reactions. As a consequence, they cannot be used in multi-dimensional computational fluid dynamic calculations in the foreseeable future due to the prohibitive computational cost. In addition to the computational difficulties, it is also known that some fundamental chemical kinetic parameters of detailed models have significant level of uncertainty due to limited experimental data available and to poor understanding of interactions among kinetic parameters. In the present investigation, local and global sensitivity analysis techniques are employed to develop a systematic approach of reducing and analyzing detailed chemical kinetic models. Unlike previous studies in which skeletal model reduction was based on the separate analysis of simple cases, in this work a novel strategy based on Principal Component Analysis of local sensitivity values is presented. This new approach is capable of simultaneously taking into account all the relevant canonical combustion configurations over different composition, temperature and pressure conditions. Moreover, the procedure developed in this work represents the first documented inclusion of non-premixed extinction phenomena, which is of great relevance in hypersonic combustors, in an automated reduction algorithm. The application of the skeletal reduction to a detailed kinetic model consisting of 111 species in 784 reactions is demonstrated. The resulting reduced skeletal model of 37--38 species showed that the global ignition/propagation/extinction phenomena of ethylene-air mixtures can be predicted within an accuracy of 2% of the full detailed model. The problems of both understanding non-linear interactions between kinetic parameters and identifying sources of uncertainty affecting relevant reaction pathways are usually addressed by resorting to Global Sensitivity Analysis (GSA) techniques. In particular, the most sensitive reactions controlling combustion phenomena are first identified using the Morris Method and then analyzed under the Random Sampling -- High Dimensional Model Representation (RS-HDMR) framework. The HDMR decomposition shows that 10% of the variance seen in the extinction strain rate of non-premixed flames is due to second-order effects between parameters, whereas the maximum concentration of acetylene, a key soot precursor, is affected by mostly only first-order contributions. Moreover, the analysis of the global sensitivity indices demonstrates that improving the accuracy of the reaction rates including the vinyl radical, C2H3, can drastically reduce the uncertainty of predicting targeted flame properties. Finally, the back-propagation of the experimental uncertainty of the extinction strain rate to the parameter space is also performed. This exercise, achieved by recycling the numerical solutions of the RS-HDMR, shows that some regions of the parameter space have a high probability of reproducing the experimental value of the extinction strain rate between its own uncertainty bounds. Therefore this study demonstrates that the uncertainty analysis of bulk flame properties can effectively provide information on relevant chemical reactions.
NASA Astrophysics Data System (ADS)
La Barbera, Selina; Vincent, Adrien F.; Vuillaume, Dominique; Querlioz, Damien; Alibart, Fabien
2016-12-01
Bio-inspired computing represents today a major challenge at different levels ranging from material science for the design of innovative devices and circuits to computer science for the understanding of the key features required for processing of natural data. In this paper, we propose a detail analysis of resistive switching dynamics in electrochemical metallization cells for synaptic plasticity implementation. We show how filament stability associated to joule effect during switching can be used to emulate key synaptic features such as short term to long term plasticity transition and spike timing dependent plasticity. Furthermore, an interplay between these different synaptic features is demonstrated for object motion detection in a spike-based neuromorphic circuit. System level simulation presents robust learning and promising synaptic operation paving the way to complex bio-inspired computing systems composed of innovative memory devices.
Web-Based Integrated Research Environment for Aerodynamic Analyses and Design
NASA Astrophysics Data System (ADS)
Ahn, Jae Wan; Kim, Jin-Ho; Kim, Chongam; Cho, Jung-Hyun; Hur, Cinyoung; Kim, Yoonhee; Kang, Sang-Hyun; Kim, Byungsoo; Moon, Jong Bae; Cho, Kum Won
e-AIRS[1,2], an abbreviation of ‘e-Science Aerospace Integrated Research System,' is a virtual organization designed to support aerodynamic flow analyses in aerospace engineering using the e-Science environment. As the first step toward a virtual aerospace engineering organization, e-AIRS intends to give a full support of aerodynamic research process. Currently, e-AIRS can handle both the computational and experimental aerodynamic research on the e-Science infrastructure. In detail, users can conduct a full CFD (Computational Fluid Dynamics) research process, request wind tunnel experiment, perform comparative analysis between computational prediction and experimental measurement, and finally, collaborate with other researchers using the web portal. The present paper describes those services and the internal architecture of the e-AIRS system.
Extreme-Scale De Novo Genome Assembly
DOE Office of Scientific and Technical Information (OSTI.GOV)
Georganas, Evangelos; Hofmeyr, Steven; Egan, Rob
De novo whole genome assembly reconstructs genomic sequence from short, overlapping, and potentially erroneous DNA segments and is one of the most important computations in modern genomics. This work presents HipMER, a high-quality end-to-end de novo assembler designed for extreme scale analysis, via efficient parallelization of the Meraculous code. Genome assembly software has many components, each of which stresses different components of a computer system. This chapter explains the computational challenges involved in each step of the HipMer pipeline, the key distributed data structures, and communication costs in detail. We present performance results of assembling the human genome and themore » large hexaploid wheat genome on large supercomputers up to tens of thousands of cores.« less
Broadband computation of the scattering coefficients of infinite arbitrary cylinders.
Blanchard, Cédric; Guizal, Brahim; Felbacq, Didier
2012-07-01
We employ a time-domain method to compute the near field on a contour enclosing infinitely long cylinders of arbitrary cross section and constitution. We therefore recover the cylindrical Hankel coefficients of the expansion of the field outside the circumscribed circle of the structure. The recovered coefficients enable the wideband analysis of complex systems, e.g., the determination of the radar cross section becomes straightforward. The prescription for constructing such a numerical tool is provided in great detail. The method is validated by computing the scattering coefficients for a homogeneous circular cylinder illuminated by a plane wave, a problem for which an analytical solution exists. Finally, some radiation properties of an optical antenna are examined by employing the proposed technique.
Use of computational fluid dynamics in respiratory medicine.
Fernández Tena, Ana; Casan Clarà, Pere
2015-06-01
Computational Fluid Dynamics (CFD) is a computer-based tool for simulating fluid movement. The main advantages of CFD over other fluid mechanics studies include: substantial savings in time and cost, the analysis of systems or conditions that are very difficult to simulate experimentally (as is the case of the airways), and a practically unlimited level of detail. We used the Ansys-Fluent CFD program to develop a conducting airway model to simulate different inspiratory flow rates and the deposition of inhaled particles of varying diameters, obtaining results consistent with those reported in the literature using other procedures. We hope this approach will enable clinicians to further individualize the treatment of different respiratory diseases. Copyright © 2014 SEPAR. Published by Elsevier Espana. All rights reserved.
Chen, Wen Hao; Yang, Sam Y. S.; Xiao, Ti Qiao; Mayo, Sherry C.; Wang, Yu Dan; Wang, Hai Peng
2014-01-01
Quantifying three-dimensional spatial distributions of pores and material compositions in samples is a key materials characterization challenge, particularly in samples where compositions are distributed across a range of length scales, and where such compositions have similar X-ray absorption properties, such as in coal. Consequently, obtaining detailed information within sub-regions of a multi-length-scale sample by conventional approaches may not provide the resolution and level of detail one might desire. Herein, an approach for quantitative high-definition determination of material compositions from X-ray local computed tomography combined with a data-constrained modelling method is proposed. The approach is capable of dramatically improving the spatial resolution and enabling finer details within a region of interest of a sample larger than the field of view to be revealed than by using conventional techniques. A coal sample containing distributions of porosity and several mineral compositions is employed to demonstrate the approach. The optimal experimental parameters are pre-analyzed. The quantitative results demonstrated that the approach can reveal significantly finer details of compositional distributions in the sample region of interest. The elevated spatial resolution is crucial for coal-bed methane reservoir evaluation and understanding the transformation of the minerals during coal processing. The method is generic and can be applied for three-dimensional compositional characterization of other materials. PMID:24763649
Interaction entropy for protein-protein binding
NASA Astrophysics Data System (ADS)
Sun, Zhaoxi; Yan, Yu N.; Yang, Maoyou; Zhang, John Z. H.
2017-03-01
Protein-protein interactions are at the heart of signal transduction and are central to the function of protein machine in biology. The highly specific protein-protein binding is quantitatively characterized by the binding free energy whose accurate calculation from the first principle is a grand challenge in computational biology. In this paper, we show how the interaction entropy approach, which was recently proposed for protein-ligand binding free energy calculation, can be applied to computing the entropic contribution to the protein-protein binding free energy. Explicit theoretical derivation of the interaction entropy approach for protein-protein interaction system is given in detail from the basic definition. Extensive computational studies for a dozen realistic protein-protein interaction systems are carried out using the present approach and comparisons of the results for these protein-protein systems with those from the standard normal mode method are presented. Analysis of the present method for application in protein-protein binding as well as the limitation of the method in numerical computation is discussed. Our study and analysis of the results provided useful information for extracting correct entropic contribution in protein-protein binding from molecular dynamics simulations.
Interaction entropy for protein-protein binding.
Sun, Zhaoxi; Yan, Yu N; Yang, Maoyou; Zhang, John Z H
2017-03-28
Protein-protein interactions are at the heart of signal transduction and are central to the function of protein machine in biology. The highly specific protein-protein binding is quantitatively characterized by the binding free energy whose accurate calculation from the first principle is a grand challenge in computational biology. In this paper, we show how the interactionentropy approach, which was recently proposed for protein-ligand binding free energy calculation, can be applied to computing the entropic contribution to the protein-protein binding free energy. Explicit theoretical derivation of the interactionentropy approach for protein-protein interaction system is given in detail from the basic definition. Extensive computational studies for a dozen realistic protein-protein interaction systems are carried out using the present approach and comparisons of the results for these protein-protein systems with those from the standard normal mode method are presented. Analysis of the present method for application in protein-protein binding as well as the limitation of the method in numerical computation is discussed. Our study and analysis of the results provided useful information for extracting correct entropic contribution in protein-protein binding from molecular dynamics simulations.
Exploiting analytics techniques in CMS computing monitoring
NASA Astrophysics Data System (ADS)
Bonacorsi, D.; Kuznetsov, V.; Magini, N.; Repečka, A.; Vaandering, E.
2017-10-01
The CMS experiment has collected an enormous volume of metadata about its computing operations in its monitoring systems, describing its experience in operating all of the CMS workflows on all of the Worldwide LHC Computing Grid Tiers. Data mining efforts into all these information have rarely been done, but are of crucial importance for a better understanding of how CMS did successful operations, and to reach an adequate and adaptive modelling of the CMS operations, in order to allow detailed optimizations and eventually a prediction of system behaviours. These data are now streamed into the CERN Hadoop data cluster for further analysis. Specific sets of information (e.g. data on how many replicas of datasets CMS wrote on disks at WLCG Tiers, data on which datasets were primarily requested for analysis, etc) were collected on Hadoop and processed with MapReduce applications profiting of the parallelization on the Hadoop cluster. We present the implementation of new monitoring applications on Hadoop, and discuss the new possibilities in CMS computing monitoring introduced with the ability to quickly process big data sets from mulltiple sources, looking forward to a predictive modeling of the system.
A Review of Methods for Analysis of the Expected Value of Information.
Heath, Anna; Manolopoulou, Ioanna; Baio, Gianluca
2017-10-01
In recent years, value-of-information analysis has become more widespread in health economic evaluations, specifically as a tool to guide further research and perform probabilistic sensitivity analysis. This is partly due to methodological advancements allowing for the fast computation of a typical summary known as the expected value of partial perfect information (EVPPI). A recent review discussed some approximation methods for calculating the EVPPI, but as the research has been active over the intervening years, that review does not discuss some key estimation methods. Therefore, this paper presents a comprehensive review of these new methods. We begin by providing the technical details of these computation methods. We then present two case studies in order to compare the estimation performance of these new methods. We conclude that a method based on nonparametric regression offers the best method for calculating the EVPPI in terms of accuracy, computational time, and ease of implementation. This means that the EVPPI can now be used practically in health economic evaluations, especially as all the methods are developed in parallel with R functions and a web app to aid practitioners.
Thali, Michael J; Taubenreuther, Ulrike; Karolczak, Marek; Braun, Marcel; Brueschweiler, Walter; Kalender, Willi A; Dirnhofer, Richard
2003-11-01
When a knife is stabbed in bone, it leaves an impression in the bone. The characteristics (shape, size, etc.) may indicate the type of tool used to produce the patterned injury in bone. Until now it has been impossible in forensic sciences to document such damage precisely and non-destructively. Micro-computed tomography (Micro-CT) offers an opportunity to analyze patterned injuries of tool marks made in bone. Using high-resolution Micro-CT and computer software, detailed analysis of three-dimensional (3D) architecture has recently become feasible and allows microstructural 3D bone information to be collected. With adequate viewing software, data from 2D slice of an arbitrary plane can be extracted from 3D datasets. Using such software as a "digital virtual knife," the examiner can interactively section and analyze the 3D sample. Analysis of the bone injury revealed that Micro-CT provides an opportunity to correlate a bone injury to an injury-causing instrument. Even broken knife tips can be graphically and non-destructively assigned to a suspect weapon.
Steady shape analysis of tomographic pumping tests for characterization of aquifer heterogeneities
Bohling, Geoffrey C.; Zhan, Xiaoyong; Butler, James J.; Zheng, Li
2002-01-01
Hydraulic tomography, a procedure involving the performance of a suite of pumping tests in a tomographic format, provides information about variations in hydraulic conductivity at a level of detail not obtainable with traditional well tests. However, analysis of transient data from such a suite of pumping tests represents a substantial computational burden. Although steady state responses can be analyzed to reduce this computational burden significantly, the time required to reach steady state will often be too long for practical applications of the tomography concept. In addition, uncertainty regarding the mechanisms driving the system to steady state can propagate to adversely impact the resulting hydraulic conductivity estimates. These disadvantages of a steady state analysis can be overcome by exploiting the simplifications possible under the steady shape flow regime. At steady shape conditions, drawdown varies with time but the hydraulic gradient does not. Thus transient data can be analyzed with the computational efficiency of a steady state model. In this study, we demonstrate the value of the steady shape concept for inversion of hydraulic tomography data and investigate its robustness with respect to improperly specified boundary conditions.
NASA Technical Reports Server (NTRS)
Barth, Tim; Zapata, Edgar; Benjamin, Perakath; Graul, Mike; Jones, Doug
2005-01-01
Portfolio Analysis Tool (PAT) is a Web-based, client/server computer program that helps managers of multiple projects funded by different customers to make decisions regarding investments in those projects. PAT facilitates analysis on a macroscopic level, without distraction by parochial concerns or tactical details of individual projects, so that managers decisions can reflect the broad strategy of their organization. PAT is accessible via almost any Web-browser software. Experts in specific projects can contribute to a broad database that managers can use in analyzing the costs and benefits of all projects, but do not have access for modifying criteria for analyzing projects: access for modifying criteria is limited to managers according to levels of administrative privilege. PAT affords flexibility for modifying criteria for particular "focus areas" so as to enable standardization of criteria among similar projects, thereby making it possible to improve assessments without need to rewrite computer code or to rehire experts, and thereby further reducing the cost of maintaining and upgrading computer code. Information in the PAT database and results of PAT analyses can be incorporated into a variety of ready-made or customizable tabular or graphical displays.
Atomistic Modeling of Pd Site Preference in NiTi
NASA Technical Reports Server (NTRS)
Bozzolo, Guillermo; Noebe, Ronald D.; Mosca, Hugo O.
2004-01-01
An analysis of the site subsitution behavior of Pd in NiTi was performed using the BFS method for alloys. Through a combination of Monte Carlo simulations and detailed atom-by-atom energetic analyses of various computational cells, representing compositions of NiTi with up to 10 at% Pd, a detailed understanding of site occupancy of Pd in NiTi was revealed. Pd subsituted at the expense of Ni in a NiTi alloy will prefer the Ni-sites. Pd subsituted at the expense of Ti shows a very weak preference for Ti-sites that diminishes as the amount of Pd in the alloy increases and as the temperature increases.
Duval, J.S.
1987-01-01
A detailed aerial gamma-ray spectrometric survey of the Jabal Ashirah area in the southeastern Arabian Shield has been analyzed using computer-classification algorithms. The analysis resulted in maps that show radiometric map units and gamma-ray anomalies indicating the presence of possible concentrations of potassium and uranium. The radiometric-unit map was interpreted to 'produce a simplified radiolithic map that was correlated with the mapped geology. The gamma-ray data show uranium anomalies that coincide with a tin-bearing granite, but known gold and nickel mineralization do not have any associated gamma-ray signatures.
NASA Astrophysics Data System (ADS)
Titov, A. G.; Gordov, E. P.; Okladnikov, I.; Shulgina, T. M.
2011-12-01
Analysis of recent climatic and environmental changes in Siberia performed on the basis of the CLEARS (CLimate and Environment Analysis and Research System) information-computational system is presented. The system was developed using the specialized software framework for rapid development of thematic information-computational systems based on Web-GIS technologies. It comprises structured environmental datasets, computational kernel, specialized web portal implementing web mapping application logic, and graphical user interface. Functional capabilities of the system include a number of procedures for mathematical and statistical analysis, data processing and visualization. At present a number of georeferenced datasets is available for processing including two editions of NCEP/NCAR Reanalysis, JMA/CRIEPI JRA-25 Reanalysis, ECMWF ERA-40 and ERA Interim Reanalysis, meteorological observation data for the territory of the former USSR, and others. Firstly, using functionality of the computational kernel employing approved statistical methods it was shown that the most reliable spatio-temporal characteristics of surface temperature and precipitation in Siberia in the second half of 20th and beginning of 21st centuries are provided by ERA-40/ERA Interim Reanalysis and APHRODITE JMA Reanalysis, respectively. Namely those Reanalyses are statistically consistent with reliable in situ meteorological observations. Analysis of surface temperature and precipitation dynamics for the territory of Siberia performed on the base of the developed information-computational system reveals fine spatial and temporal details in heterogeneous patterns obtained for the region earlier. Dynamics of bioclimatic indices determining climate change impact on structure and functioning of regional vegetation cover was investigated as well. Analysis shows significant positive trends of growing season length accompanied by statistically significant increase of sum of growing degree days and total annual precipitation over the south of Western Siberia. In particular, we conclude that analysis of trends of growing season length, sum of growing degree-days and total precipitation during the growing season reveals a tendency to an increase of vegetation ecosystems productivity across the south of Western Siberia (55°-60°N, 59°-84°E) in the past several decades. The developed system functionality providing instruments for comparison of modeling and observational data and for reliable climatological analysis allowed us to obtain new results characterizing regional manifestations of global change. It should be added that each analysis performed using the system leads also to generation of the archive of spatio-temporal data fields ready for subsequent usage by other specialists. In particular, the archive of bioclimatic indices obtained will allow performing further detailed studies of interrelations between local climate and vegetation cover changes, including changes of carbon uptake related to variations of types and amount of vegetation and spatial shift of vegetation zones. This work is partially supported by RFBR grants #10-07-00547 and #11-05-01190-a, SB RAS Basic Program Projects 4.31.1.5 and 4.31.2.7.
Computational mass spectrometry for small molecules
2013-01-01
The identification of small molecules from mass spectrometry (MS) data remains a major challenge in the interpretation of MS data. This review covers the computational aspects of identifying small molecules, from the identification of a compound searching a reference spectral library, to the structural elucidation of unknowns. In detail, we describe the basic principles and pitfalls of searching mass spectral reference libraries. Determining the molecular formula of the compound can serve as a basis for subsequent structural elucidation; consequently, we cover different methods for molecular formula identification, focussing on isotope pattern analysis. We then discuss automated methods to deal with mass spectra of compounds that are not present in spectral libraries, and provide an insight into de novo analysis of fragmentation spectra using fragmentation trees. In addition, this review shortly covers the reconstruction of metabolic networks using MS data. Finally, we list available software for different steps of the analysis pipeline. PMID:23453222
NASA Technical Reports Server (NTRS)
Giles, G. L.; Wallas, M.
1981-01-01
User documentation is presented for a computer program which considers the nonlinear properties of the strain isolator pad (SIP) in the static stress analysis of the shuttle thermal protection system. This program is generalized to handle an arbitrary SIP footprint including cutouts for instrumentation and filler bar. Multiple SIP surfaces are defined to model tiles in unique locations such as leading edges, intersections, and penetrations. The nonlinearity of the SIP is characterized by experimental stress displacement data for both normal and shear behavior. Stresses in the SIP are calculated using a Newton iteration procedure to determine the six rigid body displacements of the tile which develop reaction forces in the SIP to equilibrate the externally applied loads. This user documentation gives an overview of the analysis capabilities, a detailed description of required input data and an example to illustrate use of the program.
Comparative Investigation of Normal Modes and Molecular Dynamics of Hepatitis C NS5B Protein
NASA Astrophysics Data System (ADS)
Asafi, M. S.; Yildirim, A.; Tekpinar, M.
2016-04-01
Understanding dynamics of proteins has many practical implications in terms of finding a cure for many protein related diseases. Normal mode analysis and molecular dynamics methods are widely used physics-based computational methods for investigating dynamics of proteins. In this work, we studied dynamics of Hepatitis C NS5B protein with molecular dynamics and normal mode analysis. Principal components obtained from a 100 nanoseconds molecular dynamics simulation show good overlaps with normal modes calculated with a coarse-grained elastic network model. Coarse-grained normal mode analysis takes at least an order of magnitude shorter time. Encouraged by this good overlaps and short computation times, we analyzed further low frequency normal modes of Hepatitis C NS5B. Motion directions and average spatial fluctuations have been analyzed in detail. Finally, biological implications of these motions in drug design efforts against Hepatitis C infections have been elaborated.
Computational complexity of the landscape II-Cosmological considerations
NASA Astrophysics Data System (ADS)
Denef, Frederik; Douglas, Michael R.; Greene, Brian; Zukowski, Claire
2018-05-01
We propose a new approach for multiverse analysis based on computational complexity, which leads to a new family of "computational" measure factors. By defining a cosmology as a space-time containing a vacuum with specified properties (for example small cosmological constant) together with rules for how time evolution will produce the vacuum, we can associate global time in a multiverse with clock time on a supercomputer which simulates it. We argue for a principle of "limited computational complexity" governing early universe dynamics as simulated by this supercomputer, which translates to a global measure for regulating the infinities of eternal inflation. The rules for time evolution can be thought of as a search algorithm, whose details should be constrained by a stronger principle of "minimal computational complexity". Unlike previously studied global measures, ours avoids standard equilibrium considerations and the well-known problems of Boltzmann Brains and the youngness paradox. We also give various definitions of the computational complexity of a cosmology, and argue that there are only a few natural complexity classes.
3D analysis of macrosegregation in twin-roll cast AA3003 alloy
DOE Office of Scientific and Technical Information (OSTI.GOV)
Šlapáková, Michaela, E-mail: slapakova@karlov.mff.
Twin-roll cast aluminium alloys have a high potential for industrial applications. However, one of the drawbacks of such materials is an inhomogeneous structure generated by macrosegregation, which appears under certain conditions in the center of sheets during solidification. Segregations in AA3003 alloy form as manganese, iron and silicon rich channels spread in the rolling direction. Their spatial distribution was successfully detected by X-ray computed tomography. Scanning electron microscopy was used for a detailed observation of microstructure, morphology and chemical analysis of the segregation. - Highlights: •Macrosegregations in twin-roll cast sheets stretch along the rolling direction. •X-ray computed tomography is anmore » effective tool for visualization of the segregation. •The segregations copy the shape of grain boundaries.« less
Versluys, Thomas M M; Skylark, William J
2017-10-01
Leg-to-body ratio (LBR) predicts evolutionary fitness, and is therefore expected to influence bodily attractiveness. Previous investigations of LBR attractiveness have used a wide variety of stimuli, including line drawings, silhouettes, and computer-generated images based on anthropometric data. In two studies, community samples of heterosexual women from the USA rated the attractiveness of male figures presented as silhouettes and as detailed computer-generated images with three different skin tones (white, black, and an artificial grey). The effects of LBR depended on the image format. In particular, a curve-fitting analysis indicated that the optimally-attractive LBR for silhouettes was fractionally below the baseline, whereas the optima for more detailed computer-generated images was approximately 0.5 s.d. above the baseline and was similar for all three skin-tones. In addition, the participants' sensitivity to changes in the LBR was lowest for the silhouettes and highest for the grey figures. Our results add to evidence that the most attractive LBR is close to, but slightly above, the population mean, and caution that the effects of limb proportions on attractiveness depend on the ecological validity of the figures.
Conceptual design of a Bitter-magnet toroidal-field system for the ZEPHYR Ignition Test Reactor
DOE Office of Scientific and Technical Information (OSTI.GOV)
Williams, J.E.C.; Becker, H.D.; Bobrov, E.S.
1981-05-01
The following problems are described and discussed: (1) parametric studies - these studies examine among other things the interdependence of throat stresses, plasma parameters (margins of ignition) and stored energy. The latter is a measure of cost and is minimized in the present design; (2) magnet configuration - the shape of the plates are considered in detail including standard turns, turns located at beam ports, diagnostic and closure flanges; (3) ripple computation - this section describes the codes by which ripple is computed; (4) field diffusion and nuclear heating - the effect of magnetic field diffusion on heating is consideredmore » along with neutron heating. Current, field and temperature profiles are computed; (5) finite element analysis - the two and three dimensional finite element codes are described and the results discussed in detail; (6) structures engineering - this considers the calculation of critical stresses due to toroidal and overturning forces and discusses the method of constraint of these forces. The Materials Testing Program is also discussed; (7) fabrication - the methods available for the manufacture of the constituent parts of the Bitter plates, the method of assembly and remote maintenance are summarized.« less
NASA Astrophysics Data System (ADS)
Vintila, Iuliana; Gavrus, Adinel
2017-10-01
The present research paper proposes the validation of a rigorous computation model used as a numerical tool to identify rheological behavior of complex emulsions W/O. Considering a three-dimensional description of a general viscoplastic flow it is detailed the thermo-mechanical equations used to identify fluid or soft material's rheological laws starting from global experimental measurements. Analyses are conducted for complex emulsions W/O having generally a Bingham behavior using the shear stress - strain rate dependency based on a power law and using an improved analytical model. Experimental results are investigated in case of rheological behavior for crude and refined rapeseed/soybean oils and four types of corresponding W/O emulsions using different physical-chemical composition. The rheological behavior model was correlated with the thermo-mechanical analysis of a plane-plane rheometer, oil content, chemical composition, particle size and emulsifier's concentration. The parameters of rheological laws describing the industrial oils and the W/O concentrated emulsions behavior were computed from estimated shear stresses using a non-linear regression technique and from experimental torques using the inverse analysis tool designed by A. Gavrus (1992-2000).
Surface Curvatures Computation from Equidistance Contours
NASA Astrophysics Data System (ADS)
Tanaka, Hiromi T.; Kling, Olivier; Lee, Daniel T. L.
1990-03-01
The subject of our research is on the 3D shape representation problem for a special class of range image, one where the natural mode of the acquired range data is in the form of equidistance contours, as exemplified by a moire interferometry range system. In this paper we present a novel surface curvature computation scheme that directly computes the surface curvatures (the principal curvatures, Gaussian curvature and mean curvature) from the equidistance contours without any explicit computations or implicit estimates of partial derivatives. We show how the special nature of the equidistance contours, specifically, the dense information of the surface curves in the 2D contour plane, turns into an advantage for the computation of the surface curvatures. The approach is based on using simple geometric construction to obtain the normal sections and the normal curvatures. This method is general and can be extended to any dense range image data. We show in details how this computation is formulated and give an analysis on the error bounds of the computation steps showing that the method is stable. Computation results on real equidistance range contours are also shown.
A CAD/CAE analysis of photographic and engineering data
NASA Technical Reports Server (NTRS)
Goza, S. Michael; Peterson, Wayne L.
1987-01-01
In the investigation of the STS 51L accident, NASA engineers were given the task of visual analysis of photographic data extracted from the tracking cameras located at the launch pad. An analysis of the rotations associated with the right Solid Rocket Booster (SRB) was also performed. The visual analysis involved pinpointing coordinates of specific areas on the photographs. The objective of the analysis on the right SRB was to duplicate the rotations provided by the SRB rate gyros and to determine the effects of the rotations on the launch configuration. To accomplish the objectives, computer aided design and engineering was employed. The solid modeler, GEOMOD, inside the Structural Dynamics Research Corp. I-DEAS package, proved invaluable. The problem areas that were encountered and the corresponding solutions that were obtained are discussed. A brief description detailing the construction of the computer generated solid model of the STS launch configuration is given. A discussion of the coordinate systems used in the analysis is provided for the purpose of positioning the model in coordinate space. The techniques and theory used in the model analysis are described.
NASA Astrophysics Data System (ADS)
Klimczak, Marcin; Bojarski, Jacek; Ziembicki, Piotr; Kęskiewicz, Piotr
2017-11-01
The requirements concerning energy performance of buildings and their internal installations, particularly HVAC systems, have been growing continuously in Poland and all over the world. The existing, traditional calculation methods following from the static heat exchange model are frequently not sufficient for a reasonable heating design of a building. Both in Poland and elsewhere in the world, methods and software are employed which allow a detailed simulation of the heating and moisture conditions in a building, and also an analysis of the performance of HVAC systems within a building. However, these systems are usually difficult in use and complex. In addition, the development of a simulation model that is sufficiently adequate to the real building requires considerable time involvement of a designer, is time-consuming and laborious. A simplification of the simulation model of a building renders it possible to reduce the costs of computer simulations. The paper analyses in detail the effect of introducing a number of different variants of the simulation model developed in Design Builder on the quality of final results obtained. The objective of this analysis is to find simplifications which allow obtaining simulation results which have an acceptable level of deviations from the detailed model, thus facilitating a quick energy performance analysis of a given building.
Development of a model and computer code to describe solar grade silicon production processes
NASA Technical Reports Server (NTRS)
Gould, R. K.; Srivastava, R.
1979-01-01
Two computer codes were developed for describing flow reactors in which high purity, solar grade silicon is produced via reduction of gaseous silicon halides. The first is the CHEMPART code, an axisymmetric, marching code which treats two phase flows with models describing detailed gas-phase chemical kinetics, particle formation, and particle growth. It can be used to described flow reactors in which reactants, mix, react, and form a particulate phase. Detailed radial gas-phase composition, temperature, velocity, and particle size distribution profiles are computed. Also, deposition of heat, momentum, and mass (either particulate or vapor) on reactor walls is described. The second code is a modified version of the GENMIX boundary layer code which is used to compute rates of heat, momentum, and mass transfer to the reactor walls. This code lacks the detailed chemical kinetics and particle handling features of the CHEMPART code but has the virtue of running much more rapidly than CHEMPART, while treating the phenomena occurring in the boundary layer in more detail.
Model-based registration for assessment of spinal deformities in idiopathic scoliosis
NASA Astrophysics Data System (ADS)
Forsberg, Daniel; Lundström, Claes; Andersson, Mats; Knutsson, Hans
2014-01-01
Detailed analysis of spinal deformity is important within orthopaedic healthcare, in particular for assessment of idiopathic scoliosis. This paper addresses this challenge by proposing an image analysis method, capable of providing a full three-dimensional spine characterization. The proposed method is based on the registration of a highly detailed spine model to image data from computed tomography. The registration process provides an accurate segmentation of each individual vertebra and the ability to derive various measures describing the spinal deformity. The derived measures are estimated from landmarks attached to the spine model and transferred to the patient data according to the registration result. Evaluation of the method provides an average point-to-surface error of 0.9 mm ± 0.9 (comparing segmentations), and an average target registration error of 2.3 mm ± 1.7 (comparing landmarks). Comparing automatic and manual measurements of axial vertebral rotation provides a mean absolute difference of 2.5° ± 1.8, which is on a par with other computerized methods for assessing axial vertebral rotation. A significant advantage of our method, compared to other computerized methods for rotational measurements, is that it does not rely on vertebral symmetry for computing the rotational measures. The proposed method is fully automatic and computationally efficient, only requiring three to four minutes to process an entire image volume covering vertebrae L5 to T1. Given the use of landmarks, the method can be readily adapted to estimate other measures describing a spinal deformity by changing the set of employed landmarks. In addition, the method has the potential to be utilized for accurate segmentations of the vertebrae in routine computed tomography examinations, given the relatively low point-to-surface error.
Research on three-dimensional visualization based on virtual reality and Internet
NASA Astrophysics Data System (ADS)
Wang, Zongmin; Yang, Haibo; Zhao, Hongling; Li, Jiren; Zhu, Qiang; Zhang, Xiaohong; Sun, Kai
2007-06-01
To disclose and display water information, a three-dimensional visualization system based on Virtual Reality (VR) and Internet is researched for demonstrating "digital water conservancy" application and also for routine management of reservoir. To explore and mine in-depth information, after completion of modeling high resolution DEM with reliable quality, topographical analysis, visibility analysis and reservoir volume computation are studied. And also, some parameters including slope, water level and NDVI are selected to classify easy-landslide zone in water-level-fluctuating zone of reservoir area. To establish virtual reservoir scene, two kinds of methods are used respectively for experiencing immersion, interaction and imagination (3I). First virtual scene contains more detailed textures to increase reality on graphical workstation with virtual reality engine Open Scene Graph (OSG). Second virtual scene is for internet users with fewer details for assuring fluent speed.
Buesing, Lars; Bill, Johannes; Nessler, Bernhard; Maass, Wolfgang
2011-01-01
The organization of computations in networks of spiking neurons in the brain is still largely unknown, in particular in view of the inherently stochastic features of their firing activity and the experimentally observed trial-to-trial variability of neural systems in the brain. In principle there exists a powerful computational framework for stochastic computations, probabilistic inference by sampling, which can explain a large number of macroscopic experimental data in neuroscience and cognitive science. But it has turned out to be surprisingly difficult to create a link between these abstract models for stochastic computations and more detailed models of the dynamics of networks of spiking neurons. Here we create such a link and show that under some conditions the stochastic firing activity of networks of spiking neurons can be interpreted as probabilistic inference via Markov chain Monte Carlo (MCMC) sampling. Since common methods for MCMC sampling in distributed systems, such as Gibbs sampling, are inconsistent with the dynamics of spiking neurons, we introduce a different approach based on non-reversible Markov chains that is able to reflect inherent temporal processes of spiking neuronal activity through a suitable choice of random variables. We propose a neural network model and show by a rigorous theoretical analysis that its neural activity implements MCMC sampling of a given distribution, both for the case of discrete and continuous time. This provides a step towards closing the gap between abstract functional models of cortical computation and more detailed models of networks of spiking neurons. PMID:22096452
Validations of CFD against detailed velocity and pressure measurements in water turbine runner flow
NASA Astrophysics Data System (ADS)
Nilsson, H.; Davidson, L.
2003-03-01
This work compares CFD results with experimental results of the flow in two different kinds of water turbine runners. The runners studied are the GAMM Francis runner and the Hölleforsen Kaplan runner. The GAMM Francis runner was used as a test case in the 1989 GAMM Workshop on 3D Computation of Incompressible Internal Flows where the geometry and detailed best efficiency measurements were made available. In addition to the best efficiency measurements, four off-design operating condition measurements are used for the comparisons in this work. The Hölleforsen Kaplan runner was used at the 1999 Turbine 99 and 2001 Turbine 99 - II workshops on draft tube flow, where detailed measurements made after the runner were used as inlet boundary conditions for the draft tube computations. The measurements are used here to validate computations of the flow in the runner.The computations are made in a single runner blade passage where the inlet boundary conditions are obtained from an extrapolation of detailed measurements (GAMM) or from separate guide vane computations (Hölleforsen). The steady flow in a rotating co-ordinate system is computed. The effects of turbulence are modelled by a low-Reynolds number k- turbulence model, which removes some of the assumptions of the commonly used wall function approach and brings the computations one step further.
NASA Astrophysics Data System (ADS)
Garcia, Jose Luis
2000-10-01
In injection molding processes, computer aided engineering (CAE) allows processors to evaluate different process parameters in order to achieve complete filling of a cavity and, in some cases, it predicts shrinkage and warpage. However, because commercial computational packages are used to design complex geometries, detail in the thickness direction is limited. Approximations in the thickness direction lead to the solution of a 2½-D problem instead of a 3-D problem. These simplifications drastically reduce computational times and memory requirements. However, these approximations hinder the ability to predict thermal and/or mechanical degradation. The goal of this study was to determine the degree of degradation during PVC injection molding and to compare the results with a computational model. Instead of analyzing degradation in complex geometries, the computational analysis and injection molding trials were performed on typical sections found in complex geometries, such as flow in a tube, flow in a rectangular channel, and radial flow. This simplification reduces the flow problem to a 1-D problem and allows one to develop a computational model with a higher level of detail in the thickness direction, essential for the determination of degradation. Two different geometries were examined in this study: a spiral mold, in order to approximate the rectangular channel, and a center gated plate for the radial flow. Injection speed, melt temperature, and shot size were varied. Parts varying in degree of degradation, from no to severe degradation, were produced to determine possible transition points. Furthermore, two different PVC materials were used, low and high viscosity, M3800 and M4200, respectively (The Geon Company, Avon Lake, OH), to correlate the degree of degradation with the viscous heating observed during injection. It was found that a good agreement between experimental and computational results was obtained only if the reaction was assumed to be more thermally sensitive than found in literature. The results from this study show that, during injection, the activation energy for degradation was 65 kcal/mol, compared to 17--30 kcal/mol found in literature for quiescent systems.
Two-dimensional analysis of coupled heat and moisture transport in masonry structures
NASA Astrophysics Data System (ADS)
Krejčí, Tomáš
2016-06-01
Reconstruction and maintenance of historical buildings and bridges require good knowledge of temperature and moisture distribution. Sharp changes in the temperature and moisture can lead to damage. This paper describes analysis of coupled heat and moisture transfer in masonry based on two-level approach. Macro-scale level describes the whole structure while meso-scale level takes into account detailed composition of the masonry. The two-level approach is very computationally demanding and it was implemented in parallel. The two-level approach was used in analysis of temperature and moisture distribution in Charles bridge in Prague, Czech Republic.
Informatics for RNA Sequencing: A Web Resource for Analysis on the Cloud
Griffith, Malachi; Walker, Jason R.; Spies, Nicholas C.; Ainscough, Benjamin J.; Griffith, Obi L.
2015-01-01
Massively parallel RNA sequencing (RNA-seq) has rapidly become the assay of choice for interrogating RNA transcript abundance and diversity. This article provides a detailed introduction to fundamental RNA-seq molecular biology and informatics concepts. We make available open-access RNA-seq tutorials that cover cloud computing, tool installation, relevant file formats, reference genomes, transcriptome annotations, quality-control strategies, expression, differential expression, and alternative splicing analysis methods. These tutorials and additional training resources are accompanied by complete analysis pipelines and test datasets made available without encumbrance at www.rnaseq.wiki. PMID:26248053
Computational Fluid Dynamics (CFD) applications in rocket propulsion analysis and design
NASA Technical Reports Server (NTRS)
Mcconnaughey, P. K.; Garcia, R.; Griffin, L. W.; Ruf, J. H.
1993-01-01
Computational Fluid Dynamics (CFD) has been used in recent applications to affect subcomponent designs in liquid propulsion rocket engines. This paper elucidates three such applications for turbine stage, pump stage, and combustor chamber geometries. Details of these applications include the development of a high turning airfoil for a gas generator (GG) powered, liquid oxygen (LOX) turbopump, single-stage turbine using CFD as an integral part of the design process. CFD application to pump stage design has emphasized analysis of inducers, impellers, and diffuser/volute sections. Improvements in pump stage impeller discharge flow uniformity have been seen through CFD optimization on coarse grid models. In the area of combustor design, recent CFD analysis of a film cooled ablating combustion chamber has been used to quantify the interaction between film cooling rate, chamber wall contraction angle, and geometry and their effects of these quantities on local wall temperature. The results are currently guiding combustion chamber design and coolant flow rate for an upcoming subcomponent test. Critical aspects of successful integration of CFD into the design cycle includes a close-coupling of CFD and design organizations, quick turnaround of parametric analyses once a baseline CFD benchmark has been established, and the use of CFD methodology and approaches that address pertinent design issues. In this latter area, some problem details can be simplified while retaining key physical aspects to maintain analytical integrity.
Canstein, C; Cachot, P; Faust, A; Stalder, A F; Bock, J; Frydrychowicz, A; Küffer, J; Hennig, J; Markl, M
2008-03-01
The knowledge of local vascular anatomy and function in the human body is of high interest for the diagnosis and treatment of cardiovascular disease. A comprehensive analysis of the hemodynamics in the thoracic aorta is presented based on the integration of flow-sensitive 4D MRI with state-of-the-art rapid prototyping technology and computational fluid dynamics (CFD). Rapid prototyping was used to transform aortic geometries as measured by contrast-enhanced MR angiography into realistic vascular models with large anatomical coverage. Integration into a flow circuit with patient-specific pulsatile in-flow conditions and application of flow-sensitive 4D MRI permitted detailed analysis of local and global 3D flow dynamics in a realistic vascular geometry. Visualization of characteristic 3D flow patterns and quantitative comparisons of the in vitro experiments with in vivo data and CFD simulations in identical vascular geometries were performed to evaluate the accuracy of vascular model systems. The results indicate the potential of such patient-specific model systems for detailed experimental simulation of realistic vascular hemodynamics. Further studies are warranted to examine the influence of refined boundary conditions of the human circulatory system such as fluid-wall interaction and their effect on normal and pathological blood flow characteristics associated with vascular geometry. (c) 2008 Wiley-Liss, Inc.
Thermal/structural Tailoring of Engine Blades (T/SEAEBL). Theoretical Manual
NASA Technical Reports Server (NTRS)
Brown, K. W.; Clevenger, W. B.
1994-01-01
The Thermal/Structural Tailoring of Engine Blades (T/STAEBL) system is a family of computer programs executed by a control program. The T/STAEBL system performs design optimizations of cooled, hollow turbine blades and vanes. This manual describes the T/STAEBL data block structure and system organization. The approximate analysis and optimization modules are detailed, and a validation test case is provided.
Thermal/structural tailoring of engine blades (T/SEAEBL). Theoretical manual
NASA Astrophysics Data System (ADS)
Brown, K. W.; Clevenger, W. B.
1994-03-01
The Thermal/Structural Tailoring of Engine Blades (T/STAEBL) system is a family of computer programs executed by a control program. The T/STAEBL system performs design optimizations of cooled, hollow turbine blades and vanes. This manual describes the T/STAEBL data block structure and system organization. The approximate analysis and optimization modules are detailed, and a validation test case is provided.
Development of a weight/sizing design synthesis computer program. Volume 2: Program Description
NASA Technical Reports Server (NTRS)
Garrison, J. M.
1973-01-01
The program for the computerized analysis of weight estimation relationships for those elements of the space shuttle vehicle which contribute a significant portion of the inert weight is discussed. A listing of each module and subroutine of the program is presented. Included are a generalized flow chart describing the subroutine linkage of the complete program and detailed flow charts for each subprogram.
Cardiac image modelling: Breadth and depth in heart disease.
Suinesiaputra, Avan; McCulloch, Andrew D; Nash, Martyn P; Pontre, Beau; Young, Alistair A
2016-10-01
With the advent of large-scale imaging studies and big health data, and the corresponding growth in analytics, machine learning and computational image analysis methods, there are now exciting opportunities for deepening our understanding of the mechanisms and characteristics of heart disease. Two emerging fields are computational analysis of cardiac remodelling (shape and motion changes due to disease) and computational analysis of physiology and mechanics to estimate biophysical properties from non-invasive imaging. Many large cohort studies now underway around the world have been specifically designed based on non-invasive imaging technologies in order to gain new information about the development of heart disease from asymptomatic to clinical manifestations. These give an unprecedented breadth to the quantification of population variation and disease development. Also, for the individual patient, it is now possible to determine biophysical properties of myocardial tissue in health and disease by interpreting detailed imaging data using computational modelling. For these population and patient-specific computational modelling methods to develop further, we need open benchmarks for algorithm comparison and validation, open sharing of data and algorithms, and demonstration of clinical efficacy in patient management and care. The combination of population and patient-specific modelling will give new insights into the mechanisms of cardiac disease, in particular the development of heart failure, congenital heart disease, myocardial infarction, contractile dysfunction and diastolic dysfunction. Copyright © 2016. Published by Elsevier B.V.
ERIC Educational Resources Information Center
Butler, A. K.; And Others
The performance/design requirements and a detailed technical description for a Computer-Directed Training Subsystem to be integrated into the Air Force Phase II Base Level System are described. The subsystem may be used for computer-assisted lesson construction and has presentation capability for on-the-job training for data automation, staff, and…
How to Create, Modify, and Interface Aspen In-House and User Databanks for System Configuration 1:
DOE Office of Scientific and Technical Information (OSTI.GOV)
Camp, D W
2000-10-27
The goal of this document is to provide detailed instructions to create, modify, interface, and test Aspen User and In-House databanks with minimal frustration. The level of instructions are aimed at a novice Aspen Plus simulation user who is neither a programming nor computer-system expert. The instructions are tailored to Version 10.1 of Aspen Plus and the specific computing configuration summarized in the Title of this document and detailed in Section 2. Many details of setting up databanks depend on the computing environment specifics, such as the machines, operating systems, command languages, directory structures, inter-computer communications software, the version ofmore » the Aspen Engine and Graphical User Interface (GUI), and the directory structure of how these were installed.« less
An Integrated Low-Speed Performance and Noise Prediction Methodology for Subsonic Aircraft
NASA Technical Reports Server (NTRS)
Olson, E. D.; Mavris, D. N.
2000-01-01
An integrated methodology has been assembled to compute the engine performance, takeoff and landing trajectories, and community noise levels for a subsonic commercial aircraft. Where feasible, physics-based noise analysis methods have been used to make the results more applicable to newer, revolutionary designs and to allow for a more direct evaluation of new technologies. The methodology is intended to be used with approximation methods and risk analysis techniques to allow for the analysis of a greater number of variable combinations while retaining the advantages of physics-based analysis. Details of the methodology are described and limited results are presented for a representative subsonic commercial aircraft.
Non-destructive analysis of DU content in the NIF hohlraums
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gharibyan, Narek; Moody, Ken J.; Shaughnessy, Dawn A.
2015-12-16
The advantage of using depleted uranium (DU) hohlraums in high-yield deuterium-tritium (DT) shots at the National Ignition Facility (NIF) is addressed by Döppner, et al., in great detail [1]. This DU based hohlraum incorporates a thin layer of DU, ~7 μm thick, on the inner surface along with a thin layer of a gold coating, ~0.7 μm thick, while the outer layer is ~22 μm thick gold. A thickness measurement of the DU layer can be performed using an optical microscope where the total DU weight can be computed provided a uniform DU layer. However, the uniformity of the thicknessmore » is not constant throughout the hohlraum since CAD drawing calculations of the DU weight do not agree with the computed values from optical measurements [2]. Therefore, a non-destructive method for quantifying the DU content in hohlraums has been established by utilizing gamma-ray spectroscopy. The details of this method, along with results from several hohlraums, are presented in this report.« less
Application of Dynamic Analysis in Semi-Analytical Finite Element Method.
Liu, Pengfei; Xing, Qinyan; Wang, Dawei; Oeser, Markus
2017-08-30
Analyses of dynamic responses are significantly important for the design, maintenance and rehabilitation of asphalt pavement. In order to evaluate the dynamic responses of asphalt pavement under moving loads, a specific computational program, SAFEM, was developed based on a semi-analytical finite element method. This method is three-dimensional and only requires a two-dimensional FE discretization by incorporating Fourier series in the third dimension. In this paper, the algorithm to apply the dynamic analysis to SAFEM was introduced in detail. Asphalt pavement models under moving loads were built in the SAFEM and commercial finite element software ABAQUS to verify the accuracy and efficiency of the SAFEM. The verification shows that the computational accuracy of SAFEM is high enough and its computational time is much shorter than ABAQUS. Moreover, experimental verification was carried out and the prediction derived from SAFEM is consistent with the measurement. Therefore, the SAFEM is feasible to reliably predict the dynamic response of asphalt pavement under moving loads, thus proving beneficial to road administration in assessing the pavement's state.
Overview of ICE Project: Integration of Computational Fluid Dynamics and Experiments
NASA Technical Reports Server (NTRS)
Stegeman, James D.; Blech, Richard A.; Babrauckas, Theresa L.; Jones, William H.
2001-01-01
Researchers at the NASA Glenn Research Center have developed a prototype integrated environment for interactively exploring, analyzing, and validating information from computational fluid dynamics (CFD) computations and experiments. The Integrated CFD and Experiments (ICE) project is a first attempt at providing a researcher with a common user interface for control, manipulation, analysis, and data storage for both experiments and simulation. ICE can be used as a live, on-tine system that displays and archives data as they are gathered; as a postprocessing system for dataset manipulation and analysis; and as a control interface or "steering mechanism" for simulation codes while visualizing the results. Although the full capabilities of ICE have not been completely demonstrated, this report documents the current system. Various applications of ICE are discussed: a low-speed compressor, a supersonic inlet, real-time data visualization, and a parallel-processing simulation code interface. A detailed data model for the compressor application is included in the appendix.
Simulation on a car interior aerodynamic noise control based on statistical energy analysis
NASA Astrophysics Data System (ADS)
Chen, Xin; Wang, Dengfeng; Ma, Zhengdong
2012-09-01
How to simulate interior aerodynamic noise accurately is an important question of a car interior noise reduction. The unsteady aerodynamic pressure on body surfaces is proved to be the key effect factor of car interior aerodynamic noise control in high frequency on high speed. In this paper, a detail statistical energy analysis (SEA) model is built. And the vibra-acoustic power inputs are loaded on the model for the valid result of car interior noise analysis. The model is the solid foundation for further optimization on car interior noise control. After the most sensitive subsystems for the power contribution to car interior noise are pointed by SEA comprehensive analysis, the sound pressure level of car interior aerodynamic noise can be reduced by improving their sound and damping characteristics. The further vehicle testing results show that it is available to improve the interior acoustic performance by using detailed SEA model, which comprised by more than 80 subsystems, with the unsteady aerodynamic pressure calculation on body surfaces and the materials improvement of sound/damping properties. It is able to acquire more than 2 dB reduction on the central frequency in the spectrum over 800 Hz. The proposed optimization method can be looked as a reference of car interior aerodynamic noise control by the detail SEA model integrated unsteady computational fluid dynamics (CFD) and sensitivity analysis of acoustic contribution.
NASA Astrophysics Data System (ADS)
Jaranowski, Piotr; Królak, Andrzej
2000-03-01
We develop the analytic and numerical tools for data analysis of the continuous gravitational-wave signals from spinning neutron stars for ground-based laser interferometric detectors. The statistical data analysis method that we investigate is maximum likelihood detection which for the case of Gaussian noise reduces to matched filtering. We study in detail the statistical properties of the optimum functional that needs to be calculated in order to detect the gravitational-wave signal and estimate its parameters. We find it particularly useful to divide the parameter space into elementary cells such that the values of the optimal functional are statistically independent in different cells. We derive formulas for false alarm and detection probabilities both for the optimal and the suboptimal filters. We assess the computational requirements needed to do the signal search. We compare a number of criteria to build sufficiently accurate templates for our data analysis scheme. We verify the validity of our concepts and formulas by means of the Monte Carlo simulations. We present algorithms by which one can estimate the parameters of the continuous signals accurately. We find, confirming earlier work of other authors, that given a 100 Gflops computational power an all-sky search for observation time of 7 days and directed search for observation time of 120 days are possible whereas an all-sky search for 120 days of observation time is computationally prohibitive.
Satellite Imagery Analysis for Automated Global Food Security Forecasting
NASA Astrophysics Data System (ADS)
Moody, D.; Brumby, S. P.; Chartrand, R.; Keisler, R.; Mathis, M.; Beneke, C. M.; Nicholaeff, D.; Skillman, S.; Warren, M. S.; Poehnelt, J.
2017-12-01
The recent computing performance revolution has driven improvements in sensor, communication, and storage technology. Multi-decadal remote sensing datasets at the petabyte scale are now available in commercial clouds, with new satellite constellations generating petabytes/year of daily high-resolution global coverage imagery. Cloud computing and storage, combined with recent advances in machine learning, are enabling understanding of the world at a scale and at a level of detail never before feasible. We present results from an ongoing effort to develop satellite imagery analysis tools that aggregate temporal, spatial, and spectral information and that can scale with the high-rate and dimensionality of imagery being collected. We focus on the problem of monitoring food crop productivity across the Middle East and North Africa, and show how an analysis-ready, multi-sensor data platform enables quick prototyping of satellite imagery analysis algorithms, from land use/land cover classification and natural resource mapping, to yearly and monthly vegetative health change trends at the structural field level.
Open Rotor Computational Aeroacoustic Analysis with an Immersed Boundary Method
NASA Technical Reports Server (NTRS)
Brehm, Christoph; Barad, Michael F.; Kiris, Cetin C.
2016-01-01
Reliable noise prediction capabilities are essential to enable novel fuel efficient open rotor designs that can meet the community and cabin noise standards. Toward this end, immersed boundary methods have reached a level of maturity so that they are being frequently employed for specific real world applications within NASA. This paper demonstrates that our higher-order immersed boundary method provides the ability for aeroacoustic analysis of wake-dominated flow fields generated by highly complex geometries. This is the first of a kind aeroacoustic simulation of an open rotor propulsion system employing an immersed boundary method. In addition to discussing the peculiarities of applying the immersed boundary method to this moving boundary problem, we will provide a detailed aeroacoustic analysis of the noise generation mechanisms encountered in the open rotor flow. The simulation data is compared to available experimental data and other computational results employing more conventional CFD methods. The noise generation mechanisms are analyzed employing spectral analysis, proper orthogonal decomposition and the causality method.
Novel 3D/VR interactive environment for MD simulations, visualization and analysis.
Doblack, Benjamin N; Allis, Tim; Dávila, Lilian P
2014-12-18
The increasing development of computing (hardware and software) in the last decades has impacted scientific research in many fields including materials science, biology, chemistry and physics among many others. A new computational system for the accurate and fast simulation and 3D/VR visualization of nanostructures is presented here, using the open-source molecular dynamics (MD) computer program LAMMPS. This alternative computational method uses modern graphics processors, NVIDIA CUDA technology and specialized scientific codes to overcome processing speed barriers common to traditional computing methods. In conjunction with a virtual reality system used to model materials, this enhancement allows the addition of accelerated MD simulation capability. The motivation is to provide a novel research environment which simultaneously allows visualization, simulation, modeling and analysis. The research goal is to investigate the structure and properties of inorganic nanostructures (e.g., silica glass nanosprings) under different conditions using this innovative computational system. The work presented outlines a description of the 3D/VR Visualization System and basic components, an overview of important considerations such as the physical environment, details on the setup and use of the novel system, a general procedure for the accelerated MD enhancement, technical information, and relevant remarks. The impact of this work is the creation of a unique computational system combining nanoscale materials simulation, visualization and interactivity in a virtual environment, which is both a research and teaching instrument at UC Merced.
Novel 3D/VR Interactive Environment for MD Simulations, Visualization and Analysis
Doblack, Benjamin N.; Allis, Tim; Dávila, Lilian P.
2014-01-01
The increasing development of computing (hardware and software) in the last decades has impacted scientific research in many fields including materials science, biology, chemistry and physics among many others. A new computational system for the accurate and fast simulation and 3D/VR visualization of nanostructures is presented here, using the open-source molecular dynamics (MD) computer program LAMMPS. This alternative computational method uses modern graphics processors, NVIDIA CUDA technology and specialized scientific codes to overcome processing speed barriers common to traditional computing methods. In conjunction with a virtual reality system used to model materials, this enhancement allows the addition of accelerated MD simulation capability. The motivation is to provide a novel research environment which simultaneously allows visualization, simulation, modeling and analysis. The research goal is to investigate the structure and properties of inorganic nanostructures (e.g., silica glass nanosprings) under different conditions using this innovative computational system. The work presented outlines a description of the 3D/VR Visualization System and basic components, an overview of important considerations such as the physical environment, details on the setup and use of the novel system, a general procedure for the accelerated MD enhancement, technical information, and relevant remarks. The impact of this work is the creation of a unique computational system combining nanoscale materials simulation, visualization and interactivity in a virtual environment, which is both a research and teaching instrument at UC Merced. PMID:25549300
Exact and efficient simulation of concordant computation
NASA Astrophysics Data System (ADS)
Cable, Hugo; Browne, Daniel E.
2015-11-01
Concordant computation is a circuit-based model of quantum computation for mixed states, that assumes that all correlations within the register are discord-free (i.e. the correlations are essentially classical) at every step of the computation. The question of whether concordant computation always admits efficient simulation by a classical computer was first considered by Eastin in arXiv:quant-ph/1006.4402v1, where an answer in the affirmative was given for circuits consisting only of one- and two-qubit gates. Building on this work, we develop the theory of classical simulation of concordant computation. We present a new framework for understanding such computations, argue that a larger class of concordant computations admit efficient simulation, and provide alternative proofs for the main results of arXiv:quant-ph/1006.4402v1 with an emphasis on the exactness of simulation which is crucial for this model. We include detailed analysis of the arithmetic complexity for solving equations in the simulation, as well as extensions to larger gates and qudits. We explore the limitations of our approach, and discuss the challenges faced in developing efficient classical simulation algorithms for all concordant computations.
Workflows and Provenance: Toward Information Science Solutions for the Natural Sciences.
Gryk, Michael R; Ludäscher, Bertram
2017-01-01
The era of big data and ubiquitous computation has brought with it concerns about ensuring reproducibility in this new research environment. It is easy to assume computational methods self-document by their very nature of being exact, deterministic processes. However, similar to laboratory experiments, ensuring reproducibility in the computational realm requires the documentation of both the protocols used (workflows) as well as a detailed description of the computational environment: algorithms, implementations, software environments as well as the data ingested and execution logs of the computation. These two aspects of computational reproducibility (workflows and execution details) are discussed in the context of biomolecular Nuclear Magnetic Resonance spectroscopy (bioNMR) as well as the PRIMAD model for computational reproducibility.
NASA Astrophysics Data System (ADS)
Tonini, R.; Lorito, S.; Orefice, S.; Graziani, L.; Brizuela, B.; Smedile, A.; Volpe, M.; Romano, F.; De Martini, P. M.; Maramai, A.; Selva, J.; Piatanesi, A.; Pantosti, D.
2016-12-01
Site-specific probabilistic tsunami hazard analyses demand very high computational efforts that are often reduced by introducing approximations on tsunami sources and/or tsunami modeling. On one hand, the large variability of source parameters implies the definition of a huge number of potential tsunami scenarios, whose omission could easily lead to important bias in the analysis. On the other hand, detailed inundation maps computed by tsunami numerical simulations require very long running time. When tsunami effects are calculated at regional scale, a common practice is to propagate tsunami waves in deep waters (up to 50-100 m depth) neglecting non-linear effects and using coarse bathymetric meshes. Then, maximum wave heights on the coast are empirically extrapolated, saving a significant amount of computational time. However, moving to local scale, such assumptions drop out and tsunami modeling would require much greater computational resources. In this work, we perform a local Seismic Probabilistic Tsunami Hazard Analysis (SPTHA) for the 50 km long coastal segment between Augusta and Siracusa, a touristic and commercial area placed along the South-Eastern Sicily coast, Italy. The procedure consists in using the outcomes of a regional SPTHA as input for a two-step filtering method to select and substantially reduce the number of scenarios contributing to the specific target area. These selected scenarios are modeled using high resolution topo-bathymetry for producing detailed inundation maps. Results are presented as probabilistic hazard curves and maps, with the goal of analyze, compare and highlight the different results provided by regional and local hazard assessments. Moreover, the analysis is enriched by the use of local observed tsunami data, both geological and historical. Indeed, tsunami data-sets available for the selected target areas are particularly rich with respect to the scarce and heterogeneous data-sets usually available elsewhere. Therefore, they can represent valuable benchmarks for testing and strengthening the results of such kind of studies. The work is funded by the Italian Flagship Project RITMARE, the two EC FP7 projects ASTARTE (Grant agreement 603839) and STREST (Grant agreement 603389), and the INGV-DPC Agreement.
Geant4 Computing Performance Benchmarking and Monitoring
Dotti, Andrea; Elvira, V. Daniel; Folger, Gunter; ...
2015-12-23
Performance evaluation and analysis of large scale computing applications is essential for optimal use of resources. As detector simulation is one of the most compute intensive tasks and Geant4 is the simulation toolkit most widely used in contemporary high energy physics (HEP) experiments, it is important to monitor Geant4 through its development cycle for changes in computing performance and to identify problems and opportunities for code improvements. All Geant4 development and public releases are being profiled with a set of applications that utilize different input event samples, physics parameters, and detector configurations. Results from multiple benchmarking runs are compared tomore » previous public and development reference releases to monitor CPU and memory usage. Observed changes are evaluated and correlated with code modifications. Besides the full summary of call stack and memory footprint, a detailed call graph analysis is available to Geant4 developers for further analysis. The set of software tools used in the performance evaluation procedure, both in sequential and multi-threaded modes, include FAST, IgProf and Open|Speedshop. In conclusion, the scalability of the CPU time and memory performance in multi-threaded application is evaluated by measuring event throughput and memory gain as a function of the number of threads for selected event samples.« less
Johnson, Quentin R; Lindsay, Richard J; Shen, Tongye
2018-02-21
A computational method which extracts the dominant motions from an ensemble of biomolecular conformations via a correlation analysis of residue-residue contacts is presented. The algorithm first renders the structural information into contact matrices, then constructs the collective modes based on the correlated dynamics of a selected set of dynamic contacts. Associated programs can bridge the results for further visualization using graphics software. The aim of this method is to provide an analysis of conformations of biopolymers from the contact viewpoint. It may assist a systematical uncovering of conformational switching mechanisms existing in proteins and biopolymer systems in general by statistical analysis of simulation snapshots. In contrast to conventional correlation analyses of Cartesian coordinates (such as distance covariance analysis and Cartesian principal component analysis), this program also provides an alternative way to locate essential collective motions in general. Herein, we detail the algorithm in a stepwise manner and comment on the importance of the method as applied to decoding allosteric mechanisms. © 2018 Wiley Periodicals, Inc. © 2018 Wiley Periodicals, Inc.
All-atom calculation of protein free-energy profiles
NASA Astrophysics Data System (ADS)
Orioli, S.; Ianeselli, A.; Spagnolli, G.; Faccioli, P.
2017-10-01
The Bias Functional (BF) approach is a variational method which enables one to efficiently generate ensembles of reactive trajectories for complex biomolecular transitions, using ordinary computer clusters. For example, this scheme was applied to simulate in atomistic detail the folding of proteins consisting of several hundreds of amino acids and with experimental folding time of several minutes. A drawback of the BF approach is that it produces trajectories which do not satisfy microscopic reversibility. Consequently, this method cannot be used to directly compute equilibrium observables, such as free energy landscapes or equilibrium constants. In this work, we develop a statistical analysis which permits us to compute the potential of mean-force (PMF) along an arbitrary collective coordinate, by exploiting the information contained in the reactive trajectories calculated with the BF approach. We assess the accuracy and computational efficiency of this scheme by comparing its results with the PMF obtained for a small protein by means of plain molecular dynamics.
Bai, Xiao-ping; Zhang, Xi-wei
2013-01-01
Selecting construction schemes of the building engineering project is a complex multiobjective optimization decision process, in which many indexes need to be selected to find the optimum scheme. Aiming at this problem, this paper selects cost, progress, quality, and safety as the four first-order evaluation indexes, uses the quantitative method for the cost index, uses integrated qualitative and quantitative methodologies for progress, quality, and safety indexes, and integrates engineering economics, reliability theories, and information entropy theory to present a new evaluation method for building construction project. Combined with a practical case, this paper also presents detailed computing processes and steps, including selecting all order indexes, establishing the index matrix, computing score values of all order indexes, computing the synthesis score, sorting all selected schemes, and making analysis and decision. Presented method can offer valuable references for risk computing of building construction projects.
Experience with a sophisticated computer based authoring system
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gardner, P.R.
1984-04-01
In the November 1982 issue of ADCIS SIG CBT Newsletter the editor arrives at two conclusions regarding Computer Based Authoring Systems (CBAS): (1) CBAS drastically reduces programming time and the need for expert programmers, and (2) CBAS appears to have minimal impact on initial lesson design. Both of these comments have significant impact on any Cost-Benefit analysis for Computer-Based Training. The first tends to improve cost-effectiveness but only toward the limits imposed by the second. Westinghouse Hanford Company (WHC) recently purchased a sophisticated CBAS, the WISE/SMART system from Wicat (Orem, UT), for use in the Nuclear Power Industry. This reportmore » details our experience with this system relative to Items (1) and (2) above; lesson design time will be compared with lesson input time. Also provided will be the WHC experience in the use of subject matter experts (though computer neophytes) for the design and inputting of CBT materials.« less
Adly, Amr A.; Abd-El-Hafiz, Salwa K.
2012-01-01
Incorporation of hysteresis models in electromagnetic analysis approaches is indispensable to accurate field computation in complex magnetic media. Throughout those computations, vector nature and computational efficiency of such models become especially crucial when sophisticated geometries requiring massive sub-region discretization are involved. Recently, an efficient vector Preisach-type hysteresis model constructed from only two scalar models having orthogonally coupled elementary operators has been proposed. This paper presents a novel Hopfield neural network approach for the implementation of Stoner–Wohlfarth-like operators that could lead to a significant enhancement in the computational efficiency of the aforementioned model. Advantages of this approach stem from the non-rectangular nature of these operators that substantially minimizes the number of operators needed to achieve an accurate vector hysteresis model. Details of the proposed approach, its identification and experimental testing are presented in the paper. PMID:25685446
Experimental and Computational Study of Sonic and Supersonic Jet Plumes
NASA Technical Reports Server (NTRS)
Venkatapathy, E.; Naughton, J. W.; Fletcher, D. G.; Edwards, Thomas A. (Technical Monitor)
1994-01-01
Study of sonic and supersonic jet plumes are relevant to understanding such phenomenon as jet-noise, plume signatures, and rocket base-heating and radiation. Jet plumes are simple to simulate and yet, have complex flow structures such as Mach disks, triple points, shear-layers, barrel shocks, shock-shear-layer interaction, etc. Experimental and computational simulation of sonic and supersonic jet plumes have been performed for under- and over-expanded, axisymmetric plume conditions. The computational simulation compare very well with the experimental observations of schlieren pictures. Experimental data such as temperature measurements with hot-wire probes are yet to be measured and will be compared with computed values. Extensive analysis of the computational simulations presents a clear picture of how the complex flow structure develops and the conditions under which self-similar flow structures evolve. From the computations, the plume structure can be further classified into many sub-groups. In the proposed paper, detail results from the experimental and computational simulations for single, axisymmetric, under- and over-expanded, sonic and supersonic plumes will be compared and the fluid dynamic aspects of flow structures will be discussed.
WEST-3 wind turbine simulator development
NASA Technical Reports Server (NTRS)
Hoffman, J. A.; Sridhar, S.
1985-01-01
The software developed for WEST-3, a new, all digital, and fully programmable wind turbine simulator is given. The process of wind turbine simulation on WEST-3 is described in detail. The major steps are, the processing of the mathematical models, the preparation of the constant data, and the use of system software generated executable code for running on WEST-3. The mechanics of reformulation, normalization, and scaling of the mathematical models is discussed in detail, in particulr, the significance of reformulation which leads to accurate simulations. Descriptions for the preprocessor computer programs which are used to prepare the constant data needed in the simulation are given. These programs, in addition to scaling and normalizing all the constants, relieve the user from having to generate a large number of constants used in the simulation. Also given are brief descriptions of the components of the WEST-3 system software: Translator, Assembler, Linker, and Loader. Also included are: details of the aeroelastic rotor analysis, which is the center of a wind turbine simulation model, analysis of the gimbal subsystem; and listings of the variables, constants, and equations used in the simulation.
Aeroelastic-Acoustics Simulation of Flight Systems
NASA Technical Reports Server (NTRS)
Gupta, kajal K.; Choi, S.; Ibrahim, A.
2009-01-01
This paper describes the details of a numerical finite element (FE) based analysis procedure and a resulting code for the simulation of the acoustics phenomenon arising from aeroelastic interactions. Both CFD and structural simulations are based on FE discretization employing unstructured grids. The sound pressure level (SPL) on structural surfaces is calculated from the root mean square (RMS) of the unsteady pressure and the acoustic wave frequencies are computed from a fast Fourier transform (FFT) of the unsteady pressure distribution as a function of time. The resulting tool proves to be unique as it is designed to analyze complex practical problems, involving large scale computations, in a routine fashion.
1989-08-01
was entered as 1 in line four. Its values are entered under the following prompting message: UNIT WGT UNIT WGT LAT SOIL COEF DEPTH WIDTH MOIST...basins, tuese thick- A67 54 1 16 Hsi/2 : \\l 2 7 >- 9 >- 0x HSl/2 L 3 U- HSV /2 ( \\ i ^\\^ (J-HSl)/2 ! ^ I TYPE(A) i 6 N/2 : 1 2...program uses working stress analysis in accordance with Corps of Engineers EM 1110- 1 -2101, "Working Stresses." C METHODS D. EQUIPMENT DETAILS
X-Ray Computed Tomography of Tranquility Base Moon Rock
NASA Technical Reports Server (NTRS)
Jones, Justin S.; Garvin, Jim; Viens, Mike; Kent, Ryan; Munoz, Bruno
2016-01-01
X-ray Computed Tomography (CT) was used for the first time on the Apollo 11 Lunar Sample number 10057.30, which had been previously maintained by the White House, then transferred back to NASA under the care of Goddard Space Flight Center. Results from this analysis show detailed images of the internal structure of the moon rock, including vesicles (pores), crystal needles, and crystal bundles. These crystals, possibly the common mineral ilmenite, are found in abundance and with random orientation. Future work, in particular a greater understanding of these crystals and their formation, may lead to a more in-depth understanding of the lunar surface evolution and mineral content.
Energy-Aware Computation Offloading of IoT Sensors in Cloudlet-Based Mobile Edge Computing.
Ma, Xiao; Lin, Chuang; Zhang, Han; Liu, Jianwei
2018-06-15
Mobile edge computing is proposed as a promising computing paradigm to relieve the excessive burden of data centers and mobile networks, which is induced by the rapid growth of Internet of Things (IoT). This work introduces the cloud-assisted multi-cloudlet framework to provision scalable services in cloudlet-based mobile edge computing. Due to the constrained computation resources of cloudlets and limited communication resources of wireless access points (APs), IoT sensors with identical computation offloading decisions interact with each other. To optimize the processing delay and energy consumption of computation tasks, theoretic analysis of the computation offloading decision problem of IoT sensors is presented in this paper. In more detail, the computation offloading decision problem of IoT sensors is formulated as a computation offloading game and the condition of Nash equilibrium is derived by introducing the tool of a potential game. By exploiting the finite improvement property of the game, the Computation Offloading Decision (COD) algorithm is designed to provide decentralized computation offloading strategies for IoT sensors. Simulation results demonstrate that the COD algorithm can significantly reduce the system cost compared with the random-selection algorithm and the cloud-first algorithm. Furthermore, the COD algorithm can scale well with increasing IoT sensors.
Design, development and testing twin pulse tube cryocooler
NASA Astrophysics Data System (ADS)
Gour, Abhay Singh; Sagar, Pankaj; Karunanithi, R.
2017-09-01
The design and development of Twin Pulse Tube Cryocooler (TPTC) is presented. Both the coolers are driven by a single Linear Moving Magnet Synchronous Motor (LMMSM) with piston heads at both ends of the mover shaft. Magnetostatic analysis for flux line distribution was carried-out during design and development of LMMSM based pressure wave generator. Based on the performance of PWG, design of TPTC was carried out using Sage and Computational Fluid Dynamics (CFD) analysis. Detailed design, fabrication and testing of LMMSM, TPTC and their integration tests are presented in this paper.