The electromagnetic modeling of thin apertures using the finite-difference time-domain technique
NASA Technical Reports Server (NTRS)
Demarest, Kenneth R.
1987-01-01
A technique which computes transient electromagnetic responses of narrow apertures in complex conducting scatterers was implemented as an extension of previously developed Finite-Difference Time-Domain (FDTD) computer codes. Although these apertures are narrow with respect to the wavelengths contained within the power spectrum of excitation, this technique does not require significantly more computer resources to attain the increased resolution at the apertures. In the report, an analytical technique which utilizes Babinet's principle to model the apertures is developed, and an FDTD computer code which utilizes this technique is described.
Wei, Xuelei; Dong, Fuhui
2011-12-01
To review recent advance in the research and application of computer aided forming techniques for constructing bone tissue engineering scaffolds. The literature concerning computer aided forming techniques for constructing bone tissue engineering scaffolds in recent years was reviewed extensively and summarized. Several studies over last decade have focused on computer aided forming techniques for bone scaffold construction using various scaffold materials, which is based on computer aided design (CAD) and bone scaffold rapid prototyping (RP). CAD include medical CAD, STL, and reverse design. Reverse design can fully simulate normal bone tissue and could be very useful for the CAD. RP techniques include fused deposition modeling, three dimensional printing, selected laser sintering, three dimensional bioplotting, and low-temperature deposition manufacturing. These techniques provide a new way to construct bone tissue engineering scaffolds with complex internal structures. With rapid development of molding and forming techniques, computer aided forming techniques are expected to provide ideal bone tissue engineering scaffolds.
NASA Technical Reports Server (NTRS)
Coen, Peter G.
1991-01-01
A new computer technique for the analysis of transport aircraft sonic boom signature characteristics was developed. This new technique, based on linear theory methods, combines the previously separate equivalent area and F function development with a signature propagation method using a single geometry description. The new technique was implemented in a stand-alone computer program and was incorporated into an aircraft performance analysis program. Through these implementations, both configuration designers and performance analysts are given new capabilities to rapidly analyze an aircraft's sonic boom characteristics throughout the flight envelope.
The anatomy of floating shock fitting. [shock waves computation for flow field
NASA Technical Reports Server (NTRS)
Salas, M. D.
1975-01-01
The floating shock fitting technique is examined. Second-order difference formulas are developed for the computation of discontinuities. A procedure is developed to compute mesh points that are crossed by discontinuities. The technique is applied to the calculation of internal two-dimensional flows with arbitrary number of shock waves and contact surfaces. A new procedure, based on the coalescence of characteristics, is developed to detect the formation of shock waves. Results are presented to validate and demonstrate the versatility of the technique.
Models and techniques for evaluating the effectiveness of aircraft computing systems
NASA Technical Reports Server (NTRS)
Meyer, J. F.
1982-01-01
Models, measures, and techniques for evaluating the effectiveness of aircraft computing systems were developed. By "effectiveness" in this context we mean the extent to which the user, i.e., a commercial air carrier, may expect to benefit from the computational tasks accomplished by a computing system in the environment of an advanced commercial aircraft. Thus, the concept of effectiveness involves aspects of system performance, reliability, and worth (value, benefit) which are appropriately integrated in the process of evaluating system effectiveness. Specifically, the primary objectives are: the development of system models that provide a basis for the formulation and evaluation of aircraft computer system effectiveness, the formulation of quantitative measures of system effectiveness, and the development of analytic and simulation techniques for evaluating the effectiveness of a proposed or existing aircraft computer.
Models and techniques for evaluating the effectiveness of aircraft computing systems
NASA Technical Reports Server (NTRS)
Meyer, J. F.
1978-01-01
Progress in the development of system models and techniques for the formulation and evaluation of aircraft computer system effectiveness is reported. Topics covered include: analysis of functional dependence: a prototype software package, METAPHOR, developed to aid the evaluation of performability; and a comprehensive performability modeling and evaluation exercise involving the SIFT computer.
NASA Technical Reports Server (NTRS)
1972-01-01
The solar imaging X-ray telescope experiment (designated the S-056 experiment) is described. It will photograph the sun in the far ultraviolet or soft X-ray region. Because of the imaging characteristics of this telescope and the necessity of using special techniques for capturing images on film at these wave lengths, methods were developed for computer processing of the photographs. The problems of image restoration were addressed to develop and test digital computer techniques for applying a deconvolution process to restore overall S-056 image quality. Additional techniques for reducing or eliminating the effects of noise and nonlinearity in S-056 photographs were developed.
[Cardiac computed tomography: new applications of an evolving technique].
Martín, María; Corros, Cecilia; Calvo, Juan; Mesa, Alicia; García-Campos, Ana; Rodríguez, María Luisa; Barreiro, Manuel; Rozado, José; Colunga, Santiago; de la Hera, Jesús M; Morís, César; Luyando, Luis H
2015-01-01
During the last years we have witnessed an increasing development of imaging techniques applied in Cardiology. Among them, cardiac computed tomography is an emerging and evolving technique. With the current possibility of very low radiation studies, the applications have expanded and go further coronariography In the present article we review the technical developments of cardiac computed tomography and its new applications. Copyright © 2014 Instituto Nacional de Cardiología Ignacio Chávez. Published by Masson Doyma México S.A. All rights reserved.
NASA Technical Reports Server (NTRS)
Walker, Carrie K.
1991-01-01
A technique has been developed for combining features of a systems architecture design and assessment tool and a software development tool. This technique reduces simulation development time and expands simulation detail. The Architecture Design and Assessment System (ADAS), developed at the Research Triangle Institute, is a set of computer-assisted engineering tools for the design and analysis of computer systems. The ADAS system is based on directed graph concepts and supports the synthesis and analysis of software algorithms mapped to candidate hardware implementations. Greater simulation detail is provided by the ADAS functional simulator. With the functional simulator, programs written in either Ada or C can be used to provide a detailed description of graph nodes. A Computer-Aided Software Engineering tool developed at the Charles Stark Draper Laboratory (CSDL CASE) automatically generates Ada or C code from engineering block diagram specifications designed with an interactive graphical interface. A technique to use the tools together has been developed, which further automates the design process.
Models and techniques for evaluating the effectiveness of aircraft computing systems
NASA Technical Reports Server (NTRS)
Meyer, J. F.
1978-01-01
The development of system models that can provide a basis for the formulation and evaluation of aircraft computer system effectiveness, the formulation of quantitative measures of system effectiveness, and the development of analytic and simulation techniques for evaluating the effectiveness of a proposed or existing aircraft computer are described. Specific topics covered include: system models; performability evaluation; capability and functional dependence; computation of trajectory set probabilities; and hierarchical modeling of an air transport mission.
NASA Technical Reports Server (NTRS)
Harwood, P. (Principal Investigator); Finley, R.; Mcculloch, S.; Malin, P. A.; Schell, J. A.
1977-01-01
The author has identified the following significant results. Image interpretation and computer-assisted techniques were developed to analyze LANDSAT scenes in support of resource inventory and monitoring requirements for the Texas coastal region. Land cover and land use maps, at a scale of 1:125,000 for the image interpretation product and 1:24,000 for the computer-assisted product, were generated covering four Texas coastal test sites. Classification schemes which parallel national systems were developed for each procedure, including 23 classes for image interpretation technique and 13 classes for the computer-assisted technique. Results indicate that LANDSAT-derived land cover and land use maps can be successfully applied to a variety of planning and management activities on the Texas coast. Computer-derived land/water maps can be used with tide gage data to assess shoreline boundaries for management purposes.
A demonstrative model of a lunar base simulation on a personal computer
NASA Technical Reports Server (NTRS)
1985-01-01
The initial demonstration model of a lunar base simulation is described. This initial model was developed on the personal computer level to demonstrate feasibility and technique before proceeding to a larger computer-based model. Lotus Symphony Version 1.1 software was used to base the demonstration model on an personal computer with an MS-DOS operating system. The personal computer-based model determined the applicability of lunar base modeling techniques developed at an LSPI/NASA workshop. In addition, the personnal computer-based demonstration model defined a modeling structure that could be employed on a larger, more comprehensive VAX-based lunar base simulation. Refinement of this personal computer model and the development of a VAX-based model is planned in the near future.
NASA Technical Reports Server (NTRS)
Butera, M. K.
1981-01-01
An automatic technique has been developed to measure marsh plant production by inference from a species classification derived from Landsat MSS data. A separate computer technique has been developed to calculate the transport path length of detritus and nutrients from their point of origin in the marsh to the shoreline from Landsat data. A nutrient availability indicator, the ratio of production to transport path length, was derived for each marsh-identified Landsat cell. The use of a data base compatible with the Landsat format facilitated data handling and computations.
A Review of Developments in Computer-Based Systems to Image Teeth and Produce Dental Restorations
Rekow, E. Dianne; Erdman, Arthur G.; Speidel, T. Michael
1987-01-01
Computer-aided design and manufacturing (CAD/CAM) make it possible to automate the creation of dental restorations. Currently practiced techniques are described. Three automated systems currently under development are described and compared. Advances in computer-aided design and computer-aided manufacturing (CAD/CAM) provide a new option for dentistry, creating an alternative technique for producing dental restorations. It is possible to create dental restorations that are automatically produced and meet or exceed current requirements for fit and occlusion.
An improved data transfer and storage technique for hybrid computation
NASA Technical Reports Server (NTRS)
Hansing, A. M.
1972-01-01
Improved technique was developed for transferring and storing data at faster than real time speeds on hybrid computer. Predominant advantage is combined use of electronic relays, track and store units, and analog-to-digital and digital-to-analog conversion units of hybrid computer.
Stability-Constrained Aerodynamic Shape Optimization with Applications to Flying Wings
NASA Astrophysics Data System (ADS)
Mader, Charles Alexander
A set of techniques is developed that allows the incorporation of flight dynamics metrics as an additional discipline in a high-fidelity aerodynamic optimization. Specifically, techniques for including static stability constraints and handling qualities constraints in a high-fidelity aerodynamic optimization are demonstrated. These constraints are developed from stability derivative information calculated using high-fidelity computational fluid dynamics (CFD). Two techniques are explored for computing the stability derivatives from CFD. One technique uses an automatic differentiation adjoint technique (ADjoint) to efficiently and accurately compute a full set of static and dynamic stability derivatives from a single steady solution. The other technique uses a linear regression method to compute the stability derivatives from a quasi-unsteady time-spectral CFD solution, allowing for the computation of static, dynamic and transient stability derivatives. Based on the characteristics of the two methods, the time-spectral technique is selected for further development, incorporated into an optimization framework, and used to conduct stability-constrained aerodynamic optimization. This stability-constrained optimization framework is then used to conduct an optimization study of a flying wing configuration. This study shows that stability constraints have a significant impact on the optimal design of flying wings and that, while static stability constraints can often be satisfied by modifying the airfoil profiles of the wing, dynamic stability constraints can require a significant change in the planform of the aircraft in order for the constraints to be satisfied.
NASA Astrophysics Data System (ADS)
Yu, Leiming; Nina-Paravecino, Fanny; Kaeli, David; Fang, Qianqian
2018-01-01
We present a highly scalable Monte Carlo (MC) three-dimensional photon transport simulation platform designed for heterogeneous computing systems. Through the development of a massively parallel MC algorithm using the Open Computing Language framework, this research extends our existing graphics processing unit (GPU)-accelerated MC technique to a highly scalable vendor-independent heterogeneous computing environment, achieving significantly improved performance and software portability. A number of parallel computing techniques are investigated to achieve portable performance over a wide range of computing hardware. Furthermore, multiple thread-level and device-level load-balancing strategies are developed to obtain efficient simulations using multiple central processing units and GPUs.
Generalized dynamic engine simulation techniques for the digital computer
NASA Technical Reports Server (NTRS)
Sellers, J.; Teren, F.
1974-01-01
Recently advanced simulation techniques have been developed for the digital computer and used as the basis for development of a generalized dynamic engine simulation computer program, called DYNGEN. This computer program can analyze the steady state and dynamic performance of many kinds of aircraft gas turbine engines. Without changes to the basic program, DYNGEN can analyze one- or two-spool turbofan engines. The user must supply appropriate component performance maps and design-point information. Examples are presented to illustrate the capabilities of DYNGEN in the steady state and dynamic modes of operation. The analytical techniques used in DYNGEN are briefly discussed, and its accuracy is compared with a comparable simulation using the hybrid computer. The impact of DYNGEN and similar all-digital programs on future engine simulation philosophy is also discussed.
Generalized dynamic engine simulation techniques for the digital computer
NASA Technical Reports Server (NTRS)
Sellers, J.; Teren, F.
1974-01-01
Recently advanced simulation techniques have been developed for the digital computer and used as the basis for development of a generalized dynamic engine simulation computer program, called DYNGEN. This computer program can analyze the steady state and dynamic performance of many kinds of aircraft gas turbine engines. Without changes to the basic program DYNGEN can analyze one- or two-spool turbofan engines. The user must supply appropriate component performance maps and design-point information. Examples are presented to illustrate the capabilities of DYNGEN in the steady state and dynamic modes of operation. The analytical techniques used in DYNGEN are briefly discussed, and its accuracy is compared with a comparable simulation using the hybrid computer. The impact of DYNGEN and similar all-digital programs on future engine simulation philosophy is also discussed.
Generalized dynamic engine simulation techniques for the digital computers
NASA Technical Reports Server (NTRS)
Sellers, J.; Teren, F.
1975-01-01
Recently advanced simulation techniques have been developed for the digital computer and used as the basis for development of a generalized dynamic engine simulation computer program, called DYNGEN. This computer program can analyze the steady state and dynamic performance of many kinds of aircraft gas turbine engines. Without changes to the basic program, DYNGEN can analyze one- or two-spool turbofan engines. The user must supply appropriate component performance maps and design point information. Examples are presented to illustrate the capabilities of DYNGEN in the steady state and dynamic modes of operation. The analytical techniques used in DYNGEN are briefly discussed, and its accuracy is compared with a comparable simulation using the hybrid computer. The impact of DYNGEN and similar digital programs on future engine simulation philosophy is also discussed.
NASA Technical Reports Server (NTRS)
Smith, Paul H.
1988-01-01
The Computer Science Program provides advanced concepts, techniques, system architectures, algorithms, and software for both space and aeronautics information sciences and computer systems. The overall goal is to provide the technical foundation within NASA for the advancement of computing technology in aerospace applications. The research program is improving the state of knowledge of fundamental aerospace computing principles and advancing computing technology in space applications such as software engineering and information extraction from data collected by scientific instruments in space. The program includes the development of special algorithms and techniques to exploit the computing power provided by high performance parallel processors and special purpose architectures. Research is being conducted in the fundamentals of data base logic and improvement techniques for producing reliable computing systems.
Digital receiver study and implementation
NASA Technical Reports Server (NTRS)
Fogle, D. A.; Lee, G. M.; Massey, J. C.
1972-01-01
Computer software was developed which makes it possible to use any general purpose computer with A/D conversion capability as a PSK receiver for low data rate telemetry processing. Carrier tracking, bit synchronization, and matched filter detection are all performed digitally. To aid in the implementation of optimum computer processors, a study of general digital processing techniques was performed which emphasized various techniques for digitizing general analog systems. In particular, the phase-locked loop was extensively analyzed as a typical non-linear communication element. Bayesian estimation techniques for PSK demodulation were studied. A hardware implementation of the digital Costas loop was developed.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chen, Yousu; Etingov, Pavel V.; Ren, Huiying
This paper describes a probabilistic look-ahead contingency analysis application that incorporates smart sampling and high-performance computing (HPC) techniques. Smart sampling techniques are implemented to effectively represent the structure and statistical characteristics of uncertainty introduced by different sources in the power system. They can significantly reduce the data set size required for multiple look-ahead contingency analyses, and therefore reduce the time required to compute them. High-performance-computing (HPC) techniques are used to further reduce computational time. These two techniques enable a predictive capability that forecasts the impact of various uncertainties on potential transmission limit violations. The developed package has been tested withmore » real world data from the Bonneville Power Administration. Case study results are presented to demonstrate the performance of the applications developed.« less
Models and techniques for evaluating the effectiveness of aircraft computing systems
NASA Technical Reports Server (NTRS)
Meyer, J. F.
1977-01-01
Models, measures and techniques were developed for evaluating the effectiveness of aircraft computing systems. The concept of effectiveness involves aspects of system performance, reliability and worth. Specifically done was a detailed development of model hierarchy at mission, functional task, and computational task levels. An appropriate class of stochastic models was investigated which served as bottom level models in the hierarchial scheme. A unified measure of effectiveness called 'performability' was defined and formulated.
Employing Subgoals in Computer Programming Education
ERIC Educational Resources Information Center
Margulieux, Lauren E.; Catrambone, Richard; Guzdial, Mark
2016-01-01
The rapid integration of technology into our professional and personal lives has left many education systems ill-equipped to deal with the influx of people seeking computing education. To improve computing education, we are applying techniques that have been developed for other procedural fields. The present study applied such a technique, subgoal…
ERIC Educational Resources Information Center
Letmanyi, Helen
Developed to identify and qualitatively assess computer system evaluation techniques for use during acquisition of general purpose computer systems, this document presents several criteria for comparison and selection. An introduction discusses the automatic data processing (ADP) acquisition process and the need to plan for uncertainty through…
CAPSAS: Computer Assisted Program for the Selection of Appropriate Statistics.
ERIC Educational Resources Information Center
Shermis, Mark D.; Albert, Susan L.
A computer-assisted program has been developed for the selection of statistics or statistical techniques by both students and researchers. Based on Andrews, Klem, Davidson, O'Malley and Rodgers "A Guide for Selecting Statistical Techniques for Analyzing Social Science Data," this FORTRAN-compiled interactive computer program was…
Applications of CFD and visualization techniques
NASA Technical Reports Server (NTRS)
Saunders, James H.; Brown, Susan T.; Crisafulli, Jeffrey J.; Southern, Leslie A.
1992-01-01
In this paper, three applications are presented to illustrate current techniques for flow calculation and visualization. The first two applications use a commercial computational fluid dynamics (CFD) code, FLUENT, performed on a Cray Y-MP. The results are animated with the aid of data visualization software, apE. The third application simulates a particulate deposition pattern using techniques inspired by developments in nonlinear dynamical systems. These computations were performed on personal computers.
A Fully Distributed Approach to the Design of a KBIT/SEC VHF Packet Radio Network,
1984-02-01
topological change and consequent out-modea routing data. Algorithm development has been aided by computer simulation using a finite state machine technique...development has been aided by computer simulation using a finite state machine technique to model a realistic network of up to fifty nodes. This is...use of computer based equipments in weapons systems and their associated sensors and command and control elements and the trend from voice to data
NASA Technical Reports Server (NTRS)
Hoffer, R. M.
1975-01-01
Skylab data were obtained over a mountainous test site containing a complex association of cover types and rugged topography. The application of computer-aided analysis techniques to the multispectral scanner data produced a number of significant results. Techniques were developed to digitally overlay topographic data (elevation, slope, and aspect) onto the S-192 MSS data to provide a method for increasing the effectiveness and accuracy of computer-aided analysis techniques for cover type mapping. The S-192 MSS data were analyzed using computer techniques developed at Laboratory for Applications of Remote Sensing (LARS), Purdue University. Land use maps, forest cover type maps, snow cover maps, and area tabulations were obtained and evaluated. These results compared very well with information obtained by conventional techniques. Analysis of the spectral characteristics of Skylab data has conclusively proven the value of the middle infrared portion of the spectrum (about 1.3-3.0 micrometers), a wavelength region not previously available in multispectral satellite data.
Transient Faults in Computer Systems
NASA Technical Reports Server (NTRS)
Masson, Gerald M.
1993-01-01
A powerful technique particularly appropriate for the detection of errors caused by transient faults in computer systems was developed. The technique can be implemented in either software or hardware; the research conducted thus far primarily considered software implementations. The error detection technique developed has the distinct advantage of having provably complete coverage of all errors caused by transient faults that affect the output produced by the execution of a program. In other words, the technique does not have to be tuned to a particular error model to enhance error coverage. Also, the correctness of the technique can be formally verified. The technique uses time and software redundancy. The foundation for an effective, low-overhead, software-based certification trail approach to real-time error detection resulting from transient fault phenomena was developed.
Computing in Qualitative Analysis: A Healthy Development?
ERIC Educational Resources Information Center
Richards, Lyn; Richards, Tom
1991-01-01
Discusses the potential impact of computers in qualitative health research. Describes the original goals, design, and implementation of NUDIST, a qualitative computing software. Argues for evaluation of the impact of computer techniques and for an opening of debate among program developers and users to address the purposes and power of computing…
Guidelines for developing vectorizable computer programs
NASA Technical Reports Server (NTRS)
Miner, E. W.
1982-01-01
Some fundamental principles for developing computer programs which are compatible with array-oriented computers are presented. The emphasis is on basic techniques for structuring computer codes which are applicable in FORTRAN and do not require a special programming language or exact a significant penalty on a scalar computer. Researchers who are using numerical techniques to solve problems in engineering can apply these basic principles and thus develop transportable computer programs (in FORTRAN) which contain much vectorizable code. The vector architecture of the ASC is discussed so that the requirements of array processing can be better appreciated. The "vectorization" of a finite-difference viscous shock-layer code is used as an example to illustrate the benefits and some of the difficulties involved. Increases in computing speed with vectorization are illustrated with results from the viscous shock-layer code and from a finite-element shock tube code. The applicability of these principles was substantiated through running programs on other computers with array-associated computing characteristics, such as the Hewlett-Packard (H-P) 1000-F.
NASA Technical Reports Server (NTRS)
Hayden, W. L.; Robinson, L. H.
1972-01-01
Spectral analyses of angle-modulated communication systems is studied by: (1) performing a literature survey of candidate power spectrum computational techniques, determining the computational requirements, and formulating a mathematical model satisfying these requirements; (2) implementing the model on UNIVAC 1230 digital computer as the Spectral Analysis Program (SAP); and (3) developing the hardware specifications for a data acquisition system which will acquire an input modulating signal for SAP. The SAP computational technique uses extended fast Fourier transform and represents a generalized approach for simple and complex modulating signals.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kevrekidis, Ioannis G.
The work explored the linking of modern developing machine learning techniques (manifold learning and in particular diffusion maps) with traditional PDE modeling/discretization/scientific computation techniques via the equation-free methodology developed by the PI. The result (in addition to several PhD degrees, two of them by CSGF Fellows) was a sequence of strong developments - in part on the algorithmic side, linking data mining with scientific computing, and in part on applications, ranging from PDE discretizations to molecular dynamics and complex network dynamics.
NASA Technical Reports Server (NTRS)
Dieudonne, J. E.
1978-01-01
A numerical technique was developed which generates linear perturbation models from nonlinear aircraft vehicle simulations. The technique is very general and can be applied to simulations of any system that is described by nonlinear differential equations. The computer program used to generate these models is discussed, with emphasis placed on generation of the Jacobian matrices, calculation of the coefficients needed for solving the perturbation model, and generation of the solution of the linear differential equations. An example application of the technique to a nonlinear model of the NASA terminal configured vehicle is included.
Finite volume model for two-dimensional shallow environmental flow
Simoes, F.J.M.
2011-01-01
This paper presents the development of a two-dimensional, depth integrated, unsteady, free-surface model based on the shallow water equations. The development was motivated by the desire of balancing computational efficiency and accuracy by selective and conjunctive use of different numerical techniques. The base framework of the discrete model uses Godunov methods on unstructured triangular grids, but the solution technique emphasizes the use of a high-resolution Riemann solver where needed, switching to a simpler and computationally more efficient upwind finite volume technique in the smooth regions of the flow. Explicit time marching is accomplished with strong stability preserving Runge-Kutta methods, with additional acceleration techniques for steady-state computations. A simplified mass-preserving algorithm is used to deal with wet/dry fronts. Application of the model is made to several benchmark cases that show the interplay of the diverse solution techniques.
NASA Technical Reports Server (NTRS)
Tranter, W. H.; Turner, M. D.
1977-01-01
Techniques are developed to estimate power gain, delay, signal-to-noise ratio, and mean square error in digital computer simulations of lowpass and bandpass systems. The techniques are applied to analog and digital communications. The signal-to-noise ratio estimates are shown to be maximum likelihood estimates in additive white Gaussian noise. The methods are seen to be especially useful for digital communication systems where the mapping from the signal-to-noise ratio to the error probability can be obtained. Simulation results show the techniques developed to be accurate and quite versatile in evaluating the performance of many systems through digital computer simulation.
A Modeling and Data Analysis of Laser Beam Propagation in the Maritime Domain
2015-05-18
approach to computing pdfs is the Kernel Density Method (Reference [9] has an intro - duction to the method), which we will apply to compute the pdf of our...The project has two parts to it: 1) we present a computational analysis of different probability density function approximation techniques; and 2) we... computational analysis of different probability density function approximation techniques; and 2) we introduce preliminary steps towards developing a
Computer assisted audit techniques for UNIX (UNIX-CAATS)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Polk, W.T.
1991-12-31
Federal and DOE regulations impose specific requirements for internal controls of computer systems. These controls include adequate separation of duties and sufficient controls for access of system and data. The DOE Inspector General`s Office has the responsibility to examine internal controls, as well as efficient use of computer system resources. As a result, DOE supported NIST development of computer assisted audit techniques to examine BSD UNIX computers (UNIX-CAATS). These systems were selected due to the increasing number of UNIX workstations in use within DOE. This paper describes the design and development of these techniques, as well as the results ofmore » testing at NIST and the first audit at a DOE site. UNIX-CAATS consists of tools which examine security of passwords, file systems, and network access. In addition, a tool was developed to examine efficiency of disk utilization. Test results at NIST indicated inadequate password management, as well as weak network resource controls. File system security was considered adequate. Audit results at a DOE site indicated weak password management and inefficient disk utilization. During the audit, we also found improvements to UNIX-CAATS were needed when applied to large systems. NIST plans to enhance the techniques developed for DOE/IG in future work. This future work would leverage currently available tools, along with needed enhancements. These enhancements would enable DOE/IG to audit large systems, such as supercomputers.« less
Computer assisted audit techniques for UNIX (UNIX-CAATS)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Polk, W.T.
1991-01-01
Federal and DOE regulations impose specific requirements for internal controls of computer systems. These controls include adequate separation of duties and sufficient controls for access of system and data. The DOE Inspector General's Office has the responsibility to examine internal controls, as well as efficient use of computer system resources. As a result, DOE supported NIST development of computer assisted audit techniques to examine BSD UNIX computers (UNIX-CAATS). These systems were selected due to the increasing number of UNIX workstations in use within DOE. This paper describes the design and development of these techniques, as well as the results ofmore » testing at NIST and the first audit at a DOE site. UNIX-CAATS consists of tools which examine security of passwords, file systems, and network access. In addition, a tool was developed to examine efficiency of disk utilization. Test results at NIST indicated inadequate password management, as well as weak network resource controls. File system security was considered adequate. Audit results at a DOE site indicated weak password management and inefficient disk utilization. During the audit, we also found improvements to UNIX-CAATS were needed when applied to large systems. NIST plans to enhance the techniques developed for DOE/IG in future work. This future work would leverage currently available tools, along with needed enhancements. These enhancements would enable DOE/IG to audit large systems, such as supercomputers.« less
Mincholé, Ana; Martínez, Juan Pablo; Laguna, Pablo; Rodriguez, Blanca
2018-01-01
Widely developed for clinical screening, electrocardiogram (ECG) recordings capture the cardiac electrical activity from the body surface. ECG analysis can therefore be a crucial first step to help diagnose, understand and predict cardiovascular disorders responsible for 30% of deaths worldwide. Computational techniques, and more specifically machine learning techniques and computational modelling are powerful tools for classification, clustering and simulation, and they have recently been applied to address the analysis of medical data, especially ECG data. This review describes the computational methods in use for ECG analysis, with a focus on machine learning and 3D computer simulations, as well as their accuracy, clinical implications and contributions to medical advances. The first section focuses on heartbeat classification and the techniques developed to extract and classify abnormal from regular beats. The second section focuses on patient diagnosis from whole recordings, applied to different diseases. The third section presents real-time diagnosis and applications to wearable devices. The fourth section highlights the recent field of personalized ECG computer simulations and their interpretation. Finally, the discussion section outlines the challenges of ECG analysis and provides a critical assessment of the methods presented. The computational methods reported in this review are a strong asset for medical discoveries and their translation to the clinical world may lead to promising advances. PMID:29321268
Donato, David I.
2012-01-01
This report presents the mathematical expressions and the computational techniques required to compute maximum-likelihood estimates for the parameters of the National Descriptive Model of Mercury in Fish (NDMMF), a statistical model used to predict the concentration of methylmercury in fish tissue. The expressions and techniques reported here were prepared to support the development of custom software capable of computing NDMMF parameter estimates more quickly and using less computer memory than is currently possible with available general-purpose statistical software. Computation of maximum-likelihood estimates for the NDMMF by numerical solution of a system of simultaneous equations through repeated Newton-Raphson iterations is described. This report explains the derivation of the mathematical expressions required for computational parameter estimation in sufficient detail to facilitate future derivations for any revised versions of the NDMMF that may be developed.
Constraint treatment techniques and parallel algorithms for multibody dynamic analysis. Ph.D. Thesis
NASA Technical Reports Server (NTRS)
Chiou, Jin-Chern
1990-01-01
Computational procedures for kinematic and dynamic analysis of three-dimensional multibody dynamic (MBD) systems are developed from the differential-algebraic equations (DAE's) viewpoint. Constraint violations during the time integration process are minimized and penalty constraint stabilization techniques and partitioning schemes are developed. The governing equations of motion, a two-stage staggered explicit-implicit numerical algorithm, are treated which takes advantage of a partitioned solution procedure. A robust and parallelizable integration algorithm is developed. This algorithm uses a two-stage staggered central difference algorithm to integrate the translational coordinates and the angular velocities. The angular orientations of bodies in MBD systems are then obtained by using an implicit algorithm via the kinematic relationship between Euler parameters and angular velocities. It is shown that the combination of the present solution procedures yields a computationally more accurate solution. To speed up the computational procedures, parallel implementation of the present constraint treatment techniques, the two-stage staggered explicit-implicit numerical algorithm was efficiently carried out. The DAE's and the constraint treatment techniques were transformed into arrowhead matrices to which Schur complement form was derived. By fully exploiting the sparse matrix structural analysis techniques, a parallel preconditioned conjugate gradient numerical algorithm is used to solve the systems equations written in Schur complement form. A software testbed was designed and implemented in both sequential and parallel computers. This testbed was used to demonstrate the robustness and efficiency of the constraint treatment techniques, the accuracy of the two-stage staggered explicit-implicit numerical algorithm, and the speed up of the Schur-complement-based parallel preconditioned conjugate gradient algorithm on a parallel computer.
NASA Technical Reports Server (NTRS)
Treinish, Lloyd A.; Gough, Michael L.; Wildenhain, W. David
1987-01-01
The capability was developed of rapidly producing visual representations of large, complex, multi-dimensional space and earth sciences data sets via the implementation of computer graphics modeling techniques on the Massively Parallel Processor (MPP) by employing techniques recently developed for typically non-scientific applications. Such capabilities can provide a new and valuable tool for the understanding of complex scientific data, and a new application of parallel computing via the MPP. A prototype system with such capabilities was developed and integrated into the National Space Science Data Center's (NSSDC) Pilot Climate Data System (PCDS) data-independent environment for computer graphics data display to provide easy access to users. While developing these capabilities, several problems had to be solved independently of the actual use of the MPP, all of which are outlined.
Research on Synthesis of Concurrent Computing Systems.
1982-09-01
20 1.5.1 An Informal Description of the Techniques ....... ..................... 20 1.5 2 Formal Definitions of Aggregation and Virtualisation ...sparsely interconnected networks . We have also developed techniques to create Kung’s systolic array parallel structure from a specification of matrix...resufts of the computation of that element. For example, if A,j is computed using a single enumeration, then virtualisation would produce a three
Techniques for generation of control and guidance signals derived from optical fields, part 2
NASA Technical Reports Server (NTRS)
Hemami, H.; Mcghee, R. B.; Gardner, S. R.
1971-01-01
The development is reported of a high resolution technique for the detection and identification of landmarks from spacecraft optical fields. By making use of nonlinear regression analysis, a method is presented whereby a sequence of synthetic images produced by a digital computer can be automatically adjusted to provide a least squares approximation to a real image. The convergence of the method is demonstrated by means of a computer simulation for both elliptical and rectangular patterns. Statistical simulation studies with elliptical and rectangular patterns show that the computational techniques developed are able to at least match human pattern recognition capabilities, even in the presence of large amounts of noise. Unlike most pattern recognition techniques, this ability is unaffected by arbitrary pattern rotation, translation, and scale change. Further development of the basic approach may eventually allow a spacecraft or robot vehicle to be provided with an ability to very accurately determine its spatial relationship to arbitrary known objects within its optical field of view.
Hybrid soft computing systems for electromyographic signals analysis: a review.
Xie, Hong-Bo; Guo, Tianruo; Bai, Siwei; Dokos, Socrates
2014-02-03
Electromyographic (EMG) is a bio-signal collected on human skeletal muscle. Analysis of EMG signals has been widely used to detect human movement intent, control various human-machine interfaces, diagnose neuromuscular diseases, and model neuromusculoskeletal system. With the advances of artificial intelligence and soft computing, many sophisticated techniques have been proposed for such purpose. Hybrid soft computing system (HSCS), the integration of these different techniques, aims to further improve the effectiveness, efficiency, and accuracy of EMG analysis. This paper reviews and compares key combinations of neural network, support vector machine, fuzzy logic, evolutionary computing, and swarm intelligence for EMG analysis. Our suggestions on the possible future development of HSCS in EMG analysis are also given in terms of basic soft computing techniques, further combination of these techniques, and their other applications in EMG analysis.
Hybrid soft computing systems for electromyographic signals analysis: a review
2014-01-01
Electromyographic (EMG) is a bio-signal collected on human skeletal muscle. Analysis of EMG signals has been widely used to detect human movement intent, control various human-machine interfaces, diagnose neuromuscular diseases, and model neuromusculoskeletal system. With the advances of artificial intelligence and soft computing, many sophisticated techniques have been proposed for such purpose. Hybrid soft computing system (HSCS), the integration of these different techniques, aims to further improve the effectiveness, efficiency, and accuracy of EMG analysis. This paper reviews and compares key combinations of neural network, support vector machine, fuzzy logic, evolutionary computing, and swarm intelligence for EMG analysis. Our suggestions on the possible future development of HSCS in EMG analysis are also given in terms of basic soft computing techniques, further combination of these techniques, and their other applications in EMG analysis. PMID:24490979
Computer Simulation of the Population Growth (Schizosaccharomyces Pombe) Experiment.
ERIC Educational Resources Information Center
Daley, Michael; Hillier, Douglas
1981-01-01
Describes a computer program (available from authors) developed to simulate "Growth of a Population (Yeast) Experiment." Students actively revise the counting techniques with realistically simulated haemocytometer or eye-piece grid and are reminded of the necessary dilution technique. Program can be modified to introduce such variables…
Advanced intellect-augmentation techniques
NASA Technical Reports Server (NTRS)
Engelbart, D. C.
1972-01-01
User experience in applying our augmentation tools and techniques to various normal working tasks within our center is described so as to convey a subjective impression of what it is like to work in an augmented environment. It is concluded that working-support, computer-aid systems for augmenting individuals and teams, are undoubtedly going to be widely developed and used. A very special role in this development is seen for multi-access computer networks.
Shock and vibration technology with applications to electrical systems
NASA Technical Reports Server (NTRS)
Eshleman, R. L.
1972-01-01
A survey is presented of shock and vibration technology for electrical systems developed by the aerospace programs. The shock environment is surveyed along with new techniques for modeling, computer simulation, damping, and response analysis. Design techniques based on the use of analog computers, shock spectra, optimization, and nonlinear isolation are discussed. Shock mounting of rotors for performance and survival, and vibration isolation techniques are reviewed.
Parallel computing in experimental mechanics and optical measurement: A review (II)
NASA Astrophysics Data System (ADS)
Wang, Tianyi; Kemao, Qian
2018-05-01
With advantages such as non-destructiveness, high sensitivity and high accuracy, optical techniques have successfully integrated into various important physical quantities in experimental mechanics (EM) and optical measurement (OM). However, in pursuit of higher image resolutions for higher accuracy, the computation burden of optical techniques has become much heavier. Therefore, in recent years, heterogeneous platforms composing of hardware such as CPUs and GPUs, have been widely employed to accelerate these techniques due to their cost-effectiveness, short development cycle, easy portability, and high scalability. In this paper, we analyze various works by first illustrating their different architectures, followed by introducing their various parallel patterns for high speed computation. Next, we review the effects of CPU and GPU parallel computing specifically in EM & OM applications in a broad scope, which include digital image/volume correlation, fringe pattern analysis, tomography, hyperspectral imaging, computer-generated holograms, and integral imaging. In our survey, we have found that high parallelism can always be exploited in such applications for the development of high-performance systems.
Modeling and Analysis of Power Processing Systems (MAPPS). Volume 1: Technical report
NASA Technical Reports Server (NTRS)
Lee, F. C.; Rahman, S.; Carter, R. A.; Wu, C. H.; Yu, Y.; Chang, R.
1980-01-01
Computer aided design and analysis techniques were applied to power processing equipment. Topics covered include: (1) discrete time domain analysis of switching regulators for performance analysis; (2) design optimization of power converters using augmented Lagrangian penalty function technique; (3) investigation of current-injected multiloop controlled switching regulators; and (4) application of optimization for Navy VSTOL energy power system. The generation of the mathematical models and the development and application of computer aided design techniques to solve the different mathematical models are discussed. Recommendations are made for future work that would enhance the application of the computer aided design techniques for power processing systems.
ERIC Educational Resources Information Center
Tsai, Chih-Fong; Tsai, Ching-Tzu; Hung, Chia-Sheng; Hwang, Po-Sen
2011-01-01
Enabling undergraduate students to develop basic computing skills is an important issue in higher education. As a result, some universities have developed computer proficiency tests, which aim to assess students' computer literacy. Generally, students are required to pass such tests in order to prove that they have a certain level of computer…
Computer Training for Entrepreneurial Meteorologists.
NASA Astrophysics Data System (ADS)
Koval, Joseph P.; Young, George S.
2001-05-01
Computer applications of increasing diversity form a growing part of the undergraduate education of meteorologists in the early twenty-first century. The advent of the Internet economy, as well as a waning demand for traditional forecasters brought about by better numerical models and statistical forecasting techniques has greatly increased the need for operational and commercial meteorologists to acquire computer skills beyond the traditional techniques of numerical analysis and applied statistics. Specifically, students with the skills to develop data distribution products are in high demand in the private sector job market. Meeting these demands requires greater breadth, depth, and efficiency in computer instruction. The authors suggest that computer instruction for undergraduate meteorologists should include three key elements: a data distribution focus, emphasis on the techniques required to learn computer programming on an as-needed basis, and a project orientation to promote management skills and support student morale. In an exploration of this approach, the authors have reinvented the Applications of Computers to Meteorology course in the Department of Meteorology at The Pennsylvania State University to teach computer programming within the framework of an Internet product development cycle. Because the computer skills required for data distribution programming change rapidly, specific languages are valuable for only a limited time. A key goal of this course was therefore to help students learn how to retrain efficiently as technologies evolve. The crux of the course was a semester-long project during which students developed an Internet data distribution product. As project management skills are also important in the job market, the course teamed students in groups of four for this product development project. The success, failures, and lessons learned from this experiment are discussed and conclusions drawn concerning undergraduate instructional methods for computer applications in meteorology.
NASA Technical Reports Server (NTRS)
Chan, William M.
1995-01-01
Algorithms and computer code developments were performed for the overset grid approach to solving computational fluid dynamics problems. The techniques developed are applicable to compressible Navier-Stokes flow for any general complex configurations. The computer codes developed were tested on different complex configurations with the Space Shuttle launch vehicle configuration as the primary test bed. General, efficient and user-friendly codes were produced for grid generation, flow solution and force and moment computation.
Construction of dynamic stochastic simulation models using knowledge-based techniques
NASA Technical Reports Server (NTRS)
Williams, M. Douglas; Shiva, Sajjan G.
1990-01-01
Over the past three decades, computer-based simulation models have proven themselves to be cost-effective alternatives to the more structured deterministic methods of systems analysis. During this time, many techniques, tools and languages for constructing computer-based simulation models have been developed. More recently, advances in knowledge-based system technology have led many researchers to note the similarities between knowledge-based programming and simulation technologies and to investigate the potential application of knowledge-based programming techniques to simulation modeling. The integration of conventional simulation techniques with knowledge-based programming techniques is discussed to provide a development environment for constructing knowledge-based simulation models. A comparison of the techniques used in the construction of dynamic stochastic simulation models and those used in the construction of knowledge-based systems provides the requirements for the environment. This leads to the design and implementation of a knowledge-based simulation development environment. These techniques were used in the construction of several knowledge-based simulation models including the Advanced Launch System Model (ALSYM).
Techniques for Enhancing Web-Based Education.
ERIC Educational Resources Information Center
Barbieri, Kathy; Mehringer, Susan
The Virtual Workshop is a World Wide Web-based set of modules on high performance computing developed at the Cornell Theory Center (CTC) (New York). This approach reaches a large audience, leverages staff effort, and poses challenges for developing interesting presentation techniques. This paper describes the following techniques with their…
Procedure for Adapting Direct Simulation Monte Carlo Meshes
NASA Technical Reports Server (NTRS)
Woronowicz, Michael S.; Wilmoth, Richard G.; Carlson, Ann B.; Rault, Didier F. G.
1992-01-01
A technique is presented for adapting computational meshes used in the G2 version of the direct simulation Monte Carlo method. The physical ideas underlying the technique are discussed, and adaptation formulas are developed for use on solutions generated from an initial mesh. The effect of statistical scatter on adaptation is addressed, and results demonstrate the ability of this technique to achieve more accurate results without increasing necessary computational resources.
Suggested Approaches to the Measurement of Computer Anxiety.
ERIC Educational Resources Information Center
Toris, Carol
Psychologists can gain insight into human behavior by examining what people feel about, know about, and do with, computers. Two extreme reactions to computers are computer phobia, or anxiety, and computer addiction, or "hacking". A four-part questionnaire was developed to measure computer anxiety. The first part is a projective technique which…
Computer Programming Languages and Expertise Needed by Practicing Engineers.
ERIC Educational Resources Information Center
Doelling, Irvin
1980-01-01
Discussed is the present engineering computer environment of a large aerospace company recognized as a leader in the application and development of computer-aided design and computer-aided manufacturing techniques. A review is given of the exposure spectrum of engineers to the world of computing, the computer languages used, and the career impacts…
High speed civil transport: Sonic boom softening and aerodynamic optimization
NASA Technical Reports Server (NTRS)
Cheung, Samson
1994-01-01
An improvement in sonic boom extrapolation techniques has been the desire of aerospace designers for years. This is because the linear acoustic theory developed in the 60's is incapable of predicting the nonlinear phenomenon of shock wave propagation. On the other hand, CFD techniques are too computationally expensive to employ on sonic boom problems. Therefore, this research focused on the development of a fast and accurate sonic boom extrapolation method that solves the Euler equations for axisymmetric flow. This new technique has brought the sonic boom extrapolation techniques up to the standards of the 90's. Parallel computing is a fast growing subject in the field of computer science because of its promising speed. A new optimizer (IIOWA) for the parallel computing environment has been developed and tested for aerodynamic drag minimization. This is a promising method for CFD optimization making use of the computational resources of workstations, which unlike supercomputers can spend most of their time idle. Finally, the OAW concept is attractive because of its overall theoretical performance. In order to fully understand the concept, a wind-tunnel model was built and is currently being tested at NASA Ames Research Center. The CFD calculations performed under this cooperative agreement helped to identify the problem of the flow separation, and also aided the design by optimizing the wing deflection for roll trim.
Finite Dimensional Approximations for Continuum Multiscale Problems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Berlyand, Leonid
2017-01-24
The completed research project concerns the development of novel computational techniques for modeling nonlinear multiscale physical and biological phenomena. Specifically, it addresses the theoretical development and applications of the homogenization theory (coarse graining) approach to calculation of the effective properties of highly heterogenous biological and bio-inspired materials with many spatial scales and nonlinear behavior. This theory studies properties of strongly heterogeneous media in problems arising in materials science, geoscience, biology, etc. Modeling of such media raises fundamental mathematical questions, primarily in partial differential equations (PDEs) and calculus of variations, the subject of the PI’s research. The focus of completed researchmore » was on mathematical models of biological and bio-inspired materials with the common theme of multiscale analysis and coarse grain computational techniques. Biological and bio-inspired materials offer the unique ability to create environmentally clean functional materials used for energy conversion and storage. These materials are intrinsically complex, with hierarchical organization occurring on many nested length and time scales. The potential to rationally design and tailor the properties of these materials for broad energy applications has been hampered by the lack of computational techniques, which are able to bridge from the molecular to the macroscopic scale. The project addressed the challenge of computational treatments of such complex materials by the development of a synergistic approach that combines innovative multiscale modeling/analysis techniques with high performance computing.« less
A comparative analysis of soft computing techniques for gene prediction.
Goel, Neelam; Singh, Shailendra; Aseri, Trilok Chand
2013-07-01
The rapid growth of genomic sequence data for both human and nonhuman species has made analyzing these sequences, especially predicting genes in them, very important and is currently the focus of many research efforts. Beside its scientific interest in the molecular biology and genomics community, gene prediction is of considerable importance in human health and medicine. A variety of gene prediction techniques have been developed for eukaryotes over the past few years. This article reviews and analyzes the application of certain soft computing techniques in gene prediction. First, the problem of gene prediction and its challenges are described. These are followed by different soft computing techniques along with their application to gene prediction. In addition, a comparative analysis of different soft computing techniques for gene prediction is given. Finally some limitations of the current research activities and future research directions are provided. Copyright © 2013 Elsevier Inc. All rights reserved.
ERIC Educational Resources Information Center
STONE, PHILIP J.
AUTOMATED LANGUAGE PROCESSING (CONTENT ANALYSIS) IS ENGAGED IN NEW VENTURES IN COMPUTER DIALOG AS A RESULT OF NEW TECHNIQUES IN CATEGORIZING RESPONSES. A COMPUTER "NEED-ACHIEVEMENT" SCORING SYSTEM HAS BEEN DEVELOPED. A SET OF COMPUTER PROGRAMS, LABELED "THE GENERAL INQUIRER," WILL SCORE COMPUTER INPUTS WITH RESPONSES FED FROM…
NASA Technical Reports Server (NTRS)
Srivastava, Deepak; Menon, Madhu; Cho, Kyeongjae; Biegel, Bryan (Technical Monitor)
2001-01-01
The role of computational nanotechnology in developing next generation of multifunctional materials, molecular scale electronic and computing devices, sensors, actuators, and machines is described through a brief review of enabling computational techniques and few recent examples derived from computer simulations of carbon nanotube based molecular nanotechnology.
Integer Linear Programming in Computational Biology
NASA Astrophysics Data System (ADS)
Althaus, Ernst; Klau, Gunnar W.; Kohlbacher, Oliver; Lenhof, Hans-Peter; Reinert, Knut
Computational molecular biology (bioinformatics) is a young research field that is rich in NP-hard optimization problems. The problem instances encountered are often huge and comprise thousands of variables. Since their introduction into the field of bioinformatics in 1997, integer linear programming (ILP) techniques have been successfully applied to many optimization problems. These approaches have added much momentum to development and progress in related areas. In particular, ILP-based approaches have become a standard optimization technique in bioinformatics. In this review, we present applications of ILP-based techniques developed by members and former members of Kurt Mehlhorn’s group. These techniques were introduced to bioinformatics in a series of papers and popularized by demonstration of their effectiveness and potential.
Oberg, Kevin A.; Mades, Dean M.
1987-01-01
Four techniques for estimating generalized skew in Illinois were evaluated: (1) a generalized skew map of the US; (2) an isoline map; (3) a prediction equation; and (4) a regional-mean skew. Peak-flow records at 730 gaging stations having 10 or more annual peaks were selected for computing station skews. Station skew values ranged from -3.55 to 2.95, with a mean of -0.11. Frequency curves computed for 30 gaging stations in Illinois using the variations of the regional-mean skew technique are similar to frequency curves computed using a skew map developed by the US Water Resources Council (WRC). Estimates of the 50-, 100-, and 500-yr floods computed for 29 of these gaging stations using the regional-mean skew techniques are within the 50% confidence limits of frequency curves computed using the WRC skew map. Although the three variations of the regional-mean skew technique were slightly more accurate than the WRC map, there is no appreciable difference between flood estimates computed using the variations of the regional-mean technique and flood estimates computed using the WRC skew map. (Peters-PTT)
Cheng, Gui-Juan; Zhang, Xinhao; Chung, Lung Wa; Xu, Liping; Wu, Yun-Dong
2015-02-11
Understanding the mechanisms of chemical reactions, especially catalysis, has been an important and active area of computational organic chemistry, and close collaborations between experimentalists and theorists represent a growing trend. This Perspective provides examples of such productive collaborations. The understanding of various reaction mechanisms and the insight gained from these studies are emphasized. The applications of various experimental techniques in elucidation of reaction details as well as the development of various computational techniques to meet the demand of emerging synthetic methods, e.g., C-H activation, organocatalysis, and single electron transfer, are presented along with some conventional developments of mechanistic aspects. Examples of applications are selected to demonstrate the advantages and limitations of these techniques. Some challenges in the mechanistic studies and predictions of reactions are also analyzed.
Prediction of physical protein protein interactions
NASA Astrophysics Data System (ADS)
Szilágyi, András; Grimm, Vera; Arakaki, Adrián K.; Skolnick, Jeffrey
2005-06-01
Many essential cellular processes such as signal transduction, transport, cellular motion and most regulatory mechanisms are mediated by protein-protein interactions. In recent years, new experimental techniques have been developed to discover the protein-protein interaction networks of several organisms. However, the accuracy and coverage of these techniques have proven to be limited, and computational approaches remain essential both to assist in the design and validation of experimental studies and for the prediction of interaction partners and detailed structures of protein complexes. Here, we provide a critical overview of existing structure-independent and structure-based computational methods. Although these techniques have significantly advanced in the past few years, we find that most of them are still in their infancy. We also provide an overview of experimental techniques for the detection of protein-protein interactions. Although the developments are promising, false positive and false negative results are common, and reliable detection is possible only by taking a consensus of different experimental approaches. The shortcomings of experimental techniques affect both the further development and the fair evaluation of computational prediction methods. For an adequate comparative evaluation of prediction and high-throughput experimental methods, an appropriately large benchmark set of biophysically characterized protein complexes would be needed, but is sorely lacking.
Lauerman, Lloyd H
2004-12-01
Since the discovery of the polymerase chain reaction (PCR) 20 years ago, an avalanche of scientific publications have reported major developments and changes in specialized equipment, reagents, sample preparation, computer programs and techniques, generated through business, government and university research. The requirement for genetic sequences for primer selection and validation has been greatly facilitated by the development of new sequencing techniques, machines and computer programs. Genetic libraries, such as GenBank, EMBL and DDBJ continue to accumulate a wealth of genetic sequence information for the development and validation of molecular-based diagnostic procedures concerning human and veterinary disease agents. The mechanization of various aspects of the PCR assay, such as robotics, microfluidics and nanotechnology, has made it possible for the rapid advancement of new procedures. Real-time PCR, DNA microarray and DNA chips utilize these newer techniques in conjunction with computer and computer programs. Instruments for hand-held PCR assays are being developed. The PCR and reverse transcription-PCR (RT-PCR) assays have greatly accelerated the speed and accuracy of diagnoses of human and animal disease, especially of the infectious agents that are difficult to isolate or demonstrate. The PCR has made it possible to genetically characterize a microbial isolate inexpensively and rapidly for identification, typing and epidemiological comparison.
Algorithms and software for solving finite element equations on serial and parallel architectures
NASA Technical Reports Server (NTRS)
George, Alan
1989-01-01
Over the past 15 years numerous new techniques have been developed for solving systems of equations and eigenvalue problems arising in finite element computations. A package called SPARSPAK has been developed by the author and his co-workers which exploits these new methods. The broad objective of this research project is to incorporate some of this software in the Computational Structural Mechanics (CSM) testbed, and to extend the techniques for use on multiprocessor architectures.
Simulating Drosophila Genetics with the Computer.
ERIC Educational Resources Information Center
Small, James W., Jr.; Edwards, Kathryn L.
1979-01-01
Presents some techniques developed to help improve student understanding of Mendelian principles through the use of a computer simulation model by the genetic system of the fruit fly. Includes discussion and evaluation of this computer assisted program. (MA)
An Algebra-Based Introductory Computational Neuroscience Course with Lab.
Fink, Christian G
2017-01-01
A course in computational neuroscience has been developed at Ohio Wesleyan University which requires no previous experience with calculus or computer programming, and which exposes students to theoretical models of neural information processing and techniques for analyzing neural data. The exploration of theoretical models of neural processes is conducted in the classroom portion of the course, while data analysis techniques are covered in lab. Students learn to program in MATLAB and are offered the opportunity to conclude the course with a final project in which they explore a topic of their choice within computational neuroscience. Results from a questionnaire administered at the beginning and end of the course indicate significant gains in student facility with core concepts in computational neuroscience, as well as with analysis techniques applied to neural data.
Ultra high speed image processing techniques. [electronic packaging techniques
NASA Technical Reports Server (NTRS)
Anthony, T.; Hoeschele, D. F.; Connery, R.; Ehland, J.; Billings, J.
1981-01-01
Packaging techniques for ultra high speed image processing were developed. These techniques involve the development of a signal feedthrough technique through LSI/VLSI sapphire substrates. This allows the stacking of LSI/VLSI circuit substrates in a 3 dimensional package with greatly reduced length of interconnecting lines between the LSI/VLSI circuits. The reduced parasitic capacitances results in higher LSI/VLSI computational speeds at significantly reduced power consumption levels.
Resolution Study of a Hyperspectral Sensor using Computed Tomography in the Presence of Noise
2012-06-14
diffraction efficiency is dependent on wavelength. Compared to techniques developed by later work, simple algebraic reconstruction techniques were used...spectral di- mension, using computed tomography (CT) techniques with only a finite number of diverse images. CTHIS require a reconstruction algorithm in...many frames are needed to reconstruct the spectral cube of a simple object using a theoretical lower bound. In this research a new algorithm is derived
Redundancy management for efficient fault recovery in NASA's distributed computing system
NASA Technical Reports Server (NTRS)
Malek, Miroslaw; Pandya, Mihir; Yau, Kitty
1991-01-01
The management of redundancy in computer systems was studied and guidelines were provided for the development of NASA's fault-tolerant distributed systems. Fault recovery and reconfiguration mechanisms were examined. A theoretical foundation was laid for redundancy management by efficient reconfiguration methods and algorithmic diversity. Algorithms were developed to optimize the resources for embedding of computational graphs of tasks in the system architecture and reconfiguration of these tasks after a failure has occurred. The computational structure represented by a path and the complete binary tree was considered and the mesh and hypercube architectures were targeted for their embeddings. The innovative concept of Hybrid Algorithm Technique was introduced. This new technique provides a mechanism for obtaining fault tolerance while exhibiting improved performance.
Using Animation to Support the Teaching of Computer Game Development Techniques
ERIC Educational Resources Information Center
Taylor, Mark John; Pountney, David C.; Baskett, M.
2008-01-01
In this paper, we examine the potential use of animation for supporting the teaching of some of the mathematical concepts that underlie computer games development activities, such as vector and matrix algebra. An experiment was conducted with a group of UK undergraduate computing students to compare the perceived usefulness of animated and static…
A Study of Computer Techniques for Music Research. Final Report.
ERIC Educational Resources Information Center
Lincoln, Harry B.
Work in three areas comprised this study of computer use in thematic indexing for music research: (1) acquisition, encoding, and keypunching of data--themes of which now number about 50,000 (primarily 16th Century Italian vocal music) and serve as a test base for program development; (2) development of computer programs to process this data; and…
NASA Astrophysics Data System (ADS)
Cerwin, Steve; Barnes, Julie; Kell, Scott; Walters, Mark
2003-09-01
This paper describes development and application of a novel method to accomplish real-time solid angle acoustic direction finding using two 8-element orthogonal microphone arrays. The developed prototype system was intended for localization and signature recognition of ground-based sounds from a small UAV. Recent advances in computer speeds have enabled the implementation of microphone arrays in many audio applications. Still, the real-time presentation of a two-dimensional sound field for the purpose of audio target localization is computationally challenging. In order to overcome this challenge, a crosspower spectrum phase1 (CSP) technique was applied to each 8-element arm of a 16-element cross array to provide audio target localization. In this paper, we describe the technique and compare it with two other commonly used techniques; Cross-Spectral Matrix2 and MUSIC3. The results show that the CSP technique applied to two 8-element orthogonal arrays provides a computationally efficient solution with reasonable accuracy and tolerable artifacts, sufficient for real-time applications. Additional topics include development of a synchronized 16-channel transmitter and receiver to relay the airborne data to the ground-based processor and presentation of test data demonstrating both ground-mounted operation and airborne localization of ground-based gunshots and loud engine sounds.
Distributed geospatial model sharing based on open interoperability standards
Feng, Min; Liu, Shuguang; Euliss, Ned H.; Fang, Yin
2009-01-01
Numerous geospatial computational models have been developed based on sound principles and published in journals or presented in conferences. However modelers have made few advances in the development of computable modules that facilitate sharing during model development or utilization. Constraints hampering development of model sharing technology includes limitations on computing, storage, and connectivity; traditional stand-alone and closed network systems cannot fully support sharing and integrating geospatial models. To address this need, we have identified methods for sharing geospatial computational models using Service Oriented Architecture (SOA) techniques and open geospatial standards. The service-oriented model sharing service is accessible using any tools or systems compliant with open geospatial standards, making it possible to utilize vast scientific resources available from around the world to solve highly sophisticated application problems. The methods also allow model services to be empowered by diverse computational devices and technologies, such as portable devices and GRID computing infrastructures. Based on the generic and abstract operations and data structures required for Web Processing Service (WPS) standards, we developed an interactive interface for model sharing to help reduce interoperability problems for model use. Geospatial computational models are shared on model services, where the computational processes provided by models can be accessed through tools and systems compliant with WPS. We developed a platform to help modelers publish individual models in a simplified and efficient way. Finally, we illustrate our technique using wetland hydrological models we developed for the prairie pothole region of North America.
Approaches to Classroom-Based Computational Science.
ERIC Educational Resources Information Center
Guzdial, Mark
Computational science includes the use of computer-based modeling and simulation to define and test theories about scientific phenomena. The challenge for educators is to develop techniques for implementing computational science in the classroom. This paper reviews some previous work on the use of simulation alone (without modeling), modeling…
Designing a hands-on brain computer interface laboratory course.
Khalighinejad, Bahar; Long, Laura Kathleen; Mesgarani, Nima
2016-08-01
Devices and systems that interact with the brain have become a growing field of research and development in recent years. Engineering students are well positioned to contribute to both hardware development and signal analysis techniques in this field. However, this area has been left out of most engineering curricula. We developed an electroencephalography (EEG) based brain computer interface (BCI) laboratory course to educate students through hands-on experiments. The course is offered jointly by the Biomedical Engineering, Electrical Engineering, and Computer Science Departments of Columbia University in the City of New York and is open to senior undergraduate and graduate students. The course provides an effective introduction to the experimental design, neuroscience concepts, data analysis techniques, and technical skills required in the field of BCI.
Computer system performance measurement techniques for ARTS III computer systems.
DOT National Transportation Integrated Search
1973-12-01
Direct measurement of computer systems is of vital importance in: a) developing an intelligent grasp of the variables which affect overall performance; b)tuning the system for optimum benefit; c)determining under what conditions saturation thresholds...
General review of the MOSTAS computer code for wind turbines
NASA Technical Reports Server (NTRS)
Dungundji, J.; Wendell, J. H.
1981-01-01
The MOSTAS computer code for wind turbine analysis is reviewed, and techniques and methods used in its analyses are described. Impressions of its strengths and weakness, and recommendations for its application, modification, and further development are made. Basic techniques used in wind turbine stability and response analyses for systems with constant and periodic coefficients are reviewed.
A Simulation Algorithm to Approximate the Area of Mapped Forest Inventory Plots
William A. Bechtold; Naser E. Heravi; Matthew E. Kinkenon
2003-01-01
Calculating the area of polygons associated with mapped forest inventory plots can be mathematically cumbersome, especially when computing change between inventories. We developed a simulation technique that utilizes a computer-generated dot grid and geometry to estimate the area of mapped polygons within any size circle. The technique also yields a matrix of change in...
New coding technique for computer generated holograms.
NASA Technical Reports Server (NTRS)
Haskell, R. E.; Culver, B. C.
1972-01-01
A coding technique is developed for recording computer generated holograms on a computer controlled CRT in which each resolution cell contains two beam spots of equal size and equal intensity. This provides a binary hologram in which only the position of the two dots is varied from cell to cell. The amplitude associated with each resolution cell is controlled by selectively diffracting unwanted light into a higher diffraction order. The recording of the holograms is fast and simple.
Control Law Design in a Computational Aeroelasticity Environment
NASA Technical Reports Server (NTRS)
Newsom, Jerry R.; Robertshaw, Harry H.; Kapania, Rakesh K.
2003-01-01
A methodology for designing active control laws in a computational aeroelasticity environment is given. The methodology involves employing a systems identification technique to develop an explicit state-space model for control law design from the output of a computational aeroelasticity code. The particular computational aeroelasticity code employed in this paper solves the transonic small disturbance aerodynamic equation using a time-accurate, finite-difference scheme. Linear structural dynamics equations are integrated simultaneously with the computational fluid dynamics equations to determine the time responses of the structure. These structural responses are employed as the input to a modern systems identification technique that determines the Markov parameters of an "equivalent linear system". The Eigensystem Realization Algorithm is then employed to develop an explicit state-space model of the equivalent linear system. The Linear Quadratic Guassian control law design technique is employed to design a control law. The computational aeroelasticity code is modified to accept control laws and perform closed-loop simulations. Flutter control of a rectangular wing model is chosen to demonstrate the methodology. Various cases are used to illustrate the usefulness of the methodology as the nonlinearity of the aeroelastic system is increased through increased angle-of-attack changes.
Sainath, Kamalesh; Teixeira, Fernando L; Donderici, Burkay
2014-01-01
We develop a general-purpose formulation, based on two-dimensional spectral integrals, for computing electromagnetic fields produced by arbitrarily oriented dipoles in planar-stratified environments, where each layer may exhibit arbitrary and independent anisotropy in both its (complex) permittivity and permeability tensors. Among the salient features of our formulation are (i) computation of eigenmodes (characteristic plane waves) supported in arbitrarily anisotropic media in a numerically robust fashion, (ii) implementation of an hp-adaptive refinement for the numerical integration to evaluate the radiation and weakly evanescent spectra contributions, and (iii) development of an adaptive extension of an integral convergence acceleration technique to compute the strongly evanescent spectrum contribution. While other semianalytic techniques exist to solve this problem, none have full applicability to media exhibiting arbitrary double anisotropies in each layer, where one must account for the whole range of possible phenomena (e.g., mode coupling at interfaces and nonreciprocal mode propagation). Brute-force numerical methods can tackle this problem but only at a much higher computational cost. The present formulation provides an efficient and robust technique for field computation in arbitrary planar-stratified environments. We demonstrate the formulation for a number of problems related to geophysical exploration.
Computerized technique for recording board defect data
R. Bruce Anderson; R. Edward Thomas; Charles J. Gatchell; Neal D. Bennett; Neal D. Bennett
1993-01-01
A computerized technique for recording board defect data has been developed that is faster and more accurate than manual techniques. The lumber database generated by this technique is a necessary input to computer simulation models that estimate potential cutting yields from various lumber breakdown sequences. The technique allows collection of detailed information...
Fan noise prediction assessment
NASA Technical Reports Server (NTRS)
Bent, Paul H.
1995-01-01
This report is an evaluation of two techniques for predicting the fan noise radiation from engine nacelles. The first is a relatively computational intensive finite element technique. The code is named ARC, an abbreviation of Acoustic Radiation Code, and was developed by Eversman. This is actually a suite of software that first generates a grid around the nacelle, then solves for the potential flowfield, and finally solves the acoustic radiation problem. The second approach is an analytical technique requiring minimal computational effort. This is termed the cutoff ratio technique and was developed by Rice. Details of the duct geometry, such as the hub-to-tip ratio and Mach number of the flow in the duct, and modal content of the duct noise are required for proper prediction.
High-freezing-point fuel studies
NASA Technical Reports Server (NTRS)
Tolle, F. F.
1980-01-01
Considerable progress in developing the experimental and analytical techniques needed to design airplanes to accommodate fuels with less stringent low temperature specifications is reported. A computer technique for calculating fuel temperature profiles in full tanks was developed. The computer program is being extended to include the case of partially empty tanks. Ultimately, the completed package is to be incorporated into an aircraft fuel tank thermal analyser code to permit the designer to fly various thermal exposure patterns, study fuel temperatures versus time, and determine holdup.
NASA Technical Reports Server (NTRS)
Faller, K. H.
1976-01-01
A technique for the detection and measurement of surface feature interfaces in remotely acquired data was developed and evaluated. A computer implementation of this technique was effected to automatically process classified data derived from various sources such as the LANDSAT multispectral scanner and other scanning sensors. The basic elements of the operational theory of the technique are described, followed by the details of the procedure. An example of an application of the technique to the analysis of tidal shoreline length is given with a breakdown of manpower requirements.
NASA Technical Reports Server (NTRS)
Anuta, P. E.
1975-01-01
Least squares approximation techniques were developed for use in computer aided correction of spatial image distortions for registration of multitemporal remote sensor imagery. Polynomials were first used to define image distortion over the entire two dimensional image space. Spline functions were then investigated to determine if the combination of lower order polynomials could approximate a higher order distortion with less computational difficulty. Algorithms for generating approximating functions were developed and applied to the description of image distortion in aircraft multispectral scanner imagery. Other applications of the techniques were suggested for earth resources data processing areas other than geometric distortion representation.
RNA secondary structure prediction using soft computing.
Ray, Shubhra Sankar; Pal, Sankar K
2013-01-01
Prediction of RNA structure is invaluable in creating new drugs and understanding genetic diseases. Several deterministic algorithms and soft computing-based techniques have been developed for more than a decade to determine the structure from a known RNA sequence. Soft computing gained importance with the need to get approximate solutions for RNA sequences by considering the issues related with kinetic effects, cotranscriptional folding, and estimation of certain energy parameters. A brief description of some of the soft computing-based techniques, developed for RNA secondary structure prediction, is presented along with their relevance. The basic concepts of RNA and its different structural elements like helix, bulge, hairpin loop, internal loop, and multiloop are described. These are followed by different methodologies, employing genetic algorithms, artificial neural networks, and fuzzy logic. The role of various metaheuristics, like simulated annealing, particle swarm optimization, ant colony optimization, and tabu search is also discussed. A relative comparison among different techniques, in predicting 12 known RNA secondary structures, is presented, as an example. Future challenging issues are then mentioned.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Li, Song
CFD (Computational Fluid Dynamics) is a widely used technique in engineering design field. It uses mathematical methods to simulate and predict flow characteristics in a certain physical space. Since the numerical result of CFD computation is very hard to understand, VR (virtual reality) and data visualization techniques are introduced into CFD post-processing to improve the understandability and functionality of CFD computation. In many cases CFD datasets are very large (multi-gigabytes), and more and more interactions between user and the datasets are required. For the traditional VR application, the limitation of computing power is a major factor to prevent visualizing largemore » dataset effectively. This thesis presents a new system designing to speed up the traditional VR application by using parallel computing and distributed computing, and the idea of using hand held device to enhance the interaction between a user and VR CFD application as well. Techniques in different research areas including scientific visualization, parallel computing, distributed computing and graphical user interface designing are used in the development of the final system. As the result, the new system can flexibly be built on heterogeneous computing environment, dramatically shorten the computation time.« less
Technical Development and Application of Soft Computing in Agricultural and Biological Engineering
USDA-ARS?s Scientific Manuscript database
Soft computing is a set of “inexact” computing techniques, which are able to model and analyze very complex problems. For these complex problems, more conventional methods have not been able to produce cost-effective, analytical, or complete solutions. Soft computing has been extensively studied and...
Development of Soft Computing and Applications in Agricultural and Biological Engineering
USDA-ARS?s Scientific Manuscript database
Soft computing is a set of “inexact” computing techniques, which are able to model and analyze very complex problems. For these complex problems, more conventional methods have not been able to produce cost-effective, analytical, or complete solutions. Soft computing has been extensively studied and...
NASA Technical Reports Server (NTRS)
Layton, G. P.
1984-01-01
New flight test techniques in use at Ames Dryden are reviewed. The use of the pilot in combination with ground and airborne computational capabilities to maximize data return is discussed, including the remotely piloted research vehicle technique for high-risk testing, the remotely augmented vehicle technique for handling qualities research, and use of ground computed flight director information to fly unique profiles such as constant Reynolds number profiles through the transonic flight regime. Techniques used for checkout and design verification of systems-oriented aircraft are discussed, including descriptions of the various simulations, iron bird setups, and vehicle tests. Some newly developed techniques to support the aeronautical research disciplines are discussed, including a new approach to position-error determination, and the use of a large skin friction balance for the measurement of drag caused by various excrescencies.
Computer assessment of atherosclerosis from angiographic images
NASA Technical Reports Server (NTRS)
Selzer, R. H.; Blankenhorn, D. H.; Brooks, S. H.; Crawford, D. W.; Cashin, W. L.
1982-01-01
A computer method for detection and quantification of atherosclerosis from angiograms has been developed and used to measure lesion change in human clinical trials. The technique involves tracking the vessel edges and measuring individual lesions as well as the overall irregularity of the arterial image. Application of the technique to conventional arterial-injection femoral and coronary angiograms is outlined and an experimental study to extend the technique to analysis of intravenous angiograms of the carotid and cornary arteries is described.
Statistical Techniques Complement UML When Developing Domain Models of Complex Dynamical Biosystems.
Williams, Richard A; Timmis, Jon; Qwarnstrom, Eva E
2016-01-01
Computational modelling and simulation is increasingly being used to complement traditional wet-lab techniques when investigating the mechanistic behaviours of complex biological systems. In order to ensure computational models are fit for purpose, it is essential that the abstracted view of biology captured in the computational model, is clearly and unambiguously defined within a conceptual model of the biological domain (a domain model), that acts to accurately represent the biological system and to document the functional requirements for the resultant computational model. We present a domain model of the IL-1 stimulated NF-κB signalling pathway, which unambiguously defines the spatial, temporal and stochastic requirements for our future computational model. Through the development of this model, we observe that, in isolation, UML is not sufficient for the purpose of creating a domain model, and that a number of descriptive and multivariate statistical techniques provide complementary perspectives, in particular when modelling the heterogeneity of dynamics at the single-cell level. We believe this approach of using UML to define the structure and interactions within a complex system, along with statistics to define the stochastic and dynamic nature of complex systems, is crucial for ensuring that conceptual models of complex dynamical biosystems, which are developed using UML, are fit for purpose, and unambiguously define the functional requirements for the resultant computational model.
Statistical Techniques Complement UML When Developing Domain Models of Complex Dynamical Biosystems
Timmis, Jon; Qwarnstrom, Eva E.
2016-01-01
Computational modelling and simulation is increasingly being used to complement traditional wet-lab techniques when investigating the mechanistic behaviours of complex biological systems. In order to ensure computational models are fit for purpose, it is essential that the abstracted view of biology captured in the computational model, is clearly and unambiguously defined within a conceptual model of the biological domain (a domain model), that acts to accurately represent the biological system and to document the functional requirements for the resultant computational model. We present a domain model of the IL-1 stimulated NF-κB signalling pathway, which unambiguously defines the spatial, temporal and stochastic requirements for our future computational model. Through the development of this model, we observe that, in isolation, UML is not sufficient for the purpose of creating a domain model, and that a number of descriptive and multivariate statistical techniques provide complementary perspectives, in particular when modelling the heterogeneity of dynamics at the single-cell level. We believe this approach of using UML to define the structure and interactions within a complex system, along with statistics to define the stochastic and dynamic nature of complex systems, is crucial for ensuring that conceptual models of complex dynamical biosystems, which are developed using UML, are fit for purpose, and unambiguously define the functional requirements for the resultant computational model. PMID:27571414
NASA Astrophysics Data System (ADS)
Karagiannis, Georgios Th.
2016-04-01
The development of non-destructive techniques is a reality in the field of conservation science. These techniques are usually not so accurate, as the analytical micro-sampling techniques, however, the proper development of soft-computing techniques can improve their accuracy. In this work, we propose a real-time fast acquisition spectroscopic mapping imaging system that operates from the ultraviolet to mid infrared (UV/Vis/nIR/mIR) area of the electromagnetic spectrum and it is supported by a set of soft-computing methods to identify the materials that exist in a stratigraphic structure of paint layers. Particularly, the system acquires spectra in diffuse-reflectance mode, scanning in a Region-Of-Interest (ROI), and having wavelength range from 200 up to 5000 nm. Also, a fuzzy c-means clustering algorithm, i.e., the particular soft-computing algorithm, produces the mapping images. The evaluation of the method was tested on a byzantine painted icon.
NASA Technical Reports Server (NTRS)
Harwood, P. (Principal Investigator); Malin, P.; Finley, R.; Mcculloch, S.; Murphy, D.; Hupp, B.; Schell, J. A.
1977-01-01
The author has identified the following significant results. Four LANDSAT scenes were analyzed for the Harbor Island area test sites to produce land cover and land use maps using both image interpretation and computer-assisted techniques. When evaluated against aerial photography, the mean accuracy for three scenes was 84% for the image interpretation product and 62% for the computer-assisted classification maps. Analysis of the fourth scene was not completed using the image interpretation technique, because of poor quality, false color composite, but was available from the computer technique. Preliminary results indicate that these LANDSAT products can be applied to a variety of planning and management activities in the Texas coastal zone.
NASA Technical Reports Server (NTRS)
Bekey, G. A.
1971-01-01
Studies are summarized on the application of advanced analytical and computational methods to the development of mathematical models of human controllers in multiaxis manual control systems. Specific accomplishments include the following: (1) The development of analytical and computer methods for the measurement of random parameters in linear models of human operators. (2) Discrete models of human operator behavior in a multiple display situation were developed. (3) Sensitivity techniques were developed which make possible the identification of unknown sampling intervals in linear systems. (4) The adaptive behavior of human operators following particular classes of vehicle failures was studied and a model structure proposed.
Development of X-TOOLSS: Preliminary Design of Space Systems Using Evolutionary Computation
NASA Technical Reports Server (NTRS)
Schnell, Andrew R.; Hull, Patrick V.; Turner, Mike L.; Dozier, Gerry; Alverson, Lauren; Garrett, Aaron; Reneau, Jarred
2008-01-01
Evolutionary computational (EC) techniques such as genetic algorithms (GA) have been identified as promising methods to explore the design space of mechanical and electrical systems at the earliest stages of design. In this paper the authors summarize their research in the use of evolutionary computation to develop preliminary designs for various space systems. An evolutionary computational solver developed over the course of the research, X-TOOLSS (Exploration Toolset for the Optimization of Launch and Space Systems) is discussed. With the success of early, low-fidelity example problems, an outline of work involving more computationally complex models is discussed.
Secure distributed genome analysis for GWAS and sequence comparison computation.
Zhang, Yihua; Blanton, Marina; Almashaqbeh, Ghada
2015-01-01
The rapid increase in the availability and volume of genomic data makes significant advances in biomedical research possible, but sharing of genomic data poses challenges due to the highly sensitive nature of such data. To address the challenges, a competition for secure distributed processing of genomic data was organized by the iDASH research center. In this work we propose techniques for securing computation with real-life genomic data for minor allele frequency and chi-squared statistics computation, as well as distance computation between two genomic sequences, as specified by the iDASH competition tasks. We put forward novel optimizations, including a generalization of a version of mergesort, which might be of independent interest. We provide implementation results of our techniques based on secret sharing that demonstrate practicality of the suggested protocols and also report on performance improvements due to our optimization techniques. This work describes our techniques, findings, and experimental results developed and obtained as part of iDASH 2015 research competition to secure real-life genomic computations and shows feasibility of securely computing with genomic data in practice.
Secure distributed genome analysis for GWAS and sequence comparison computation
2015-01-01
Background The rapid increase in the availability and volume of genomic data makes significant advances in biomedical research possible, but sharing of genomic data poses challenges due to the highly sensitive nature of such data. To address the challenges, a competition for secure distributed processing of genomic data was organized by the iDASH research center. Methods In this work we propose techniques for securing computation with real-life genomic data for minor allele frequency and chi-squared statistics computation, as well as distance computation between two genomic sequences, as specified by the iDASH competition tasks. We put forward novel optimizations, including a generalization of a version of mergesort, which might be of independent interest. Results We provide implementation results of our techniques based on secret sharing that demonstrate practicality of the suggested protocols and also report on performance improvements due to our optimization techniques. Conclusions This work describes our techniques, findings, and experimental results developed and obtained as part of iDASH 2015 research competition to secure real-life genomic computations and shows feasibility of securely computing with genomic data in practice. PMID:26733307
NASA Technical Reports Server (NTRS)
Gordy, R. S.
1972-01-01
An improved broadband impedance matching technique was developed. The technique is capable of resolving points in the waveguide which generate reflected energy. A version of the comparison reflectometer was developed and fabricated to determine the mean amplitude of the reflection coefficient excited at points in the guide as a function of distance, and the complex reflection coefficient of a specific discontinuity in the guide as a function of frequency. An impedance matching computer program was developed which is capable of impedance matching the characteristics of each disturbance independent of other reflections in the guide. The characteristics of four standard matching elements were compiled, and their associated curves of reflection coefficient and shunt susceptance as a function of frequency are presented. It is concluded that an economical, fast, and reliable impedance matching technique has been established which can provide broadband impedance matches.
An efficient, explicit finite-rate algorithm to compute flows in chemical nonequilibrium
NASA Technical Reports Server (NTRS)
Palmer, Grant
1989-01-01
An explicit finite-rate code was developed to compute hypersonic viscous chemically reacting flows about three-dimensional bodies. Equations describing the finite-rate chemical reactions were fully coupled to the gas dynamic equations using a new coupling technique. The new technique maintains stability in the explicit finite-rate formulation while permitting relatively large global time steps.
Recent Advances in X-ray Cone-beam Computed Laminography.
O'Brien, Neil S; Boardman, Richard P; Sinclair, Ian; Blumensath, Thomas
2016-10-06
X-ray computed tomography is an established volume imaging technique used routinely in medical diagnosis, industrial non-destructive testing, and a wide range of scientific fields. Traditionally, computed tomography uses scanning geometries with a single axis of rotation together with reconstruction algorithms specifically designed for this setup. Recently there has however been increasing interest in more complex scanning geometries. These include so called X-ray computed laminography systems capable of imaging specimens with large lateral dimensions or large aspect ratios, neither of which are well suited to conventional CT scanning procedures. Developments throughout this field have thus been rapid, including the introduction of novel system trajectories, the application and refinement of various reconstruction methods, and the use of recently developed computational hardware and software techniques to accelerate reconstruction times. Here we examine the advances made in the last several years and consider their impact on the state of the art.
Competency Reference for Computer Assisted Drafting.
ERIC Educational Resources Information Center
Oregon State Dept. of Education, Salem. Div. of Vocational Technical Education.
This guide, developed in Oregon, lists competencies essential for students in computer-assisted drafting (CAD). Competencies are organized in eight categories: computer hardware, file usage and manipulation, basic drafting techniques, mechanical drafting, specialty disciplines, three dimensional drawing/design, plotting/printing, and advanced CAD.…
Institute for Computational Mechanics in Propulsion (ICOMP)
NASA Technical Reports Server (NTRS)
Feiler, Charles E. (Compiler)
1993-01-01
The Institute for Computational Mechanics in Propulsion (ICOMP) was established at the NASA Lewis Research Center in Cleveland, Ohio to develop techniques to improve problem-solving capabilities in all aspects of computational mechanics related to propulsion. The activities at ICOMP during 1992 are described.
NASA Technical Reports Server (NTRS)
Capo, M. A.; Disney, R. K.
1971-01-01
The work performed in the following areas is summarized: (1) Analysis of Realistic nuclear-propelled vehicle was analyzed using the Marshall Space Flight Center computer code package. This code package includes one and two dimensional discrete ordinate transport, point kernel, and single scatter techniques, as well as cross section preparation and data processing codes, (2) Techniques were developed to improve the automated data transfer in the coupled computation method of the computer code package and improve the utilization of this code package on the Univac-1108 computer system. (3) The MSFC master data libraries were updated.
Algorithms for computing the geopotential using a simple density layer
NASA Technical Reports Server (NTRS)
Morrison, F.
1976-01-01
Several algorithms have been developed for computing the potential and attraction of a simple density layer. These are numerical cubature, Taylor series, and a mixed analytic and numerical integration using a singularity-matching technique. A computer program has been written to combine these techniques for computing the disturbing acceleration on an artificial earth satellite. A total of 1640 equal-area, constant surface density blocks on an oblate spheroid are used. The singularity-matching algorithm is used in the subsatellite region, Taylor series in the surrounding zone, and numerical cubature on the rest of the earth.
Designing a Hands-On Brain Computer Interface Laboratory Course
Khalighinejad, Bahar; Long, Laura Kathleen; Mesgarani, Nima
2017-01-01
Devices and systems that interact with the brain have become a growing field of research and development in recent years. Engineering students are well positioned to contribute to both hardware development and signal analysis techniques in this field. However, this area has been left out of most engineering curricula. We developed an electroencephalography (EEG) based brain computer interface (BCI) laboratory course to educate students through hands-on experiments. The course is offered jointly by the Biomedical Engineering, Electrical Engineering, and Computer Science Departments of Columbia University in the City of New York and is open to senior undergraduate and graduate students. The course provides an effective introduction to the experimental design, neuroscience concepts, data analysis techniques, and technical skills required in the field of BCI. PMID:28268946
Satellite orbit computation methods
NASA Technical Reports Server (NTRS)
1977-01-01
Mathematical and algorithmical techniques for solution of problems in satellite dynamics were developed, along with solutions to satellite orbit motion. Dynamical analysis of shuttle on-orbit operations were conducted. Computer software routines for use in shuttle mission planning were developed and analyzed, while mathematical models of atmospheric density were formulated.
Computing Newsletter for Schools of Business.
ERIC Educational Resources Information Center
Couger, J. Daniel, Ed.
1973-01-01
The first of the two issues included here reports on various developments concerning the use of computers for schools of business. One-page articles cover these topics: widespread use of simulation games, survey of computer use in higher education, ten new computer cases which teach techniques for management analysis, advantages of the use of…
Designing Ubiquitous Computing to Enhance Children's Learning in Museums
ERIC Educational Resources Information Center
Hall, T.; Bannon, L.
2006-01-01
In recent years, novel paradigms of computing have emerged, which enable computational power to be embedded in artefacts and in environments in novel ways. These developments may create new possibilities for using computing to enhance learning. This paper presents the results of a design process that set out to explore interactive techniques,…
Application of computer graphics in the design of custom orthopedic implants.
Bechtold, J E
1986-10-01
Implementation of newly developed computer modelling techniques and computer graphics displays and software have greatly aided the orthopedic design engineer and physician in creating a custom implant with good anatomic conformity in a short turnaround time. Further advances in computerized design and manufacturing will continue to simplify the development of custom prostheses and enlarge their niche in the joint replacement market.
Modeling software systems by domains
NASA Technical Reports Server (NTRS)
Dippolito, Richard; Lee, Kenneth
1992-01-01
The Software Architectures Engineering (SAE) Project at the Software Engineering Institute (SEI) has developed engineering modeling techniques that both reduce the complexity of software for domain-specific computer systems and result in systems that are easier to build and maintain. These techniques allow maximum freedom for system developers to apply their domain expertise to software. We have applied these techniques to several types of applications, including training simulators operating in real time, engineering simulators operating in non-real time, and real-time embedded computer systems. Our modeling techniques result in software that mirrors both the complexity of the application and the domain knowledge requirements. We submit that the proper measure of software complexity reflects neither the number of software component units nor the code count, but the locus of and amount of domain knowledge. As a result of using these techniques, domain knowledge is isolated by fields of engineering expertise and removed from the concern of the software engineer. In this paper, we will describe kinds of domain expertise, describe engineering by domains, and provide relevant examples of software developed for simulator applications using the techniques.
Multidisciplinary design optimization using multiobjective formulation techniques
NASA Technical Reports Server (NTRS)
Chattopadhyay, Aditi; Pagaldipti, Narayanan S.
1995-01-01
This report addresses the development of a multidisciplinary optimization procedure using an efficient semi-analytical sensitivity analysis technique and multilevel decomposition for the design of aerospace vehicles. A semi-analytical sensitivity analysis procedure is developed for calculating computational grid sensitivities and aerodynamic design sensitivities. Accuracy and efficiency of the sensitivity analysis procedure is established through comparison of the results with those obtained using a finite difference technique. The developed sensitivity analysis technique are then used within a multidisciplinary optimization procedure for designing aerospace vehicles. The optimization problem, with the integration of aerodynamics and structures, is decomposed into two levels. Optimization is performed for improved aerodynamic performance at the first level and improved structural performance at the second level. Aerodynamic analysis is performed by solving the three-dimensional parabolized Navier Stokes equations. A nonlinear programming technique and an approximate analysis procedure are used for optimization. The proceduredeveloped is applied to design the wing of a high speed aircraft. Results obtained show significant improvements in the aircraft aerodynamic and structural performance when compared to a reference or baseline configuration. The use of the semi-analytical sensitivity technique provides significant computational savings.
Nontrivial, Nonintelligent, Computer-Based Learning.
ERIC Educational Resources Information Center
Bork, Alfred
1987-01-01
This paper describes three interactive computer programs used with personal computers to present science learning modules for all ages. Developed by groups of teachers at the Educational Technology Center at the University of California, Irvine, these instructional materials do not use the techniques of contemporary artificial intelligence. (GDC)
Pseudo-random number generator for the Sigma 5 computer
NASA Technical Reports Server (NTRS)
Carroll, S. N.
1983-01-01
A technique is presented for developing a pseudo-random number generator based on the linear congruential form. The two numbers used for the generator are a prime number and a corresponding primitive root, where the prime is the largest prime number that can be accurately represented on a particular computer. The primitive root is selected by applying Marsaglia's lattice test. The technique presented was applied to write a random number program for the Sigma 5 computer. The new program, named S:RANDOM1, is judged to be superior to the older program named S:RANDOM. For applications requiring several independent random number generators, a table is included showing several acceptable primitive roots. The technique and programs described can be applied to any computer having word length different from that of the Sigma 5.
NASA Technical Reports Server (NTRS)
Fishbach, L. H.
1979-01-01
The computational techniques utilized to determine the optimum propulsion systems for future aircraft applications and to identify system tradeoffs and technology requirements are described. The characteristics and use of the following computer codes are discussed: (1) NNEP - a very general cycle analysis code that can assemble an arbitrary matrix fans, turbines, ducts, shafts, etc., into a complete gas turbine engine and compute on- and off-design thermodynamic performance; (2) WATE - a preliminary design procedure for calculating engine weight using the component characteristics determined by NNEP; (3) POD DRG - a table look-up program to calculate wave and friction drag of nacelles; (4) LIFCYC - a computer code developed to calculate life cycle costs of engines based on the output from WATE; and (5) INSTAL - a computer code developed to calculate installation effects, inlet performance and inlet weight. Examples are given to illustrate how these computer techniques can be applied to analyze and optimize propulsion system fuel consumption, weight, and cost for representative types of aircraft and missions.
The Impact of Machine Translation and Computer-aided Translation on Translators
NASA Astrophysics Data System (ADS)
Peng, Hao
2018-03-01
Under the context of globalization, communications between countries and cultures are becoming increasingly frequent, which make it imperative to use some techniques to help translate. This paper is to explore the influence of computer-aided translation on translators, which is derived from the field of the computer-aided translation (CAT) and machine translation (MT). Followed by an introduction to the development of machine and computer-aided translation, it then depicts the technologies practicable to translators, which are trying to analyze the demand of designing the computer-aided translation so far in translation practice, and optimize the designation of computer-aided translation techniques, and analyze its operability in translation. The findings underline the advantages and disadvantages of MT and CAT tools, and the serviceability and future development of MT and CAT technologies. Finally, this thesis probes into the impact of these new technologies on translators in hope that more translators and translation researchers can learn to use such tools to improve their productivity.
Automated selection of BI-RADS lesion descriptors for reporting calcifications in mammograms
NASA Astrophysics Data System (ADS)
Paquerault, Sophie; Jiang, Yulei; Nishikawa, Robert M.; Schmidt, Robert A.; D'Orsi, Carl J.; Vyborny, Carl J.; Newstead, Gillian M.
2003-05-01
We are developing an automated computer technique to describe calcifications in mammograms according to the BI-RADS lexicon. We evaluated this technique by its agreement with radiologists' description of the same lesions. Three expert mammographers reviewed our database of 90 cases of digitized mammograms containing clustered microcalcifications and described the calcifications according to BI-RADS. In our study, the radiologists used only 4 of the 5 calcification distribution descriptors and 5 of the 14 calcification morphology descriptors contained in BI-RADS. Our computer technique was therefore designed specifically for these 4 calcification distribution descriptors and 5 calcification morphology descriptors. For calcification distribution, 4 linear discriminant analysis (LDA) classifiers were developed using 5 computer-extracted features to produce scores of how well each descriptor describes a cluster. Similarly, for calcification morphology, 5 LDAs were designed using 10 computer-extracted features. We trained the LDAs using only the BI-RADS data reported by the first radiologist and compared the computer output to the descriptor data reported by all 3 radiologists (for the first radiologist, the leave-one-out method was used). The computer output consisted of the best calcification distribution descriptor and the best 2 calcification morphology descriptors. The results of the comparison with the data from each radiologist, respectively, were: for calcification distribution, percent agreement, 74%, 66%, and 73%, kappa value, 0.44, 0.36, and 0.46; for calcification morphology, percent agreement, 83%, 77%, and 57%, kappa value, 0.78, 0.70, and 0.44. These results indicate that the proposed computer technique can select BI-RADS descriptors in good agreement with radiologists.
NASA Technical Reports Server (NTRS)
Foss, W. E., Jr.
1981-01-01
A computer technique to determine the mission radius and maneuverability characteristics of combat aircraft was developed. The technique was used to determine critical operational requirements and the areas in which research programs would be expected to yield the most beneficial results. In turn, the results of research efforts were evaluated in terms of aircraft performance on selected mission segments and for complete mission profiles. Extensive use of the technique in evaluation studies indicates that the calculated performance is essentially the same as that obtained by the proprietary programs in use throughout the aircraft industry.
D-region blunt probe data analysis using hybrid computer techniques
NASA Technical Reports Server (NTRS)
Burkhard, W. J.
1973-01-01
The feasibility of performing data reduction techniques with a hybrid computer was studied. The data was obtained from the flight of a parachute born probe through the D-region of the ionosphere. A presentation of the theory of blunt probe operation is included with emphasis on the equations necessary to perform the analysis. This is followed by a discussion of computer program development. Included in this discussion is a comparison of computer and hand reduction results for the blunt probe launched on 31 January 1972. The comparison showed that it was both feasible and desirable to use the computer for data reduction. The results of computer data reduction performed on flight data acquired from five blunt probes are also presented.
Definition and trade-off study of reconfigurable airborne digital computer system organizations
NASA Technical Reports Server (NTRS)
Conn, R. B.
1974-01-01
A highly-reliable, fault-tolerant reconfigurable computer system for aircraft applications was developed. The development and application reliability and fault-tolerance assessment techniques are described. Particular emphasis is placed on the needs of an all-digital, fly-by-wire control system appropriate for a passenger-carrying airplane.
Reliable Acquisition of RAM Dumps from Intel-Based Apple Mac Computers over FireWire
NASA Astrophysics Data System (ADS)
Gladyshev, Pavel; Almansoori, Afrah
RAM content acquisition is an important step in live forensic analysis of computer systems. FireWire offers an attractive way to acquire RAM content of Apple Mac computers equipped with a FireWire connection. However, the existing techniques for doing so require substantial knowledge of the target computer configuration and cannot be used reliably on a previously unknown computer in a crime scene. This paper proposes a novel method for acquiring RAM content of Apple Mac computers over FireWire, which automatically discovers necessary information about the target computer and can be used in the crime scene setting. As an application of the developed method, the techniques for recovery of AOL Instant Messenger (AIM) conversation fragments from RAM dumps are also discussed in this paper.
NASA Technical Reports Server (NTRS)
Adams, J. R.; Hawley, S. W.; Peterson, G. R.; Salinger, S. S.; Workman, R. A.
1971-01-01
A hardware and software specification covering requirements for the computer enhancement of structural weld radiographs was considered. Three scanning systems were used to digitize more than 15 weld radiographs. The performance of these systems was evaluated by determining modulation transfer functions and noise characteristics. Enhancement techniques were developed and applied to the digitized radiographs. The scanning parameters of spot size and spacing and film density were studied to optimize the information content of the digital representation of the image.
NASA Astrophysics Data System (ADS)
Kajiwara, K.; Shobu, T.; Toyokawa, H.; Sato, M.
2014-04-01
A technique for three-dimensional visualization of grain boundaries was developed at BL28B2 at SPring-8. The technique uses white X-ray microbeam diffraction and a rotating slit. Three-dimensional images of small silicon single crystals filled in a plastic tube were successfully obtained using this technique for demonstration purposes. The images were consistent with those obtained by X-ray computed tomography.
NASA Technical Reports Server (NTRS)
Wiswell, E. R.; Cooper, G. R. (Principal Investigator)
1978-01-01
The author has identified the following significant results. The concept of average mutual information in the received spectral random process about the spectral scene was developed. Techniques amenable to implementation on a digital computer were also developed to make the required average mutual information calculations. These techniques required identification of models for the spectral response process of scenes. Stochastic modeling techniques were adapted for use. These techniques were demonstrated on empirical data from wheat and vegetation scenes.
Mostafavi, Kamal; Tutunea-Fatan, O Remus; Bordatchev, Evgueni V; Johnson, James A
2014-12-01
The strong advent of computer-assisted technologies experienced by the modern orthopedic surgery prompts for the expansion of computationally efficient techniques to be built on the broad base of computer-aided engineering tools that are readily available. However, one of the common challenges faced during the current developmental phase continues to remain the lack of reliable frameworks to allow a fast and precise conversion of the anatomical information acquired through computer tomography to a format that is acceptable to computer-aided engineering software. To address this, this study proposes an integrated and automatic framework capable to extract and then postprocess the original imaging data to a common planar and closed B-Spline representation. The core of the developed platform relies on the approximation of the discrete computer tomography data by means of an original two-step B-Spline fitting technique based on successive deformations of the control polygon. In addition to its rapidity and robustness, the developed fitting technique was validated to produce accurate representations that do not deviate by more than 0.2 mm with respect to alternate representations of the bone geometry that were obtained through different-contact-based-data acquisition or data processing methods. © IMechE 2014.
Fourier Method for Calculating Fission Chain Neutron Multiplicity Distributions
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chambers, David H.; Chandrasekaran, Hema; Walston, Sean E.
Here, a new way of utilizing the fast Fourier transform is developed to compute the probability distribution for a fission chain to create n neutrons. We then extend this technique to compute the probability distributions for detecting n neutrons. Lastly, our technique can be used for fission chains initiated by either a single neutron inducing a fission or by the spontaneous fission of another isotope.
Fourier Method for Calculating Fission Chain Neutron Multiplicity Distributions
Chambers, David H.; Chandrasekaran, Hema; Walston, Sean E.
2017-03-27
Here, a new way of utilizing the fast Fourier transform is developed to compute the probability distribution for a fission chain to create n neutrons. We then extend this technique to compute the probability distributions for detecting n neutrons. Lastly, our technique can be used for fission chains initiated by either a single neutron inducing a fission or by the spontaneous fission of another isotope.
Implementation of a 3D mixing layer code on parallel computers
NASA Technical Reports Server (NTRS)
Roe, K.; Thakur, R.; Dang, T.; Bogucz, E.
1995-01-01
This paper summarizes our progress and experience in the development of a Computational-Fluid-Dynamics code on parallel computers to simulate three-dimensional spatially-developing mixing layers. In this initial study, the three-dimensional time-dependent Euler equations are solved using a finite-volume explicit time-marching algorithm. The code was first programmed in Fortran 77 for sequential computers. The code was then converted for use on parallel computers using the conventional message-passing technique, while we have not been able to compile the code with the present version of HPF compilers.
Advances and trends in computational structural mechanics
NASA Technical Reports Server (NTRS)
Noor, A. K.
1986-01-01
Recent developments in computational structural mechanics are reviewed with reference to computational needs for future structures technology, advances in computational models for material behavior, discrete element technology, assessment and control of numerical simulations of structural response, hybrid analysis, and techniques for large-scale optimization. Research areas in computational structural mechanics which have high potential for meeting future technological needs are identified. These include prediction and analysis of the failure of structural components made of new materials, development of computational strategies and solution methodologies for large-scale structural calculations, and assessment of reliability and adaptive improvement of response predictions.
An overview of computer viruses in a research environment
NASA Technical Reports Server (NTRS)
Bishop, Matt
1991-01-01
The threat of attack by computer viruses is in reality a very small part of a much more general threat, specifically threats aimed at subverting computer security. Here, computer viruses are examined as a malicious logic in a research and development environment. A relation is drawn between the viruses and various models of security and integrity. Current research techniques aimed at controlling the threats posed to computer systems by threatening viruses in particular and malicious logic in general are examined. Finally, a brief examination of the vulnerabilities of research and development systems that malicious logic and computer viruses may exploit is undertaken.
Computer Applications in Assessment and Counseling.
ERIC Educational Resources Information Center
Veldman, Donald J.; Menaker, Shirley L.
Public school counselors and psychologists can expect valuable assistance from computer-based assessment and counseling techniques within a few years, as programs now under development become generally available for the typical computers now used by schools for grade-reporting and class-scheduling. Although routine information-giving and gathering…
NASA Technical Reports Server (NTRS)
Smith, R. E.
1981-01-01
A grid generation technique called the two boundary technique is developed and applied for the solution of the three dimensional Navier-Stokes equations. The Navier-Stokes equations are transformed from a cartesian coordinate system to a computational coordinate system, and the grid generation technique provides the Jacobian matrix describing the transformation. The two boundary technique is based on algebraically defining two distinct boundaries of a flow domain and the distribution of the grid is achieved by applying functions to the uniform computational grid which redistribute the computational independent variables and consequently concentrate or disperse the grid points in the physical domain. The Navier-Stokes equations are solved using a MacCormack time-split technique. Grids and supersonic laminar flow solutions are obtained for a family of three dimensional corners and two spike-nosed bodies.
Sharma, Nandita; Gedeon, Tom
2012-12-01
Stress is a major growing concern in our day and age adversely impacting both individuals and society. Stress research has a wide range of benefits from improving personal operations, learning, and increasing work productivity to benefiting society - making it an interesting and socially beneficial area of research. This survey reviews sensors that have been used to measure stress and investigates techniques for modelling stress. It discusses non-invasive and unobtrusive sensors for measuring computed stress, a term we coin in the paper. Sensors that do not impede everyday activities that could be used by those who would like to monitor stress levels on a regular basis (e.g. vehicle drivers, patients with illnesses linked to stress) is the focus of the discussion. Computational techniques have the capacity to determine optimal sensor fusion and automate data analysis for stress recognition and classification. Several computational techniques have been developed to model stress based on techniques such as Bayesian networks, artificial neural networks, and support vector machines, which this survey investigates. The survey concludes with a summary and provides possible directions for further computational stress research. Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.
NASA Technical Reports Server (NTRS)
Thorp, Scott A.
1992-01-01
This presentation will discuss the development of a NASA Geometry Exchange Specification for transferring aerodynamic surface geometry between LeRC systems and grid generation software used for computational fluid dynamics research. The proposed specification is based on a subset of the Initial Graphics Exchange Specification (IGES). The presentation will include discussion of how the NASA-IGES standard will accommodate improved computer aided design inspection methods and reverse engineering techniques currently being developed. The presentation is in viewgraph format.
Rahman, Mohd Nasrull Abdol; Mohamad, Siti Shafika
2017-01-01
Computer works are associated with Musculoskeletal Disorders (MSDs). There are several methods have been developed to assess computer work risk factor related to MSDs. This review aims to give an overview of current techniques available for pen-and-paper-based observational methods in assessing ergonomic risk factors of computer work. We searched an electronic database for materials from 1992 until 2015. The selected methods were focused on computer work, pen-and-paper observational methods, office risk factors and musculoskeletal disorders. This review was developed to assess the risk factors, reliability and validity of pen-and-paper observational method associated with computer work. Two evaluators independently carried out this review. Seven observational methods used to assess exposure to office risk factor for work-related musculoskeletal disorders were identified. The risk factors involved in current techniques of pen and paper based observational tools were postures, office components, force and repetition. From the seven methods, only five methods had been tested for reliability. They were proven to be reliable and were rated as moderate to good. For the validity testing, from seven methods only four methods were tested and the results are moderate. Many observational tools already exist, but no single tool appears to cover all of the risk factors including working posture, office component, force, repetition and office environment at office workstations and computer work. Although the most important factor in developing tool is proper validation of exposure assessment techniques, the existing observational method did not test reliability and validity. Futhermore, this review could provide the researchers with ways on how to improve the pen-and-paper-based observational method for assessing ergonomic risk factors of computer work.
Development of Moire machine vision
NASA Technical Reports Server (NTRS)
Harding, Kevin G.
1987-01-01
Three dimensional perception is essential to the development of versatile robotics systems in order to handle complex manufacturing tasks in future factories and in providing high accuracy measurements needed in flexible manufacturing and quality control. A program is described which will develop the potential of Moire techniques to provide this capability in vision systems and automated measurements, and demonstrate artificial intelligence (AI) techniques to take advantage of the strengths of Moire sensing. Moire techniques provide a means of optically manipulating the complex visual data in a three dimensional scene into a form which can be easily and quickly analyzed by computers. This type of optical data manipulation provides high productivity through integrated automation, producing a high quality product while reducing computer and mechanical manipulation requirements and thereby the cost and time of production. This nondestructive evaluation is developed to be able to make full field range measurement and three dimensional scene analysis.
Development of Moire machine vision
NASA Astrophysics Data System (ADS)
Harding, Kevin G.
1987-10-01
Three dimensional perception is essential to the development of versatile robotics systems in order to handle complex manufacturing tasks in future factories and in providing high accuracy measurements needed in flexible manufacturing and quality control. A program is described which will develop the potential of Moire techniques to provide this capability in vision systems and automated measurements, and demonstrate artificial intelligence (AI) techniques to take advantage of the strengths of Moire sensing. Moire techniques provide a means of optically manipulating the complex visual data in a three dimensional scene into a form which can be easily and quickly analyzed by computers. This type of optical data manipulation provides high productivity through integrated automation, producing a high quality product while reducing computer and mechanical manipulation requirements and thereby the cost and time of production. This nondestructive evaluation is developed to be able to make full field range measurement and three dimensional scene analysis.
Optimization of thermal protection systems for the space shuttle vehicle. Volume 1: Final report
NASA Technical Reports Server (NTRS)
1972-01-01
A study performed to continue development of computational techniques for the Space Shuttle Thermal Protection System is reported. The resulting computer code was used to perform some additional optimization studies on several TPS configurations. The program was developed in Fortran 4 for the CDC 6400, and it was converted to Fortran 5 to be used for the Univac 1108. The computational methodology is developed in modular fashion to facilitate changes and updating of the techniques and to allow overlaying the computer code to fit into approximately 131,000 octal words of core storage. The program logic involves subroutines which handle input and output of information between computer and user, thermodynamic stress, dynamic, and weight/estimate analyses of a variety of panel configurations. These include metallic, ablative, RSI (with and without an underlying phase change material), and a thermodynamic analysis only of carbon-carbon systems applied to the leading edge and flat cover panels. Two different thermodynamic analyses are used. The first is a two-dimensional, explicit precedure with variable time steps which is used to describe the behavior of metallic and carbon-carbon leading edges. The second is a one-dimensional implicity technique used to predict temperature in the charring ablator and the noncharring RSI. The latter analysis is performed simply by suppressing the chemical reactions and pyrolysis of the TPS material.
Estimation of conformational entropy in protein-ligand interactions: a computational perspective.
Polyansky, Anton A; Zubac, Ruben; Zagrovic, Bojan
2012-01-01
Conformational entropy is an important component of the change in free energy upon binding of a ligand to its target protein. As a consequence, development of computational techniques for reliable estimation of conformational entropies is currently receiving an increased level of attention in the context of computational drug design. Here, we review the most commonly used techniques for conformational entropy estimation from classical molecular dynamics simulations. Although by-and-large still not directly used in practical drug design, these techniques provide a golden standard for developing other, computationally less-demanding methods for such applications, in addition to furthering our understanding of protein-ligand interactions in general. In particular, we focus on the quasi-harmonic approximation and discuss different approaches that can be used to go beyond it, most notably, when it comes to treating anharmonic and/or correlated motions. In addition to reviewing basic theoretical formalisms, we provide a concrete set of steps required to successfully calculate conformational entropy from molecular dynamics simulations, as well as discuss a number of practical issues that may arise in such calculations.
Supersonic reacting internal flowfields
NASA Astrophysics Data System (ADS)
Drummond, J. P.
The national program to develop a trans-atmospheric vehicle has kindled a renewed interest in the modeling of supersonic reacting flows. A supersonic combustion ramjet, or scramjet, has been proposed to provide the propulsion system for this vehicle. The development of computational techniques for modeling supersonic reacting flowfields, and the application of these techniques to an increasingly difficult set of combustion problems are studied. Since the scramjet problem has been largely responsible for motivating this computational work, a brief history is given of hypersonic vehicles and their propulsion systems. A discussion is also given of some early modeling efforts applied to high speed reacting flows. Current activities to develop accurate and efficient algorithms and improved physical models for modeling supersonic combustion is then discussed. Some new problems where computer codes based on these algorithms and models are being applied are described.
Supersonic reacting internal flow fields
NASA Technical Reports Server (NTRS)
Drummond, J. Philip
1989-01-01
The national program to develop a trans-atmospheric vehicle has kindled a renewed interest in the modeling of supersonic reacting flows. A supersonic combustion ramjet, or scramjet, has been proposed to provide the propulsion system for this vehicle. The development of computational techniques for modeling supersonic reacting flow fields, and the application of these techniques to an increasingly difficult set of combustion problems are studied. Since the scramjet problem has been largely responsible for motivating this computational work, a brief history is given of hypersonic vehicles and their propulsion systems. A discussion is also given of some early modeling efforts applied to high speed reacting flows. Current activities to develop accurate and efficient algorithms and improved physical models for modeling supersonic combustion is then discussed. Some new problems where computer codes based on these algorithms and models are being applied are described.
The Screen Display Syntax for CAI.
ERIC Educational Resources Information Center
Richards, Boyd F.; Salisbury, David F.
1987-01-01
Describes four storyboard techniques frequently used in designing computer assisted instruction (CAI) programs, and explains screen display syntax (SDS), a new technique combining the major advantages of the storyboard techniques. SDS was developed to facilitate communication among designers, programmers, and editors working on a large CAI basic…
Computational technique for stepwise quantitative assessment of equation correctness
NASA Astrophysics Data System (ADS)
Othman, Nuru'l Izzah; Bakar, Zainab Abu
2017-04-01
Many of the computer-aided mathematics assessment systems that are available today possess the capability to implement stepwise correctness checking of a working scheme for solving equations. The computational technique for assessing the correctness of each response in the scheme mainly involves checking the mathematical equivalence and providing qualitative feedback. This paper presents a technique, known as the Stepwise Correctness Checking and Scoring (SCCS) technique that checks the correctness of each equation in terms of structural equivalence and provides quantitative feedback. The technique, which is based on the Multiset framework, adapts certain techniques from textual information retrieval involving tokenization, document modelling and similarity evaluation. The performance of the SCCS technique was tested using worked solutions on solving linear algebraic equations in one variable. 350 working schemes comprising of 1385 responses were collected using a marking engine prototype, which has been developed based on the technique. The results show that both the automated analytical scores and the automated overall scores generated by the marking engine exhibit high percent agreement, high correlation and high degree of agreement with manual scores with small average absolute and mixed errors.
Computational techniques for flows with finite-rate condensation
NASA Technical Reports Server (NTRS)
Candler, Graham V.
1993-01-01
A computational method to simulate the inviscid two-dimensional flow of a two-phase fluid was developed. This computational technique treats the gas phase and each of a prescribed number of particle sizes as separate fluids which are allowed to interact with one another. Thus, each particle-size class is allowed to move through the fluid at its own velocity at each point in the flow field. Mass, momentum, and energy are exchanged between each particle class and the gas phase. It is assumed that the particles do not collide with one another, so that there is no inter-particle exchange of momentum and energy. However, the particles are allowed to grow, and therefore, they may change from one size class to another. Appropriate rates of mass, momentum, and energy exchange between the gas and particle phases and between the different particle classes were developed. A numerical method was developed for use with this equation set. Several test cases were computed and show qualitative agreement with previous calculations.
Prediction of quantitative intrathoracic fluid volume to diagnose pulmonary oedema using LabVIEW.
Urooj, Shabana; Khan, M; Ansari, A Q; Lay-Ekuakille, Aimé; Salhan, Ashok K
2012-01-01
Pulmonary oedema is a life-threatening disease that requires special attention in the area of research and clinical diagnosis. Computer-based techniques are rarely used to quantify the intrathoracic fluid volume (IFV) for diagnostic purposes. This paper discusses a software program developed to detect and diagnose pulmonary oedema using LabVIEW. The software runs on anthropometric dimensions and physiological parameters, mainly transthoracic electrical impedance (TEI). This technique is accurate and faster than existing manual techniques. The LabVIEW software was used to compute the parameters required to quantify IFV. An equation relating per cent control and IFV was obtained. The results of predicted TEI and measured TEI were compared with previously reported data to validate the developed program. It was found that the predicted values of TEI obtained from the computer-based technique were much closer to the measured values of TEI. Six new subjects were enrolled to measure and predict transthoracic impedance and hence to quantify IFV. A similar difference was also observed in the measured and predicted values of TEI for the new subjects.
A multitasking finite state architecture for computer control of an electric powertrain
DOE Office of Scientific and Technical Information (OSTI.GOV)
Burba, J.C.
1984-01-01
Finite state techniques provide a common design language between the control engineer and the computer engineer for event driven computer control systems. They simplify communication and provide a highly maintainable control system understandable by both. This paper describes the development of a control system for an electric vehicle powertrain utilizing finite state concepts. The basics of finite state automata are provided as a framework to discuss a unique multitasking software architecture developed for this application. The architecture employs conventional time-sliced techniques with task scheduling controlled by a finite state machine representation of the control strategy of the powertrain. The complexitiesmore » of excitation variable sampling in this environment are also considered.« less
Computer quantitation of coronary angiograms
NASA Technical Reports Server (NTRS)
Ledbetter, D. C.; Selzer, R. H.; Gordon, R. M.; Blankenhorn, D. H.; Sanmarco, M. E.
1978-01-01
A computer technique is being developed at the Jet Propulsion Laboratory to automate the measurement of coronary stenosis. A Vanguard 35mm film transport is optically coupled to a Spatial Data System vidicon/digitizer which in turn is controlled by a DEC PDP 11/55 computer. Programs have been developed to track the edges of the arterial shadow, to locate normal and atherosclerotic vessel sections and to measure percent stenosis. Multiple frame analysis techniques are being investigated that involve on the one hand, averaging stenosis measurements from adjacent frames, and on the other hand, averaging adjacent frame images directly and then measuring stenosis from the averaged image. For the latter case, geometric transformations are used to force registration of vessel images whose spatial orientation changes.
Three-dimensional monochromatic x-ray computed tomography using synchrotron radiation
NASA Astrophysics Data System (ADS)
Saito, Tsuneo; Kudo, Hiroyuki; Takeda, Tohoru; Itai, Yuji; Tokumori, Kenji; Toyofuku, Fukai; Hyodo, Kazuyuki; Ando, Masami; Nishimura, Katsuyuki; Uyama, Chikao
1998-08-01
We describe a technique of 3D computed tomography (3D CT) using monochromatic x rays generated by synchrotron radiation, which performs a direct reconstruction of a 3D volume image of an object from its cone-beam projections. For the development, we propose a practical scanning orbit of the x-ray source to obtain complete 3D information on an object, and its corresponding 3D image reconstruction algorithm. The validity and usefulness of the proposed scanning orbit and reconstruction algorithm were confirmed by computer simulation studies. Based on these investigations, we have developed a prototype 3D monochromatic x-ray CT using synchrotron radiation, which provides exact 3D reconstruction and material-selective imaging by using the K-edge energy subtraction technique.
NASA Technical Reports Server (NTRS)
Mielke, Amy F.; Seasholtz, Richard G.; Elam, Kristie A.; Panda, Jayanta
2005-01-01
Nonintrusive optical point-wise measurement techniques utilizing the principles of molecular Rayleigh scattering have been developed at the NASA Glenn Research Center to obtain time-averaged information about gas velocity, density, temperature, and turbulence, or dynamic information about gas velocity and density in unseeded flows. These techniques enable measurements that are necessary for validating computational fluid dynamics (CFD) and computational aeroacoustic (CAA) codes. Dynamic measurements allow the calculation of power spectra for the various flow properties. This type of information is currently being used in jet noise studies, correlating sound pressure fluctuations with velocity and density fluctuations to determine noise sources in jets. These nonintrusive techniques are particularly useful in supersonic flows, where seeding the flow with particles is not an option, and where the environment is too harsh for hot-wire measurements.
NASA Astrophysics Data System (ADS)
Mahaboob, B.; Venkateswarlu, B.; Sankar, J. Ravi; Balasiddamuni, P.
2017-11-01
This paper uses matrix calculus techniques to obtain Nonlinear Least Squares Estimator (NLSE), Maximum Likelihood Estimator (MLE) and Linear Pseudo model for nonlinear regression model. David Pollard and Peter Radchenko [1] explained analytic techniques to compute the NLSE. However the present research paper introduces an innovative method to compute the NLSE using principles in multivariate calculus. This study is concerned with very new optimization techniques used to compute MLE and NLSE. Anh [2] derived NLSE and MLE of a heteroscedatistic regression model. Lemcoff [3] discussed a procedure to get linear pseudo model for nonlinear regression model. In this research article a new technique is developed to get the linear pseudo model for nonlinear regression model using multivariate calculus. The linear pseudo model of Edmond Malinvaud [4] has been explained in a very different way in this paper. David Pollard et.al used empirical process techniques to study the asymptotic of the LSE (Least-squares estimation) for the fitting of nonlinear regression function in 2006. In Jae Myung [13] provided a go conceptual for Maximum likelihood estimation in his work “Tutorial on maximum likelihood estimation
NASA Technical Reports Server (NTRS)
Noor, A. K.; Andersen, C. M.; Tanner, J. A.
1984-01-01
An effective computational strategy is presented for the large-rotation, nonlinear axisymmetric analysis of shells of revolution. The three key elements of the computational strategy are: (1) use of mixed finite-element models with discontinuous stress resultants at the element interfaces; (2) substantial reduction in the total number of degrees of freedom through the use of a multiple-parameter reduction technique; and (3) reduction in the size of the analysis model through the decomposition of asymmetric loads into symmetric and antisymmetric components coupled with the use of the multiple-parameter reduction technique. The potential of the proposed computational strategy is discussed. Numerical results are presented to demonstrate the high accuracy of the mixed models developed and to show the potential of using the proposed computational strategy for the analysis of tires.
How to use hand-held computers to evaluate wood drying.
Howard N. Rosen; Darrell S. Martin
1985-01-01
Techniques have been developed to evaluate end generate wood drying curves with hand-held computers (3-5K memory). Predictions of time to dry to a specific moisture content, drying rates, and other characteristics of wood drying curves can be made. The paper describes the development of programs and illustrates their use.
ERIC Educational Resources Information Center
Hahn, H. A.; And Others
The purpose of this handbook is to provide background and guidelines for course designers and instructional developers who will be developing Reserve Component training for the United States military using asynchronous computer conferencing techniques. The recommendations in this report are based on an international review of the literature in…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kuhl, D.E.
1976-08-05
During the thirteen year duration of this contract the goal has been to develop and apply computer based analysis of radionuclide scan data so as to make available improved diagnostic information based on a knowledge of localized quantitative estimates of radionuclide concentration. Results are summarized. (CH)
NASA Technical Reports Server (NTRS)
1995-01-01
The Formal Methods Specification and Verification Guidebook for Software and Computer Systems describes a set of techniques called Formal Methods (FM), and outlines their use in the specification and verification of computer systems and software. Development of increasingly complex systems has created a need for improved specification and verification techniques. NASA's Safety and Mission Quality Office has supported the investigation of techniques such as FM, which are now an accepted method for enhancing the quality of aerospace applications. The guidebook provides information for managers and practitioners who are interested in integrating FM into an existing systems development process. Information includes technical and administrative considerations that must be addressed when establishing the use of FM on a specific project. The guidebook is intended to aid decision makers in the successful application of FM to the development of high-quality systems at reasonable cost. This is the first volume of a planned two-volume set. The current volume focuses on administrative and planning considerations for the successful application of FM.
Yoo, Dongjin
2012-07-01
Advanced additive manufacture (AM) techniques are now being developed to fabricate scaffolds with controlled internal pore architectures in the field of tissue engineering. In general, these techniques use a hybrid method which combines computer-aided design (CAD) with computer-aided manufacturing (CAM) tools to design and fabricate complicated three-dimensional (3D) scaffold models. The mathematical descriptions of micro-architectures along with the macro-structures of the 3D scaffold models are limited by current CAD technologies as well as by the difficulty of transferring the designed digital models to standard formats for fabrication. To overcome these difficulties, we have developed an efficient internal pore architecture design system based on triply periodic minimal surface (TPMS) unit cell libraries and associated computational methods to assemble TPMS unit cells into an entire scaffold model. In addition, we have developed a process planning technique based on TPMS internal architecture pattern of unit cells to generate tool paths for freeform fabrication of tissue engineering porous scaffolds. Copyright © 2012 IPEM. Published by Elsevier Ltd. All rights reserved.
NASA Technical Reports Server (NTRS)
Biemann, K.
1973-01-01
Data processing techniques were developed to measure with high precision and sensitivity the line spectra produced by a high resolution mass spectrometer. The most important aspect of this phase was the interfacing of a modified precision microphotometer-comparator with a computer and the improvement of existing software to serve the special needs of the investigation of lunar samples. In addition, a gas-chromatograph mass spectrometer system was interfaced with the same computer to allow continuous recording of mass spectra on a gas chromatographic effluent and efficient evaluation of the resulting data. These techniques were then used to detect and identify organic compounds present in the samples returned by the Apollo 11 and 12 missions.
ERIC Educational Resources Information Center
Forwood, Bruce S.
This bibliography has been produced as part of a research program attempting to develop a new approach to building environment and service systems design using computer-aided design techniques. As such it not only classifies available literature on the service systems themselves, but also contains sections on the application of computers and…
Nonlinear relaxation algorithms for circuit simulation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Saleh, R.A.
Circuit simulation is an important Computer-Aided Design (CAD) tool in the design of Integrated Circuits (IC). However, the standard techniques used in programs such as SPICE result in very long computer-run times when applied to large problems. In order to reduce the overall run time, a number of new approaches to circuit simulation were developed and are described. These methods are based on nonlinear relaxation techniques and exploit the relative inactivity of large circuits. Simple waveform-processing techniques are described to determine the maximum possible speed improvement that can be obtained by exploiting this property of large circuits. Three simulation algorithmsmore » are described, two of which are based on the Iterated Timing Analysis (ITA) method and a third based on the Waveform-Relaxation Newton (WRN) method. New programs that incorporate these techniques were developed and used to simulate a variety of industrial circuits. The results from these simulations are provided. The techniques are shown to be much faster than the standard approach. In addition, a number of parallel aspects of these algorithms are described, and a general space-time model of parallel-task scheduling is developed.« less
Computation of transonic flow past projectiles at angle of attack
NASA Technical Reports Server (NTRS)
Reklis, R. P.; Sturek, W. B.; Bailey, F. R.
1978-01-01
Aerodynamic properties of artillery shell such as normal force and pitching moment reach peak values in a narrow transonic Mach number range. In order to compute these quantities, numerical techniques have been developed to obtain solutions to the three-dimensional transonic small disturbance equation about slender bodies at angle of attack. The computation is based on a plane relaxation technique involving Fourier transforms to partially decouple the three-dimensional difference equations. Particular care is taken to assure accurate solutions near corners found in shell designs. Computed surface pressures are compared to experimental measurements for circular arc and cone cylinder bodies which have been selected as test cases. Computed pitching moments are compared to range measurements for a typical projectile shape.
NASA Technical Reports Server (NTRS)
Vest, C. M.
1982-01-01
The use of holographic interferometry to measure two and threedimensional flows and the interpretation of multiple-view interferograms with computer tomography are discussed. Computational techniques developed for tomography are reviewed. Current research topics are outlined including the development of an automated fringe readout system, optimum reconstruction procedures for when an opaque test model is present in the field, and interferometry and tomography with strongly refracting fields and shocks.
NASA Technical Reports Server (NTRS)
Gassaway, J. D.; Mahmood, Q.; Trotter, J. D.
1980-01-01
Quarterly report describes progress in three programs: dc sputtering machine for aluminum and aluminum alloys; two dimensional computer modeling of MOS transistors; and development of computer techniques for calculating redistribution diffusion of dopants in silicon on sapphire films.
Cloud Computing Techniques for Space Mission Design
NASA Technical Reports Server (NTRS)
Arrieta, Juan; Senent, Juan
2014-01-01
The overarching objective of space mission design is to tackle complex problems producing better results, and faster. In developing the methods and tools to fulfill this objective, the user interacts with the different layers of a computing system.
Study to develop techniques for a self-organizing computer
NASA Technical Reports Server (NTRS)
Schaffner, M. R.
1972-01-01
The main emphasis has been on the programming language for a self organizing computer. An example of programming with the language of finite state machines is presented in a real time processing for weather radars.
Advanced 3D Characterization and Reconstruction of Reactor Materials FY16 Final Report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fromm, Bradley; Hauch, Benjamin; Sridharan, Kumar
2016-12-01
A coordinated effort to link advanced materials characterization methods and computational modeling approaches is critical to future success for understanding and predicting the behavior of reactor materials that operate at extreme conditions. The difficulty and expense of working with nuclear materials have inhibited the use of modern characterization techniques on this class of materials. Likewise, mesoscale simulation efforts have been impeded due to insufficient experimental data necessary for initialization and validation of the computer models. The objective of this research is to develop methods to integrate advanced materials characterization techniques developed for reactor materials with state-of-the-art mesoscale modeling and simulationmore » tools. Research to develop broad-ion beam sample preparation, high-resolution electron backscatter diffraction, and digital microstructure reconstruction techniques; and methods for integration of these techniques into mesoscale modeling tools are detailed. Results for both irradiated and un-irradiated reactor materials are presented for FY14 - FY16 and final remarks are provided.« less
Understanding Islamist political violence through computational social simulation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Watkins, Jennifer H; Mackerrow, Edward P; Patelli, Paolo G
Understanding the process that enables political violence is of great value in reducing the future demand for and support of violent opposition groups. Methods are needed that allow alternative scenarios and counterfactuals to be scientifically researched. Computational social simulation shows promise in developing 'computer experiments' that would be unfeasible or unethical in the real world. Additionally, the process of modeling and simulation reveals and challenges assumptions that may not be noted in theories, exposes areas where data is not available, and provides a rigorous, repeatable, and transparent framework for analyzing the complex dynamics of political violence. This paper demonstrates themore » computational modeling process using two simulation techniques: system dynamics and agent-based modeling. The benefits and drawbacks of both techniques are discussed. In developing these social simulations, we discovered that the social science concepts and theories needed to accurately simulate the associated psychological and social phenomena were lacking.« less
A service based adaptive U-learning system using UX.
Jeong, Hwa-Young; Yi, Gangman
2014-01-01
In recent years, traditional development techniques for e-learning systems have been changing to become more convenient and efficient. One new technology in the development of application systems includes both cloud and ubiquitous computing. Cloud computing can support learning system processes by using services while ubiquitous computing can provide system operation and management via a high performance technical process and network. In the cloud computing environment, a learning service application can provide a business module or process to the user via the internet. This research focuses on providing the learning material and processes of courses by learning units using the services in a ubiquitous computing environment. And we also investigate functions that support users' tailored materials according to their learning style. That is, we analyzed the user's data and their characteristics in accordance with their user experience. We subsequently applied the learning process to fit on their learning performance and preferences. Finally, we demonstrate how the proposed system outperforms learning effects to learners better than existing techniques.
A Service Based Adaptive U-Learning System Using UX
Jeong, Hwa-Young
2014-01-01
In recent years, traditional development techniques for e-learning systems have been changing to become more convenient and efficient. One new technology in the development of application systems includes both cloud and ubiquitous computing. Cloud computing can support learning system processes by using services while ubiquitous computing can provide system operation and management via a high performance technical process and network. In the cloud computing environment, a learning service application can provide a business module or process to the user via the internet. This research focuses on providing the learning material and processes of courses by learning units using the services in a ubiquitous computing environment. And we also investigate functions that support users' tailored materials according to their learning style. That is, we analyzed the user's data and their characteristics in accordance with their user experience. We subsequently applied the learning process to fit on their learning performance and preferences. Finally, we demonstrate how the proposed system outperforms learning effects to learners better than existing techniques. PMID:25147832
NASA Technical Reports Server (NTRS)
Marconi, F.; Salas, M.; Yaeger, L.
1976-01-01
A numerical procedure has been developed to compute the inviscid super/hypersonic flow field about complex vehicle geometries accurately and efficiently. A second order accurate finite difference scheme is used to integrate the three dimensional Euler equations in regions of continuous flow, while all shock waves are computed as discontinuities via the Rankine Hugoniot jump conditions. Conformal mappings are used to develop a computational grid. The effects of blunt nose entropy layers are computed in detail. Real gas effects for equilibrium air are included using curve fits of Mollier charts. Typical calculated results for shuttle orbiter, hypersonic transport, and supersonic aircraft configurations are included to demonstrate the usefulness of this tool.
A study of trends and techniques for space base electronics
NASA Technical Reports Server (NTRS)
Trotter, J. D.; Wade, T. E.; Gassaway, J. D.; Mahmood, Q.
1978-01-01
A sputtering system was developed to deposit aluminum and aluminum alloys by the dc sputtering technique. This system is designed for a high level of cleanliness and for monitoring the deposition parameters during film preparation. This system is now ready for studying the deposition and annealing parameters upon double-level metal preparation. A technique recently applied for semiconductor analysis, the finite element method, was studied for use in the computer modeling of two dimensional MOS transistor structures. It was concluded that the method has not been sufficiently well developed for confident use at this time. An algorithm was developed for confident use at this time. An algorithm was developed for implementing a computer study which is based upon the finite difference method. The program which was developed was modified and used to calculate redistribution data for boron and phosphorous which had been predeposited by ion implantation with range and straggle conditions. Data were generated for 111 oriented SOS films with redistribution in N2, dry O2 and steam ambients.
NASA Technical Reports Server (NTRS)
Smetana, F. O.; Summery, D. C.; Johnson, W. D.
1972-01-01
Techniques quoted in the literature for the extraction of stability derivative information from flight test records are reviewed. A recent technique developed at NASA's Langley Research Center was regarded as the most productive yet developed. Results of tests of the sensitivity of this procedure to various types of data noise and to the accuracy of the estimated values of the derivatives are reported. Computer programs for providing these initial estimates are given. The literature review also includes a discussion of flight test measuring techniques, instrumentation, and piloting techniques.
Shaded computer graphic techniques for visualizing and interpreting analytic fluid flow models
NASA Technical Reports Server (NTRS)
Parke, F. I.
1981-01-01
Mathematical models which predict the behavior of fluid flow in different experiments are simulated using digital computers. The simulations predict values of parameters of the fluid flow (pressure, temperature and velocity vector) at many points in the fluid. Visualization of the spatial variation in the value of these parameters is important to comprehend and check the data generated, to identify the regions of interest in the flow, and for effectively communicating information about the flow to others. The state of the art imaging techniques developed in the field of three dimensional shaded computer graphics is applied to visualization of fluid flow. Use of an imaging technique known as 'SCAN' for visualizing fluid flow, is studied and the results are presented.
Stochastic Feedforward Control Technique
NASA Technical Reports Server (NTRS)
Halyo, Nesim
1990-01-01
Class of commanded trajectories modeled as stochastic process. Advanced Transport Operating Systems (ATOPS) research and development program conducted by NASA Langley Research Center aimed at developing capabilities for increases in capacities of airports, safe and accurate flight in adverse weather conditions including shear, winds, avoidance of wake vortexes, and reduced consumption of fuel. Advances in techniques for design of modern controls and increased capabilities of digital flight computers coupled with accurate guidance information from Microwave Landing System (MLS). Stochastic feedforward control technique developed within context of ATOPS program.
Institute for Computational Mechanics in Propulsion (ICOMP)
NASA Technical Reports Server (NTRS)
Feiler, Charles E. (Editor)
1991-01-01
The Institute for Computational Mechanics in Propulsion (ICOMP) is operated jointly by Case Western Reserve University and the NASA Lewis Research Center in Cleveland, Ohio. The purpose of ICOMP is to develop techniques to improve problem-solving capabilities in all aspects of computational mechanics related to propulsion. The activities at ICOMP during 1990 are described.
Institute for Computational Mechanics in Propulsion (ICOMP)
NASA Technical Reports Server (NTRS)
1989-01-01
The Institute for Computational Mechanics in Propulsion (ICOMP) is operated jointly by Case Western Reserve University and the NASA Lewis Research Center in Cleveland, Ohio. The purpose of ICOMP is to develop techniques to improve problem-solving capabilities in all aspects of computational mechanics related to propulsion. This report describes the activities at ICOMP during 1988.
Institute for Computational Mechanics in Propulsion (ICOMP)
NASA Technical Reports Server (NTRS)
1988-01-01
The Institute for Computational Mechanics in Propulsion (ICOMP) is operated jointly by Case Western Reserve University and the NASA Lewis Research Center in Cleveland, Ohio. The purpose of ICOMP is to develop techniques to improve problem-solving capabilities in all aspects of computational mechanics related to propulsion. Described are the activities of ICOMP during 1987.
The application of computational chemistry to lignin
Thomas Elder; Laura Berstis; Nele Sophie Zwirchmayr; Gregg T. Beckham; Michael F. Crowley
2017-01-01
Computational chemical methods have become an important technique in the examination of the structure and reactivity of lignin. The calculations can be based either on classical or quantum mechanics, with concomitant differences in computational intensity and size restrictions. The current paper will concentrate on results developed from the latter type of calculations...
Institute for Computational Mechanics in Propulsion (ICOMP) fourth annual review, 1989
NASA Technical Reports Server (NTRS)
1990-01-01
The Institute for Computational Mechanics in Propulsion (ICOMP) is operated jointly by Case Western Reserve University and the NASA Lewis Research Center. The purpose of ICOMP is to develop techniques to improve problem solving capabilities in all aspects of computational mechanics related to propulsion. The activities at ICOMP during 1989 are described.
Heterogeneity in Health Care Computing Environments
Sengupta, Soumitra
1989-01-01
This paper discusses issues of heterogeneity in computer systems, networks, databases, and presentation techniques, and the problems it creates in developing integrated medical information systems. The need for institutional, comprehensive goals are emphasized. Using the Columbia-Presbyterian Medical Center's computing environment as the case study, various steps to solve the heterogeneity problem are presented.
Institute for Computational Mechanics in Propulsion (ICOMP)
NASA Technical Reports Server (NTRS)
Feiler, Charles E. (Editor)
1992-01-01
The Institute for Computational Mechanics in Propulsion (ICOMP) is a combined activity of Case Western Reserve University, Ohio Aerospace Institute (OAI) and NASA Lewis. The purpose of ICOMP is to develop techniques to improve problem solving capabilities in all aspects of computational mechanics related to propulsion. The activities at ICOMP during 1991 are described.
Soft computing techniques toward modeling the water supplies of Cyprus.
Iliadis, L; Maris, F; Tachos, S
2011-10-01
This research effort aims in the application of soft computing techniques toward water resources management. More specifically, the target is the development of reliable soft computing models capable of estimating the water supply for the case of "Germasogeia" mountainous watersheds in Cyprus. Initially, ε-Regression Support Vector Machines (ε-RSVM) and fuzzy weighted ε-RSVMR models have been developed that accept five input parameters. At the same time, reliable artificial neural networks have been developed to perform the same job. The 5-fold cross validation approach has been employed in order to eliminate bad local behaviors and to produce a more representative training data set. Thus, the fuzzy weighted Support Vector Regression (SVR) combined with the fuzzy partition has been employed in an effort to enhance the quality of the results. Several rational and reliable models have been produced that can enhance the efficiency of water policy designers. Copyright © 2011 Elsevier Ltd. All rights reserved.
NASA Technical Reports Server (NTRS)
Cannizzaro, Frank E.; Ash, Robert L.
1992-01-01
A state-of-the-art computer code has been developed that incorporates a modified Runge-Kutta time integration scheme, upwind numerical techniques, multigrid acceleration, and multi-block capabilities (RUMM). A three-dimensional thin-layer formulation of the Navier-Stokes equations is employed. For turbulent flow cases, the Baldwin-Lomax algebraic turbulence model is used. Two different upwind techniques are available: van Leer's flux-vector splitting and Roe's flux-difference splitting. Full approximation multi-grid plus implicit residual and corrector smoothing were implemented to enhance the rate of convergence. Multi-block capabilities were developed to provide geometric flexibility. This feature allows the developed computer code to accommodate any grid topology or grid configuration with multiple topologies. The results shown in this dissertation were chosen to validate the computer code and display its geometric flexibility, which is provided by the multi-block structure.
Materials-by-design: computation, synthesis, and characterization from atoms to structures
NASA Astrophysics Data System (ADS)
Yeo, Jingjie; Jung, Gang Seob; Martín-Martínez, Francisco J.; Ling, Shengjie; Gu, Grace X.; Qin, Zhao; Buehler, Markus J.
2018-05-01
In the 50 years that succeeded Richard Feynman’s exposition of the idea that there is ‘plenty of room at the bottom’ for manipulating individual atoms for the synthesis and manufacturing processing of materials, the materials-by-design paradigm is being developed gradually through synergistic integration of experimental material synthesis and characterization with predictive computational modeling and optimization. This paper reviews how this paradigm creates the possibility to develop materials according to specific, rational designs from the molecular to the macroscopic scale. We discuss promising techniques in experimental small-scale material synthesis and large-scale fabrication methods to manipulate atomistic or macroscale structures, which can be designed by computational modeling. These include recombinant protein technology to produce peptides and proteins with tailored sequences encoded by recombinant DNA, self-assembly processes induced by conformational transition of proteins, additive manufacturing for designing complex structures, and qualitative and quantitative characterization of materials at different length scales. We describe important material characterization techniques using numerous methods of spectroscopy and microscopy. We detail numerous multi-scale computational modeling techniques that complements these experimental techniques: DFT at the atomistic scale; fully atomistic and coarse-grain molecular dynamics at the molecular to mesoscale; continuum modeling at the macroscale. Additionally, we present case studies that utilize experimental and computational approaches in an integrated manner to broaden our understanding of the properties of two-dimensional materials and materials based on silk and silk-elastin-like proteins.
The silicon synapse or, neural net computing.
Frenger, P
1989-01-01
Recent developments have rekindled interest in the electronic neural network, a form of parallel computer architecture loosely based on the nervous system of living creatures. This paper describes the elements of neural net computers, reviews the historical milestones in their development, and lists the advantages and disadvantages of their use. Methods for software simulation of neural network systems on existing computers, as well as creation of hardware analogues, are given. The most successful applications of these techniques, involving emulation of biological system responses, are presented. The author's experiences with neural net systems are discussed.
Parallelized modelling and solution scheme for hierarchically scaled simulations
NASA Technical Reports Server (NTRS)
Padovan, Joe
1995-01-01
This two-part paper presents the results of a benchmarked analytical-numerical investigation into the operational characteristics of a unified parallel processing strategy for implicit fluid mechanics formulations. This hierarchical poly tree (HPT) strategy is based on multilevel substructural decomposition. The Tree morphology is chosen to minimize memory, communications and computational effort. The methodology is general enough to apply to existing finite difference (FD), finite element (FEM), finite volume (FV) or spectral element (SE) based computer programs without an extensive rewrite of code. In addition to finding large reductions in memory, communications, and computational effort associated with a parallel computing environment, substantial reductions are generated in the sequential mode of application. Such improvements grow with increasing problem size. Along with a theoretical development of general 2-D and 3-D HPT, several techniques for expanding the problem size that the current generation of computers are capable of solving, are presented and discussed. Among these techniques are several interpolative reduction methods. It was found that by combining several of these techniques that a relatively small interpolative reduction resulted in substantial performance gains. Several other unique features/benefits are discussed in this paper. Along with Part 1's theoretical development, Part 2 presents a numerical approach to the HPT along with four prototype CFD applications. These demonstrate the potential of the HPT strategy.
Iteration and Prototyping in Creating Technical Specifications.
ERIC Educational Resources Information Center
Flynt, John P.
1994-01-01
Claims that the development process for computer software can be greatly aided by the writers of specifications if they employ basic iteration and prototyping techniques. Asserts that computer software configuration management practices provide ready models for iteration and prototyping. (HB)
ERIC Educational Resources Information Center
Dumas, Helene M.
2010-01-01
The PEDI-CAT is a new computer adaptive test (CAT) version of the Pediatric Evaluation of Disability Inventory (PEDI). Additional PEDI-CAT items specific to postacute pediatric hospital care were recently developed using expert reviews and cognitive interviewing techniques. Expert reviews established face and construct validity, providing positive…
A Framework for the Design of Computer-Assisted Simulation Training for Complex Police Situations
ERIC Educational Resources Information Center
Söderström, Tor; Åström, Jan; Anderson, Greg; Bowles, Ron
2014-01-01
Purpose: The purpose of this paper is to report progress concerning the design of a computer-assisted simulation training (CAST) platform for developing decision-making skills in police students. The overarching aim is to outline a theoretical framework for the design of CAST to facilitate police students' development of search techniques in…
Computer Simulation as an Aid for Management of an Information System.
ERIC Educational Resources Information Center
Simmonds, W. H.; And Others
The aim of this study was to develop methods, based upon computer simulation, of designing information systems and illustrate the use of these methods by application to an information service. The method developed is based upon Monte Carlo and discrete event simulation techniques and is described in an earlier report - Sira report R412 Organizing…
NASA Technical Reports Server (NTRS)
Hoffer, R. M.
1974-01-01
Forestry, geology, and water resource applications were the focus of this study, which involved the use of computer-implemented pattern-recognition techniques to analyze ERTS-1 data. The results have proven the value of computer-aided analysis techniques, even in areas of mountainous terrain. Several analysis capabilities have been developed during these ERTS-1 investigations. A procedure to rotate, deskew, and geometrically scale the MSS data results in 1:24,000 scale printouts that can be directly overlayed on 7 1/2 minutes U.S.G.S. topographic maps. Several scales of computer-enhanced "false color-infrared" composites of MSS data can be obtained from a digital display unit, and emphasize the tremendous detail present in the ERTS-1 data. A grid can also be superimposed on the displayed data to aid in specifying areas of interest.
[Advancements of computer chemistry in separation of Chinese medicine].
Li, Lingjuan; Hong, Hong; Xu, Xuesong; Guo, Liwei
2011-12-01
Separating technique of Chinese medicine is not only a key technique in the field of Chinese medicine' s research and development, but also a significant step in the modernization of Chinese medicinal preparation. Computer chemistry can build model and look for the regulations from Chinese medicine system which is full of complicated data. This paper analyzed the applicability, key technology, basic mode and common algorithm of computer chemistry applied in the separation of Chinese medicine, introduced the mathematic mode and the setting methods of Extraction kinetics, investigated several problems which based on traditional Chinese medicine membrane procession, and forecasted the application prospect.
ERIC Educational Resources Information Center
Sayre, Scott Alan
The purpose of this study was to develop and validate a computer-based system that would allow interactive video developers to integrate and manage the design components prior to production. These components of an interactive video (IVD) program include visual information in a variety of formats, audio information, and instructional techniques,…
Cost Estimation Techniques for C3I System Software.
1984-07-01
opment manmonth have been determined for maxi, midi , and mini .1 type computers. Small to median size timeshared developments used 0.2 to 1.5 hours...development schedule 1.23 1.00 1.10 2.1.3 Detailed Model The final codification of the COCOMO regressions was the development of separate effort...regardless of the software structure level being estimated: D8VC -- the expected development computer (maxi. midi . mini, micro) MODE -- the expected
The use of computer graphic simulation in the development of on-orbit tele-robotic systems
NASA Technical Reports Server (NTRS)
Fernandez, Ken; Hinman, Elaine
1987-01-01
This paper describes the use of computer graphic simulation techniques to resolve critical design and operational issues for robotic systems used for on-orbit operations. These issues are robot motion control, robot path-planning/verification, and robot dynamics. The major design issues in developing effective telerobotic systems are discussed, and the use of ROBOSIM, a NASA-developed computer graphic simulation tool, to address these issues is presented. Simulation plans for the Space Station and the Orbital Maneuvering Vehicle are presented and discussed.
X-ray computed tomography to study rice (Oryza sativa L.) panicle development
Jhala, Vibhuti M.; Thaker, Vrinda S.
2015-01-01
Computational tomography is an important technique for developing digital agricultural models that may help farmers and breeders for increasing crop quality and yield. In the present study an attempt has been made to understand rice seed development within the panicle at different developmental stages using this technique. During the first phase of cell division the Hounsfield Unit (HU) value remained low, increased in the dry matter accumulation phase, and finally reached a maximum at the maturation stage. HU value and seed dry weight showed a linear relationship in the varieties studied. This relationship was confirmed subsequently using seven other varieties. This is therefore an easy, simple, and non-invasive technique which may help breeders to select the best varieties. In addition, it may also help farmers to optimize post-anthesis agronomic practices as well as deciding the crop harvest time for higher grain yield. PMID:26265763
Computer methods for sampling from the gamma distribution
DOE Office of Scientific and Technical Information (OSTI.GOV)
Johnson, M.E.; Tadikamalla, P.R.
1978-01-01
Considerable attention has recently been directed at developing ever faster algorithms for generating gamma random variates on digital computers. This paper surveys the current state of the art including the leading algorithms of Ahrens and Dieter, Atkinson, Cheng, Fishman, Marsaglia, Tadikamalla, and Wallace. General random variate generation techniques are explained with reference to these gamma algorithms. Computer simulation experiments on IBM and CDC computers are reported.
Simulation techniques in hyperthermia treatment planning
Paulides, MM; Stauffer, PR; Neufeld, E; Maccarini, P; Kyriakou, A; Canters, RAM; Diederich, C; Bakker, JF; Van Rhoon, GC
2013-01-01
Clinical trials have shown that hyperthermia (HT), i.e. an increase of tissue temperature to 39-44°C, significantly enhance radiotherapy and chemotherapy effectiveness (1). Driven by the developments in computational techniques and computing power, personalized hyperthermia treatment planning (HTP) has matured and has become a powerful tool for optimizing treatment quality. Electromagnetic, ultrasound, and thermal simulations using realistic clinical setups are now being performed to achieve patient-specific treatment optimization. In addition, extensive studies aimed to properly implement novel HT tools and techniques, and to assess the quality of HT, are becoming more common. In this paper, we review the simulation tools and techniques developed for clinical hyperthermia, and evaluate their current status on the path from “model” to “clinic”. In addition, we illustrate the major techniques employed for validation and optimization. HTP has become an essential tool for improvement, control, and assessment of HT treatment quality. As such, it plays a pivotal role in the quest to establish HT as an efficacious addition to multi-modality treatment of cancer. PMID:23672453
Computed tomography-based finite element analysis to assess fracture risk and osteoporosis treatment
Imai, Kazuhiro
2015-01-01
Finite element analysis (FEA) is a computer technique of structural stress analysis and developed in engineering mechanics. FEA has developed to investigate structural behavior of human bones over the past 40 years. When the faster computers have acquired, better FEA, using 3-dimensional computed tomography (CT) has been developed. This CT-based finite element analysis (CT/FEA) has provided clinicians with useful data. In this review, the mechanism of CT/FEA, validation studies of CT/FEA to evaluate accuracy and reliability in human bones, and clinical application studies to assess fracture risk and effects of osteoporosis medication are overviewed. PMID:26309819
Structural biology computing: Lessons for the biomedical research sciences.
Morin, Andrew; Sliz, Piotr
2013-11-01
The field of structural biology, whose aim is to elucidate the molecular and atomic structures of biological macromolecules, has long been at the forefront of biomedical sciences in adopting and developing computational research methods. Operating at the intersection between biophysics, biochemistry, and molecular biology, structural biology's growth into a foundational framework on which many concepts and findings of molecular biology are interpreted1 has depended largely on parallel advancements in computational tools and techniques. Without these computing advances, modern structural biology would likely have remained an exclusive pursuit practiced by few, and not become the widely practiced, foundational field it is today. As other areas of biomedical research increasingly embrace research computing techniques, the successes, failures and lessons of structural biology computing can serve as a useful guide to progress in other biomedically related research fields. Copyright © 2013 Wiley Periodicals, Inc.
WELDSMART: A vision-based expert system for quality control
NASA Technical Reports Server (NTRS)
Andersen, Kristinn; Barnett, Robert Joel; Springfield, James F.; Cook, George E.
1992-01-01
This work was aimed at exploring means for utilizing computer technology in quality inspection and evaluation. Inspection of metallic welds was selected as the main application for this development and primary emphasis was placed on visual inspection, as opposed to other inspection methods, such as radiographic techniques. Emphasis was placed on methodologies with the potential for use in real-time quality control systems. Because quality evaluation is somewhat subjective, despite various efforts to classify discontinuities and standardize inspection methods, the task of using a computer for both inspection and evaluation was not trivial. The work started out with a review of the various inspection techniques that are used for quality control in welding. Among other observations from this review was the finding that most weld defects result in abnormalities that may be seen by visual inspection. This supports the approach of emphasizing visual inspection for this work. Quality control consists of two phases: (1) identification of weld discontinuities (some of which may be severe enough to be classified as defects), and (2) assessment or evaluation of the weld based on the observed discontinuities. Usually the latter phase results in a pass/fail judgement for the inspected piece. It is the conclusion of this work that the first of the above tasks, identification of discontinuities, is the most challenging one. It calls for sophisticated image processing and image analysis techniques, and frequently ad hoc methods have to be developed to identify specific features in the weld image. The difficulty of this task is generally not due to limited computing power. In most cases it was found that a modest personal computer or workstation could carry out most computations in a reasonably short time period. Rather, the algorithms and methods necessary for identifying weld discontinuities were in some cases limited. The fact that specific techniques were finally developed and successfully demosntrated to work illustrates that the general approach taken here appears to be promising for commercial development of computerized quality inspection systems. Inspection based on these techniques may be used to supplement or substitute more elaborate inspection methods, such as x-ray inspections.
Computational materials design of crystalline solids.
Butler, Keith T; Frost, Jarvist M; Skelton, Jonathan M; Svane, Katrine L; Walsh, Aron
2016-11-07
The modelling of materials properties and processes from first principles is becoming sufficiently accurate as to facilitate the design and testing of new systems in silico. Computational materials science is both valuable and increasingly necessary for developing novel functional materials and composites that meet the requirements of next-generation technology. A range of simulation techniques are being developed and applied to problems related to materials for energy generation, storage and conversion including solar cells, nuclear reactors, batteries, fuel cells, and catalytic systems. Such techniques may combine crystal-structure prediction (global optimisation), data mining (materials informatics) and high-throughput screening with elements of machine learning. We explore the development process associated with computational materials design, from setting the requirements and descriptors to the development and testing of new materials. As a case study, we critically review progress in the fields of thermoelectrics and photovoltaics, including the simulation of lattice thermal conductivity and the search for Pb-free hybrid halide perovskites. Finally, a number of universal chemical-design principles are advanced.
Model reduction of the numerical analysis of Low Impact Developments techniques
NASA Astrophysics Data System (ADS)
Brunetti, Giuseppe; Šimůnek, Jirka; Wöhling, Thomas; Piro, Patrizia
2017-04-01
Mechanistic models have proven to be accurate and reliable tools for the numerical analysis of the hydrological behavior of Low Impact Development (LIDs) techniques. However, their widespread adoption is limited by their complexity and computational cost. Recent studies have tried to address this issue by investigating the application of new techniques, such as surrogate-based modeling. However, current results are still limited and fragmented. One of such approaches, the Model Order Reduction (MOR) technique, can represent a valuable tool for reducing the computational complexity of a numerical problems by computing an approximation of the original model. While this technique has been extensively used in water-related problems, no studies have evaluated its use in LIDs modeling. Thus, the main aim of this study is to apply the MOR technique for the development of a reduced order model (ROM) for the numerical analysis of the hydrologic behavior of LIDs, in particular green roofs. The model should be able to correctly reproduce all the hydrological processes of a green roof while reducing the computational cost. The proposed model decouples the subsurface water dynamic of a green roof in a) one-dimensional (1D) vertical flow through a green roof itself and b) one-dimensional saturated lateral flow along the impervious rooftop. The green roof is horizontally discretized in N elements. Each element represents a vertical domain, which can have different properties or boundary conditions. The 1D Richards equation is used to simulate flow in the substrate and drainage layers. Simulated outflow from the vertical domain is used as a recharge term for saturated lateral flow, which is described using the kinematic wave approximation of the Boussinesq equation. The proposed model has been compared with the mechanistic model HYDRUS-2D, which numerically solves the Richards equation for the whole domain. The HYDRUS-1D code has been used for the description of vertical flow, while a Finite Volume Scheme has been adopted for lateral flow. Two scenarios involving flat and steep green roofs were analyzed. Results confirmed the accuracy of the reduced order model, which was able to reproduce both subsurface outflow and the moisture distribution in the green roof, significantly reducing the computational cost.
CFD Techniques for Propulsion Applications
NASA Technical Reports Server (NTRS)
1992-01-01
The symposium was composed of the following sessions: turbomachinery computations and validations; flow in ducts, intakes, and nozzles; and reacting flows. Forty papers were presented, and they covered full 3-D code validation and numerical techniques; multidimensional reacting flow; and unsteady viscous flow for the entire spectrum of propulsion system components. The capabilities of the various numerical techniques were assessed and significant new developments were identified. The technical evaluation spells out where progress has been made and concludes that the present state of the art has almost reached the level necessary to tackle the comprehensive topic of computational fluid dynamics (CFD) validation for propulsion.
NASA Technical Reports Server (NTRS)
Greene, William H.
1990-01-01
A study was performed focusing on the calculation of sensitivities of displacements, velocities, accelerations, and stresses in linear, structural, transient response problems. One significant goal of the study was to develop and evaluate sensitivity calculation techniques suitable for large-order finite element analyses. Accordingly, approximation vectors such as vibration mode shapes are used to reduce the dimensionality of the finite element model. Much of the research focused on the accuracy of both response quantities and sensitivities as a function of number of vectors used. Two types of sensitivity calculation techniques were developed and evaluated. The first type of technique is an overall finite difference method where the analysis is repeated for perturbed designs. The second type of technique is termed semi-analytical because it involves direct, analytical differentiation of the equations of motion with finite difference approximation of the coefficient matrices. To be computationally practical in large-order problems, the overall finite difference methods must use the approximation vectors from the original design in the analyses of the perturbed models. In several cases this fixed mode approach resulted in very poor approximations of the stress sensitivities. Almost all of the original modes were required for an accurate sensitivity and for small numbers of modes, the accuracy was extremely poor. To overcome this poor accuracy, two semi-analytical techniques were developed. The first technique accounts for the change in eigenvectors through approximate eigenvector derivatives. The second technique applies the mode acceleration method of transient analysis to the sensitivity calculations. Both result in accurate values of the stress sensitivities with a small number of modes and much lower computational costs than if the vibration modes were recalculated and then used in an overall finite difference method.
NASA Technical Reports Server (NTRS)
Ray, R. J.; Hicks, J. W.; Alexander, R. I.
1988-01-01
The X-29A advanced technology demonstrator has shown the practicality and advantages of the capability to compute and display, in real time, aeroperformance flight results. This capability includes the calculation of the in-flight measured drag polar, lift curve, and aircraft specific excess power. From these elements many other types of aeroperformance measurements can be computed and analyzed. The technique can be used to give an immediate postmaneuver assessment of data quality and maneuver technique, thus increasing the productivity of a flight program. A key element of this new method was the concurrent development of a real-time in-flight net thrust algorithm, based on the simplified gross thrust method. This net thrust algorithm allows for the direct calculation of total aircraft drag.
NASA Technical Reports Server (NTRS)
1979-01-01
A comprehensive review of all NASA airfoil research, conducted both in-house and under grant and contract, as well as a broad spectrum of airfoil research outside of NASA is presented. Emphasis is placed on the development of computational aerodynamic codes for airfoil analysis and design, the development of experimental facilities and test techniques, and all types of airfoil applications.
NASA Technical Reports Server (NTRS)
Dumas, A.
1981-01-01
Three major areas that are considered in the development of an overall maintenance scheme of computer equipment are described. The areas of concern related to fault isolation techniques are: the programmer (or user), company and its policies, and the manufacturer of the equipment.
Photo-reconnaissance applications of computer processing of images.
NASA Technical Reports Server (NTRS)
Billingsley, F. C.
1971-01-01
An imaging processing technique is developed for enhancement and calibration of imaging experiments. The technique is shown to be useful not only for the original application but also when applied to images from a wide variety of sources.
Simulation of intelligent object behavior in a virtual reality system
NASA Astrophysics Data System (ADS)
Mironov, Sergey F.
1998-01-01
This article presents a technique for computer control of a power boat movement in real-time marine trainers or arcade games. The author developed and successfully implemented a general technique allowing intellectual navigation of computer controlled moving objects that proved to be appropriate for real-time applications. This technique covers significant part of necessary behavioral tasks that appear in such titles. At the same time the technique forms a part of a more general system that involves control of less complicated characters of another nature. The system being an open one can be easily used by an action or arcade programming to improve the overall quality of characters artificial intelligence style.
The Mind Research Network - Mental Illness Neuroscience Discovery Grant
DOE Office of Scientific and Technical Information (OSTI.GOV)
Roberts, J.; Calhoun, V.
The scientific and technological programs of the Mind Research Network (MRN), reflect DOE missions in basic science and associated instrumentation, computational modeling, and experimental techniques. MRN's technical goals over the course of this project have been to develop and apply integrated, multi-modality functional imaging techniques derived from a decade of DOE-support research and technology development.
Multiprocessor programming environment
DOE Office of Scientific and Technical Information (OSTI.GOV)
Smith, M.B.; Fornaro, R.
Programming tools and techniques have been well developed for traditional uniprocessor computer systems. The focus of this research project is on the development of a programming environment for a high speed real time heterogeneous multiprocessor system, with special emphasis on languages and compilers. The new tools and techniques will allow a smooth transition for programmers with experience only on single processor systems.
Institute for Computational Mechanics in Propulsion (ICOMP)
NASA Technical Reports Server (NTRS)
Feiler, Charles E. (Editor)
1994-01-01
The Institute for Computational Mechanics in Propulsion (ICOMP) is operated by the Ohio Aerospace Institute (OAI) and the NASA Lewis Research Center in Cleveland, Ohio. The purpose of ICOMP is to develop techniques to improve problem-solving capabilities in all aspects of computational mechanics related to propulsion. This report describes the accomplishments and activities at ICOMP during 1993.
ERIC Educational Resources Information Center
Nagasinghe, Iranga
2010-01-01
This thesis investigates and develops a few acceleration techniques for the search engine algorithms used in PageRank and HITS computations. PageRank and HITS methods are two highly successful applications of modern Linear Algebra in computer science and engineering. They constitute the essential technologies accounted for the immense growth and…
ERIC Educational Resources Information Center
Wilkins, Charles L.
Computer-assisted instruction (CAI) has proven useful in teaching chemistry instrumentation techniques to undergraduate students. The work completed at the time of this interim report has clearly shown that a general purpose laboratory computer system, equipped with suitable devices to allow direct data input from experiments, can be an effective…
Different but Similar: Computer Use Patterns between Young Korean Males and Females
ERIC Educational Resources Information Center
Lim, Keol; Meier, Ellen B.
2011-01-01
This study was developed to identify and describe new trends and gender differences in the use of computers and the Internet in South Korea. In this mixed-method study, both quantitative and qualitative techniques were used. Results indicated that both males and females used computers generally for four purposes: social networking, personal…
Reduction and analysis of data collected during the electromagnetic tornado experiment
NASA Technical Reports Server (NTRS)
Davisson, L. D.
1976-01-01
Techniques for data processing and analysis are described to support tornado detection by analysis of radio frequency interference in various frequency bands, and sea state determination from short pulse radar measurements. Activities include: strip chart recording of tornado data; the development and implementation of computer programs for digitalization and analysis of the data; data reduction techniques for short pulse radar data, and the simulation of radar returns from the sea surface by computer models.
NASA Technical Reports Server (NTRS)
Jones, Robert E.; Kramarchuk, Ihor; Williams, Wallace D.; Pouch, John J.; Gilbert, Percy
1989-01-01
Computer-controlled thermal-wave microscope developed to investigate III-V compound semiconductor devices and materials. Is nondestructive technique providing information on subsurface thermal features of solid samples. Furthermore, because this is subsurface technique, three-dimensional imaging also possible. Microscope uses intensity-modulated electron beam of modified scanning electron microscope to generate thermal waves in sample. Acoustic waves generated by thermal waves received by transducer and processed in computer to form images displayed on video display of microscope or recorded on magnetic disk.
Multiple regression technique for Pth degree polynominals with and without linear cross products
NASA Technical Reports Server (NTRS)
Davis, J. W.
1973-01-01
A multiple regression technique was developed by which the nonlinear behavior of specified independent variables can be related to a given dependent variable. The polynomial expression can be of Pth degree and can incorporate N independent variables. Two cases are treated such that mathematical models can be studied both with and without linear cross products. The resulting surface fits can be used to summarize trends for a given phenomenon and provide a mathematical relationship for subsequent analysis. To implement this technique, separate computer programs were developed for the case without linear cross products and for the case incorporating such cross products which evaluate the various constants in the model regression equation. In addition, the significance of the estimated regression equation is considered and the standard deviation, the F statistic, the maximum absolute percent error, and the average of the absolute values of the percent of error evaluated. The computer programs and their manner of utilization are described. Sample problems are included to illustrate the use and capability of the technique which show the output formats and typical plots comparing computer results to each set of input data.
Developments in Signature Process Control
NASA Astrophysics Data System (ADS)
Keller, L. B.; Dominski, Marty
1993-01-01
Developments in the adaptive process control technique known as Signature Process Control for Advanced Composites (SPCC) are described. This computer control method for autoclave processing of composites was used to develop an optimum cure cycle for AFR 700B polyamide and for an experimental poly-isoimide. An improved process cycle was developed for Avimid N polyamide. The potential for extending the SPCC technique to pre-preg quality control, press modeling, pultrusion and RTM is briefly discussed.
Efficient grid-based techniques for density functional theory
NASA Astrophysics Data System (ADS)
Rodriguez-Hernandez, Juan Ignacio
Understanding the chemical and physical properties of molecules and materials at a fundamental level often requires quantum-mechanical models for these substance's electronic structure. This type of many body quantum mechanics calculation is computationally demanding, hindering its application to substances with more than a few hundreds atoms. The supreme goal of many researches in quantum chemistry---and the topic of this dissertation---is to develop more efficient computational algorithms for electronic structure calculations. In particular, this dissertation develops two new numerical integration techniques for computing molecular and atomic properties within conventional Kohn-Sham-Density Functional Theory (KS-DFT) of molecular electronic structure. The first of these grid-based techniques is based on the transformed sparse grid construction. In this construction, a sparse grid is generated in the unit cube and then mapped to real space according to the pro-molecular density using the conditional distribution transformation. The transformed sparse grid was implemented in program deMon2k, where it is used as the numerical integrator for the exchange-correlation energy and potential in the KS-DFT procedure. We tested our grid by computing ground state energies, equilibrium geometries, and atomization energies. The accuracy on these test calculations shows that our grid is more efficient than some previous integration methods: our grids use fewer points to obtain the same accuracy. The transformed sparse grids were also tested for integrating, interpolating and differentiating in different dimensions (n = 1,2,3,6). The second technique is a grid-based method for computing atomic properties within QTAIM. It was also implemented in deMon2k. The performance of the method was tested by computing QTAIM atomic energies, charges, dipole moments, and quadrupole moments. For medium accuracy, our method is the fastest one we know of.
NASA Technical Reports Server (NTRS)
Kolb, Mark A.
1990-01-01
Originally, computer programs for engineering design focused on detailed geometric design. Later, computer programs for algorithmically performing the preliminary design of specific well-defined classes of objects became commonplace. However, due to the need for extreme flexibility, it appears unlikely that conventional programming techniques will prove fruitful in developing computer aids for engineering conceptual design. The use of symbolic processing techniques, such as object-oriented programming and constraint propagation, facilitate such flexibility. Object-oriented programming allows programs to be organized around the objects and behavior to be simulated, rather than around fixed sequences of function- and subroutine-calls. Constraint propagation allows declarative statements to be understood as designating multi-directional mathematical relationships among all the variables of an equation, rather than as unidirectional assignments to the variable on the left-hand side of the equation, as in conventional computer programs. The research has concentrated on applying these two techniques to the development of a general-purpose computer aid for engineering conceptual design. Object-oriented programming techniques are utilized to implement a user-extensible database of design components. The mathematical relationships which model both geometry and physics of these components are managed via constraint propagation. In addition, to this component-based hierarchy, special-purpose data structures are provided for describing component interactions and supporting state-dependent parameters. In order to investigate the utility of this approach, a number of sample design problems from the field of aerospace engineering were implemented using the prototype design tool, Rubber Airplane. The additional level of organizational structure obtained by representing design knowledge in terms of components is observed to provide greater convenience to the program user, and to result in a database of engineering information which is easier both to maintain and to extend.
Linear static structural and vibration analysis on high-performance computers
NASA Technical Reports Server (NTRS)
Baddourah, M. A.; Storaasli, O. O.; Bostic, S. W.
1993-01-01
Parallel computers offer the oppurtunity to significantly reduce the computation time necessary to analyze large-scale aerospace structures. This paper presents algorithms developed for and implemented on massively-parallel computers hereafter referred to as Scalable High-Performance Computers (SHPC), for the most computationally intensive tasks involved in structural analysis, namely, generation and assembly of system matrices, solution of systems of equations and calculation of the eigenvalues and eigenvectors. Results on SHPC are presented for large-scale structural problems (i.e. models for High-Speed Civil Transport). The goal of this research is to develop a new, efficient technique which extends structural analysis to SHPC and makes large-scale structural analyses tractable.
Computational Approaches to Drug Repurposing and Pharmacology
Hodos, Rachel A; Kidd, Brian A; Khader, Shameer; Readhead, Ben P; Dudley, Joel T
2016-01-01
Data in the biological, chemical, and clinical domains are accumulating at ever-increasing rates and have the potential to accelerate and inform drug development in new ways. Challenges and opportunities now lie in developing analytic tools to transform these often complex and heterogeneous data into testable hypotheses and actionable insights. This is the aim of computational pharmacology, which uses in silico techniques to better understand and predict how drugs affect biological systems, which can in turn improve clinical use, avoid unwanted side effects, and guide selection and development of better treatments. One exciting application of computational pharmacology is drug repurposing- finding new uses for existing drugs. Already yielding many promising candidates, this strategy has the potential to improve the efficiency of the drug development process and reach patient populations with previously unmet needs such as those with rare diseases. While current techniques in computational pharmacology and drug repurposing often focus on just a single data modality such as gene expression or drug-target interactions, we rationalize that methods such as matrix factorization that can integrate data within and across diverse data types have the potential to improve predictive performance and provide a fuller picture of a drug's pharmacological action. PMID:27080087
Signal analysis techniques for incipient failure detection in turbomachinery
NASA Technical Reports Server (NTRS)
Coffin, T.
1985-01-01
Signal analysis techniques for the detection and classification of incipient mechanical failures in turbomachinery were developed, implemented and evaluated. Signal analysis techniques available to describe dynamic measurement characteristics are reviewed. Time domain and spectral methods are described, and statistical classification in terms of moments is discussed. Several of these waveform analysis techniques were implemented on a computer and applied to dynamic signals. A laboratory evaluation of the methods with respect to signal detection capability is described. Plans for further technique evaluation and data base development to characterize turbopump incipient failure modes from Space Shuttle main engine (SSME) hot firing measurements are outlined.
Optimization of thermal protection systems for the space vehicle. Volume 2: User's manual
NASA Technical Reports Server (NTRS)
1972-01-01
The development of the computational techniques for the design optimization of thermal protection systems for the space shuttle vehicle are discussed. The resulting computer program was then used to perform initial optimization and sensitivity studies on a typical thermal protection system (TPS) to demonstrate its application to the space shuttle TPS design. The program was developed in FORTRAN IV for CDC 6400 computer, but it was subsequently converted to the FORTRAN V language to be used on the Univac 1108.
Acoustic Source Bearing Estimation (ASBE) computer program development
NASA Technical Reports Server (NTRS)
Wiese, Michael R.
1987-01-01
A new bearing estimation algorithm (Acoustic Source Analysis Technique - ASAT) and an acoustic analysis computer program (Acoustic Source Bearing Estimation - ASBE) are described, which were developed by Computer Sciences Corporation for NASA Langley Research Center. The ASBE program is used by the Acoustics Division/Applied Acoustics Branch and the Instrument Research Division/Electro-Mechanical Instrumentation Branch to analyze acoustic data and estimate the azimuths from which the source signals radiated. Included are the input and output from a benchmark test case.
Assessment of computer-related health problems among post-graduate nursing students.
Khan, Shaheen Akhtar; Sharma, Veena
2013-01-01
The study was conducted to assess computer-related health problems among post-graduate nursing students and to develop a Self Instructional Module for prevention of computer-related health problems in a selected university situated in Delhi. A descriptive survey with co-relational design was adopted. A total of 97 samples were selected from different faculties of Jamia Hamdard by multi stage sampling with systematic random sampling technique. Among post-graduate students, majority of sample subjects had average compliance with computer-related ergonomics principles. As regards computer related health problems, majority of post graduate students had moderate computer-related health problems, Self Instructional Module developed for prevention of computer-related health problems was found to be acceptable by the post-graduate students.
Development and Current Status of Skull-Image Superimposition - Methodology and Instrumentation.
Lan, Y
1992-12-01
This article presents a review of the literature and an evaluation on the development and application of skull-image superimposition technology - both instrumentation and methodology - contributed by a number of scholars since 1935. Along with a comparison of the methodologies involved in the two superimposition techniques - photographic and video - the author characterized the techniques in action and the recent advances in computer image superimposition processing technology. The major disadvantage of conventional approaches is its relying on subjective interpretation. Through painstaking comparison and analysis, computer image processing technology can make more conclusive identifications by direct testing and evaluating the various programmed indices. Copyright © 1992 Central Police University.
1984-09-01
Verification Technique for a Class of Security Kernels," International Symposium on Programming , Lecture Notes in Computer Science 137, Springer-Verlag, New York...September 1984 MTR9S31 " J. K. Millen Computer Security C. M. Cerniglia Models * 0 Ne c - ¢- C. S• ~CONTRACT SPONSOR OUSDRE/C31 & ESO/ALEE...ABSTRACT The purpose of this report is to provide a basis for evaluating security models in the context of secure computer system development
Computer-Aided Geometry Modeling
NASA Technical Reports Server (NTRS)
Shoosmith, J. N. (Compiler); Fulton, R. E. (Compiler)
1984-01-01
Techniques in computer-aided geometry modeling and their application are addressed. Mathematical modeling, solid geometry models, management of geometric data, development of geometry standards, and interactive and graphic procedures are discussed. The applications include aeronautical and aerospace structures design, fluid flow modeling, and gas turbine design.
Nuclear Physics Laboratory 1979 annual report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Adelberger, E.G.
1979-07-01
Research progress is reported in the following areas: astrophysics and cosmology, fundamental symmetries, nuclear structure, radiative capture, medium energy physics, heavy ion reactions, research by users and visitors, accelerator and ion source development, instrumentation and experimental techniques, and computers and computing. Publications are listed. (WHK)
Computer-Assisted Programmed Instruction in Textiles.
ERIC Educational Resources Information Center
Kean, Rita C.; Laughlin, Joan
Students in an introductory textiles course at the University of Nebraska's College of Home Economics actively participate in the learning experience through a self-paced instructional technique. Specific learning packets were developed adapting programmed instructional learning materials to computer assisted instruction (CAI). A study booklet…
A System for Generating Instructional Computer Graphics.
ERIC Educational Resources Information Center
Nygard, Kendall E.; Ranganathan, Babusankar
1983-01-01
Description of the Tektronix-Based Interactive Graphics System for Instruction (TIGSI), which was developed for generating graphics displays in computer-assisted instruction materials, discusses several applications (e.g., reinforcing learning of concepts, principles, rules, and problem-solving techniques) and presents advantages of the TIGSI…
Nationwide forestry applications program. Analysis of forest classification accuracy
NASA Technical Reports Server (NTRS)
Congalton, R. G.; Mead, R. A.; Oderwald, R. G.; Heinen, J. (Principal Investigator)
1981-01-01
The development of LANDSAT classification accuracy assessment techniques, and of a computerized system for assessing wildlife habitat from land cover maps are considered. A literature review on accuracy assessment techniques and an explanation for the techniques development under both projects are included along with listings of the computer programs. The presentations and discussions at the National Working Conference on LANDSAT Classification Accuracy are summarized. Two symposium papers which were published on the results of this project are appended.
Determining Training Device Requirements in Army Aviation Systems
NASA Technical Reports Server (NTRS)
Poumade, M. L.
1984-01-01
A decision making methodology which applies the systems approach to the training problem is discussed. Training is viewed as a total system instead of a collection of individual devices and unrelated techniques. The core of the methodology is the use of optimization techniques such as the transportation algorithm and multiobjective goal programming with training task and training device specific data. The role of computers, especially automated data bases and computer simulation models, in the development of training programs is also discussed. The approach can provide significant training enhancement and cost savings over the more traditional, intuitive form of training development and device requirements process. While given from an aviation perspective, the methodology is equally applicable to other training development efforts.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Davidson, George S.; Brown, William Michael
2007-09-01
Techniques for high throughput determinations of interactomes, together with high resolution protein collocalizations maps within organelles and through membranes will soon create a vast resource. With these data, biological descriptions, akin to the high dimensional phase spaces familiar to physicists, will become possible. These descriptions will capture sufficient information to make possible realistic, system-level models of cells. The descriptions and the computational models they enable will require powerful computing techniques. This report is offered as a call to the computational biology community to begin thinking at this scale and as a challenge to develop the required algorithms and codes tomore » make use of the new data.3« less
Llanes, Antonio; Muñoz, Andrés; Bueno-Crespo, Andrés; García-Valverde, Teresa; Sánchez, Antonia; Arcas-Túnez, Francisco; Pérez-Sánchez, Horacio; Cecilia, José M
2016-01-01
The protein-folding problem has been extensively studied during the last fifty years. The understanding of the dynamics of global shape of a protein and the influence on its biological function can help us to discover new and more effective drugs to deal with diseases of pharmacological relevance. Different computational approaches have been developed by different researchers in order to foresee the threedimensional arrangement of atoms of proteins from their sequences. However, the computational complexity of this problem makes mandatory the search for new models, novel algorithmic strategies and hardware platforms that provide solutions in a reasonable time frame. We present in this revision work the past and last tendencies regarding protein folding simulations from both perspectives; hardware and software. Of particular interest to us are both the use of inexact solutions to this computationally hard problem as well as which hardware platforms have been used for running this kind of Soft Computing techniques.
Verification, Validation and Sensitivity Studies in Computational Biomechanics
Anderson, Andrew E.; Ellis, Benjamin J.; Weiss, Jeffrey A.
2012-01-01
Computational techniques and software for the analysis of problems in mechanics have naturally moved from their origins in the traditional engineering disciplines to the study of cell, tissue and organ biomechanics. Increasingly complex models have been developed to describe and predict the mechanical behavior of such biological systems. While the availability of advanced computational tools has led to exciting research advances in the field, the utility of these models is often the subject of criticism due to inadequate model verification and validation. The objective of this review is to present the concepts of verification, validation and sensitivity studies with regard to the construction, analysis and interpretation of models in computational biomechanics. Specific examples from the field are discussed. It is hoped that this review will serve as a guide to the use of verification and validation principles in the field of computational biomechanics, thereby improving the peer acceptance of studies that use computational modeling techniques. PMID:17558646
Yang, J; Feng, H L
2018-04-09
With the rapid development of the chair-side computer aided design and computer aided manufacture (CAD/CAM) technology, its accuracy and operability of have been greatly improved in recent years. Chair-side CAD/CAM system may produce all kinds of indirect restorations, and has the advantages of rapid, accurate and stable production. It has become the future development direction of Stomatology. This paper describes the clinical application of the chair-side CAD/CAM technology for anterior aesthetic restorations from the aspects of shade and shape.
High-Performance Computing for the Electromagnetic Modeling and Simulation of Interconnects
NASA Technical Reports Server (NTRS)
Schutt-Aine, Jose E.
1996-01-01
The electromagnetic modeling of packages and interconnects plays a very important role in the design of high-speed digital circuits, and is most efficiently performed by using computer-aided design algorithms. In recent years, packaging has become a critical area in the design of high-speed communication systems and fast computers, and the importance of the software support for their development has increased accordingly. Throughout this project, our efforts have focused on the development of modeling and simulation techniques and algorithms that permit the fast computation of the electrical parameters of interconnects and the efficient simulation of their electrical performance.
Constraint-based component-modeling for knowledge-based design
NASA Technical Reports Server (NTRS)
Kolb, Mark A.
1992-01-01
The paper describes the application of various advanced programming techniques derived from artificial intelligence research to the development of flexible design tools for conceptual design. Special attention is given to two techniques which appear to be readily applicable to such design tools: the constraint propagation technique and the object-oriented programming. The implementation of these techniques in a prototype computer tool, Rubber Airplane, is described.
Methodologies for extracting kinetic constants for multiphase reacting flow simulation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chang, S.L.; Lottes, S.A.; Golchert, B.
1997-03-01
Flows in industrial reactors often involve complex reactions of many species. A computational fluid dynamics (CFD) computer code, ICRKFLO, was developed to simulate multiphase, multi-species reacting flows. The ICRKFLO uses a hybrid technique to calculate species concentration and reaction for a large number of species in a reacting flow. This technique includes a hydrodynamic and reacting flow simulation with a small but sufficient number of lumped reactions to compute flow field properties followed by a calculation of local reaction kinetics and transport of many subspecies (order of 10 to 100). Kinetic rate constants of the numerous subspecies chemical reactions aremore » difficult to determine. A methodology has been developed to extract kinetic constants from experimental data efficiently. A flow simulation of a fluid catalytic cracking (FCC) riser was successfully used to demonstrate this methodology.« less
Fast Learning for Immersive Engagement in Energy Simulations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bush, Brian W; Bugbee, Bruce; Gruchalla, Kenny M
The fast computation which is critical for immersive engagement with and learning from energy simulations would be furthered by developing a general method for creating rapidly computed simplified versions of NREL's computation-intensive energy simulations. Created using machine learning techniques, these 'reduced form' simulations can provide statistically sound estimates of the results of the full simulations at a fraction of the computational cost with response times - typically less than one minute of wall-clock time - suitable for real-time human-in-the-loop design and analysis. Additionally, uncertainty quantification techniques can document the accuracy of the approximate models and their domain of validity. Approximationmore » methods are applicable to a wide range of computational models, including supply-chain models, electric power grid simulations, and building models. These reduced-form representations cannot replace or re-implement existing simulations, but instead supplement them by enabling rapid scenario design and quality assurance for large sets of simulations. We present an overview of the framework and methods we have implemented for developing these reduced-form representations.« less
Light aircraft lift, drag, and moment prediction: A review and analysis
NASA Technical Reports Server (NTRS)
Smetana, F. O.; Summey, D. C.; Smith, N. S.; Carden, R. K.
1975-01-01
The historical development of analytical methods for predicting the lift, drag, and pitching moment of complete light aircraft configurations in cruising flight is reviewed. Theoretical methods, based in part on techniques described in the literature and in part on original work, are developed. These methods form the basis for understanding the computer programs given to: (1) compute the lift, drag, and moment of conventional airfoils, (2) extend these two-dimensional characteristics to three dimensions for moderate-to-high aspect ratio unswept wings, (3) plot complete configurations, (4) convert the fuselage geometric data to the correct input format, (5) compute the fuselage lift and drag, (6) compute the lift and moment of symmetrical airfoils to M = 1.0 by a simplified semi-empirical procedure, and (7) compute, in closed form, the pressure distribution over a prolate spheroid at alpha = 0. Comparisons of the predictions with experiment indicate excellent lift and drag agreement for conventional airfoils and wings. Limited comparisons of body-alone drag characteristics yield reasonable agreement. Also included are discussions for interference effects and techniques for summing the results above to obtain predictions for complete configurations.
Chang, Ching-I; Yan, Huey-Yeu; Sung, Wen-Hsu; Shen, Shu-Cheng; Chuang, Pao-Yu
2006-01-01
The purpose of this research was to develop a computer-aided instruction system for intra-aortic balloon pumping (IABP) skills in clinical nursing with virtual instrument (VI) concepts. Computer graphic technologies were incorporated to provide not only static clinical nursing education, but also the simulated function of operating an expensive medical instrument with VI techniques. The content of nursing knowledge was adapted from current well-accepted clinical training materials. The VI functions were developed using computer graphic technology with photos of real medical instruments taken by digital camera. We wish the system could provide beginners of nursing education important teaching assistance.
Multiphase Fluid Dynamics for Spacecraft Applications
NASA Astrophysics Data System (ADS)
Shyy, W.; Sim, J.
2011-09-01
Multiphase flows involving moving interfaces between different fluids/phases are observed in nature as well as in a wide range of engineering applications. With the recent development of high fidelity computational techniques, a number of challenging multiphase flow problems can now be computed. We introduce the basic notion of the main categories of multiphase flow computation; Lagrangian, Eulerian, and Eulerian-Lagrangian techniques to represent and follow interface, and sharp and continuous interface methods to model interfacial dynamics. The marker-based adaptive Eulerian-Lagrangian method, which is one of the most popular methods, is highlighted with microgravity and space applications including droplet collision and spacecraft liquid fuel tank surface stability.
Machine vision for real time orbital operations
NASA Technical Reports Server (NTRS)
Vinz, Frank L.
1988-01-01
Machine vision for automation and robotic operation of Space Station era systems has the potential for increasing the efficiency of orbital servicing, repair, assembly and docking tasks. A machine vision research project is described in which a TV camera is used for inputing visual data to a computer so that image processing may be achieved for real time control of these orbital operations. A technique has resulted from this research which reduces computer memory requirements and greatly increases typical computational speed such that it has the potential for development into a real time orbital machine vision system. This technique is called AI BOSS (Analysis of Images by Box Scan and Syntax).
[Axial computer tomography of the neurocranium (author's transl)].
Stöppler, L
1977-05-27
Computer tomography (CT), a new radiographic examination technique, is very highly efficient, for it has high informative content with little stress for the patient. In contrast to the conventional X-ray technology, CT succeeds, by direct presentation of the structure of the soft parts, in obtaining information which comes close to that of macroscopic neuropathology. The capacity and limitations of the method at the present stage of development are reported. Computer tomography cannot displace conventional neuroradiological methods of investigation, although it is rightly presented as a screening method and helps towards selective use. Indications, technical integration and handling of CT are prerequisites for the exhaustive benefit of the excellent new technique.
Prediction of Scour below Flip Bucket using Soft Computing Techniques
NASA Astrophysics Data System (ADS)
Azamathulla, H. Md.; Ab Ghani, Aminuddin; Azazi Zakaria, Nor
2010-05-01
The accurate prediction of the depth of scour around hydraulic structure (trajectory spillways) has been based on the experimental studies and the equations developed are mainly empirical in nature. This paper evaluates the performance of the soft computing (intelligence) techiques, Adaptive Neuro-Fuzzy System (ANFIS) and Genetic expression Programming (GEP) approach, in prediction of scour below a flip bucket spillway. The results are very promising, which support the use of these intelligent techniques in prediction of highly non-linear scour parameters.
Automated Quantification of Pneumothorax in CT
Do, Synho; Salvaggio, Kristen; Gupta, Supriya; Kalra, Mannudeep; Ali, Nabeel U.; Pien, Homer
2012-01-01
An automated, computer-aided diagnosis (CAD) algorithm for the quantification of pneumothoraces from Multidetector Computed Tomography (MDCT) images has been developed. Algorithm performance was evaluated through comparison to manual segmentation by expert radiologists. A combination of two-dimensional and three-dimensional processing techniques was incorporated to reduce required processing time by two-thirds (as compared to similar techniques). Volumetric measurements on relative pneumothorax size were obtained and the overall performance of the automated method shows an average error of just below 1%. PMID:23082091
Sedlár, Drahomír; Potomková, Jarmila; Rehorová, Jarmila; Seckár, Pavel; Sukopová, Vera
2003-11-01
Information explosion and globalization make great demands on keeping pace with the new trends in the healthcare sector. The contemporary level of computer and information literacy among most health care professionals in the Teaching Hospital Olomouc (Czech Republic) is not satisfactory for efficient exploitation of modern information technology in diagnostics, therapy and nursing. The present contribution describes the application of two basic problem solving techniques (brainstorming, SWOT analysis) to develop a project aimed at information literacy enhancement.
Geological mapping in northwestern Saudi Arabia using LANDSAT multispectral techniques
NASA Technical Reports Server (NTRS)
Blodget, H. W.; Brown, G. F.; Moik, J. G.
1975-01-01
Various computer enhancement and data extraction systems using LANDSAT data were assessed and used to complement a continuing geologic mapping program. Interactive digital classification techniques using both the parallel-piped and maximum-likelihood statistical approaches achieve very limited success in areas of highly dissected terrain. Computer enhanced imagery developed by color compositing stretched MSS ratio data was constructed for a test site in northwestern Saudi Arabia. Initial results indicate that several igneous and sedimentary rock types can be discriminated.
A high level language for a high performance computer
NASA Technical Reports Server (NTRS)
Perrott, R. H.
1978-01-01
The proposed computational aerodynamic facility will join the ranks of the supercomputers due to its architecture and increased execution speed. At present, the languages used to program these supercomputers have been modifications of programming languages which were designed many years ago for sequential machines. A new programming language should be developed based on the techniques which have proved valuable for sequential programming languages and incorporating the algorithmic techniques required for these supercomputers. The design objectives for such a language are outlined.
WAATS: A computer program for Weights Analysis of Advanced Transportation Systems
NASA Technical Reports Server (NTRS)
Glatt, C. R.
1974-01-01
A historical weight estimating technique for advanced transportation systems is presented. The classical approach to weight estimation is discussed and sufficient data is presented to estimate weights for a large spectrum of flight vehicles including horizontal and vertical takeoff aircraft, boosters and reentry vehicles. A computer program, WAATS (Weights Analysis for Advanced Transportation Systems) embracing the techniques discussed has been written and user instructions are presented. The program was developed for use in the ODIN (Optimal Design Integration System) system.
Digital optical computers at the optoelectronic computing systems center
NASA Technical Reports Server (NTRS)
Jordan, Harry F.
1991-01-01
The Digital Optical Computing Program within the National Science Foundation Engineering Research Center for Opto-electronic Computing Systems has as its specific goal research on optical computing architectures suitable for use at the highest possible speeds. The program can be targeted toward exploiting the time domain because other programs in the Center are pursuing research on parallel optical systems, exploiting optical interconnection and optical devices and materials. Using a general purpose computing architecture as the focus, we are developing design techniques, tools and architecture for operation at the speed of light limit. Experimental work is being done with the somewhat low speed components currently available but with architectures which will scale up in speed as faster devices are developed. The design algorithms and tools developed for a general purpose, stored program computer are being applied to other systems such as optimally controlled optical communication networks.
Integrating Multimedia Techniques into CS Pedagogy.
ERIC Educational Resources Information Center
Adams, Sandra Honda; Jou, Richard; Nasri, Ahmad; Radimsky, Anne-Louise; Sy, Bon K.
Through its grants, the National Science Foundation sponsors workshops that inform faculty of current topics in computer science. Such a workshop, entitled, "Developing Multimedia-based Interactive Laboratory Modules for Computer Science," was given July 27-August 6, 1998, at Illinois State University at Normal. Each participant was…
Solution of 3-dimensional time-dependent viscous flows. Part 2: Development of the computer code
NASA Technical Reports Server (NTRS)
Weinberg, B. C.; Mcdonald, H.
1980-01-01
There is considerable interest in developing a numerical scheme for solving the time dependent viscous compressible three dimensional flow equations to aid in the design of helicopter rotors. The development of a computer code to solve a three dimensional unsteady approximate form of the Navier-Stokes equations employing a linearized block emplicit technique in conjunction with a QR operator scheme is described. Results of calculations of several Cartesian test cases are presented. The computer code can be applied to more complex flow fields such as these encountered on rotating airfoils.
NASA Astrophysics Data System (ADS)
Gilbert, B. K.; Robb, R. A.; Chu, A.; Kenue, S. K.; Lent, A. H.; Swartzlander, E. E., Jr.
1981-02-01
Rapid advances during the past ten years of several forms of computer-assisted tomography (CT) have resulted in the development of numerous algorithms to convert raw projection data into cross-sectional images. These reconstruction algorithms are either 'iterative,' in which a large matrix algebraic equation is solved by successive approximation techniques; or 'closed form'. Continuing evolution of the closed form algorithms has allowed the newest versions to produce excellent reconstructed images in most applications. This paper will review several computer software and special-purpose digital hardware implementations of closed form algorithms, either proposed during the past several years by a number of workers or actually implemented in commercial or research CT scanners. The discussion will also cover a number of recently investigated algorithmic modifications which reduce the amount of computation required to execute the reconstruction process, as well as several new special-purpose digital hardware implementations under development in laboratories at the Mayo Clinic.
Mades, Dean M.; Weiss, Linda S.; Gray, John R.
1991-01-01
Techniques for computing discharge are developed for Brandon Road Dam on the Des Plaines River and for Dresden Island, Marseilles, and Starved Rock Dams on the Illinois River. At Brandon Road Dam, streamflow is regulated by the operation of Tainter gates and headgates. At Dresden Island, Marseilles, and Starved Rock Dams, only Tainter gates are operated to regulate streamflow. The locks at all dams are equipped with culvert valves that are used to fill and empty the lock. The techniques facilitate determination of discharge at locations along the upper Illinois Waterway where no streamflow-gaging stations exist. The techniques are also useful for computing low flows when the water-surface slope between control structures on the river approaches zero and traditional methods of determining discharge based on slope are unsatisfactory. Two techniques can be used to compute discharge at the dams--gate ratings and tailwater ratings . A gate ratingdescribes the relation between discharge, gate opening, tailwater stage, and headwater stage. A tailwater rating describes the relation between tailwater stage and discharge. Gate ratings for Tainter gates at Dresden Island, Marseilles, and Starved Rock Dams are based on a total of 78 measurements of discharge that range from 569 to 86,400 cubic feet per second. Flood hydrographs developed from the gate ratings and Lockmaster records of gate opening and stage compare closely with streamflow records published for nearby streamflow-gaging stations. Additional measurements are needed to verify gate ratings for Tainter gates and headgates at Brandon Road Dam after the dam rehabilitation is completed. Extensive leakage past deteriorated headgates and sluice gates contributed to uncertainty in the ratings developed for this dam. A useful tailwater rating is developed for Marseilles Dam. Tailwater ratings for Dresden Island Dam and Starved Rock Dam are of limited use because of varying downstream channel-storage conditions. A tailwater rating could not be developed for Brandon Road Dam because its tailwater pool is substantially affected by the headwater pool of Dresden Island Dam.
Concurrent Probabilistic Simulation of High Temperature Composite Structural Response
NASA Technical Reports Server (NTRS)
Abdi, Frank
1996-01-01
A computational structural/material analysis and design tool which would meet industry's future demand for expedience and reduced cost is presented. This unique software 'GENOA' is dedicated to parallel and high speed analysis to perform probabilistic evaluation of high temperature composite response of aerospace systems. The development is based on detailed integration and modification of diverse fields of specialized analysis techniques and mathematical models to combine their latest innovative capabilities into a commercially viable software package. The technique is specifically designed to exploit the availability of processors to perform computationally intense probabilistic analysis assessing uncertainties in structural reliability analysis and composite micromechanics. The primary objectives which were achieved in performing the development were: (1) Utilization of the power of parallel processing and static/dynamic load balancing optimization to make the complex simulation of structure, material and processing of high temperature composite affordable; (2) Computational integration and synchronization of probabilistic mathematics, structural/material mechanics and parallel computing; (3) Implementation of an innovative multi-level domain decomposition technique to identify the inherent parallelism, and increasing convergence rates through high- and low-level processor assignment; (4) Creating the framework for Portable Paralleled architecture for the machine independent Multi Instruction Multi Data, (MIMD), Single Instruction Multi Data (SIMD), hybrid and distributed workstation type of computers; and (5) Market evaluation. The results of Phase-2 effort provides a good basis for continuation and warrants Phase-3 government, and industry partnership.
Institute for Computational Mechanics in Propulsion (ICOMP)
NASA Technical Reports Server (NTRS)
Feiler, Charles E. (Editor)
1995-01-01
The Institute for Computational Mechanics in Propulsion (ICOMP) is operated by the Ohio Aerospace Institute (OAI) and funded under a cooperative agreement by the NASA Lewis Research Center in Cleveland, Ohio. The purpose of ICOMP is to develop techniques to improve problem-solving capabilities in all aspects of computational mechanics related to propulsion. This report describes the activities at ICOMP during 1994.
Institute for Computational Mechanics in Propulsion (ICOMP). 10
NASA Technical Reports Server (NTRS)
Keith, Theo G., Jr. (Editor); Balog, Karen (Editor); Povinelli, Louis A. (Editor)
1996-01-01
The Institute for Computational Mechanics in Propulsion (ICOMP) is operated by the Ohio Aerospace Institute (OAI) and funded under a cooperative agreement by the NASA Lewis Research Center in Cleveland, Ohio. The purpose of ICOMP is to develop techniques to improve problem-solving capabilities in all aspects of computational mechanics related to propulsion. This report describes the activities at ICOUP during 1995.
What Does CALL Have to Offer Computer Science and What Does Computer Science Have to Offer CALL?
ERIC Educational Resources Information Center
Cushion, Steve
2006-01-01
We will argue that CALL can usefully be viewed as a subset of computer software engineering and can profit from adopting some of the recent progress in software development theory. The unified modelling language has become the industry standard modelling technique and the accompanying unified process is rapidly gaining acceptance. The manner in…
ERIC Educational Resources Information Center
Hummel, Hans G. K.; Smit, Herjan
1996-01-01
Describes mathematics courses developed for guided self-study at a distance. The courses incorporate new educational media and special didactic techniques. Provides details about how a computer can be used in such a setting, and focuses on a computer practical examination used for an advanced course in Fourier transforms. (27 references) (DDR)
Institute for Computational Mechanics in Propulsion (ICOMP)
NASA Technical Reports Server (NTRS)
Keith, Theo G., Jr. (Editor); Balog, Karen (Editor); Povinelli, Louis A. (Editor)
1997-01-01
The Institute for Computational Mechanics in Propulsion (ICOMP) is operated by the Ohio Aerospace Institute (OAI) and funded under a cooperative agreement by the NASA Lewis Research Center in Cleveland, Ohio. Thee purpose of ICOMP is to develop techniques to improve problem-solving capabilities in all aspects of computational mechanics related to propulsion. This report describes the activities at ICOMP during 1996.
Recent Developments in Computational Techniques for Applied Hydrodynamics.
1979-12-07
by block number) Numerical Method Fluids Incompressible Flow Finite Difference Methods Poisson Equation Convective Equations -MABSTRACT (Continue on...weaknesses of the different approaches are analyzed. Finite - difference techniques have particularly attractive properties in this framework. Hence it will...be worthwhile to correct, at least partially, the difficulties from which Eulerian and Lagrangian finite - difference techniques suffer, discussed in
ERIC Educational Resources Information Center
Storer, I. J.; Campbell, R. I.
2012-01-01
Industrial Designers need to understand and command a number of modelling techniques to communicate their ideas to themselves and others. Verbal explanations, sketches, engineering drawings, computer aided design (CAD) models and physical prototypes are the most commonly used communication techniques. Within design, unlike some disciplines,…
Facial Animations: Future Research Directions & Challenges
NASA Astrophysics Data System (ADS)
Alkawaz, Mohammed Hazim; Mohamad, Dzulkifli; Rehman, Amjad; Basori, Ahmad Hoirul
2014-06-01
Nowadays, computer facial animation is used in a significant multitude fields that brought human and social to study the computer games, films and interactive multimedia reality growth. Authoring the computer facial animation, complex and subtle expressions are challenging and fraught with problems. As a result, the current most authored using universal computer animation techniques often limit the production quality and quantity of facial animation. With the supplement of computer power, facial appreciative, software sophistication and new face-centric methods emerging are immature in nature. Therefore, this paper concentrates to define and managerially categorize current and emerged surveyed facial animation experts to define the recent state of the field, observed bottlenecks and developing techniques. This paper further presents a real-time simulation model of human worry and howling with detail discussion about their astonish, sorrow, annoyance and panic perception.
NASA Astrophysics Data System (ADS)
Poggio, Andrew J.
1988-10-01
This issue of Energy and Technology Review contains: Neutron Penumbral Imaging of Laser-Fusion Targets--using our new penumbral-imaging diagnostic, we have obtained the first images that can be used to measure directly the deuterium-tritium burn region in laser-driven fusion targets; Computed Tomography for Nondestructive Evaluation--various computed tomography systems and computational techniques are used in nondestructive evaluation; Three-Dimensional Image Analysis for Studying Nuclear Chromatin Structure--we have developed an optic-electronic system for acquiring cross-sectional views of cell nuclei, and computer codes to analyze these images and reconstruct the three-dimensional structures they represent; Imaging in the Nuclear Test Program--advanced techniques produce images of unprecedented detail and resolution from Nevada Test Site data; and Computational X-Ray Holography--visible-light experiments and numerically simulated holograms test our ideas about an X-ray microscope for biological research.
NASA Astrophysics Data System (ADS)
Teodorescu, Liliana; Britton, David; Glover, Nigel; Heinrich, Gudrun; Lauret, Jérôme; Naumann, Axel; Speer, Thomas; Teixeira-Dias, Pedro
2012-06-01
ACAT2011 This volume of Journal of Physics: Conference Series is dedicated to scientific contributions presented at the 14th International Workshop on Advanced Computing and Analysis Techniques in Physics Research (ACAT 2011) which took place on 5-7 September 2011 at Brunel University, UK. The workshop series, which began in 1990 in Lyon, France, brings together computer science researchers and practitioners, and researchers from particle physics and related fields in order to explore and confront the boundaries of computing and of automatic data analysis and theoretical calculation techniques. It is a forum for the exchange of ideas among the fields, exploring and promoting cutting-edge computing, data analysis and theoretical calculation techniques in fundamental physics research. This year's edition of the workshop brought together over 100 participants from all over the world. 14 invited speakers presented key topics on computing ecosystems, cloud computing, multivariate data analysis, symbolic and automatic theoretical calculations as well as computing and data analysis challenges in astrophysics, bioinformatics and musicology. Over 80 other talks and posters presented state-of-the art developments in the areas of the workshop's three tracks: Computing Technologies, Data Analysis Algorithms and Tools, and Computational Techniques in Theoretical Physics. Panel and round table discussions on data management and multivariate data analysis uncovered new ideas and collaboration opportunities in the respective areas. This edition of ACAT was generously sponsored by the Science and Technology Facility Council (STFC), the Institute for Particle Physics Phenomenology (IPPP) at Durham University, Brookhaven National Laboratory in the USA and Dell. We would like to thank all the participants of the workshop for the high level of their scientific contributions and for the enthusiastic participation in all its activities which were, ultimately, the key factors in the success of the workshop. Further information on ACAT 2011 can be found at http://acat2011.cern.ch Dr Liliana Teodorescu Brunel University ACATgroup The PDF also contains details of the workshop's committees and sponsors.
NASA Technical Reports Server (NTRS)
Srokowski, A. J.
1978-01-01
The problem of obtaining accurate estimates of suction requirements on swept laminar flow control wings was discussed. A fast accurate computer code developed to predict suction requirements by integrating disturbance amplification rates was described. Assumptions and approximations used in the present computer code are examined in light of flow conditions on the swept wing which may limit their validity.
As-built design specification for proportion estimate software subsystem
NASA Technical Reports Server (NTRS)
Obrien, S. (Principal Investigator)
1980-01-01
The Proportion Estimate Processor evaluates four estimation techniques in order to get an improved estimate of the proportion of a scene that is planted in a selected crop. The four techniques to be evaluated were provided by the techniques development section and are: (1) random sampling; (2) proportional allocation, relative count estimate; (3) proportional allocation, Bayesian estimate; and (4) sequential Bayesian allocation. The user is given two options for computation of the estimated mean square error. These are referred to as the cluster calculation option and the segment calculation option. The software for the Proportion Estimate Processor is operational on the IBM 3031 computer.
NASA Technical Reports Server (NTRS)
Wu, Andy
1995-01-01
Allan Deviation computations of linear frequency synthesizer systems have been reported previously using real-time simulations. Even though it takes less time compared with the actual measurement, it is still very time consuming to compute the Allan Deviation for long sample times with the desired confidence level. Also noises, such as flicker phase noise and flicker frequency noise, can not be simulated precisely. The use of frequency domain techniques can overcome these drawbacks. In this paper the system error model of a fictitious linear frequency synthesizer is developed and its performance using a Cesium (Cs) atomic frequency standard (AFS) as a reference is evaluated using frequency domain techniques. For a linear timing system, the power spectral density at the system output can be computed with known system transfer functions and known power spectral densities from the input noise sources. The resulting power spectral density can then be used to compute the Allan Variance at the system output. Sensitivities of the Allan Variance at the system output to each of its independent input noises are obtained, and they are valuable for design trade-off and trouble-shooting.
Data-driven Applications for the Sun-Earth System
NASA Astrophysics Data System (ADS)
Kondrashov, D. A.
2016-12-01
Advances in observational and data mining techniques allow extracting information from the large volume of Sun-Earth observational data that can be assimilated into first principles physical models. However, equations governing Sun-Earth phenomena are typically nonlinear, complex, and high-dimensional. The high computational demand of solving the full governing equations over a large range of scales precludes the use of a variety of useful assimilative tools that rely on applied mathematical and statistical techniques for quantifying uncertainty and predictability. Effective use of such tools requires the development of computationally efficient methods to facilitate fusion of data with models. This presentation will provide an overview of various existing as well as newly developed data-driven techniques adopted from atmospheric and oceanic sciences that proved to be useful for space physics applications, such as computationally efficient implementation of Kalman Filter in radiation belts modeling, solar wind gap-filling by Singular Spectrum Analysis, and low-rank procedure for assimilation of low-altitude ionospheric magnetic perturbations into the Lyon-Fedder-Mobarry (LFM) global magnetospheric model. Reduced-order non-Markovian inverse modeling and novel data-adaptive decompositions of Sun-Earth datasets will be also demonstrated.
Educational Software: A Developer's Perspective.
ERIC Educational Resources Information Center
Armstrong, Timothy C.; Loane, Russell F.
1994-01-01
Examines the current status and short-term future of computer software development in higher education. Topics discussed include educational advantages of software; current program development techniques, including object oriented programming; and market trends, including IBM versus Macintosh and multimedia programs. (LRW)
Alves, Pedro; Liu, Shuang; Wang, Daifeng; Gerstein, Mark
2018-01-01
Machine learning is an integral part of computational biology, and has already shown its use in various applications, such as prognostic tests. In the last few years in the non-biological machine learning community, ensembling techniques have shown their power in data mining competitions such as the Netflix challenge; however, such methods have not found wide use in computational biology. In this work, we endeavor to show how ensembling techniques can be applied to practical problems, including problems in the field of bioinformatics, and how they often outperform other machine learning techniques in both predictive power and robustness. Furthermore, we develop a methodology of ensembling, Multi-Swarm Ensemble (MSWE) by using multiple particle swarm optimizations and demonstrate its ability to further enhance the performance of ensembles.
Domain Immersion Technique And Free Surface Computations Applied To Extrusion And Mixing Processes
NASA Astrophysics Data System (ADS)
Valette, Rudy; Vergnes, Bruno; Basset, Olivier; Coupez, Thierry
2007-04-01
This work focuses on the development of numerical techniques devoted to the simulation of mixing processes of complex fluids such as twin-screw extrusion or batch mixing. In mixing process simulation, the absence of symmetry of the moving boundaries (the screws or the rotors) implies that their rigid body motion has to be taken into account by using a special treatment. We therefore use a mesh immersion technique (MIT), which consists in using a P1+/P1-based (MINI-element) mixed finite element method for solving the velocity-pressure problem and then solving the problem in the whole barrel cavity by imposing a rigid motion (rotation) to nodes found located inside the so called immersed domain, each subdomain (screw, rotor) being represented by a surface CAD mesh (or its mathematical equation in simple cases). The independent meshes are immersed into a unique backgound computational mesh by computing the distance function to their boundaries. Intersections of meshes are accounted for, allowing to compute a fill factor usable as for the VOF methodology. This technique, combined with the use of parallel computing, allows to compute the time-dependent flow of generalized Newtonian fluids including yield stress fluids in a complex system such as a twin screw extruder, including moving free surfaces, which are treated by a "level set" and Hamilton-Jacobi method.
Applying service learning to computer science: attracting and engaging under-represented students
NASA Astrophysics Data System (ADS)
Dahlberg, Teresa; Barnes, Tiffany; Buch, Kim; Bean, Karen
2010-09-01
This article describes a computer science course that uses service learning as a vehicle to accomplish a range of pedagogical and BPC (broadening participation in computing) goals: (1) to attract a diverse group of students and engage them in outreach to younger students to help build a diverse computer science pipeline, (2) to develop leadership and team skills using experiential techniques, and (3) to develop student attitudes associated with success and retention in computer science. First, we describe the course and how it was designed to incorporate good practice in service learning. We then report preliminary results showing a positive impact of the course on all pedagogical goals and discuss the implications of the results for broadening participation in computing.
Enhanced Multiobjective Optimization Technique for Comprehensive Aerospace Design. Part A
NASA Technical Reports Server (NTRS)
Chattopadhyay, Aditi; Rajadas, John N.
1997-01-01
A multidisciplinary design optimization procedure which couples formal multiobjectives based techniques and complex analysis procedures (such as computational fluid dynamics (CFD) codes) developed. The procedure has been demonstrated on a specific high speed flow application involving aerodynamics and acoustics (sonic boom minimization). In order to account for multiple design objectives arising from complex performance requirements, multiobjective formulation techniques are used to formulate the optimization problem. Techniques to enhance the existing Kreisselmeier-Steinhauser (K-S) function multiobjective formulation approach have been developed. The K-S function procedure used in the proposed work transforms a constrained multiple objective functions problem into an unconstrained problem which then is solved using the Broyden-Fletcher-Goldfarb-Shanno (BFGS) algorithm. Weight factors are introduced during the transformation process to each objective function. This enhanced procedure will provide the designer the capability to emphasize specific design objectives during the optimization process. The demonstration of the procedure utilizes a computational Fluid dynamics (CFD) code which solves the three-dimensional parabolized Navier-Stokes (PNS) equations for the flow field along with an appropriate sonic boom evaluation procedure thus introducing both aerodynamic performance as well as sonic boom as the design objectives to be optimized simultaneously. Sensitivity analysis is performed using a discrete differentiation approach. An approximation technique has been used within the optimizer to improve the overall computational efficiency of the procedure in order to make it suitable for design applications in an industrial setting.
Parallel processors and nonlinear structural dynamics algorithms and software
NASA Technical Reports Server (NTRS)
Belytschko, Ted
1990-01-01
Techniques are discussed for the implementation and improvement of vectorization and concurrency in nonlinear explicit structural finite element codes. In explicit integration methods, the computation of the element internal force vector consumes the bulk of the computer time. The program can be efficiently vectorized by subdividing the elements into blocks and executing all computations in vector mode. The structuring of elements into blocks also provides a convenient way to implement concurrency by creating tasks which can be assigned to available processors for evaluation. The techniques were implemented in a 3-D nonlinear program with one-point quadrature shell elements. Concurrency and vectorization were first implemented in a single time step version of the program. Techniques were developed to minimize processor idle time and to select the optimal vector length. A comparison of run times between the program executed in scalar, serial mode and the fully vectorized code executed concurrently using eight processors shows speed-ups of over 25. Conjugate gradient methods for solving nonlinear algebraic equations are also readily adapted to a parallel environment. A new technique for improving convergence properties of conjugate gradients in nonlinear problems is developed in conjunction with other techniques such as diagonal scaling. A significant reduction in the number of iterations required for convergence is shown for a statically loaded rigid bar suspended by three equally spaced springs.
Minimization search method for data inversion
NASA Technical Reports Server (NTRS)
Fymat, A. L.
1975-01-01
Technique has been developed for determining values of selected subsets of independent variables in mathematical formulations. Required computation time increases with first power of the number of variables. This is in contrast with classical minimization methods for which computational time increases with third power of the number of variables.
ERIC Educational Resources Information Center
Tataw, Oben Moses
2013-01-01
Interdisciplinary research in computer science requires the development of computational techniques for practical application in different domains. This usually requires careful integration of different areas of technical expertise. This dissertation presents image and time series analysis algorithms, with practical interdisciplinary applications…
Computer graphics testbed to simulate and test vision systems for space applications
NASA Technical Reports Server (NTRS)
Cheatham, John B.
1991-01-01
Research activity has shifted from computer graphics and vision systems to the broader scope of applying concepts of artificial intelligence to robotics. Specifically, the research is directed toward developing Artificial Neural Networks, Expert Systems, and Laser Imaging Techniques for Autonomous Space Robots.
Modeling Spanish Mood Choice in Belief Statements
ERIC Educational Resources Information Center
Robinson, Jason R.
2013-01-01
This work develops a computational methodology new to linguistics that empirically evaluates competing linguistic theories on Spanish verbal mood choice through the use of computational techniques to learn mood and other hidden linguistic features from Spanish belief statements found in corpora. The machine learned probabilistic linguistic models…
Microcomputer Software Development: New Strategies for a New Technology.
ERIC Educational Resources Information Center
Kehrberg, Kent T.
1979-01-01
Provides a guide for the development of educational computer programs for use on microcomputers. Making use of the features of microcomputers, including visual, audio, and tactile techniques, is encouraged. (Author/IRT)
Off-Body Boundary-Layer Measurement Techniques Development for Supersonic Low-Disturbance Flows
NASA Technical Reports Server (NTRS)
Owens, Lewis R.; Kegerise, Michael A.; Wilkinson, Stephen P.
2011-01-01
Investigations were performed to develop accurate boundary-layer measurement techniques in a Mach 3.5 laminar boundary layer on a 7 half-angle cone at 0 angle of attack. A discussion of the measurement challenges is presented as well as how each was addressed. A computational study was performed to minimize the probe aerodynamic interference effects resulting in improved pitot and hot-wire probe designs. Probe calibration and positioning processes were also developed with the goal of reducing the measurement uncertainties from 10% levels to less than 5% levels. Efforts were made to define the experimental boundary conditions for the cone flow so comparisons could be made with a set of companion computational simulations. The development status of the mean and dynamic boundary-layer flow measurements for a nominally sharp cone in a low-disturbance supersonic flow is presented.
NASA Technical Reports Server (NTRS)
Nez, G. (Principal Investigator); Mutter, D.
1977-01-01
The author has identified the following significant results. The project mapped land use/cover classifications from LANDSAT computer compatible tape data and combined those results with other multisource data via computer mapping/compositing techniques to analyze various land use planning/natural resource management problems. Data were analyzed on 1:24,000 scale maps at 1.1 acre resolution. LANDSAT analysis software and linkages with other computer mapping software were developed. Significant results were also achieved in training, communication, and identification of needs for developing the LANDSAT/computer mapping technologies into operational tools for use by decision makers.
NASA Astrophysics Data System (ADS)
Lanzagorta, Marco O.; Gomez, Richard B.; Uhlmann, Jeffrey K.
2003-08-01
In recent years, computer graphics has emerged as a critical component of the scientific and engineering process, and it is recognized as an important computer science research area. Computer graphics are extensively used for a variety of aerospace and defense training systems and by Hollywood's special effects companies. All these applications require the computer graphics systems to produce high quality renderings of extremely large data sets in short periods of time. Much research has been done in "classical computing" toward the development of efficient methods and techniques to reduce the rendering time required for large datasets. Quantum Computing's unique algorithmic features offer the possibility of speeding up some of the known rendering algorithms currently used in computer graphics. In this paper we discuss possible implementations of quantum rendering algorithms. In particular, we concentrate on the implementation of Grover's quantum search algorithm for Z-buffering, ray-tracing, radiosity, and scene management techniques. We also compare the theoretical performance between the classical and quantum versions of the algorithms.
On the usage of ultrasound computational models for decision making under ambiguity
NASA Astrophysics Data System (ADS)
Dib, Gerges; Sexton, Samuel; Prowant, Matthew; Crawford, Susan; Diaz, Aaron
2018-04-01
Computer modeling and simulation is becoming pervasive within the non-destructive evaluation (NDE) industry as a convenient tool for designing and assessing inspection techniques. This raises a pressing need for developing quantitative techniques for demonstrating the validity and applicability of the computational models. Computational models provide deterministic results based on deterministic and well-defined input, or stochastic results based on inputs defined by probability distributions. However, computational models cannot account for the effects of personnel, procedures, and equipment, resulting in ambiguity about the efficacy of inspections based on guidance from computational models only. In addition, ambiguity arises when model inputs, such as the representation of realistic cracks, cannot be defined deterministically, probabilistically, or by intervals. In this work, Pacific Northwest National Laboratory demonstrates the ability of computational models to represent field measurements under known variabilities, and quantify the differences using maximum amplitude and power spectrum density metrics. Sensitivity studies are also conducted to quantify the effects of different input parameters on the simulation results.
NASA Astrophysics Data System (ADS)
Sherley, Patrick L.; Pujol, Alfonso, Jr.; Meadow, John S.
1990-07-01
To provide a means of rendering complex computer architectures languages and input/output modalities transparent to experienced and inexperienced users research is being conducted to develop a voice driven/voice response computer graphics imaging system. The system will be used for reconstructing and displaying computed tomography and magnetic resonance imaging scan data. In conjunction with this study an artificial intelligence (Al) control strategy was developed to interface the voice components and support software to the computer graphics functions implemented on the Sun Microsystems 4/280 color graphics workstation. Based on generated text and converted renditions of verbal utterances by the user the Al control strategy determines the user''s intent and develops and validates a plan. The program type and parameters within the plan are used as input to the graphics system for reconstructing and displaying medical image data corresponding to that perceived intent. If the plan is not valid the control strategy queries the user for additional information. The control strategy operates in a conversation mode and vocally provides system status reports. A detailed examination of the various AT techniques is presented with major emphasis being placed on their specific roles within the total control strategy structure. 1.
ERIC Educational Resources Information Center
LEBEDEV, P.D.
ON THE PREMISES THAT THE DEVELOPMENT OF PROGRAMED LEARNING BY RESEARCH TEAMS OF SUBJECT AND TECHNIQUE SPECIALISTS IS INDISPUTABLE, AND THAT THE EXPERIENCED TEACHER IN THE ROLE OF INDIVIDUAL TUTOR IS INDISPENSABLE, THE TECHNOLOGY TO SUPPORT PROGRAMED INSTRUCTION MUST BE ADVANCED. AUTOMATED DEVICES EMPLOYING SEQUENTIAL AND BRANCHING TECHNIQUES FOR…
Fluid Structure Interaction Techniques For Extrusion And Mixing Processes
NASA Astrophysics Data System (ADS)
Valette, Rudy; Vergnes, Bruno; Coupez, Thierry
2007-05-01
This work focuses on the development of numerical techniques devoted to the simulation of mixing processes of complex fluids such as twin-screw extrusion or batch mixing. In mixing process simulation, the absence of symmetry of the moving boundaries (the screws or the rotors) implies that their rigid body motion has to be taken into account by using a special treatment We therefore use a mesh immersion technique (MIT), which consists in using a P1+/P1-based (MINI-element) mixed finite element method for solving the velocity-pressure problem and then solving the problem in the whole barrel cavity by imposing a rigid motion (rotation) to nodes found located inside the so called immersed domain, each sub-domain (screw, rotor) being represented by a surface CAD mesh (or its mathematical equation in simple cases). The independent meshes are immersed into a unique background computational mesh by computing the distance function to their boundaries. Intersections of meshes are accounted for, allowing to compute a fill factor usable as for the VOF methodology. This technique, combined with the use of parallel computing, allows to compute the time-dependent flow of generalized Newtonian fluids including yield stress fluids in a complex system such as a twin screw extruder, including moving free surfaces, which are treated by a "level set" and Hamilton-Jacobi method.
NASA Technical Reports Server (NTRS)
Greene, William H.
1989-01-01
A study has been performed focusing on the calculation of sensitivities of displacements, velocities, accelerations, and stresses in linear, structural, transient response problems. One significant goal was to develop and evaluate sensitivity calculation techniques suitable for large-order finite element analyses. Accordingly, approximation vectors such as vibration mode shapes are used to reduce the dimensionality of the finite element model. Much of the research focused on the accuracy of both response quantities and sensitivities as a function of number of vectors used. Two types of sensitivity calculation techniques were developed and evaluated. The first type of technique is an overall finite difference method where the analysis is repeated for perturbed designs. The second type of technique is termed semianalytical because it involves direct, analytical differentiation of the equations of motion with finite difference approximation of the coefficient matrices. To be computationally practical in large-order problems, the overall finite difference methods must use the approximation vectors from the original design in the analyses of the perturbed models.
Applications of Deep Learning and Reinforcement Learning to Biological Data.
Mahmud, Mufti; Kaiser, Mohammed Shamim; Hussain, Amir; Vassanelli, Stefano
2018-06-01
Rapid advances in hardware-based technologies during the past decades have opened up new possibilities for life scientists to gather multimodal data in various application domains, such as omics, bioimaging, medical imaging, and (brain/body)-machine interfaces. These have generated novel opportunities for development of dedicated data-intensive machine learning techniques. In particular, recent research in deep learning (DL), reinforcement learning (RL), and their combination (deep RL) promise to revolutionize the future of artificial intelligence. The growth in computational power accompanied by faster and increased data storage, and declining computing costs have already allowed scientists in various fields to apply these techniques on data sets that were previously intractable owing to their size and complexity. This paper provides a comprehensive survey on the application of DL, RL, and deep RL techniques in mining biological data. In addition, we compare the performances of DL techniques when applied to different data sets across various application domains. Finally, we outline open issues in this challenging research area and discuss future development perspectives.
ten Kate, Gerrit L.; Sijbrands, Eric J. G.; Valkema, Roelf; ten Cate, Folkert J.; Feinstein, Steven B.; van der Steen, Antonius F. W.; Daemen, Mat J. A. P.
2010-01-01
Current developments in cardiovascular biology and imaging enable the noninvasive molecular evaluation of atherosclerotic vascular disease. Intraplaque neovascularization sprouting from the adventitial vasa vasorum has been identified as an independent predictor of intraplaque hemorrhage and plaque rupture. These intraplaque vasa vasorum result from angiogenesis, most likely under influence of hypoxic and inflammatory stimuli. Several molecular imaging techniques are currently available. Most experience has been obtained with molecular imaging using positron emission tomography and single photon emission computed tomography. Recently, the development of targeted contrast agents has allowed molecular imaging with magnetic resonance imaging, ultrasound and computed tomography. The present review discusses the use of these molecular imaging techniques to identify inflammation and intraplaque vasa vasorum to identify vulnerable atherosclerotic plaques at risk of rupture and thrombosis. The available literature on molecular imaging techniques and molecular targets associated with inflammation and angiogenesis is discussed, and the clinical applications of molecular cardiovascular imaging and the use of molecular techniques for local drug delivery are addressed. PMID:20552308
NASA Technical Reports Server (NTRS)
Easton, John W.; Struk, Peter M.; Rotella, Anthony
2008-01-01
As a part of efforts to develop an electronics repair capability for long duration space missions, techniques and materials for soldering components on a circuit board in reduced gravity must be developed. This paper presents results from testing solder joint formation in low gravity on a NASA Reduced Gravity Research Aircraft. The results presented include joints formed using eutectic tin-lead solder and one of the following fluxes: (1) a no-clean flux core, (2) a rosin flux core, and (3) a solid solder wire with external liquid no-clean flux. The solder joints are analyzed with a computed tomography (CT) technique which imaged the interior of the entire solder joint. This replaced an earlier technique that required the solder joint to be destructively ground down revealing a single plane which was subsequently analyzed. The CT analysis technique is described and results presented with implications for future testing as well as implications for the overall electronics repair effort discussed.
NASA Technical Reports Server (NTRS)
Abramson, N.
1974-01-01
The Aloha system was studied and developed and extended to advanced forms of computer communications networks. Theoretical and simulation studies of Aloha type radio channels for use in packet switched communications networks were performed. Improved versions of the Aloha communications techniques and their extensions were tested experimentally. A packet radio repeater suitable for use with the Aloha system operational network was developed. General studies of the organization of multiprocessor systems centered on the development of the BCC 500 computer were concluded.
Computer-assisted techniques to evaluate fringe patterns
NASA Astrophysics Data System (ADS)
Sciammarella, Cesar A.; Bhat, Gopalakrishna K.
1992-01-01
Strain measurement using interferometry requires an efficient way to extract the desired information from interferometric fringes. Availability of digital image processing systems makes it possible to use digital techniques for the analysis of fringes. In the past, there have been several developments in the area of one dimensional and two dimensional fringe analysis techniques, including the carrier fringe method (spatial heterodyning) and the phase stepping (quasi-heterodyning) technique. This paper presents some new developments in the area of two dimensional fringe analysis, including a phase stepping technique supplemented by the carrier fringe method and a two dimensional Fourier transform method to obtain the strain directly from the discontinuous phase contour map.
Yamada, Shigehito; Uwabe, Chigako; Nakatsu-Komatsu, Tomoko; Minekura, Yutaka; Iwakura, Masaji; Motoki, Tamaki; Nishimiya, Kazuhiko; Iiyama, Masaaki; Kakusho, Koh; Minoh, Michihiko; Mizuta, Shinobu; Matsuda, Tetsuya; Matsuda, Yoshimasa; Haishi, Tomoyuki; Kose, Katsumi; Fujii, Shingo; Shiota, Kohei
2006-02-01
Morphogenesis in the developing embryo takes place in three dimensions, and in addition, the dimension of time is another important factor in development. Therefore, the presentation of sequential morphological changes occurring in the embryo (4D visualization) is essential for understanding the complex morphogenetic events and the underlying mechanisms. Until recently, 3D visualization of embryonic structures was possible only by reconstruction from serial histological sections, which was tedious and time-consuming. During the past two decades, 3D imaging techniques have made significant advances thanks to the progress in imaging and computer technologies, computer graphics, and other related techniques. Such novel tools have enabled precise visualization of the 3D topology of embryonic structures and to demonstrate spatiotemporal 4D sequences of organogenesis. Here, we describe a project in which staged human embryos are imaged by the magnetic resonance (MR) microscope, and 3D images of embryos and their organs at each developmental stage were reconstructed based on the MR data, with the aid of computer graphics techniques. On the basis of the 3D models of staged human embryos, we constructed a data set of 3D images of human embryos and made movies to illustrate the sequential process of human morphogenesis. Furthermore, a computer-based self-learning program of human embryology is being developed for educational purposes, using the photographs, histological sections, MR images, and 3D models of staged human embryos. Copyright 2005 Wiley-Liss, Inc.
Visser, Marco D.; McMahon, Sean M.; Merow, Cory; Dixon, Philip M.; Record, Sydne; Jongejans, Eelke
2015-01-01
Computation has become a critical component of research in biology. A risk has emerged that computational and programming challenges may limit research scope, depth, and quality. We review various solutions to common computational efficiency problems in ecological and evolutionary research. Our review pulls together material that is currently scattered across many sources and emphasizes those techniques that are especially effective for typical ecological and environmental problems. We demonstrate how straightforward it can be to write efficient code and implement techniques such as profiling or parallel computing. We supply a newly developed R package (aprof) that helps to identify computational bottlenecks in R code and determine whether optimization can be effective. Our review is complemented by a practical set of examples and detailed Supporting Information material (S1–S3 Texts) that demonstrate large improvements in computational speed (ranging from 10.5 times to 14,000 times faster). By improving computational efficiency, biologists can feasibly solve more complex tasks, ask more ambitious questions, and include more sophisticated analyses in their research. PMID:25811842
Applications of penetrating radiation for small animal imaging
NASA Astrophysics Data System (ADS)
Hasegawa, Bruce H.; Wu, Max C.; Iwata, Koji; Hwang, Andrew B.; Wong, Kenneth H.; Barber, William C.; Dae, Michael W.; Sakdinawat, Anne E.
2002-11-01
Researchers long have relied on research involving small animals to unravel scientific mysteries in the biological sciences, and to develop new diagnostic and therapeutic techniques in the medical and health sciences. Within the past 2 decades, new techniques have been developed to manipulate the genome of the mouse, allowing the development of transgenic and knockout models of mammalian and human disease, development, and physiology. Traditionally, much biological research involving small animals has relied on the use of invasive methods such as organ harvesting, tissue sampling, and autoradiography during which the animal was sacrificed to perform a single measurement. More recently, imaging techniques have been developed that assess anatomy and physiology in the intact animal, in a way that allows the investigator to follow the progression of disease, or to monitor the response to therapeutic interventions. Imaging techniques that use penetrating radiation at millimeter or submillimeter levels to image small animals include x-ray computed tomography (microCT), single-photon emission computed tomography (microSPECT), and imaging positron emission computed tomography (microPET). MicroCT generates cross-sectional slices which reveal the structure of the object with spatial resolution in the range of 50 to 100 microns. MicroSPECT and microPET are radionuclide imaging techniques in which a radiopharmaceutical is injected into the animal that is accumulated to metabolism, blood flow, bone remodeling, tumor growth, or other biological processes. Both microSPECT and microPET offer spatial resolutions in the range of 1-2 millimeters. However, microPET records annihilation photons produced by a positron-emitting radiopharmaceutical using electronic coincidence, and has a sensitivity approximately two orders of magnitude better than microSPECT, while microSPECT is compatible with gamma-ray emitting radiopharmaceuticals that are less expensive and more readily available than those used with microPET. High-resolution dual-modality imaging systems now are being developed that combine microPET or microSPECT with microCT in a way that facilitates more direct correlation of anatomy and physiology in the same animal. Small animal imaging allows researchers to perform experiments that are not possible with conventional invasive techniques, and thereby are becoming increasingly important tools for discovery of fundamental biological information, and development of new diagnostic and therapeutic techniques in the biomedical sciences.
A multigrid nonoscillatory method for computing high speed flows
NASA Technical Reports Server (NTRS)
Li, C. P.; Shieh, T. H.
1993-01-01
A multigrid method using different smoothers has been developed to solve the Euler equations discretized by a nonoscillatory scheme up to fourth order accuracy. The best smoothing property is provided by a five-stage Runge-Kutta technique with optimized coefficients, yet the most efficient smoother is a backward Euler technique in factored and diagonalized form. The singlegrid solution for a hypersonic, viscous conic flow is in excellent agreement with the solution obtained by the third order MUSCL and Roe's method. Mach 8 inviscid flow computations for a complete entry probe have shown that the accuracy is at least as good as the symmetric TVD scheme of Yee and Harten. The implicit multigrid method is four times more efficient than the explicit multigrid technique and 3.5 times faster than the single-grid implicit technique. For a Mach 8.7 inviscid flow over a blunt delta wing at 30 deg incidence, the CPU reduction factor from the three-level multigrid computation is 2.2 on a grid of 37 x 41 x 73 nodes.
Natural Inspired Intelligent Visual Computing and Its Application to Viticulture.
Ang, Li Minn; Seng, Kah Phooi; Ge, Feng Lu
2017-05-23
This paper presents an investigation of natural inspired intelligent computing and its corresponding application towards visual information processing systems for viticulture. The paper has three contributions: (1) a review of visual information processing applications for viticulture; (2) the development of natural inspired computing algorithms based on artificial immune system (AIS) techniques for grape berry detection; and (3) the application of the developed algorithms towards real-world grape berry images captured in natural conditions from vineyards in Australia. The AIS algorithms in (2) were developed based on a nature-inspired clonal selection algorithm (CSA) which is able to detect the arcs in the berry images with precision, based on a fitness model. The arcs detected are then extended to perform the multiple arcs and ring detectors information processing for the berry detection application. The performance of the developed algorithms were compared with traditional image processing algorithms like the circular Hough transform (CHT) and other well-known circle detection methods. The proposed AIS approach gave a Fscore of 0.71 compared with Fscores of 0.28 and 0.30 for the CHT and a parameter-free circle detection technique (RPCD) respectively.
NASA Technical Reports Server (NTRS)
Smith, Phillip N.
1990-01-01
The automation of low-altitude rotorcraft flight depends on the ability to detect, locate, and navigate around obstacles lying in the rotorcraft's intended flightpath. Computer vision techniques provide a passive method of obstacle detection and range estimation, for obstacle avoidance. Several algorithms based on computer vision methods have been developed for this purpose using laboratory data; however, further development and validation of candidate algorithms require data collected from rotorcraft flight. A data base containing low-altitude imagery augmented with the rotorcraft and sensor parameters required for passive range estimation is not readily available. Here, the emphasis is on the methodology used to develop such a data base from flight-test data consisting of imagery, rotorcraft and sensor parameters, and ground-truth range measurements. As part of the data preparation, a technique for obtaining the sensor calibration parameters is described. The data base will enable the further development of algorithms for computer vision-based obstacle detection and passive range estimation, as well as provide a benchmark for verification of range estimates against ground-truth measurements.
Utilization of a CRT display light pen in the design of feedback control systems
NASA Technical Reports Server (NTRS)
Thompson, J. G.; Young, K. R.
1972-01-01
A hierarchical structure of the interlinked programs was developed to provide a flexible computer-aided design tool. A graphical input technique and a data structure are considered which provide the capability of entering the control system model description into the computer in block diagram form. An information storage and retrieval system was developed to keep track of the system description, and analysis and simulation results, and to provide them to the correct routines for further manipulation or display. Error analysis and diagnostic capabilities are discussed, and a technique was developed to reduce a transfer function to a set of nested integrals suitable for digital simulation. A general, automated block diagram reduction procedure was set up to prepare the system description for the analysis routines.
Conservative zonal schemes for patched grids in 2 and 3 dimensions
NASA Technical Reports Server (NTRS)
Hessenius, Kristin A.
1987-01-01
The computation of flow over complex geometries, such as realistic aircraft configurations, poses difficult grid generation problems for computational aerodynamicists. The creation of a traditional, single-module grid of acceptable quality about an entire configuration may be impossible even with the most sophisticated of grid generation techniques. A zonal approach, wherein the flow field is partitioned into several regions within which grids are independently generated, is a practical alternative for treating complicated geometries. This technique not only alleviates the problems of discretizing a complex region, but also facilitates a block processing approach to computation thereby circumventing computer memory limitations. The use of such a zonal scheme, however, requires the development of an interfacing procedure that ensures a stable, accurate, and conservative calculation for the transfer of information across the zonal borders.
NASA Astrophysics Data System (ADS)
Zamorano, Lucia J.; Jiang, Charlie Z. W.
1993-09-01
In this decade the concept and development of computer assisted stereotactic neurological surgery has improved dramatically. First, the computer network replaced the tape as the data transportation media. Second, newer systems include multi-modality image correlation and frameless stereotactics as an integral part of their functionality, and offer extensive assistance to the neurosurgeon from the preplanning stages to and throughout the operation itself. These are very important changes, and have spurred the development of many interesting techniques. Successful systems include the ISG and NSPS-3.0.
NASA Technical Reports Server (NTRS)
Bratanow, T.; Ecer, A.
1973-01-01
A general computational method for analyzing unsteady flow around pitching and plunging airfoils was developed. The finite element method was applied in developing an efficient numerical procedure for the solution of equations describing the flow around airfoils. The numerical results were employed in conjunction with computer graphics techniques to produce visualization of the flow. The investigation involved mathematical model studies of flow in two phases: (1) analysis of a potential flow formulation and (2) analysis of an incompressible, unsteady, viscous flow from Navier-Stokes equations.
Space shuttle propulsion parameter estimation using optional estimation techniques
NASA Technical Reports Server (NTRS)
1983-01-01
A regression analyses on tabular aerodynamic data provided. A representative aerodynamic model for coefficient estimation. It also reduced the storage requirements for the "normal' model used to check out the estimation algorithms. The results of the regression analyses are presented. The computer routines for the filter portion of the estimation algorithm and the :"bringing-up' of the SRB predictive program on the computer was developed. For the filter program, approximately 54 routines were developed. The routines were highly subsegmented to facilitate overlaying program segments within the partitioned storage space on the computer.
Cartographic Modeling: Computer-assisted Analysis of Spatially Defined Neighborhoods
NASA Technical Reports Server (NTRS)
Berry, J. K.; Tomlin, C. D.
1982-01-01
Cartographic models addressing a wide variety of applications are composed of fundamental map processing operations. These primitive operations are neither data base nor application-specific. By organizing the set of operations into a mathematical-like structure, the basis for a generalized cartographic modeling framework can be developed. Among the major classes of primitive operations are those associated with reclassifying map categories, overlaying maps, determining distance and connectivity, and characterizing cartographic neighborhoods. The conceptual framework of cartographic modeling is established and techniques for characterizing neighborhoods are used as a means of demonstrating some of the more sophisticated procedures of computer-assisted map analysis. A cartographic model for assessing effective roundwood supply is briefly described as an example of a computer analysis. Most of the techniques described have been implemented as part of the map analysis package developed at the Yale School of Forestry and Environmental Studies.
Analysis on laser plasma emission for characterization of colloids by video-based computer program
NASA Astrophysics Data System (ADS)
Putri, Kirana Yuniati; Lumbantoruan, Hendra Damos; Isnaeni
2016-02-01
Laser-induced breakdown detection (LIBD) is a sensitive technique for characterization of colloids with small size and low concentration. There are two types of detection, optical and acoustic. Optical LIBD employs CCD camera to capture the plasma emission and uses the information to quantify the colloids. This technique requires sophisticated technology which is often pricey. In order to build a simple, home-made LIBD system, a dedicated computer program based on MATLAB™ for analyzing laser plasma emission was developed. The analysis was conducted by counting the number of plasma emissions (breakdowns) during a certain period of time. Breakdown probability provided information on colloid size and concentration. Validation experiment showed that the computer program performed well on analyzing the plasma emissions. Optical LIBD has A graphical user interface (GUI) was also developed to make the program more user-friendly.
X-Ray Radiography of Gas Turbine Ceramics.
1979-10-20
Microfocus X-ray equipment. 1a4ihe definition of equipment concepts for a computer assisted tomography ( CAT ) system; and 4ffthe development of a CAT ...were obtained from these test coupons using Microfocus X-ray and image en- hancement techniques. A Computer Assisted Tomography ( CAT ) design concept...monitor. Computer reconstruction algorithms were investigated with respect to CAT and a preferred approach was determined. An appropriate CAT algorithm
Wright, Cameron H G; Barrett, Steven F; Pack, Daniel J
2005-01-01
We describe a new approach to attacking the problem of robust computer vision for mobile robots. The overall strategy is to mimic the biological evolution of animal vision systems. Our basic imaging sensor is based upon the eye of the common house fly, Musca domestica. The computational algorithms are a mix of traditional image processing, subspace techniques, and multilayer neural networks.
Institute for Computational Mechanics in Propulsion (ICOMP)
NASA Technical Reports Server (NTRS)
Keith, Theo G., Jr. (Editor); Balog, Karen (Editor); Povinelli, Louis A. (Editor)
1999-01-01
The Institute for Computational Mechanics in Propulsion (ICOMP) was formed to develop techniques to improve problem-solving capabilities in all aspects of computational mechanics related to propulsion. ICOMP is operated by the Ohio Aerospace Institute (OAI) and funded via numerous cooperative agreements by the NASA Glenn Research Center in Cleveland, Ohio. This report describes the activities at ICOMP during 1998, the Institutes thirteenth year of operation.
Research in nonlinear structural and solid mechanics
NASA Technical Reports Server (NTRS)
Mccomb, H. G., Jr. (Compiler); Noor, A. K. (Compiler)
1981-01-01
Recent and projected advances in applied mechanics, numerical analysis, computer hardware and engineering software, and their impact on modeling and solution techniques in nonlinear structural and solid mechanics are discussed. The fields covered are rapidly changing and are strongly impacted by current and projected advances in computer hardware. To foster effective development of the technology perceptions on computing systems and nonlinear analysis software systems are presented.
An Initial Look at Alternative Computing Technologies for the Intelligence Community
2014-01-01
Recommendation (N-1): Guide hardware development with lessons from machine learning and neuroscience . Neuro-inspired computing suffers from a lack...not new to either the government or industry. We have described Google’s approach. The government—most notably The National Security Agency ( NSA ) and...increasing accumulation of knowledge in neuroscience and bio-molecular methods, new computational techniques may become available in the near future
Institute for Computational Mechanics in Propulsion (ICOMP)
NASA Technical Reports Server (NTRS)
Keith, Theo G., Jr. (Editor); Balog, Karen (Editor); Povinelli, Louis A. (Editor)
2001-01-01
The Institute for Computational Mechanics in Propulsion (ICOMP) was formed to develop techniques to improve problem-solving capabilities in all aspects of computational mechanics related to propulsion. ICOMP is operated by the Ohio Aerospace Institute (OAI) and funded via numerous cooperative agreements by the NASA Glenn Research Center in Cleveland, Ohio. This report describes the activities at ICOMP during 1999, the Institute's fourteenth year of operation.
Institute for Computational Mechanics in Propulsion (ICOMP)
NASA Technical Reports Server (NTRS)
Keith, Theo G., Jr. (Editor); Balog, Karen (Editor); Povinelli, Louis A. (Editor)
1998-01-01
The Institute for Computational Mechanics in Propulsion (ICOMP) was formed to develop techniques to improve problem-solving capabilities in all aspects of computational mechanics related to propulsion. ICOMP is operated by the Ohio Aerospace Institute (OAI) and funded via numerous cooperative agreements by the NASA Lewis Research Center in Cleveland, Ohio. This report describes the activities at ICOMP during 1997, the Institute's twelfth year of operation.
Sigint Application for Polymorphous Computing Architecture (PCA): Wideband DF
2006-08-01
Polymorphous Computing Architecture (PCA) program as stated by Robert Graybill is to Develop the computing foundation for agile systems by establishing...ubiquitous MUSIC algorithm rely upon an underlying narrowband signal model [8]. In this case, narrowband means that the signal bandwidth is less than...a wideband DF algorithm is needed to compensate for this model inadequacy. Among the various wideband DF techniques available, the coherent signal
GRAVTool, a Package to Compute Geoid Model by Remove-Compute-Restore Technique
NASA Astrophysics Data System (ADS)
Marotta, G. S.; Blitzkow, D.; Vidotti, R. M.
2015-12-01
Currently, there are several methods to determine geoid models. They can be based on terrestrial gravity data, geopotential coefficients, astro-geodetic data or a combination of them. Among the techniques to compute a precise geoid model, the Remove-Compute-Restore (RCR) has been widely applied. It considers short, medium and long wavelengths derived from altitude data provided by Digital Terrain Models (DTM), terrestrial gravity data and global geopotential coefficients, respectively. In order to apply this technique, it is necessary to create procedures that compute gravity anomalies and geoid models, by the integration of different wavelengths, and that adjust these models to one local vertical datum. This research presents a developed package called GRAVTool based on MATLAB software to compute local geoid models by RCR technique and its application in a study area. The studied area comprehends the federal district of Brazil, with ~6000 km², wavy relief, heights varying from 600 m to 1340 m, located between the coordinates 48.25ºW, 15.45ºS and 47.33ºW, 16.06ºS. The results of the numerical example on the studied area show the local geoid model computed by the GRAVTool package (Figure), using 1377 terrestrial gravity data, SRTM data with 3 arc second of resolution, and geopotential coefficients of the EIGEN-6C4 model to degree 360. The accuracy of the computed model (σ = ± 0.071 m, RMS = 0.069 m, maximum = 0.178 m and minimum = -0.123 m) matches the uncertainty (σ =± 0.073) of 21 points randomly spaced where the geoid was computed by geometrical leveling technique supported by positioning GNSS. The results were also better than those achieved by Brazilian official regional geoid model (σ = ± 0.099 m, RMS = 0.208 m, maximum = 0.419 m and minimum = -0.040 m).
Ng, K H; Peh, W C G
2010-02-01
A technical note is a short article giving a brief description of a specific development, technique or procedure, or it may describe a modification of an existing technique, procedure or device applicable to medicine. The technique, procedure or device described should have practical value and should contribute to clinical diagnosis or management. It could also present a software tool, or an experimental or computational method. Technical notes are variously referred to as technical innovations or technical developments. The main criteria for publication will be the novelty of concepts involved, the validity of the technique and its potential for clinical applications.
Laurence Lin; J.R. Webster
2012-01-01
The constant nutrient addition technique has been used extensively to measure nutrient uptake in streams. However, this technique is impractical for large streams, and the pulse nutrient addition (PNA) has been suggested as an alternative. We developed a computer model to simulate Monod kinetics nutrient uptake in large rivers and used this model to evaluate the...
NASA Astrophysics Data System (ADS)
Ehmann, Andreas F.; Downie, J. Stephen
2005-09-01
The objective of the International Music Information Retrieval Systems Evaluation Laboratory (IMIRSEL) project is the creation of a large, secure corpus of audio and symbolic music data accessible to the music information retrieval (MIR) community for the testing and evaluation of various MIR techniques. As part of the IMIRSEL project, a cross-platform JAVA based visual programming environment called Music to Knowledge (M2K) is being developed for a variety of music information retrieval related tasks. The primary objective of M2K is to supply the MIR community with a toolset that provides the ability to rapidly prototype algorithms, as well as foster the sharing of techniques within the MIR community through the use of a standardized set of tools. Due to the relatively large size of audio data and the computational costs associated with some digital signal processing and machine learning techniques, M2K is also designed to support distributed computing across computing clusters. In addition, facilities to allow the integration of non-JAVA based (e.g., C/C++, MATLAB, etc.) algorithms and programs are provided within M2K. [Work supported by the Andrew W. Mellon Foundation and NSF Grants No. IIS-0340597 and No. IIS-0327371.
L-O-S-T: Logging Optimization Selection Technique
Jerry L. Koger; Dennis B. Webster
1984-01-01
L-O-S-T is a FORTRAN computer program developed to systematically quantify, analyze, and improve user selected harvesting methods. Harvesting times and costs are computed for road construction, landing construction, system move between landings, skidding, and trucking. A linear programming formulation utilizing the relationships among marginal analysis, isoquants, and...
Computers in Public Schools: Changing the Image with Image Processing.
ERIC Educational Resources Information Center
Raphael, Jacqueline; Greenberg, Richard
1995-01-01
The kinds of educational technologies selected can make the difference between uninspired, rote computer use and challenging learning experiences. University of Arizona's Image Processing for Teaching Project has worked with over 1,000 teachers to develop image-processing techniques that provide students with exciting, open-ended opportunities for…
Electronic Computer and Switching Systems Specialist (AFSC 30554).
ERIC Educational Resources Information Center
Air Univ., Gunter AFS, Ala. Extension Course Inst.
This course is intended to train Air Force personnel to become electronic computer and switching systems specialists. One part of the course consists of a three-volume career development course. Topics are maintenance orientation (15 hours), electronic principles and digital techniques (87 hours), and systems maintenance (51 hours). Each volume…
Software Engineering Techniques for Computer-Aided Learning.
ERIC Educational Resources Information Center
Ibrahim, Bertrand
1989-01-01
Describes the process for developing tutorials for computer-aided learning (CAL) using a programing language rather than an authoring system. The workstation used is described, the use of graphics is discussed, the role of a local area network (LAN) is explained, and future plans are discussed. (five references) (LRW)
An Introduction to Archival Automation: A RAMP Study with Guidelines.
ERIC Educational Resources Information Center
Cook, Michael
Developed under a contract with the International Council on Archives, these guidelines are designed to emphasize the role of automation techniques in archives and records services, provide an indication of existing computer systems used in different archives services and of specific computer applications at various stages of archives…
Research in Distance Education: A System Modeling Approach.
ERIC Educational Resources Information Center
Saba, Farhad; Twitchell, David
1988-01-01
Describes how a computer simulation research method can be used for studying distance education systems. Topics discussed include systems research in distance education; a technique of model development using the System Dynamics approach and DYNAMO simulation language; and a computer simulation of a prototype model. (18 references) (LRW)
WE-D-303-00: Computational Phantoms
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lewis, John; Brigham and Women’s Hospital and Dana-Farber Cancer Institute, Boston, MA
2015-06-15
Modern medical physics deals with complex problems such as 4D radiation therapy and imaging quality optimization. Such problems involve a large number of radiological parameters, and anatomical and physiological breathing patterns. A major challenge is how to develop, test, evaluate and compare various new imaging and treatment techniques, which often involves testing over a large range of radiological parameters as well as varying patient anatomies and motions. It would be extremely challenging, if not impossible, both ethically and practically, to test every combination of parameters and every task on every type of patient under clinical conditions. Computer-based simulation using computationalmore » phantoms offers a practical technique with which to evaluate, optimize, and compare imaging technologies and methods. Within simulation, the computerized phantom provides a virtual model of the patient’s anatomy and physiology. Imaging data can be generated from it as if it was a live patient using accurate models of the physics of the imaging and treatment process. With sophisticated simulation algorithms, it is possible to perform virtual experiments entirely on the computer. By serving as virtual patients, computational phantoms hold great promise in solving some of the most complex problems in modern medical physics. In this proposed symposium, we will present the history and recent developments of computational phantom models, share experiences in their application to advanced imaging and radiation applications, and discuss their promises and limitations. Learning Objectives: Understand the need and requirements of computational phantoms in medical physics research Discuss the developments and applications of computational phantoms Know the promises and limitations of computational phantoms in solving complex problems.« less
Applications integration in a hybrid cloud computing environment: modelling and platform
NASA Astrophysics Data System (ADS)
Li, Qing; Wang, Ze-yuan; Li, Wei-hua; Li, Jun; Wang, Cheng; Du, Rui-yang
2013-08-01
With the development of application services providers and cloud computing, more and more small- and medium-sized business enterprises use software services and even infrastructure services provided by professional information service companies to replace all or part of their information systems (ISs). These information service companies provide applications, such as data storage, computing processes, document sharing and even management information system services as public resources to support the business process management of their customers. However, no cloud computing service vendor can satisfy the full functional IS requirements of an enterprise. As a result, enterprises often have to simultaneously use systems distributed in different clouds and their intra enterprise ISs. Thus, this article presents a framework to integrate applications deployed in public clouds and intra ISs. A run-time platform is developed and a cross-computing environment process modelling technique is also developed to improve the feasibility of ISs under hybrid cloud computing environments.
Methods for Prediction of High-Speed Reacting Flows in Aerospace Propulsion
NASA Technical Reports Server (NTRS)
Drummond, J. Philip
2014-01-01
Research to develop high-speed airbreathing aerospace propulsion systems was underway in the late 1950s. A major part of the effort involved the supersonic combustion ramjet, or scramjet, engine. Work had also begun to develop computational techniques for solving the equations governing the flow through a scramjet engine. However, scramjet technology and the computational methods to assist in its evolution would remain apart for another decade. The principal barrier was that the computational methods needed for engine evolution lacked the computer technology required for solving the discrete equations resulting from the numerical methods. Even today, computer resources remain a major pacing item in overcoming this barrier. Significant advances have been made over the past 35 years, however, in modeling the supersonic chemically reacting flow in a scramjet combustor. To see how scramjet development and the required computational tools finally merged, we briefly trace the evolution of the technology in both areas.
The Admissions Strategist: Recruiting in the 1980s.
ERIC Educational Resources Information Center
College Entrance Examination Board, New York, NY.
Proven market/institutional research techniques and technological developments in computer systems that can be used by college admissions offices to enhance recruitment efforts are discussed in 13 articles. Titles and authors of the articles are as follows: "Student Reactions to College Recruiting Techniques" (Arthur D. Lopatin);…
Applying Nonverbal Techniques to Organizational Diagnosis.
ERIC Educational Resources Information Center
Tubbs, Stewart L.; Koske, W. Cary
Ongoing research programs conducted at General Motors Institute are motivated by the practical objective of improving the company's organizational effectiveness. Computer technology is being used whenever possible; for example, a technique developed by Herman Chernoff was used to process data from a survey of employee attitudes into 18 different…
Graphics Processing Unit Assisted Thermographic Compositing
NASA Technical Reports Server (NTRS)
Ragasa, Scott; Russell, Samuel S.
2012-01-01
Objective Develop a software application utilizing high performance computing techniques, including general purpose graphics processing units (GPGPUs), for the analysis and visualization of large thermographic data sets. Over the past several years, an increasing effort among scientists and engineers to utilize graphics processing units (GPUs) in a more general purpose fashion is allowing for previously unobtainable levels of computation by individual workstations. As data sets grow, the methods to work them grow at an equal, and often greater, pace. Certain common computations can take advantage of the massively parallel and optimized hardware constructs of the GPU which yield significant increases in performance. These common computations have high degrees of data parallelism, that is, they are the same computation applied to a large set of data where the result does not depend on other data elements. Image processing is one area were GPUs are being used to greatly increase the performance of certain analysis and visualization techniques.
NASA Technical Reports Server (NTRS)
Kavi, K. M.
1984-01-01
There have been a number of simulation packages developed for the purpose of designing, testing and validating computer systems, digital systems and software systems. Complex analytical tools based on Markov and semi-Markov processes have been designed to estimate the reliability and performance of simulated systems. Petri nets have received wide acceptance for modeling complex and highly parallel computers. In this research data flow models for computer systems are investigated. Data flow models can be used to simulate both software and hardware in a uniform manner. Data flow simulation techniques provide the computer systems designer with a CAD environment which enables highly parallel complex systems to be defined, evaluated at all levels and finally implemented in either hardware or software. Inherent in data flow concept is the hierarchical handling of complex systems. In this paper we will describe how data flow can be used to model computer system.
Employing subgoals in computer programming education
NASA Astrophysics Data System (ADS)
Margulieux, Lauren E.; Catrambone, Richard; Guzdial, Mark
2016-01-01
The rapid integration of technology into our professional and personal lives has left many education systems ill-equipped to deal with the influx of people seeking computing education. To improve computing education, we are applying techniques that have been developed for other procedural fields. The present study applied such a technique, subgoal labeled worked examples, to explore whether it would improve programming instruction. The first two experiments, conducted in a laboratory, suggest that the intervention improves undergraduate learners' problem-solving performance and affects how learners approach problem-solving. The third experiment demonstrates that the intervention has similar, and perhaps stronger, effects in an online learning environment with in-service K-12 teachers who want to become qualified to teach computing courses. By implementing this subgoal intervention as a tool for educators to teach themselves and their students, education systems could improve computing education and better prepare learners for an increasingly technical world.
NASA Technical Reports Server (NTRS)
Effinger, Michael; Beshears, Ron; Hufnagle, David; Walker, James; Russell, Sam; Stowell, Bob; Myers, David
2002-01-01
Nondestructive characterization techniques have been used to steer development and testing of CMCs. Computed tomography is used to determine the volumetric integrity of the CMC plates and components. Thermography is used to determine the near surface integrity of the CMC plates and components. For process and material development, information such as density uniformity, part delamination, and dimensional tolerance conformity is generated. The information from the thermography and computed tomography is correlated and then specimen cutting maps are superimposed on the thermography images. This enables for tighter data and potential explanation of off nominal test data. Examples of nondestructive characterization utilization to make decisions in process and material development and testing are presented.
A novel potential/viscous flow coupling technique for computing helicopter flow fields
NASA Technical Reports Server (NTRS)
Summa, J. Michael; Strash, Daniel J.; Yoo, Sungyul
1990-01-01
Because of the complexity of helicopter flow field, a zonal method of analysis of computational aerodynamics is required. Here, a new procedure for coupling potential and viscous flow is proposed. An overlapping, velocity coupling technique is to be developed with the unique feature that the potential flow surface singularity strengths are obtained directly from the Navier-Stokes at a smoother inner fluid boundary. The closed-loop iteration method proceeds until the velocity field is converged. This coupling should provide the means of more accurate viscous computations of the near-body and rotor flow fields with resultant improved analysis of such important performance parameters as helicopter fuselage drag and rotor airloads.
Applications of multiple-constraint matrix updates to the optimal control of large structures
NASA Technical Reports Server (NTRS)
Smith, S. W.; Walcott, B. L.
1992-01-01
Low-authority control or vibration suppression in large, flexible space structures can be formulated as a linear feedback control problem requiring computation of displacement and velocity feedback gain matrices. To ensure stability in the uncontrolled modes, these gain matrices must be symmetric and positive definite. In this paper, efficient computation of symmetric, positive-definite feedback gain matrices is accomplished through the use of multiple-constraint matrix update techniques originally developed for structural identification applications. Two systems were used to illustrate the application: a simple spring-mass system and a planar truss. From these demonstrations, use of this multiple-constraint technique is seen to provide a straightforward approach for computing the low-authority gains.
The smiling scan technique: Facially driven guided surgery and prosthetics.
Pozzi, Alessandro; Arcuri, Lorenzo; Moy, Peter K
2018-04-11
To introduce a proof of concept technique and new integrated workflow to optimize the functional and esthetic outcome of the implant-supported restorations by means of a 3-dimensional (3D) facially-driven, digital assisted treatment plan. The Smiling Scan technique permits the creation of a virtual dental patient (VDP) showing a broad smile under static conditions. The patient is exposed to a cone beam computed tomography scan (CBCT), displaying a broad smile for the duration of the examination. Intraoral optical surface scanning (IOS) of the dental and soft tissue anatomy or extraoral optical surface scanning (EOS) of the study casts are achieved. The superimposition of the digital imaging and communications in medicine (DICOM) files with standard tessellation language (STL) files is performed using the virtual planning software program permitting the creation of a VDP. The smiling scan is an effective, easy to use, and low-cost technique to develop a more comprehensive and simplified facially driven computer-assisted treatment plan, allowing a prosthetically driven implant placement and the delivery of an immediate computer aided design (CAD) computer aided manufacturing (CAM) temporary fixed dental prostheses (CAD/CAM technology). Copyright © 2018 Japan Prosthodontic Society. Published by Elsevier Ltd. All rights reserved.
Daily pan evaporation modelling using a neuro-fuzzy computing technique
NASA Astrophysics Data System (ADS)
Kişi, Özgür
2006-10-01
SummaryEvaporation, as a major component of the hydrologic cycle, is important in water resources development and management. This paper investigates the abilities of neuro-fuzzy (NF) technique to improve the accuracy of daily evaporation estimation. Five different NF models comprising various combinations of daily climatic variables, that is, air temperature, solar radiation, wind speed, pressure and humidity are developed to evaluate degree of effect of each of these variables on evaporation. A comparison is made between the estimates provided by the NF model and the artificial neural networks (ANNs). The Stephens-Stewart (SS) method is also considered for the comparison. Various statistic measures are used to evaluate the performance of the models. Based on the comparisons, it was found that the NF computing technique could be employed successfully in modelling evaporation process from the available climatic data. The ANN also found to perform better than the SS method.
Characterization and Developmental History of Problem Solving Methods in Medicine
Harbort, Robert A.
1980-01-01
The central thesis of this paper is the importance of the framework in which information is structured. It is technically important in the design of systems; it is also important in guaranteeing that systems are usable by clinicians. Progress in medical computing depends on our ability to develop a more quantitative understanding of the role of context in our choice of problem solving techniques. This in turn will help us to design more flexible and responsive computer systems. The paper contains an overview of some models of knowledge and problem solving methods, a characterization of modern diagnostic techniques, and a discussion of skill development in medical practice. Diagnostic techniques are examined in terms of how they are taught, what problem solving methods they use, and how they fit together into an overall theory of interpretation of the medical status of a patient.
Computational System For Rapid CFD Analysis In Engineering
NASA Technical Reports Server (NTRS)
Barson, Steven L.; Ascoli, Edward P.; Decroix, Michelle E.; Sindir, Munir M.
1995-01-01
Computational system comprising modular hardware and software sub-systems developed to accelerate and facilitate use of techniques of computational fluid dynamics (CFD) in engineering environment. Addresses integration of all aspects of CFD analysis process, including definition of hardware surfaces, generation of computational grids, CFD flow solution, and postprocessing. Incorporates interfaces for integration of all hardware and software tools needed to perform complete CFD analysis. Includes tools for efficient definition of flow geometry, generation of computational grids, computation of flows on grids, and postprocessing of flow data. System accepts geometric input from any of three basic sources: computer-aided design (CAD), computer-aided engineering (CAE), or definition by user.
Computer-aided light sheet flow visualization using photogrammetry
NASA Technical Reports Server (NTRS)
Stacy, Kathryn; Severance, Kurt; Childers, Brooks A.
1994-01-01
A computer-aided flow visualization process has been developed to analyze video images acquired from rotating and translating light sheet visualization systems. The computer process integrates a mathematical model for image reconstruction, advanced computer graphics concepts, and digital image processing to provide a quantitative and a visual analysis capability. The image reconstruction model, based on photogrammetry, uses knowledge of the camera and light sheet locations and orientations to project two-dimensional light sheet video images into three-dimensional space. A sophisticated computer visualization package, commonly used to analyze computational fluid dynamics (CFD) results, was chosen to interactively display the reconstructed light sheet images with the numerical surface geometry for the model or aircraft under study. The photogrammetric reconstruction technique and the image processing and computer graphics techniques and equipment are described. Results of the computer-aided process applied to both a wind tunnel translating light sheet experiment and an in-flight rotating light sheet experiment are presented. The capability to compare reconstructed experimental light sheet images with CFD solutions in the same graphics environment is also demonstrated.
Computer-Aided Light Sheet Flow Visualization
NASA Technical Reports Server (NTRS)
Stacy, Kathryn; Severance, Kurt; Childers, Brooks A.
1993-01-01
A computer-aided flow visualization process has been developed to analyze video images acquired from rotating and translating light sheet visualization systems. The computer process integrates a mathematical model for image reconstruction, advanced computer graphics concepts, and digital image processing to provide a quantitative and visual analysis capability. The image reconstruction model, based on photogrammetry, uses knowledge of the camera and light sheet locations and orientations to project two-dimensional light sheet video images into three-dimensional space. A sophisticated computer visualization package, commonly used to analyze computational fluid dynamics (CFD) data sets, was chosen to interactively display the reconstructed light sheet images, along with the numerical surface geometry for the model or aircraft under study. A description is provided of the photogrammetric reconstruction technique, and the image processing and computer graphics techniques and equipment. Results of the computer aided process applied to both a wind tunnel translating light sheet experiment and an in-flight rotating light sheet experiment are presented. The capability to compare reconstructed experimental light sheet images and CFD solutions in the same graphics environment is also demonstrated.
Computer-aided light sheet flow visualization
NASA Technical Reports Server (NTRS)
Stacy, Kathryn; Severance, Kurt; Childers, Brooks A.
1993-01-01
A computer-aided flow visualization process has been developed to analyze video images acquired from rotating and translating light sheet visualization systems. The computer process integrates a mathematical model for image reconstruction, advanced computer graphics concepts, and digital image processing to provide a quantitative and visual analysis capability. The image reconstruction model, based on photogrammetry, uses knowledge of the camera and light sheet locations and orientations to project two-dimensional light sheet video images into three-dimensional space. A sophisticated computer visualization package, commonly used to analyze computational fluid dynamics (CFD) data sets, was chosen to interactively display the reconstructed light sheet images, along with the numerical surface geometry for the model or aircraft under study. A description is provided of the photogrammetric reconstruction technique, and the image processing and computer graphics techniques and equipment. Results of the computer aided process applied to both a wind tunnel translating light sheet experiment and an in-flight rotating light sheet experiment are presented. The capability to compare reconstructed experimental light sheet images and CFD solutions in the same graphics environment is also demonstrated.
Assessment of traffic noise levels in urban areas using different soft computing techniques.
Tomić, J; Bogojević, N; Pljakić, M; Šumarac-Pavlović, D
2016-10-01
Available traffic noise prediction models are usually based on regression analysis of experimental data, and this paper presents the application of soft computing techniques in traffic noise prediction. Two mathematical models are proposed and their predictions are compared to data collected by traffic noise monitoring in urban areas, as well as to predictions of commonly used traffic noise models. The results show that application of evolutionary algorithms and neural networks may improve process of development, as well as accuracy of traffic noise prediction.
A study of the electromagnetic interaction between planetary bodies and the solar wind
NASA Technical Reports Server (NTRS)
Schwartz, K.
1971-01-01
Theoretical and computational techniques were developed for calculating the time dependent electromagnetic response of a radially inhomogeneous moon. The techniques were used to analyze the experimental data from the LSM (lunar surface magnetometer) thus providing an in-depth diagnostic of the Lunar interior. The theory was also incorporated into an existing computer code designed to calculate the thermal evolution of planetary bodies. The program will provide a tool for examining the effect of heating from the TE mode (poloidal magnetic field) as well as the TM mode (toroidal magnetic field).
NASA Technical Reports Server (NTRS)
Buechler, W.; Tucker, A. G.
1981-01-01
Several methods were employed to detect both the occurrence and source of errors in the operational software of the AN/SLQ-32. A large embedded real time electronic warfare command and control system for the ROLM 1606 computer are presented. The ROLM computer provides information about invalid addressing, improper use of privileged instructions, stack overflows, and unimplemented instructions. Additionally, software techniques were developed to detect invalid jumps, indices out of range, infinte loops, stack underflows, and field size errors. Finally, data are saved to provide information about the status of the system when an error is detected. This information includes I/O buffers, interrupt counts, stack contents, and recently passed locations. The various errors detected, techniques to assist in debugging problems, and segment simulation on a nontarget computer are discussed. These error detection techniques were a major factor in the success of finding the primary cause of error in 98% of over 500 system dumps.
Classified one-step high-radix signed-digit arithmetic units
NASA Astrophysics Data System (ADS)
Cherri, Abdallah K.
1998-08-01
High-radix number systems enable higher information storage density, less complexity, fewer system components, and fewer cascaded gates and operations. A simple one-step fully parallel high-radix signed-digit arithmetic is proposed for parallel optical computing based on new joint spatial encodings. This reduces hardware requirements and improves throughput by reducing the space-bandwidth produce needed. The high-radix signed-digit arithmetic operations are based on classifying the neighboring input digit pairs into various groups to reduce the computation rules. A new joint spatial encoding technique is developed to present both the operands and the computation rules. This technique increases the spatial bandwidth product of the spatial light modulators of the system. An optical implementation of the proposed high-radix signed-digit arithmetic operations is also presented. It is shown that our one-step trinary signed-digit and quaternary signed-digit arithmetic units are much simpler and better than all previously reported high-radix signed-digit techniques.
Development and application of computational aerothermodynamics flowfield computer codes
NASA Technical Reports Server (NTRS)
Venkatapathy, Ethiraj
1993-01-01
Computations are presented for one-dimensional, strong shock waves that are typical of those that form in front of a reentering spacecraft. The fluid mechanics and thermochemistry are modeled using two different approaches. The first employs traditional continuum techniques in solving the Navier-Stokes equations. The second-approach employs a particle simulation technique (the direct simulation Monte Carlo method, DSMC). The thermochemical models employed in these two techniques are quite different. The present investigation presents an evaluation of thermochemical models for nitrogen under hypersonic flow conditions. Four separate cases are considered. The cases are governed, respectively, by the following: vibrational relaxation; weak dissociation; strong dissociation; and weak ionization. In near-continuum, hypersonic flow, the nonequilibrium thermochemical models employed in continuum and particle simulations produce nearly identical solutions. Further, the two approaches are evaluated successfully against available experimental data for weakly and strongly dissociating flows.
Computer-aided assessment of pulmonary disease in novel swine-origin H1N1 influenza on CT
NASA Astrophysics Data System (ADS)
Yao, Jianhua; Dwyer, Andrew J.; Summers, Ronald M.; Mollura, Daniel J.
2011-03-01
The 2009 pandemic is a global outbreak of novel H1N1 influenza. Radiologic images can be used to assess the presence and severity of pulmonary infection. We develop a computer-aided assessment system to analyze the CT images from Swine-Origin Influenza A virus (S-OIV) novel H1N1 cases. The technique is based on the analysis of lung texture patterns and classification using a support vector machine (SVM). Pixel-wise tissue classification is computed from the SVM value. The method was validated on four H1N1 cases and ten normal cases. We demonstrated that the technique can detect regions of pulmonary abnormality in novel H1N1 patients and differentiate these regions from visually normal lung (area under the ROC curve is 0.993). This technique can also be applied to differentiate regions infected by different pulmonary diseases.
An efficient technique for the numerical solution of the bidomain equations.
Whiteley, Jonathan P
2008-08-01
Computing the numerical solution of the bidomain equations is widely accepted to be a significant computational challenge. In this study we extend a previously published semi-implicit numerical scheme with good stability properties that has been used to solve the bidomain equations (Whiteley, J.P. IEEE Trans. Biomed. Eng. 53:2139-2147, 2006). A new, efficient numerical scheme is developed which utilizes the observation that the only component of the ionic current that must be calculated on a fine spatial mesh and updated frequently is the fast sodium current. Other components of the ionic current may be calculated on a coarser mesh and updated less frequently, and then interpolated onto the finer mesh. Use of this technique to calculate the transmembrane potential and extracellular potential induces very little error in the solution. For the simulations presented in this study an increase in computational efficiency of over two orders of magnitude over standard numerical techniques is obtained.
Application of artificial neural networks with backpropagation technique in the financial data
NASA Astrophysics Data System (ADS)
Jaiswal, Jitendra Kumar; Das, Raja
2017-11-01
The propensity of applying neural networks has been proliferated in multiple disciplines for research activities since the past recent decades because of its powerful control with regulatory parameters for pattern recognition and classification. It is also being widely applied for forecasting in the numerous divisions. Since financial data have been readily available due to the involvement of computers and computing systems in the stock market premises throughout the world, researchers have also developed numerous techniques and algorithms to analyze the data from this sector. In this paper we have applied neural network with backpropagation technique to find the data pattern from finance section and prediction for stock values as well.
WE-D-303-01: Development and Application of Digital Human Phantoms
DOE Office of Scientific and Technical Information (OSTI.GOV)
Segars, P.
2015-06-15
Modern medical physics deals with complex problems such as 4D radiation therapy and imaging quality optimization. Such problems involve a large number of radiological parameters, and anatomical and physiological breathing patterns. A major challenge is how to develop, test, evaluate and compare various new imaging and treatment techniques, which often involves testing over a large range of radiological parameters as well as varying patient anatomies and motions. It would be extremely challenging, if not impossible, both ethically and practically, to test every combination of parameters and every task on every type of patient under clinical conditions. Computer-based simulation using computationalmore » phantoms offers a practical technique with which to evaluate, optimize, and compare imaging technologies and methods. Within simulation, the computerized phantom provides a virtual model of the patient’s anatomy and physiology. Imaging data can be generated from it as if it was a live patient using accurate models of the physics of the imaging and treatment process. With sophisticated simulation algorithms, it is possible to perform virtual experiments entirely on the computer. By serving as virtual patients, computational phantoms hold great promise in solving some of the most complex problems in modern medical physics. In this proposed symposium, we will present the history and recent developments of computational phantom models, share experiences in their application to advanced imaging and radiation applications, and discuss their promises and limitations. Learning Objectives: Understand the need and requirements of computational phantoms in medical physics research Discuss the developments and applications of computational phantoms Know the promises and limitations of computational phantoms in solving complex problems.« less
NASA Technical Reports Server (NTRS)
Klumpar, D. M. (Principal Investigator)
1981-01-01
Progress is reported in reading MAGSAT tapes in modeling procedure developed to compute the magnetic fields at satellite orbit due to current distributions in the ionosphere. The modeling technique utilizes a linear current element representation of the large-scale space-current system.
Computer Techniques for Studying Coverage, Overlaps, and Gaps in Collections.
ERIC Educational Resources Information Center
White, Howard D.
1987-01-01
Describes techniques for using the Statistical Package for the Social Sciences (SSPS) to create tables for cooperative collection development across a number of libraries. Specific commands are given to generate holdings profiles focusing on collection coverage, overlaps, gaps, or other areas of interest, from a master bibliographic list. (CLB)
Analysis techniques for multivariate root loci. [a tool in linear control systems
NASA Technical Reports Server (NTRS)
Thompson, P. M.; Stein, G.; Laub, A. J.
1980-01-01
Analysis and techniques are developed for the multivariable root locus and the multivariable optimal root locus. The generalized eigenvalue problem is used to compute angles and sensitivities for both types of loci, and an algorithm is presented that determines the asymptotic properties of the optimal root locus.
Development of a support software system for real-time HAL/S applications
NASA Technical Reports Server (NTRS)
Smith, R. S.
1984-01-01
Methodologies employed in defining and implementing a software support system for the HAL/S computer language for real-time operations on the Shuttle are detailed. Attention is also given to the management and validation techniques used during software development and software maintenance. Utilities developed to support the real-time operating conditions are described. With the support system being produced on Cyber computers and executable code then processed through Cyber or PDP machines, the support system has a production level status and can serve as a model for other software development projects.
Rocket engine system reliability analyses using probabilistic and fuzzy logic techniques
NASA Technical Reports Server (NTRS)
Hardy, Terry L.; Rapp, Douglas C.
1994-01-01
The reliability of rocket engine systems was analyzed by using probabilistic and fuzzy logic techniques. Fault trees were developed for integrated modular engine (IME) and discrete engine systems, and then were used with the two techniques to quantify reliability. The IRRAS (Integrated Reliability and Risk Analysis System) computer code, developed for the U.S. Nuclear Regulatory Commission, was used for the probabilistic analyses, and FUZZYFTA (Fuzzy Fault Tree Analysis), a code developed at NASA Lewis Research Center, was used for the fuzzy logic analyses. Although both techniques provided estimates of the reliability of the IME and discrete systems, probabilistic techniques emphasized uncertainty resulting from randomness in the system whereas fuzzy logic techniques emphasized uncertainty resulting from vagueness in the system. Because uncertainty can have both random and vague components, both techniques were found to be useful tools in the analysis of rocket engine system reliability.
Proceedings of the Fifth NASA/NSF/DOD Workshop on Aerospace Computational Control
NASA Technical Reports Server (NTRS)
Wette, M. (Editor); Man, G. K. (Editor)
1993-01-01
The Fifth Annual Workshop on Aerospace Computational Control was one in a series of workshops sponsored by NASA, NSF, and the DOD. The purpose of these workshops is to address computational issues in the analysis, design, and testing of flexible multibody control systems for aerospace applications. The intention in holding these workshops is to bring together users, researchers, and developers of computational tools in aerospace systems (spacecraft, space robotics, aerospace transportation vehicles, etc.) for the purpose of exchanging ideas on the state of the art in computational tools and techniques.
On the Solution of the Three-Dimensional Flowfield About a Flow-Through Nacelle. Ph.D. Thesis
NASA Technical Reports Server (NTRS)
Compton, William Bernard
1985-01-01
The solution of the three dimensional flow field for a flow through nacelle was studied. Both inviscid and viscous inviscid interacting solutions were examined. Inviscid solutions were obtained with two different computational procedures for solving the three dimensional Euler equations. The first procedure employs an alternating direction implicit numerical algorithm, and required the development of a complete computational model for the nacelle problem. The second computational technique employs a fourth order Runge-Kutta numerical algorithm which was modified to fit the nacelle problem. Viscous effects on the flow field were evaluated with a viscous inviscid interacting computational model. This model was constructed by coupling the explicit Euler solution procedure with a flag entrainment boundary layer solution procedure in a global iteration scheme. The computational techniques were used to compute the flow field for a long duct turbofan engine nacelle at free stream Mach numbers of 0.80 and 0.94 and angles of attack of 0 and 4 deg.
NASA Technical Reports Server (NTRS)
Daywitt, J.; Kutler, P.; Anderson, D.
1977-01-01
The technique of floating shock fitting is adapted to the computation of the inviscid flowfield about circular cones in a supersonic free stream at angles of attack that exceed the cone half-angle. The resulting equations are applicable over the complete range of free-stream Mach numbers, angles of attack and cone half-angles for which the bow shock is attached. A finite difference algorithm is used to obtain the solution by an unsteady relaxation approach. The bow shock, embedded cross-flow shock, and vortical singularity in the leeward symmetry plane are treated as floating discontinuities in a fixed computational mesh. Where possible, the flowfield is partitioned into windward, shoulder, and leeward regions with each region computed separately to achieve maximum computational efficiency. An alternative shock fitting technique which treats the bow shock as a computational boundary is developed and compared with the floating-fitting approach. Several surface boundary condition schemes are also analyzed.
Methodologies and systems for heterogeneous concurrent computing
NASA Technical Reports Server (NTRS)
Sunderam, V. S.
1994-01-01
Heterogeneous concurrent computing is gaining increasing acceptance as an alternative or complementary paradigm to multiprocessor-based parallel processing as well as to conventional supercomputing. While algorithmic and programming aspects of heterogeneous concurrent computing are similar to their parallel processing counterparts, system issues, partitioning and scheduling, and performance aspects are significantly different. In this paper, we discuss critical design and implementation issues in heterogeneous concurrent computing, and describe techniques for enhancing its effectiveness. In particular, we highlight the system level infrastructures that are required, aspects of parallel algorithm development that most affect performance, system capabilities and limitations, and tools and methodologies for effective computing in heterogeneous networked environments. We also present recent developments and experiences in the context of the PVM system and comment on ongoing and future work.
NASA Astrophysics Data System (ADS)
Dhingra, Shonali; Sandler, Roman; Rios, Rodrigo; Vuong, Cliff; Mehta, Mayank
All animals naturally perceive the abstract concept of space-time. A brain region called the Hippocampus is known to be important in creating these perceptions, but the underlying mechanisms are unknown. In our lab we employ several experimental and computational techniques from Physics to tackle this fundamental puzzle. Experimentally, we use ideas from Nanoscience and Materials Science to develop techniques to measure the activity of hippocampal neurons, in freely-behaving animals. Computationally, we develop models to study neuronal activity patterns, which are point processes that are highly stochastic and multidimensional. We then apply these techniques to collect and analyze neuronal signals from rodents while they're exploring space in Real World or Virtual Reality with various stimuli. Our findings show that under these conditions neuronal activity depends on various parameters, such as sensory cues including visual and auditory, and behavioral cues including, linear and angular, position and velocity. Further, neuronal networks create internally-generated rhythms, which influence perception of space and time. In totality, these results further our understanding of how the brain develops a cognitive map of our surrounding space, and keep track of time.
A stirling engine computer model for performance calculations
NASA Technical Reports Server (NTRS)
Tew, R.; Jefferies, K.; Miao, D.
1978-01-01
To support the development of the Stirling engine as a possible alternative to the automobile spark-ignition engine, the thermodynamic characteristics of the Stirling engine were analyzed and modeled on a computer. The modeling techniques used are presented. The performance of an existing rhombic-drive Stirling engine was simulated by use of this computer program, and some typical results are presented. Engine tests are planned in order to evaluate this model.
Reduction of lithologic-log data to numbers for use in the digital computer
Morgan, C.O.; McNellis, J.M.
1971-01-01
The development of a standardized system for conveniently coding lithologic-log data for use in the digital computer has long been needed. The technique suggested involves a reduction of the original written alphanumeric log to a numeric log by use of computer programs. This numeric log can then be retrieved as a written log, interrogated for pertinent information, or analyzed statistically. ?? 1971 Plenum Publishing Corporation.
Computer architecture evaluation for structural dynamics computations: Project summary
NASA Technical Reports Server (NTRS)
Standley, Hilda M.
1989-01-01
The intent of the proposed effort is the examination of the impact of the elements of parallel architectures on the performance realized in a parallel computation. To this end, three major projects are developed: a language for the expression of high level parallelism, a statistical technique for the synthesis of multicomputer interconnection networks based upon performance prediction, and a queueing model for the analysis of shared memory hierarchies.
Interface projection techniques for fluid-structure interaction modeling with moving-mesh methods
NASA Astrophysics Data System (ADS)
Tezduyar, Tayfun E.; Sathe, Sunil; Pausewang, Jason; Schwaab, Matthew; Christopher, Jason; Crabtree, Jason
2008-12-01
The stabilized space-time fluid-structure interaction (SSTFSI) technique developed by the Team for Advanced Flow Simulation and Modeling (T★AFSM) was applied to a number of 3D examples, including arterial fluid mechanics and parachute aerodynamics. Here we focus on the interface projection techniques that were developed as supplementary methods targeting the computational challenges associated with the geometric complexities of the fluid-structure interface. Although these supplementary techniques were developed in conjunction with the SSTFSI method and in the context of air-fabric interactions, they can also be used in conjunction with other moving-mesh methods, such as the Arbitrary Lagrangian-Eulerian (ALE) method, and in the context of other classes of FSI applications. The supplementary techniques currently consist of using split nodal values for pressure at the edges of the fabric and incompatible meshes at the air-fabric interfaces, the FSI Geometric Smoothing Technique (FSI-GST), and the Homogenized Modeling of Geometric Porosity (HMGP). Using split nodal values for pressure at the edges and incompatible meshes at the interfaces stabilizes the structural response at the edges of the membrane used in modeling the fabric. With the FSI-GST, the fluid mechanics mesh is sheltered from the consequences of the geometric complexity of the structure. With the HMGP, we bypass the intractable complexities of the geometric porosity by approximating it with an “equivalent”, locally-varying fabric porosity. As test cases demonstrating how the interface projection techniques work, we compute the air-fabric interactions of windsocks, sails and ringsail parachutes.
Molecular Modeling in Drug Design for the Development of Organophosphorus Antidotes/Prophylactics.
1986-06-01
multidimensional statistical QSAR analysis techniques to suggest new structures for synthesis and evaluation. C. Application of quantum chemical techniques to...compounds for synthesis and testing for antidotal potency. E. Use of computer-assisted methods to determine the steric constraints at the active site...modeling techniques to model the enzyme acetylcholinester-se. H. Suggestion of some novel compounds for synthesis and testing for reactivating
AN OVERVIEW OF REDUCED ORDER MODELING TECHNIQUES FOR SAFETY APPLICATIONS
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mandelli, D.; Alfonsi, A.; Talbot, P.
2016-10-01
The RISMC project is developing new advanced simulation-based tools to perform Computational Risk Analysis (CRA) for the existing fleet of U.S. nuclear power plants (NPPs). These tools numerically model not only the thermal-hydraulic behavior of the reactors primary and secondary systems, but also external event temporal evolution and component/system ageing. Thus, this is not only a multi-physics problem being addressed, but also a multi-scale problem (both spatial, µm-mm-m, and temporal, seconds-hours-years). As part of the RISMC CRA approach, a large amount of computationally-expensive simulation runs may be required. An important aspect is that even though computational power is growing, themore » overall computational cost of a RISMC analysis using brute-force methods may be not viable for certain cases. A solution that is being evaluated to assist the computational issue is the use of reduced order modeling techniques. During the FY2015, we investigated and applied reduced order modeling techniques to decrease the RISMC analysis computational cost by decreasing the number of simulation runs; for this analysis improvement we used surrogate models instead of the actual simulation codes. This article focuses on the use of reduced order modeling techniques that can be applied to RISMC analyses in order to generate, analyze, and visualize data. In particular, we focus on surrogate models that approximate the simulation results but in a much faster time (microseconds instead of hours/days).« less
GLAD: a system for developing and deploying large-scale bioinformatics grid.
Teo, Yong-Meng; Wang, Xianbing; Ng, Yew-Kwong
2005-03-01
Grid computing is used to solve large-scale bioinformatics problems with gigabytes database by distributing the computation across multiple platforms. Until now in developing bioinformatics grid applications, it is extremely tedious to design and implement the component algorithms and parallelization techniques for different classes of problems, and to access remotely located sequence database files of varying formats across the grid. In this study, we propose a grid programming toolkit, GLAD (Grid Life sciences Applications Developer), which facilitates the development and deployment of bioinformatics applications on a grid. GLAD has been developed using ALiCE (Adaptive scaLable Internet-based Computing Engine), a Java-based grid middleware, which exploits the task-based parallelism. Two bioinformatics benchmark applications, such as distributed sequence comparison and distributed progressive multiple sequence alignment, have been developed using GLAD.
Development of a Very Dense Liquid Cooled Compute Platform
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hughes, Phillip N.; Lipp, Robert J.
2013-12-10
The objective of this project was to design and develop a prototype very energy efficient high density compute platform with 100% pumped refrigerant liquid cooling using commodity components and high volume manufacturing techniques. Testing at SLAC has indicated that we achieved a DCIE of 0.93 against our original goal of 0.85. This number includes both cooling and power supply and was achieved employing some of the highest wattage processors available.
Using Robotics to Improve Retention and Increase Comprehension in Introductory Programming Courses
ERIC Educational Resources Information Center
Pullan, Marie
2013-01-01
Several college majors, outside of computer science, require students to learn computer programming. Many students have difficulty getting through the programming sequence and ultimately change majors or drop out of college. To deal with this problem, active learning techniques were developed and implemented in a freshman programming logic and…
Automatic Grading of 3D Computer Animation Laboratory Assignments
ERIC Educational Resources Information Center
Lamberti, Fabrizio; Sanna, Andrea; Paravati, Gianluca; Carlevaris, Gilles
2014-01-01
Assessment is a delicate task in the overall teaching process because it may require significant time and may be prone to subjectivity. Subjectivity is especially true for disciplines in which perceptual factors play a key role in the evaluation. In previous decades, computer-based assessment techniques were developed to address the…
Cognition, Corpora, and Computing: Triangulating Research in Usage-Based Language Learning
ERIC Educational Resources Information Center
Ellis, Nick C.
2017-01-01
Usage-based approaches explore how we learn language from our experience of language. Related research thus involves the analysis of the usage from which learners learn and of learner usage as it develops. This program involves considerable data recording, transcription, and analysis, using a variety of corpus and computational techniques, many of…
Comparability of a Paper-Based Language Test and a Computer-Based Language Test.
ERIC Educational Resources Information Center
Choi, Inn-Chull; Kim, Kyoung Sung; Boo, Jaeyool
2003-01-01
Utilizing the Test of English Proficiency, developed by Seoul National University (TEPS), examined comparability between the paper-based language test and the computer-based language test based on content and construct validation employing content analyses based on corpus linguistic techniques in addition to such statistical analyses as…
The Effectiveness of Computer-Based Hypermedia Teaching Modules for Radiology Residents.
ERIC Educational Resources Information Center
Azevedo, Roger; And Others
This paper explains the rationale for utilizing computer-based, hypermedia tutorials for radiology education and presents the results of a field test of this educational technique. It discusses the development of the hypermedia tutorials at Montreal General Hospital (Quebec, Canada) in 1991-92 and their use in the radiology residency program. The…
Computer Assisted Assignment of Students to Schools to Achieve Desegregation.
ERIC Educational Resources Information Center
Illinois Inst. of Tech., Chicago. Research Inst.
To help school districts with the task of assigning students to schools in order to achieve desegregation, the Illinois Institute of Technology has developed a system involving the use of planning techniques and computer technology that greatly simplifies the school district's job. The key features of the system are objectivity, minimum…
A Diagnostic Study of Computer Application of Structural Communication Grid
ERIC Educational Resources Information Center
Bahar, Mehmet; Aydin, Fatih; Karakirik, Erol
2009-01-01
In this article, Structural communication grid (SCG), an alternative measurement and evaluation technique, has been firstly summarised and the design, development and implementation of a computer based SCG system have been introduced. The system is then tested on a sample of 154 participants consisting of candidate students, science teachers and…
Erosion in radial inflow turbines. Volume 5: Computer programs for tracing particle trajectories
NASA Technical Reports Server (NTRS)
Clevenger, W. B., Jr.; Tabakoff, W.
1975-01-01
Computer programs used to study the trajectories of particles in the radial inflow turbines are presented. The general technique of each program is described. A set of subroutines developed during the study are described. Descriptions, listings, and typical examples of each of the main programs are included.
An Intelligent Computer Assisted Language Learning System for Arabic Learners
ERIC Educational Resources Information Center
Shaalan, Khaled F.
2005-01-01
This paper describes the development of an intelligent computer-assisted language learning (ICALL) system for learning Arabic. This system could be used for learning Arabic by students at primary schools or by learners of Arabic as a second or foreign language. It explores the use of Natural Language Processing (NLP) techniques for learning…
Program Design for Retrospective Searches on Large Data Bases
ERIC Educational Resources Information Center
Thiel, L. H.; Heaps, H. S.
1972-01-01
Retrospective search of large data bases requires development of special techniques for automatic compression of data and minimization of the number of input-output operations to the computer files. The computer program should require a relatively small amount of internal memory. This paper describes the structure of such a program. (9 references)…
Volumetric visualization algorithm development for an FPGA-based custom computing machine
NASA Astrophysics Data System (ADS)
Sallinen, Sami J.; Alakuijala, Jyrki; Helminen, Hannu; Laitinen, Joakim
1998-05-01
Rendering volumetric medical images is a burdensome computational task for contemporary computers due to the large size of the data sets. Custom designed reconfigurable hardware could considerably speed up volume visualization if an algorithm suitable for the platform is used. We present an algorithm and speedup techniques for visualizing volumetric medical CT and MR images with a custom-computing machine based on a Field Programmable Gate Array (FPGA). We also present simulated performance results of the proposed algorithm calculated with a software implementation running on a desktop PC. Our algorithm is capable of generating perspective projection renderings of single and multiple isosurfaces with transparency, simulated X-ray images, and Maximum Intensity Projections (MIP). Although more speedup techniques exist for parallel projection than for perspective projection, we have constrained ourselves to perspective viewing, because of its importance in the field of radiotherapy. The algorithm we have developed is based on ray casting, and the rendering is sped up by three different methods: shading speedup by gradient precalculation, a new generalized version of Ray-Acceleration by Distance Coding (RADC), and background ray elimination by speculative ray selection.
NASA Astrophysics Data System (ADS)
Aonishi, Toru; Mimura, Kazushi; Utsunomiya, Shoko; Okada, Masato; Yamamoto, Yoshihisa
2017-10-01
The coherent Ising machine (CIM) has attracted attention as one of the most effective Ising computing architectures for solving large scale optimization problems because of its scalability and high-speed computational ability. However, it is difficult to implement the Ising computation in the CIM because the theories and techniques of classical thermodynamic equilibrium Ising spin systems cannot be directly applied to the CIM. This means we have to adapt these theories and techniques to the CIM. Here we focus on a ferromagnetic model and a finite loading Hopfield model, which are canonical models sharing a common mathematical structure with almost all other Ising models. We derive macroscopic equations to capture nonequilibrium phase transitions in these models. The statistical mechanical methods developed here constitute a basis for constructing evaluation methods for other Ising computation models.
SPECT/CT in imaging foot and ankle pathology-the demise of other coregistration techniques.
Mohan, Hosahalli K; Gnanasegaran, Gopinath; Vijayanathan, Sanjay; Fogelman, Ignac
2010-01-01
Disorders of the ankle and foot are common and given the complex anatomy and function of the foot, they present a significant clinical challenge. Imaging plays a crucial role in the management of these patients, with multiple imaging options available to the clinician. The American College of radiology has set the appropriateness criteria for the use of the available investigating modalities in the management of foot and ankle pathologies. These are broadly classified into anatomical and functional imaging modalities. Recently, single-photon emission computed tomography and/or computed tomography scanners, which can elegantly combine functional and anatomical images have been introduced, promising an exciting and important development. This review describes our clinical experience with single-photon emission computed tomography and/or computed tomography and discusses potential applications of these techniques.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Aziz, H. M. Abdul; Ukkusuri, Satish V.
We present that EPA-MOVES (Motor Vehicle Emission Simulator) is often integrated with traffic simulators to assess emission levels of large-scale urban networks with signalized intersections. High variations in speed profiles exist in the context of congested urban networks with signalized intersections. The traditional average-speed-based emission estimation technique with EPA-MOVES provides faster execution while underestimates the emissions in most cases because of ignoring the speed variation at congested networks with signalized intersections. In contrast, the atomic second-by-second speed profile (i.e., the trajectory of each vehicle)-based technique provides accurate emissions at the cost of excessive computational power and time. We addressed thismore » issue by developing a novel method to determine the link-driving-schedules (LDSs) for the EPA-MOVES tool. Our research developed a hierarchical clustering technique with dynamic time warping similarity measures (HC-DTW) to find the LDS for EPA-MOVES that is capable of producing emission estimates better than the average-speed-based technique with execution time faster than the atomic speed profile approach. We applied the HC-DTW on a sample data from a signalized corridor and found that HC-DTW can significantly reduce computational time without compromising the accuracy. The developed technique in this research can substantially contribute to the EPA-MOVES-based emission estimation process for large-scale urban transportation network by reducing the computational time with reasonably accurate estimates. This method is highly appropriate for transportation networks with higher variation in speed such as signalized intersections. Lastly, experimental results show error difference ranging from 2% to 8% for most pollutants except PM 10.« less
Aziz, H. M. Abdul; Ukkusuri, Satish V.
2017-06-29
We present that EPA-MOVES (Motor Vehicle Emission Simulator) is often integrated with traffic simulators to assess emission levels of large-scale urban networks with signalized intersections. High variations in speed profiles exist in the context of congested urban networks with signalized intersections. The traditional average-speed-based emission estimation technique with EPA-MOVES provides faster execution while underestimates the emissions in most cases because of ignoring the speed variation at congested networks with signalized intersections. In contrast, the atomic second-by-second speed profile (i.e., the trajectory of each vehicle)-based technique provides accurate emissions at the cost of excessive computational power and time. We addressed thismore » issue by developing a novel method to determine the link-driving-schedules (LDSs) for the EPA-MOVES tool. Our research developed a hierarchical clustering technique with dynamic time warping similarity measures (HC-DTW) to find the LDS for EPA-MOVES that is capable of producing emission estimates better than the average-speed-based technique with execution time faster than the atomic speed profile approach. We applied the HC-DTW on a sample data from a signalized corridor and found that HC-DTW can significantly reduce computational time without compromising the accuracy. The developed technique in this research can substantially contribute to the EPA-MOVES-based emission estimation process for large-scale urban transportation network by reducing the computational time with reasonably accurate estimates. This method is highly appropriate for transportation networks with higher variation in speed such as signalized intersections. Lastly, experimental results show error difference ranging from 2% to 8% for most pollutants except PM 10.« less
Sun, Tie Gang; Xiao, Rong Bo; Cai, Yun Nan; Wang, Yao Wu; Wu, Chang Guang
2016-08-01
Quantitative assessment of urban thermal environment has become a focus for urban climate and environmental science since the concept of urban heat island has been proposed. With the continual development of space information and computer simulation technology, substantial progresses have been made on quantitative assessment techniques and methods of urban thermal environment. The quantitative assessment techniques have been developed to dynamics simulation and forecast of thermal environment at various scales based on statistical analysis of thermal environment on urban-scale using the historical data of weather stations. This study reviewed the development progress of ground meteorological observation, thermal infrared remote sensing and numerical simulation. Moreover, the potential advantages and disadvantages, applicability and the development trends of these techniques were also summarized, aiming to add fundamental knowledge of understanding the urban thermal environment assessment and optimization.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lee, Hsien-Hsin S
The overall objective of this research project is to develop novel architectural techniques as well as system software to achieve a highly secure and intrusion-tolerant computing system. Such system will be autonomous, self-adapting, introspective, with self-healing capability under the circumstances of improper operations, abnormal workloads, and malicious attacks. The scope of this research includes: (1) System-wide, unified introspection techniques for autonomic systems, (2) Secure information-flow microarchitecture, (3) Memory-centric security architecture, (4) Authentication control and its implication to security, (5) Digital right management, (5) Microarchitectural denial-of-service attacks on shared resources. During the period of the project, we developed several architectural techniquesmore » and system software for achieving a robust, secure, and reliable computing system toward our goal.« less
NASA Technical Reports Server (NTRS)
Wong, K. W.
1974-01-01
In lunar phototriangulation, there is a complete lack of accurate ground control points. The accuracy analysis of the results of lunar phototriangulation must, therefore, be completely dependent on statistical procedure. It was the objective of this investigation to examine the validity of the commonly used statistical procedures, and to develop both mathematical techniques and computer softwares for evaluating (1) the accuracy of lunar phototriangulation; (2) the contribution of the different types of photo support data on the accuracy of lunar phototriangulation; (3) accuracy of absolute orientation as a function of the accuracy and distribution of both the ground and model points; and (4) the relative slope accuracy between any triangulated pass points.
NASA Astrophysics Data System (ADS)
Larsen, J. D.; Schaap, M. G.
2013-12-01
Recent advances in computing technology and experimental techniques have made it possible to observe and characterize fluid dynamics at the micro-scale. Many computational methods exist that can adequately simulate fluid flow in porous media. Lattice Boltzmann methods provide the distinct advantage of tracking particles at the microscopic level and returning macroscopic observations. While experimental methods can accurately measure macroscopic fluid dynamics, computational efforts can be used to predict and gain insight into fluid dynamics by utilizing thin sections or computed micro-tomography (CMT) images of core sections. Although substantial effort have been made to advance non-invasive imaging methods such as CMT, fluid dynamics simulations, and microscale analysis, a true three dimensional image segmentation technique has not been developed until recently. Many competing segmentation techniques are utilized in industry and research settings with varying results. In this study lattice Boltzmann method is used to simulate stokes flow in a macroporous soil column. Two dimensional CMT images were used to reconstruct a three dimensional representation of the original sample. Six competing segmentation standards were used to binarize the CMT volumes which provide distinction between solid phase and pore space. The permeability of the reconstructed samples was calculated, with Darcy's Law, from lattice Boltzmann simulations of fluid flow in the samples. We compare simulated permeability from differing segmentation algorithms to experimental findings.
A data compression technique for synthetic aperture radar images
NASA Technical Reports Server (NTRS)
Frost, V. S.; Minden, G. J.
1986-01-01
A data compression technique is developed for synthetic aperture radar (SAR) imagery. The technique is based on an SAR image model and is designed to preserve the local statistics in the image by an adaptive variable rate modification of block truncation coding (BTC). A data rate of approximately 1.6 bit/pixel is achieved with the technique while maintaining the image quality and cultural (pointlike) targets. The algorithm requires no large data storage and is computationally simple.
Probabilistic analysis algorithm for UA slope software program.
DOT National Transportation Integrated Search
2013-12-01
A reliability-based computational algorithm for using a single row and equally spaced drilled shafts to : stabilize an unstable slope has been developed in this research. The Monte-Carlo simulation (MCS) : technique was used in the previously develop...
NASA Technical Reports Server (NTRS)
Tamma, Kumar K.; Railkar, Sudhir B.
1988-01-01
This paper describes new and recent advances in the development of a hybrid transfinite element computational methodology for applicability to conduction/convection/radiation heat transfer problems. The transfinite element methodology, while retaining the modeling versatility of contemporary finite element formulations, is based on application of transform techniques in conjunction with classical Galerkin schemes and is a hybrid approach. The purpose of this paper is to provide a viable hybrid computational methodology for applicability to general transient thermal analysis. Highlights and features of the methodology are described and developed via generalized formulations and applications to several test problems. The proposed transfinite element methodology successfully provides a viable computational approach and numerical test problems validate the proposed developments for conduction/convection/radiation thermal analysis.
NASA Technical Reports Server (NTRS)
1971-01-01
Computational techniques were developed and assimilated for the design optimization. The resulting computer program was then used to perform initial optimization and sensitivity studies on a typical thermal protection system (TPS) to demonstrate its application to the space shuttle TPS design. The program was developed in Fortran IV for the CDC 6400 but was subsequently converted to the Fortran V language to be used on the Univac 1108. The program allows for improvement and update of the performance prediction techniques. The program logic involves subroutines which handle the following basic functions: (1) a driver which calls for input, output, and communication between program and user and between the subroutines themselves; (2) thermodynamic analysis; (3) thermal stress analysis; (4) acoustic fatigue analysis; and (5) weights/cost analysis. In addition, a system total cost is predicted based on system weight and historical cost data of similar systems. Two basic types of input are provided, both of which are based on trajectory data. These are vehicle attitude (altitude, velocity, and angles of attack and sideslip), for external heat and pressure loads calculation, and heating rates and pressure loads as a function of time.
Near-field three-dimensional radar imaging techniques and applications.
Sheen, David; McMakin, Douglas; Hall, Thomas
2010-07-01
Three-dimensional radio frequency imaging techniques have been developed for a variety of near-field applications, including radar cross-section imaging, concealed weapon detection, ground penetrating radar imaging, through-barrier imaging, and nondestructive evaluation. These methods employ active radar transceivers that operate at various frequency ranges covering a wide range, from less than 100 MHz to in excess of 350 GHz, with the frequency range customized for each application. Computational wavefront reconstruction imaging techniques have been developed that optimize the resolution and illumination quality of the images. In this paper, rectilinear and cylindrical three-dimensional imaging techniques are described along with several application results.
NASA Technical Reports Server (NTRS)
Garai, Anirban; Diosady, Laslo T.; Murman, Scott M.; Madavan, Nateri K.
2016-01-01
The perfectly matched layer (PML) technique is developed in the context of a high- order spectral-element Discontinuous-Galerkin (DG) method. The technique is applied to a range of test cases and is shown to be superior compared to other approaches, such as those based on using characteristic boundary conditions and sponge layers, for treating the inflow and outflow boundaries of computational domains. In general, the PML technique improves the quality of the numerical results for simulations of practical flow configurations, but it also exhibits some instabilities for large perturbations. A preliminary analysis that attempts to understand the source of these instabilities is discussed.
Inversion of particle-size distribution from angular light-scattering data with genetic algorithms.
Ye, M; Wang, S; Lu, Y; Hu, T; Zhu, Z; Xu, Y
1999-04-20
A stochastic inverse technique based on a genetic algorithm (GA) to invert particle-size distribution from angular light-scattering data is developed. This inverse technique is independent of any given a priori information of particle-size distribution. Numerical tests show that this technique can be successfully applied to inverse problems with high stability in the presence of random noise and low susceptibility to the shape of distributions. It has also been shown that the GA-based inverse technique is more efficient in use of computing time than the inverse Monte Carlo method recently developed by Ligon et al. [Appl. Opt. 35, 4297 (1996)].
NASA Technical Reports Server (NTRS)
Celic, Alan; Zilliac, Gregory G.
1998-01-01
The fringe-imaging skin friction (FISF) technique, which was originally developed by D. J. Monson and G. G. Mateer at Ames Research Center and recently extended to 3-D flows, is the most accurate skin friction measurement technique currently available. The principle of this technique is that the skin friction at a point on an aerodynamic surface can be determined by measuring the time-rate-of-change of the thickness of an oil drop placed on the surface under the influence of the external air boundary layer. Lubrication theory is used to relate the oil-patch thickness variation to shear stress. The uncertainty of FISF measurements is estimated to be as low as 4 percent, yet little is known about the effects of surface tension and wall adhesion forces on the measured results. A modified version of the free-surface Navier-Stokes solver RIPPLE, developed at Los Alamos National Laboratories, was used to compute the time development of an oil drop on a surface under a simulated air boundary layer. RIPPLE uses the volume of fluid method to track the surface and the continuum surface force approach to model surface tension and wall adhesion effects. The development of an oil drop, over a time period of approximately 4 seconds, was studied. Under the influence of shear imposed by an air boundary layer, the computed profile of the drop rapidly changes from its initial circular-arc shape to a wedge-like shape. Comparison of the time-varying oil-thickness distributions computed using RIPPLE and also computed using a greatly simplified numerical model of an oil drop equation which does not include surface tension and wall adhesion effects) was used to evaluate the effects of surface tension on FISF measurement results. The effects of surface tension were found to be small but not necessarily negligible in some cases.
On the use and computation of the Jordan canonical form in system theory
NASA Technical Reports Server (NTRS)
Sridhar, B.; Jordan, D.
1974-01-01
This paper investigates various aspects of the application of the Jordan canonical form of a matrix in system theory and develops a computational approach to determining the Jordan form for a given matrix. Applications include pole placement, controllability and observability studies, serving as an intermediate step in yielding other canonical forms, and theorem proving. The computational method developed in this paper is both simple and efficient. The method is based on the definition of a generalized eigenvector and a natural extension of Gauss elimination techniques. Examples are included for demonstration purposes.
Real Time Computer Graphics From Body Motion
NASA Astrophysics Data System (ADS)
Fisher, Scott; Marion, Ann
1983-10-01
This paper focuses on the recent emergence and development of real, time, computer-aided body tracking technologies and their use in combination with various computer graphics imaging techniques. The convergence of these, technologies in our research results, in an interactive display environment. in which multipde, representations of a given body motion can be displayed in real time. Specific reference, to entertainment applications is described in the development of a real time, interactive stage set in which dancers can 'draw' with their bodies as they move, through the space. of the stage or manipulate virtual elements of the set with their gestures.
Huang, Hsuan-Ming; Hsiao, Ing-Tsung
2017-01-01
Over the past decade, image quality in low-dose computed tomography has been greatly improved by various compressive sensing- (CS-) based reconstruction methods. However, these methods have some disadvantages including high computational cost and slow convergence rate. Many different speed-up techniques for CS-based reconstruction algorithms have been developed. The purpose of this paper is to propose a fast reconstruction framework that combines a CS-based reconstruction algorithm with several speed-up techniques. First, total difference minimization (TDM) was implemented using the soft-threshold filtering (STF). Second, we combined TDM-STF with the ordered subsets transmission (OSTR) algorithm for accelerating the convergence. To further speed up the convergence of the proposed method, we applied the power factor and the fast iterative shrinkage thresholding algorithm to OSTR and TDM-STF, respectively. Results obtained from simulation and phantom studies showed that many speed-up techniques could be combined to greatly improve the convergence speed of a CS-based reconstruction algorithm. More importantly, the increased computation time (≤10%) was minor as compared to the acceleration provided by the proposed method. In this paper, we have presented a CS-based reconstruction framework that combines several acceleration techniques. Both simulation and phantom studies provide evidence that the proposed method has the potential to satisfy the requirement of fast image reconstruction in practical CT.
Yu, Quan; Gong, Xin; Wang, Guo-Min; Yu, Zhe-Yuan; Qian, Yu-Fen; Shen, Gang
2011-01-01
To establish a new method of presurgical nasoalveolar molding (NAM) using computer-aided reverse engineering and rapid prototyping technique in infants with unilateral cleft lip and palate (UCLP). Five infants (2 males and 3 females with mean age of 1.2 w) with complete UCLP were recruited. All patients were subjected to NAM before the cleft lip repair. The upper denture casts were recorded using a three-dimensional laser scanner within 2 weeks after birth in UCLP infants. A digital model was constructed and analyzed to simulate the NAM procedure with reverse engineering software. The digital geometrical data were exported to print the solid model with rapid prototyping system. The whole set of appliances was fabricated based on these solid models. Laser scanning and digital model construction simplified the NAM procedure and estimated the treatment objective. The appliances were fabricated based on the rapid prototyping technique, and for each patient, the complete set of appliances could be obtained at one time. By the end of presurgical NAM treatment, the cleft was narrowed, and the malformation of nasoalveolar segments was aligned normally. We have developed a novel technique of presurgical NAM based on a computer-aided design. The accurate digital denture model of UCLP infants could be obtained with laser scanning. The treatment design and appliance fabrication could be simplified with a computer-aided reverse engineering and rapid prototyping technique.
Computational Analysis of Stresses Acting on Intermodular Junctions in Thoracic Aortic Endografts
Prasad, Anamika; To, Lillian K.; Gorrepati, Madhu L.; Zarins, Christopher K.; Figueroa, C. Alberto
2011-01-01
Purpose: To evaluate the biomechanical and hemodynamic forces acting on the intermodular junctions of a multi-component thoracic endograft and elucidate their influence on the development of type III endoleak due to disconnection of stent-graft segments. Methods: Three-dimensional computer models of the thoracic aorta and a 4-component thoracic endograft were constructed using postoperative (baseline) and follow-up computed tomography (CT) data from a 69-year-old patient who developed type III endoleak 4 years after stent-graft placement. Computational fluid dynamics (CFD) techniques were used to quantitate the displacement forces acting on the device. The contact stresses between the different modules of the graft were then quantified using computational solid mechanics (CSM) techniques. Lastly, the intermodular junction frictional stability was evaluated using a Coulomb model. Results: The CFD analysis revealed that curvature and length are key determinants of the displacement forces experienced by each endograft and that the first 2 modules were exposed to displacement forces acting in opposite directions in both the lateral and longitudinal axes. The CSM analysis revealed that the highest concentration of stresses occurred at the junction between the first and second modules of the device. Furthermore, the frictional analysis demonstrated that most of the surface area (53%) of this junction had unstable contact. The predicted critical zone of intermodular stress concentration and frictional instability matched the location of the type III endoleak observed in the 4-year follow-up CT image. Conclusion: The region of larger intermodular stresses and highest frictional instability correlated with the zone where a type III endoleak developed 4 years after thoracic stent-graft placement. Computational techniques can be helpful in evaluating the risk of endograft migration and potential for modular disconnection and may be useful in improving device placement strategies and endograft design. PMID:21861748
DOE Office of Scientific and Technical Information (OSTI.GOV)
Estep, Donald
2015-11-30
This project addressed the challenge of predictive computational analysis of strongly coupled, highly nonlinear multiphysics systems characterized by multiple physical phenomena that span a large range of length- and time-scales. Specifically, the project was focused on computational estimation of numerical error and sensitivity analysis of computational solutions with respect to variations in parameters and data. In addition, the project investigated the use of accurate computational estimates to guide efficient adaptive discretization. The project developed, analyzed and evaluated new variational adjoint-based techniques for integration, model, and data error estimation/control and sensitivity analysis, in evolutionary multiphysics multiscale simulations.
Taylor, George C.
1971-01-01
Hydrologic instrumentation and methodology for assessing water-resource potentials have originated largely in the developed countries of the temperature zone. The developing countries lie largely in the tropic zone, which contains the full gamut of the earth's climatic environments, including most of those of the temperate zone. For this reason, most hydrologic techniques have world-wide applicability. Techniques for assessing water-resource potentials for the high priority goals of economic growth are well established in the developing countries--but much more are well established in the developing countries--but much more so in some than in other. Conventional techniques for measurement and evaluation of basic hydrologic parameters are now well-understood in the developing countries and are generally adequate for their current needs and those of the immediate future. Institutional and economic constraints, however, inhibit growth of sustained programs of hydrologic data collection and application of the data to problems in engineering technology. Computer-based technology, including processing of hydrologic data and mathematical modelling of hydrologic parameters i also well-begun in many developing countries and has much wider potential application. In some developing counties, however, there is a tendency to look on the computer as a panacea for deficiencies in basic hydrologic data collection programs. This fallacy must be discouraged, as the computer is a tool and not a "magic box." There is no real substitute for sound programs of basic data collection. Nuclear and isotopic techniques are being used increasingly in the developed countries in the measurement and evaluation of virtually all hydrologic parameter in which conventional techniques have been used traditionally. Even in the developed countries, however, many hydrologists are not using nuclear techniques, simply because they lack knowledge of the principles involved and of the potential benefits. Nuclear methodology in hydrologic applications is generally more complex than the conventional and hence requires a high level of technical expertise for effective use. Application of nuclear techniques to hydrologic problems in the developing countries is likely to be marginal for some years to come, owing to the higher costs involved and expertise required. Nuclear techniques, however, would seem to have particular promise in studies of water movement in unsaturated soils and of erosion and sedimentation where conventional techniques are inadequate, inefficient and in some cases costly. Remote sensing offers great promise for synoptic evaluations of water resources and hydrologic processes, including the transient phenomena of the hydrologic cycle. Remote sensing is not, however, a panacea for deficiencies in hydrologic data programs in the developing countries. Rather it is a means for extending and augmenting on-the-ground observations ans surveys (ground truth) to evaluated water resources and hydrologic processes on a regionall or even continental scale. With respect to economic growth goals in developing countries, there are few identifiable gaps in existing hydrologic instrumentation and methodology insofar as appraisal, development and management of available water resources are concerned. What is needed is acceleration of institutional development and professional motivation toward more effective use of existing and proven methodology. Moreover, much sophisticated methodology can be applied effectively in the developing countries only when adequate levels of indigenous scientific skills have been reached and supportive institutional frameworks are evolved to viability.
Topics in computational physics
NASA Astrophysics Data System (ADS)
Monville, Maura Edelweiss
Computational Physics spans a broad range of applied fields extending beyond the border of traditional physics tracks. Demonstrated flexibility and capability to switch to a new project, and pick up the basics of the new field quickly, are among the essential requirements for a computational physicist. In line with the above mentioned prerequisites, my thesis described the development and results of two computational projects belonging to two different applied science areas. The first project is a Materials Science application. It is a prescription for an innovative nano-fabrication technique that is built out of two other known techniques. The preliminary results of the simulation of this novel nano-patterning fabrication method show an average improvement, roughly equal to 18%, with respect to the single techniques it draws on. The second project is a Homeland Security application aimed at preventing smuggling of nuclear material at ports of entry. It is concerned with a simulation of an active material interrogation system based on the analysis of induced photo-nuclear reactions. This project consists of a preliminary evaluation of the photo-fission implementation in the more robust radiation transport Monte Carlo codes, followed by the customization and extension of MCNPX, a Monte Carlo code developed in Los Alamos National Laboratory, and MCNP-PoliMi. The final stage of the project consists of testing the interrogation system against some real world scenarios, for the purpose of determining the system's reliability, material discrimination power, and limitations.
Plank, Gernot; Zhou, Lufang; Greenstein, Joseph L; Cortassa, Sonia; Winslow, Raimond L; O'Rourke, Brian; Trayanova, Natalia A
2008-01-01
Computer simulations of electrical behaviour in the whole ventricles have become commonplace during the last few years. The goals of this article are (i) to review the techniques that are currently employed to model cardiac electrical activity in the heart, discussing the strengths and weaknesses of the various approaches, and (ii) to implement a novel modelling approach, based on physiological reasoning, that lifts some of the restrictions imposed by current state-of-the-art ionic models. To illustrate the latter approach, the present study uses a recently developed ionic model of the ventricular myocyte that incorporates an excitation–contraction coupling and mitochondrial energetics model. A paradigm to bridge the vastly disparate spatial and temporal scales, from subcellular processes to the entire organ, and from sub-microseconds to minutes, is presented. Achieving sufficient computational efficiency is the key to success in the quest to develop multiscale realistic models that are expected to lead to better understanding of the mechanisms of arrhythmia induction following failure at the organelle level, and ultimately to the development of novel therapeutic applications. PMID:18603526
Portable Computer Technology (PCT) Research and Development Program Phase 2
NASA Technical Reports Server (NTRS)
Castillo, Michael; McGuire, Kenyon; Sorgi, Alan
1995-01-01
The subject of this project report, focused on: (1) Design and development of two Advanced Portable Workstation 2 (APW 2) units. These units incorporate advanced technology features such as a low power Pentium processor, a high resolution color display, National Television Standards Committee (NTSC) video handling capabilities, a Personal Computer Memory Card International Association (PCMCIA) interface, and Small Computer System Interface (SCSI) and ethernet interfaces. (2) Use these units to integrate and demonstrate advanced wireless network and portable video capabilities. (3) Qualification of the APW 2 systems for use in specific experiments aboard the Mir Space Station. A major objective of the PCT Phase 2 program was to help guide future choices in computing platforms and techniques for meeting National Aeronautics and Space Administration (NASA) mission objectives. The focus being on the development of optimal configurations of computing hardware, software applications, and network technologies for use on NASA missions.
Advances in computer-aided well-test interpretation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Horne, R.N.
1994-07-01
Despite the feeling expressed several times over the past 40 years that well-test analysis had reached it peak development, an examination of recent advances shows continuous expansion in capability, with future improvement likely. The expansion in interpretation capability over the past decade arose mainly from the development of computer-aided techniques, which, although introduced 20 years ago, have come into use only recently. The broad application of computer-aided interpretation originated with the improvement of the methodologies and continued with the expansion in computer access and capability that accompanied the explosive development of the microcomputer industry. This paper focuses on the differentmore » pieces of the methodology that combine to constitute a computer-aided interpretation and attempts to compare some of the approaches currently used. Future directions of the approach are also discussed. The separate areas discussed are deconvolution, pressure derivatives, model recognition, nonlinear regression, and confidence intervals.« less
ENFIN--A European network for integrative systems biology.
Kahlem, Pascal; Clegg, Andrew; Reisinger, Florian; Xenarios, Ioannis; Hermjakob, Henning; Orengo, Christine; Birney, Ewan
2009-11-01
Integration of biological data of various types and the development of adapted bioinformatics tools represent critical objectives to enable research at the systems level. The European Network of Excellence ENFIN is engaged in developing an adapted infrastructure to connect databases, and platforms to enable both the generation of new bioinformatics tools and the experimental validation of computational predictions. With the aim of bridging the gap existing between standard wet laboratories and bioinformatics, the ENFIN Network runs integrative research projects to bring the latest computational techniques to bear directly on questions dedicated to systems biology in the wet laboratory environment. The Network maintains internally close collaboration between experimental and computational research, enabling a permanent cycling of experimental validation and improvement of computational prediction methods. The computational work includes the development of a database infrastructure (EnCORE), bioinformatics analysis methods and a novel platform for protein function analysis FuncNet.
Multi-scale modeling in cell biology
Meier-Schellersheim, Martin; Fraser, Iain D. C.; Klauschen, Frederick
2009-01-01
Biomedical research frequently involves performing experiments and developing hypotheses that link different scales of biological systems such as, for instance, the scales of intracellular molecular interactions to the scale of cellular behavior and beyond to the behavior of cell populations. Computational modeling efforts that aim at exploring such multi-scale systems quantitatively with the help of simulations have to incorporate several different simulation techniques due to the different time and space scales involved. Here, we provide a non-technical overview of how different scales of experimental research can be combined with the appropriate computational modeling techniques. We also show that current modeling software permits building and simulating multi-scale models without having to become involved with the underlying technical details of computational modeling. PMID:20448808
Urban land use monitoring from computer-implemented processing of airborne multispectral data
NASA Technical Reports Server (NTRS)
Todd, W. J.; Mausel, P. W.; Baumgardner, M. F.
1976-01-01
Machine processing techniques were applied to multispectral data obtained from airborne scanners at an elevation of 600 meters over central Indianapolis in August, 1972. Computer analysis of these spectral data indicate that roads (two types), roof tops (three types), dense grass (two types), sparse grass (two types), trees, bare soil, and water (two types) can be accurately identified. Using computers, it is possible to determine land uses from analysis of type, size, shape, and spatial associations of earth surface images identified from multispectral data. Land use data developed through machine processing techniques can be programmed to monitor land use changes, simulate land use conditions, and provide impact statistics that are required to analyze stresses placed on spatial systems.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ongari, Daniele; Boyd, Peter G.; Barthel, Senja
Pore volume is one of the main properties for the characterization of microporous crystals. It is experimentally measurable, and it can also be obtained from the refined unit cell by a number of computational techniques. In this work, we assess the accuracy and the discrepancies between the different computational methods which are commonly used for this purpose, i.e, geometric, helium, and probe center pore volumes, by studying a database of more than 5000 frameworks. We developed a new technique to fully characterize the internal void of a microporous material and to compute the probe-accessible and -occupiable pore volume. Lasty, wemore » show that, unlike the other definitions of pore volume, the occupiable pore volume can be directly related to the experimentally measured pore volumes from nitrogen isotherms.« less
Prognosis model for stand development
Albert R. Stage
1973-01-01
Describes a set of computer programs for developing prognoses of the development of existing stand under alternative regimes of management. Calibration techniques, modeling procedures, and a procedure for including stochastic variation are described. Implementation of the system for lodgepole pine, including assessment of losses attributed to an infestation of mountain...
Improvements in approaches to forecasting and evaluation techniques
NASA Astrophysics Data System (ADS)
Weatherhead, Elizabeth
2014-05-01
The US is embarking on an experiment to make significant and sustained improvements in weather forecasting. The effort stems from a series of community conversations that recognized the rapid advancements in observations, modeling and computing techniques in the academic, governmental and private sectors. The new directions and initial efforts will be summarized, including information on possibilities for international collaboration. Most new projects are scheduled to start in the last half of 2014. Several advancements include ensemble forecasting with global models, and new sharing of computing resources. Newly developed techniques for evaluating weather forecast models will be presented in detail. The approaches use statistical techniques that incorporate pair-wise comparisons of forecasts with observations and account for daily auto-correlation to assess appropriate uncertainty in forecast changes. Some of the new projects allow for international collaboration, particularly on the research components of the projects.
Giger, Maryellen L.; Chan, Heang-Ping; Boone, John
2008-01-01
The roles of physicists in medical imaging have expanded over the years, from the study of imaging systems (sources and detectors) and dose to the assessment of image quality and perception, the development of image processing techniques, and the development of image analysis methods to assist in detection and diagnosis. The latter is a natural extension of medical physicists’ goals in developing imaging techniques to help physicians acquire diagnostic information and improve clinical decisions. Studies indicate that radiologists do not detect all abnormalities on images that are visible on retrospective review, and they do not always correctly characterize abnormalities that are found. Since the 1950s, the potential use of computers had been considered for analysis of radiographic abnormalities. In the mid-1980s, however, medical physicists and radiologists began major research efforts for computer-aided detection or computer-aided diagnosis (CAD), that is, using the computer output as an aid to radiologists—as opposed to a completely automatic computer interpretation—focusing initially on methods for the detection of lesions on chest radiographs and mammograms. Since then, extensive investigations of computerized image analysis for detection or diagnosis of abnormalities in a variety of 2D and 3D medical images have been conducted. The growth of CAD over the past 20 years has been tremendous—from the early days of time-consuming film digitization and CPU-intensive computations on a limited number of cases to its current status in which developed CAD approaches are evaluated rigorously on large clinically relevant databases. CAD research by medical physicists includes many aspects—collecting relevant normal and pathological cases; developing computer algorithms appropriate for the medical interpretation task including those for segmentation, feature extraction, and classifier design; developing methodology for assessing CAD performance; validating the algorithms using appropriate cases to measure performance and robustness; conducting observer studies with which to evaluate radiologists in the diagnostic task without and with the use of the computer aid; and ultimately assessing performance with a clinical trial. Medical physicists also have an important role in quantitative imaging, by validating the quantitative integrity of scanners and developing imaging techniques, and image analysis tools that extract quantitative data in a more accurate and automated fashion. As imaging systems become more complex and the need for better quantitative information from images grows, the future includes the combined research efforts from physicists working in CAD with those working on quantitative imaging systems to readily yield information on morphology, function, molecular structure, and more—from animal imaging research to clinical patient care. A historical review of CAD and a discussion of challenges for the future are presented here, along with the extension to quantitative image analysis. PMID:19175137
DOE Office of Scientific and Technical Information (OSTI.GOV)
Giger, Maryellen L.; Chan, Heang-Ping; Boone, John
2008-12-15
The roles of physicists in medical imaging have expanded over the years, from the study of imaging systems (sources and detectors) and dose to the assessment of image quality and perception, the development of image processing techniques, and the development of image analysis methods to assist in detection and diagnosis. The latter is a natural extension of medical physicists' goals in developing imaging techniques to help physicians acquire diagnostic information and improve clinical decisions. Studies indicate that radiologists do not detect all abnormalities on images that are visible on retrospective review, and they do not always correctly characterize abnormalities thatmore » are found. Since the 1950s, the potential use of computers had been considered for analysis of radiographic abnormalities. In the mid-1980s, however, medical physicists and radiologists began major research efforts for computer-aided detection or computer-aided diagnosis (CAD), that is, using the computer output as an aid to radiologists--as opposed to a completely automatic computer interpretation--focusing initially on methods for the detection of lesions on chest radiographs and mammograms. Since then, extensive investigations of computerized image analysis for detection or diagnosis of abnormalities in a variety of 2D and 3D medical images have been conducted. The growth of CAD over the past 20 years has been tremendous--from the early days of time-consuming film digitization and CPU-intensive computations on a limited number of cases to its current status in which developed CAD approaches are evaluated rigorously on large clinically relevant databases. CAD research by medical physicists includes many aspects--collecting relevant normal and pathological cases; developing computer algorithms appropriate for the medical interpretation task including those for segmentation, feature extraction, and classifier design; developing methodology for assessing CAD performance; validating the algorithms using appropriate cases to measure performance and robustness; conducting observer studies with which to evaluate radiologists in the diagnostic task without and with the use of the computer aid; and ultimately assessing performance with a clinical trial. Medical physicists also have an important role in quantitative imaging, by validating the quantitative integrity of scanners and developing imaging techniques, and image analysis tools that extract quantitative data in a more accurate and automated fashion. As imaging systems become more complex and the need for better quantitative information from images grows, the future includes the combined research efforts from physicists working in CAD with those working on quantitative imaging systems to readily yield information on morphology, function, molecular structure, and more--from animal imaging research to clinical patient care. A historical review of CAD and a discussion of challenges for the future are presented here, along with the extension to quantitative image analysis.« less
NASA Astrophysics Data System (ADS)
Wang, Jianxiong
2014-06-01
This volume of Journal of Physics: Conference Series is dedicated to scientific contributions presented at the 15th International Workshop on Advanced Computing and Analysis Techniques in Physics Research (ACAT 2013) which took place on 16-21 May 2013 at the Institute of High Energy Physics, Chinese Academy of Sciences, Beijing, China. The workshop series brings together computer science researchers and practitioners, and researchers from particle physics and related fields to explore and confront the boundaries of computing and of automatic data analysis and theoretical calculation techniques. This year's edition of the workshop brought together over 120 participants from all over the world. 18 invited speakers presented key topics on the universe in computer, Computing in Earth Sciences, multivariate data analysis, automated computation in Quantum Field Theory as well as computing and data analysis challenges in many fields. Over 70 other talks and posters presented state-of-the-art developments in the areas of the workshop's three tracks: Computing Technologies, Data Analysis Algorithms and Tools, and Computational Techniques in Theoretical Physics. The round table discussions on open-source, knowledge sharing and scientific collaboration stimulate us to think over the issue in the respective areas. ACAT 2013 was generously sponsored by the Chinese Academy of Sciences (CAS), National Natural Science Foundation of China (NFSC), Brookhaven National Laboratory in the USA (BNL), Peking University (PKU), Theoretical Physics Cernter for Science facilities of CAS (TPCSF-CAS) and Sugon. We would like to thank all the participants for their scientific contributions and for the en- thusiastic participation in all its activities of the workshop. Further information on ACAT 2013 can be found at http://acat2013.ihep.ac.cn. Professor Jianxiong Wang Institute of High Energy Physics Chinese Academy of Science Details of committees and sponsors are available in the PDF
The design of aircraft using the decision support problem technique
NASA Technical Reports Server (NTRS)
Mistree, Farrokh; Marinopoulos, Stergios; Jackson, David M.; Shupe, Jon A.
1988-01-01
The Decision Support Problem Technique for unified design, manufacturing and maintenance is being developed at the Systems Design Laboratory at the University of Houston. This involves the development of a domain-independent method (and the associated software) that can be used to process domain-dependent information and thereby provide support for human judgment. In a computer assisted environment, this support is provided in the form of optimal solutions to Decision Support Problems.
An analysis technique for testing log grades
Carl A. Newport; William G. O' Regan
1963-01-01
An analytical technique that may be used in evaluating log-grading systems is described. It also provides means of comparing two or more grading systems, or a proposed change with the system from which it was developed. The total volume and computed value of lumber from each sample log are the basic data used.
Moisture content calculations for 1000-hour timelag fuels
Michael A. Fosberg; Richard C. Rothermel; Patricia L. Andrews
1981-01-01
Techniques to calculate 1000-hour timelag fuel moistures were developed from theory of water movement in wood. The 1000-hour timelag fuel moisture is computed from mean daily temperatures and humidities and precipitation duration. Comparison of calculated and observed fuel moistures showed good agreement. Techniques to determine the seasonal starting value of the 1000-...
Optimization Techniques for Analysis of Biological and Social Networks
2012-03-28
analyzing a new metaheuristic technique, variable objective search. 3. Experimentation and application: Implement the proposed algorithms , test and fine...alternative mathematical programming formulations, their theoretical analysis, the development of exact algorithms , and heuristics. Originally, clusters...systematic fashion under a unifying theoretical and algorithmic framework. Optimization, Complex Networks, Social Network Analysis, Computational
NASA Technical Reports Server (NTRS)
Klumpar, D. M. (Principal Investigator)
1982-01-01
The status of the initial testing of the modeling procedure developed to compute the magnetic fields at satellite orbit due to current distributions in the ionosphere and magnetosphere is reported. The modeling technique utilizes a linear current element representation of the large scale space-current system.
NASA Technical Reports Server (NTRS)
Gillham, J. K.; Stadnicki, S. J.; Hazony, Y.
1974-01-01
The torsional braid experiment has been interfaced with a centralized hierarchical computing system for data acquisition and data processing. Such a system, when matched by the appropriate upgrading of the monitoring techniques, provides high resolution thermomechanical spectra of rigidity and damping, and their derivatives with respect to temperature.
Low Speed Rot or/Fuselage Interactional Aerodynamics
NASA Technical Reports Server (NTRS)
Barnwell, Richard W.; Prichard, Devon S.
2003-01-01
This report presents work performed under a Cooperative Research Agreement between Virginia Tech and the NASA Langley Research Center. The work involved development of computational techniques for modeling helicopter rotor/airframe aerodynamic interaction. A brief overview of the problem is presented, the modeling techniques are described, and selected example calculations are briefly discussed.
Muhogora, Wilbroad E; Msaki, Peter; Padovani, Renato
2015-03-08
The objective of this study was to improve the visibility of anatomical details by applying off-line postimage processing in chest computed radiography (CR). Four spatial domain-based external image processing techniques were developed by using MATLAB software version 7.0.0.19920 (R14) and image processing tools. The developed techniques were implemented to sample images and their visual appearances confirmed by two consultant radiologists to be clinically adequate. The techniques were then applied to 200 chest clinical images and randomized with other 100 images previously processed online. These 300 images were presented to three experienced radiologists for image quality assessment using standard quality criteria. The mean and ranges of the average scores for three radiologists were characterized for each of the developed technique and imaging system. The Mann-Whitney U-test was used to test the difference of details visibility between the images processed using each of the developed techniques and the corresponding images processed using default algorithms. The results show that the visibility of anatomical features improved significantly (0.005 ≤ p ≤ 0.02) with combinations of intensity values adjustment and/or spatial linear filtering techniques for images acquired using 60 ≤ kVp ≤ 70. However, there was no improvement for images acquired using 102 ≤ kVp ≤ 107 (0.127 ≤ p ≤ 0.48). In conclusion, the use of external image processing for optimization can be effective in chest CR, but should be implemented in consultations with the radiologists.
Msaki, Peter; Padovani, Renato
2015-01-01
The objective of this study was to improve the visibility of anatomical details by applying off‐line postimage processing in chest computed radiography (CR). Four spatial domain‐based external image processing techniques were developed by using MATLAB software version 7.0.0.19920 (R14) and image processing tools. The developed techniques were implemented to sample images and their visual appearances confirmed by two consultant radiologists to be clinically adequate. The techniques were then applied to 200 chest clinical images and randomized with other 100 images previously processed online. These 300 images were presented to three experienced radiologists for image quality assessment using standard quality criteria. The mean and ranges of the average scores for three radiologists were characterized for each of the developed technique and imaging system. The Mann‐Whitney U‐test was used to test the difference of details visibility between the images processed using each of the developed techniques and the corresponding images processed using default algorithms. The results show that the visibility of anatomical features improved significantly (0.005≤p≤0.02) with combinations of intensity values adjustment and/or spatial linear filtering techniques for images acquired using 60≤kVp≤70. However, there was no improvement for images acquired using 102≤kVp≤107 (0.127≤p≤0.48). In conclusion, the use of external image processing for optimization can be effective in chest CR, but should be implemented in consultations with the radiologists. PACS number: 87.59.−e, 87.59.−B, 87.59.−bd PMID:26103165
NASA Astrophysics Data System (ADS)
Hannachi, Ammar; Kohler, Sophie; Lallement, Alex; Hirsch, Ernest
2015-04-01
3D modeling of scene contents takes an increasing importance for many computer vision based applications. In particular, industrial applications of computer vision require efficient tools for the computation of this 3D information. Routinely, stereo-vision is a powerful technique to obtain the 3D outline of imaged objects from the corresponding 2D images. As a consequence, this approach provides only a poor and partial description of the scene contents. On another hand, for structured light based reconstruction techniques, 3D surfaces of imaged objects can often be computed with high accuracy. However, the resulting active range data in this case lacks to provide data enabling to characterize the object edges. Thus, in order to benefit from the positive points of various acquisition techniques, we introduce in this paper promising approaches, enabling to compute complete 3D reconstruction based on the cooperation of two complementary acquisition and processing techniques, in our case stereoscopic and structured light based methods, providing two 3D data sets describing respectively the outlines and surfaces of the imaged objects. We present, accordingly, the principles of three fusion techniques and their comparison based on evaluation criterions related to the nature of the workpiece and also the type of the tackled application. The proposed fusion methods are relying on geometric characteristics of the workpiece, which favour the quality of the registration. Further, the results obtained demonstrate that the developed approaches are well adapted for 3D modeling of manufactured parts including free-form surfaces and, consequently quality control applications using these 3D reconstructions.
Advanced technology airfoil research, volume 2. [conferences
NASA Technical Reports Server (NTRS)
1979-01-01
A comprehensive review of airfoil research is presented. The major thrust of the research is in three areas: development of computational aerodynamic codes for airfoil analysis and design, development of experimental facilities and test techniques, and all types of airfoil applications.
The use of virtual environments for percentage view analysis.
Schofield, Damian; Cox, Christopher J B
2005-09-01
It is recognised that Visual Impact Assessment (VIA), unlike many other aspects of Environmental Impact Assessments (EIA), relies less upon measurement than upon experience and judgement. Hence, it is necessary for a more structured and consistent approach towards VIA, reducing the amount of bias and subjectivity. For proposed developments, there are very few quantitative techniques for the evaluation of visibility, and these existing methods can be highly inaccurate and time consuming. Percentage view changes are one of the few quantitative techniques, and the use of computer technology can reduce the inaccuracy and the time spent evaluating the visibility of either existing or proposed developments. For over 10 years, research work undertaken by the authors at the University of Nottingham has employed Computer Graphics (CG) and Virtual Reality (VR) in civilian and industrial contexts for environmental planning, design visualisation, accident reconstruction, risk analysis, data visualisation and training simulators. This paper describes a method to quantitatively assess the visual impact of proposed developments on the landscape using CG techniques. This method allows the determination of accurate percentage view changes with the use of a computer-generated model of the environment and the application of specialist software that has been developed at the University of Nottingham. The principles are easy to understand and therefore planners, authorisation agencies and members of the public can use and understand the results. A case study is shown to demonstrate the application and the capabilities of the technology.
The Advanced Software Development and Commercialization Project
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gallopoulos, E.; Canfield, T.R.; Minkoff, M.
1990-09-01
This is the first of a series of reports pertaining to progress in the Advanced Software Development and Commercialization Project, a joint collaborative effort between the Center for Supercomputing Research and Development of the University of Illinois and the Computing and Telecommunications Division of Argonne National Laboratory. The purpose of this work is to apply techniques of parallel computing that were pioneered by University of Illinois researchers to mature computational fluid dynamics (CFD) and structural dynamics (SD) computer codes developed at Argonne. The collaboration in this project will bring this unique combination of expertise to bear, for the first time,more » on industrially important problems. By so doing, it will expose the strengths and weaknesses of existing techniques for parallelizing programs and will identify those problems that need to be solved in order to enable wide spread production use of parallel computers. Secondly, the increased efficiency of the CFD and SD codes themselves will enable the simulation of larger, more accurate engineering models that involve fluid and structural dynamics. In order to realize the above two goals, we are considering two production codes that have been developed at ANL and are widely used by both industry and Universities. These are COMMIX and WHAMS-3D. The first is a computational fluid dynamics code that is used for both nuclear reactor design and safety and as a design tool for the casting industry. The second is a three-dimensional structural dynamics code used in nuclear reactor safety as well as crashworthiness studies. These codes are currently available for both sequential and vector computers only. Our main goal is to port and optimize these two codes on shared memory multiprocessors. In so doing, we shall establish a process that can be followed in optimizing other sequential or vector engineering codes for parallel processors.« less
Parallel Visualization Co-Processing of Overnight CFD Propulsion Applications
NASA Technical Reports Server (NTRS)
Edwards, David E.; Haimes, Robert
1999-01-01
An interactive visualization system pV3 is being developed for the investigation of advanced computational methodologies employing visualization and parallel processing for the extraction of information contained in large-scale transient engineering simulations. Visual techniques for extracting information from the data in terms of cutting planes, iso-surfaces, particle tracing and vector fields are included in this system. This paper discusses improvements to the pV3 system developed under NASA's Affordable High Performance Computing project.
EVA worksite analysis--use of computer analysis for EVA operations development and execution.
Anderson, D
1999-01-01
To sustain the rate of extravehicular activity (EVA) required to assemble and maintain the International Space Station, we must enhance our ability to plan, train for, and execute EVAs. An underlying analysis capability has been developed to ensure EVA access to all external worksites as a starting point for ground training, to generate information needed for on-orbit training, and to react quickly to develop contingency EVA plans, techniques, and procedures. This paper describes the use of computer-based EVA worksite analysis techniques for EVA worksite design. EVA worksite analysis has been used to design 80% of EVA worksites on the U.S. portion of the International Space Station. With the launch of the first U.S. element of the station, EVA worksite analysis is being developed further to support real-time analysis of unplanned EVA operations. This paper describes this development and deployment of EVA worksite analysis for International Space Station (ISS) mission support.
Nonlinear Analysis and Modeling of Tires
NASA Technical Reports Server (NTRS)
Noor, Ahmed K.
1996-01-01
The objective of the study was to develop efficient modeling techniques and computational strategies for: (1) predicting the nonlinear response of tires subjected to inflation pressure, mechanical and thermal loads; (2) determining the footprint region, and analyzing the tire pavement contact problem, including the effect of friction; and (3) determining the sensitivity of the tire response (displacements, stresses, strain energy, contact pressures and contact area) to variations in the different material and geometric parameters. Two computational strategies were developed. In the first strategy the tire was modeled by using either a two-dimensional shear flexible mixed shell finite elements or a quasi-three-dimensional solid model. The contact conditions were incorporated into the formulation by using a perturbed Lagrangian approach. A number of model reduction techniques were applied to substantially reduce the number of degrees of freedom used in describing the response outside the contact region. The second strategy exploited the axial symmetry of the undeformed tire, and uses cylindrical coordinates in the development of three-dimensional elements for modeling each of the different parts of the tire cross section. Model reduction techniques are also used with this strategy.
Image-guided tissue engineering
Ballyns, Jeffrey J; Bonassar, Lawrence J
2009-01-01
Replication of anatomic shape is a significant challenge in developing implants for regenerative medicine. This has lead to significant interest in using medical imaging techniques such as magnetic resonance imaging and computed tomography to design tissue engineered constructs. Implementation of medical imaging and computer aided design in combination with technologies for rapid prototyping of living implants enables the generation of highly reproducible constructs with spatial resolution up to 25 μm. In this paper, we review the medical imaging modalities available and a paradigm for choosing a particular imaging technique. We also present fabrication techniques and methodologies for producing cellular engineered constructs. Finally, we comment on future challenges involved with image guided tissue engineering and efforts to generate engineered constructs ready for implantation. PMID:19583811
Use of a Computer-Mediated Delphi Process to Validate a Mass Casualty Conceptual Model
CULLEY, JOAN M.
2012-01-01
Since the original work on the Delphi technique, multiple versions have been developed and used in research and industry; however, very little empirical research has been conducted that evaluates the efficacy of using online computer, Internet, and e-mail applications to facilitate a Delphi method that can be used to validate theoretical models. The purpose of this research was to develop computer, Internet, and e-mail applications to facilitate a modified Delphi technique through which experts provide validation for a proposed conceptual model that describes the information needs for a mass-casualty continuum of care. Extant literature and existing theoretical models provided the basis for model development. Two rounds of the Delphi process were needed to satisfy the criteria for consensus and/or stability related to the constructs, relationships, and indicators in the model. The majority of experts rated the online processes favorably (mean of 6.1 on a seven-point scale). Using online Internet and computer applications to facilitate a modified Delphi process offers much promise for future research involving model building or validation. The online Delphi process provided an effective methodology for identifying and describing the complex series of events and contextual factors that influence the way we respond to disasters. PMID:21076283
Use of a computer-mediated Delphi process to validate a mass casualty conceptual model.
Culley, Joan M
2011-05-01
Since the original work on the Delphi technique, multiple versions have been developed and used in research and industry; however, very little empirical research has been conducted that evaluates the efficacy of using online computer, Internet, and e-mail applications to facilitate a Delphi method that can be used to validate theoretical models. The purpose of this research was to develop computer, Internet, and e-mail applications to facilitate a modified Delphi technique through which experts provide validation for a proposed conceptual model that describes the information needs for a mass-casualty continuum of care. Extant literature and existing theoretical models provided the basis for model development. Two rounds of the Delphi process were needed to satisfy the criteria for consensus and/or stability related to the constructs, relationships, and indicators in the model. The majority of experts rated the online processes favorably (mean of 6.1 on a seven-point scale). Using online Internet and computer applications to facilitate a modified Delphi process offers much promise for future research involving model building or validation. The online Delphi process provided an effective methodology for identifying and describing the complex series of events and contextual factors that influence the way we respond to disasters.
Mendoza, Patricia; d'Anjou, Marc-André; Carmel, Eric N; Fournier, Eric; Mai, Wilfried; Alexander, Kate; Winter, Matthew D; Zwingenberger, Allison L; Thrall, Donald E; Theoret, Christine
2014-01-01
Understanding radiographic anatomy and the effects of varying patient and radiographic tube positioning on image quality can be a challenge for students. The purposes of this study were to develop and validate a novel technique for creating simulated radiographs using computed tomography (CT) datasets. A DICOM viewer (ORS Visual) plug-in was developed with the ability to move and deform cuboidal volumetric CT datasets, and to produce images simulating the effects of tube-patient-detector distance and angulation. Computed tomographic datasets were acquired from two dogs, one cat, and one horse. Simulated radiographs of different body parts (n = 9) were produced using different angles to mimic conventional projections, before actual digital radiographs were obtained using the same projections. These studies (n = 18) were then submitted to 10 board-certified radiologists who were asked to score visualization of anatomical landmarks, depiction of patient positioning, realism of distortion/magnification, and image quality. No significant differences between simulated and actual radiographs were found for anatomic structure visualization and patient positioning in the majority of body parts. For the assessment of radiographic realism, no significant differences were found between simulated and digital radiographs for canine pelvis, equine tarsus, and feline abdomen body parts. Overall, image quality and contrast resolution of simulated radiographs were considered satisfactory. Findings from the current study indicated that radiographs simulated using this new technique are comparable to actual digital radiographs. Further studies are needed to apply this technique in developing interactive tools for teaching radiographic anatomy and the effects of varying patient and tube positioning. © 2013 American College of Veterinary Radiology.
Genome wide approaches to identify protein-DNA interactions.
Ma, Tao; Ye, Zhenqing; Wang, Liguo
2018-05-29
Transcription factors are DNA-binding proteins that play key roles in many fundamental biological processes. Unraveling their interactions with DNA is essential to identify their target genes and understand the regulatory network. Genome-wide identification of their binding sites became feasible thanks to recent progress in experimental and computational approaches. ChIP-chip, ChIP-seq, and ChIP-exo are three widely used techniques to demarcate genome-wide transcription factor binding sites. This review aims to provide an overview of these three techniques including their experiment procedures, computational approaches, and popular analytic tools. ChIP-chip, ChIP-seq, and ChIP-exo have been the major techniques to study genome-wide in vivo protein-DNA interaction. Due to the rapid development of next-generation sequencing technology, array-based ChIP-chip is deprecated and ChIP-seq has become the most widely used technique to identify transcription factor binding sites in genome-wide. The newly developed ChIP-exo further improves the spatial resolution to single nucleotide. Numerous tools have been developed to analyze ChIP-chip, ChIP-seq and ChIP-exo data. However, different programs may employ different mechanisms or underlying algorithms thus each will inherently include its own set of statistical assumption and bias. So choosing the most appropriate analytic program for a given experiment needs careful considerations. Moreover, most programs only have command line interface so their installation and usage will require basic computation expertise in Unix/Linux. Copyright© Bentham Science Publishers; For any queries, please email at epub@benthamscience.org.
Tan, Germaine Xin Yi; Jamil, Muhammad; Tee, Nicole Gui Zhen; Zhong, Liang; Yap, Choon Hwai
2015-11-01
Recent animal studies have provided evidence that prenatal blood flow fluid mechanics may play a role in the pathogenesis of congenital cardiovascular malformations. To further these researches, it is important to have an imaging technique for small animal embryos with sufficient resolution to support computational fluid dynamics studies, and that is also non-invasive and non-destructive to allow for subject-specific, longitudinal studies. In the current study, we developed such a technique, based on ultrasound biomicroscopy scans on chick embryos. Our technique included a motion cancelation algorithm to negate embryonic body motion, a temporal averaging algorithm to differentiate blood spaces from tissue spaces, and 3D reconstruction of blood volumes in the embryo. The accuracy of the reconstructed models was validated with direct stereoscopic measurements. A computational fluid dynamics simulation was performed to model fluid flow in the generated construct of a Hamburger-Hamilton (HH) stage 27 embryo. Simulation results showed that there were divergent streamlines and a low shear region at the carotid duct, which may be linked to the carotid duct's eventual regression and disappearance by HH stage 34. We show that our technique has sufficient resolution to produce accurate geometries for computational fluid dynamics simulations to quantify embryonic cardiovascular fluid mechanics.
Effects of cosmic rays on single event upsets
NASA Technical Reports Server (NTRS)
Venable, D. D.; Zajic, V.; Lowe, C. W.; Olidapupo, A.; Fogarty, T. N.
1989-01-01
Assistance was provided to the Brookhaven Single Event Upset (SEU) Test Facility. Computer codes were developed for fragmentation and secondary radiation affecting Very Large Scale Integration (VLSI) in space. A computer controlled CV (HP4192) test was developed for Terman analysis. Also developed were high speed parametric tests which are independent of operator judgment and a charge pumping technique for measurement of D(sub it) (E). The X-ray secondary effects, and parametric degradation as a function of dose rate were simulated. The SPICE simulation of static RAMs with various resistor filters was tested.
NASA Technical Reports Server (NTRS)
Rummler, D. R.
1976-01-01
The results are presented of investigations to apply regression techniques to the development of methodology for creep-rupture data analysis. Regression analysis techniques are applied to the explicit description of the creep behavior of materials for space shuttle thermal protection systems. A regression analysis technique is compared with five parametric methods for analyzing three simulated and twenty real data sets, and a computer program for the evaluation of creep-rupture data is presented.
ERIC Educational Resources Information Center
Podell, Harold J.
An introduction into the foundations of constructing a marketing data base is presented for the systems and marketing executives who are familiar with basic computer technology methods. The techniques and concepts presented are now being implemented by major organizations in the development of Management Information Systems (MIS). A marketing data…
An Intelligent Computer-Based System for Sign Language Tutoring
ERIC Educational Resources Information Center
Ritchings, Tim; Khadragi, Ahmed; Saeb, Magdy
2012-01-01
A computer-based system for sign language tutoring has been developed using a low-cost data glove and a software application that processes the movement signals for signs in real-time and uses Pattern Matching techniques to decide if a trainee has closely replicated a teacher's recorded movements. The data glove provides 17 movement signals from…
ERIC Educational Resources Information Center
Berney, Tomi D.; Keyes, Jose L.
The Computer Writing Skills for Limited English Proficient Students (Project COMPUGRAFIA.LEP), bilingual special education classes totalling 375 Spanish-speaking students at 10 elementary schools in the Bronx, is evaluated. The project proposed to assist site teachers in developing appropriate lesson plans and effective teaching techniques and…
Planning and Scheduling of Software Manufacturing Projects
1991-03-01
based on the previous results in social analysis of computing, operations research in manufacturing, artificial intelligence in manufacturing...planning and scheduling, and the traditional approaches to planning in artificial intelligence, and extends the techniques that have been developed by them...social analysis of computing, operations research in manufacturing, artificial intelligence in manufacturing planning and scheduling, and the
Computer image processing - The Viking experience. [digital enhancement techniques
NASA Technical Reports Server (NTRS)
Green, W. B.
1977-01-01
Computer processing of digital imagery from the Viking mission to Mars is discussed, with attention given to subjective enhancement and quantitative processing. Contrast stretching and high-pass filtering techniques of subjective enhancement are described; algorithms developed to determine optimal stretch and filtering parameters are also mentioned. In addition, geometric transformations to rectify the distortion of shapes in the field of view and to alter the apparent viewpoint of the image are considered. Perhaps the most difficult problem in quantitative processing of Viking imagery was the production of accurate color representations of Orbiter and Lander camera images.
NASA Technical Reports Server (NTRS)
Berendes, Todd; Sengupta, Sailes K.; Welch, Ron M.; Wielicki, Bruce A.; Navar, Murgesh
1992-01-01
A semiautomated methodology is developed for estimating cumulus cloud base heights on the basis of high spatial resolution Landsat MSS data, using various image-processing techniques to match cloud edges with their corresponding shadow edges. The cloud base height is then estimated by computing the separation distance between the corresponding generalized Hough transform reference points. The differences between the cloud base heights computed by these means and a manual verification technique are of the order of 100 m or less; accuracies of 50-70 m may soon be possible via EOS instruments.
Study to design and develop remote manipulator system
NASA Technical Reports Server (NTRS)
Hill, J. W.; Sword, A. J.
1973-01-01
Human performance measurement techniques for remote manipulation tasks and remote sensing techniques for manipulators are described for common manipulation tasks, performance is monitored by means of an on-line computer capable of measuring the joint angles of both master and slave arms as a function of time. The computer programs allow measurements of the operator's strategy and physical quantities such as task time and power consumed. The results are printed out after a test run to compare different experimental conditions. For tracking tasks, we describe a method of displaying errors in three dimensions and measuring the end-effector position in three dimensions.
A computer graphics display technique for the examination of aircraft design data
NASA Technical Reports Server (NTRS)
Talcott, N. A., Jr.
1981-01-01
An interactive computer graphics technique has been developed for quickly sorting and interpreting large amounts of aerodynamic data. It utilizes a graphic representation rather than numbers. The geometry package represents the vehicle as a set of panels. These panels are ordered in groups of ascending values (e.g., equilibrium temperatures). The groups are then displayed successively on a CRT building up to the complete vehicle. A zoom feature allows for displaying only the panels with values between certain limits. The addition of color allows a one-time display thus eliminating the need for a display build up.
Knowledge Sharing through Pair Programming in Learning Environments: An Empirical Study
ERIC Educational Resources Information Center
Kavitha, R. K.; Ahmed, M. S.
2015-01-01
Agile software development is an iterative and incremental methodology, where solutions evolve from self-organizing, cross-functional teams. Pair programming is a type of agile software development technique where two programmers work together with one computer for developing software. This paper reports the results of the pair programming…
NASA Astrophysics Data System (ADS)
Kubina, Stanley J.
1989-09-01
The review of the status of computational electromagnetics by Miller and the exposition by Burke of the developments in one of the more important computer codes in the application of the electric field integral equation method, the Numerical Electromagnetic Code (NEC), coupled with Molinet's summary of progress in techniques based on the Geometrical Theory of Diffraction (GTD), provide a clear perspective on the maturity of the modern discipline of computational electromagnetics and its potential. Audone's exposition of the application to the computation of Radar Scattering Cross-section (RCS) is an indication of the breadth of practical applications and his exploitation of modern near-field measurement techniques reminds one of progress in the measurement discipline which is essential to the validation or calibration of computational modeling methodology when applied to complex structures such as aircraft and ships. The latter monograph also presents some comparison results with computational models. Some of the results presented for scale model and flight measurements show some serious disagreements in the lobe structure which would require some detailed examination. This also applies to the radiation patterns obtained by flight measurement compared with those obtained using wire-grid models and integral equation modeling methods. In the examples which follow, an attempt is made to match measurements results completely over the entire 2 to 30 MHz HF range for antennas on a large patrol aircraft. The problem of validating computer models of HF antennas on a helicopter and using computer models to generate radiation pattern information which cannot be obtained by measurements are discussed. The use of NEC computer models to analyze top-side ship configurations where measurement results are not available and only self-validation measures are available or at best comparisons with an alternate GTD computer modeling technique is also discussed.
NASA Technical Reports Server (NTRS)
Hale, Mark A.; Craig, James I.; Mistree, Farrokh; Schrage, Daniel P.
1995-01-01
Integrated Product and Process Development (IPPD) embodies the simultaneous application of both system and quality engineering methods throughout an iterative design process. The use of IPPD results in the time-conscious, cost-saving development of engineering systems. Georgia Tech has proposed the development of an Integrated Design Engineering Simulator that will merge Integrated Product and Process Development with interdisciplinary analysis techniques and state-of-the-art computational technologies. To implement IPPD, a Decision-Based Design perspective is encapsulated in an approach that focuses on the role of the human designer in product development. The approach has two parts and is outlined in this paper. First, an architecture, called DREAMS, is being developed that facilitates design from a decision-based perspective. Second, a supporting computing infrastructure, called IMAGE, is being designed. The current status of development is given and future directions are outlined.
An object-oriented approach to data display and storage: 3 years experience, 25,000 cases.
Sainsbury, D A
1993-11-01
Object-oriented programming techniques were used to develop computer based data display and storage systems. These have been operating in the 8 anaesthetising areas of the Adelaide Children's Hospital for 3 years. The analogue and serial outputs from an array of patient monitors are connected to IBM compatible PC-XT computers. The information is displayed on a colour screen as wave-form and trend graphs and digital format in 'real time'. The trend data is printed simultaneously on a dot matrix printer. This data is also stored for 24 hours on 'hard' disk. The major benefit has been the provision of a single visual focus for all monitored variables. The automatic logging of data has been invaluable in the analysis of critical incidents. The systems were made possible by recent, rapid improvements in computer hardware and software. This paper traces the development of the program and demonstrates the advantages of object-oriented programming techniques.
Reconfigurable Computing As an Enabling Technology for Single-Photon-Counting Laser Altimetry
NASA Technical Reports Server (NTRS)
Powell, Wesley; Hicks, Edward; Pinchinat, Maxime; Dabney, Philip; McGarry, Jan; Murray, Paul
2003-01-01
Single-photon-counting laser altimetry is a new measurement technique offering significant advantages in vertical resolution, reducing instrument size, mass, and power, and reducing laser complexity as compared to analog or threshold detection laser altimetry techniques. However, these improvements come at the cost of a dramatically increased requirement for onboard real-time data processing. Reconfigurable computing has been shown to offer considerable performance advantages in performing this processing. These advantages have been demonstrated on the Multi-KiloHertz Micro-Laser Altimeter (MMLA), an aircraft based single-photon-counting laser altimeter developed by NASA Goddard Space Flight Center with several potential spaceflight applications. This paper describes how reconfigurable computing technology was employed to perform MMLA data processing in real-time under realistic operating constraints, along with the results observed. This paper also expands on these prior results to identify concepts for using reconfigurable computing to enable spaceflight single-photon-counting laser altimeter instruments.
Evolutionary Computation for the Identification of Emergent Behavior in Autonomous Systems
NASA Technical Reports Server (NTRS)
Terrile, Richard J.; Guillaume, Alexandre
2009-01-01
Over the past several years the Center for Evolutionary Computation and Automated Design at the Jet Propulsion Laboratory has developed a technique based on Evolutionary Computational Methods (ECM) that allows for the automated optimization of complex computationally modeled systems. An important application of this technique is for the identification of emergent behaviors in autonomous systems. Mobility platforms such as rovers or airborne vehicles are now being designed with autonomous mission controllers that can find trajectories over a solution space that is larger than can reasonably be tested. It is critical to identify control behaviors that are not predicted and can have surprising results (both good and bad). These emergent behaviors need to be identified, characterized and either incorporated into or isolated from the acceptable range of control characteristics. We use cluster analysis of automatically retrieved solutions to identify isolated populations of solutions with divergent behaviors.
Three-dimensional computer visualization of forensic pathology data.
March, Jack; Schofield, Damian; Evison, Martin; Woodford, Noel
2004-03-01
Despite a decade of use in US courtrooms, it is only recently that forensic computer animations have become an increasingly important form of communication in legal spheres within the United Kingdom. Aims Research at the University of Nottingham has been influential in the critical investigation of forensic computer graphics reconstruction methodologies and techniques and in raising the profile of this novel form of data visualization within the United Kingdom. The case study presented demonstrates research undertaken by Aims Research and the Department of Forensic Pathology at the University of Sheffield, which aims to apply, evaluate, and develop novel 3-dimensional computer graphics (CG) visualization and virtual reality (VR) techniques in the presentation and investigation of forensic information concerning the human body. The inclusion of such visualizations within other CG or VR environments may ultimately provide the potential for alternative exploratory directions, processes, and results within forensic pathology investigations.
Dugdale, Stephanie; Ward, Jonathan; Hernen, Jan; Elison, Sarah; Davies, Glyn; Donkor, Daniel
2016-07-22
In recent years, research within the field of health psychology has made significant progress in terms of advancing and standardizing the science of developing, evaluating and reporting complex behavioral change interventions. A major part of this work has involved the development of an evidence-based Behavior Change Technique Taxonomy v1 (BCTTv1), as a means of describing the active components contained within such complex interventions. To date, however, this standardized approach derived from health psychology research has not been applied to the development of complex interventions for the treatment of substance use disorders (SUD). Therefore, this paper uses Breaking Free Online (BFO), a computer-assisted therapy program for SUD, as an example of how the clinical techniques contained within such an intervention might be mapped onto the BCTTv1. The developers of BFO were able to produce a full list of the clinical techniques contained within BFO. Exploratory mapping of the BCTTv1 onto the clinical content of the BFO program was conducted separately by the authors of the paper. This included the developers of the BFO program and psychology professionals working within the SUD field. These coded techniques were reviewed by the authors and any discrepancies in the coding were discussed between all authors until an agreement was reached. The BCTTv1 was mapped onto the clinical content of the BFO program. At least one behavioral change technique was found in 12 out of 16 grouping categories within the BCTTv1. A total of 26 out of 93 behavior change techniques were identified across the clinical content of the program. This exploratory mapping exercise has identified the specific behavior change techniques contained within BFO, and has provided a means of describing these techniques in a standardized way using the BCTTv1 terminology. It has also provided an opportunity for the BCTTv1 mapping process to be reported to the wider SUD treatment community, as it may have real utility in the development and evaluation of other psychosocial and behavioral change interventions within this field.
Advanced DPSM approach for modeling ultrasonic wave scattering in an arbitrary geometry
NASA Astrophysics Data System (ADS)
Yadav, Susheel K.; Banerjee, Sourav; Kundu, Tribikram
2011-04-01
Several techniques are used to diagnose structural damages. In the ultrasonic technique structures are tested by analyzing ultrasonic signals scattered by damages. The interpretation of these signals requires a good understanding of the interaction between ultrasonic waves and structures. Therefore, researchers need analytical or numerical techniques to have a clear understanding of the interaction between ultrasonic waves and structural damage. However, modeling of wave scattering phenomenon by conventional numerical techniques such as finite element method requires very fine mesh at high frequencies necessitating heavy computational power. Distributed point source method (DPSM) is a newly developed robust mesh free technique to simulate ultrasonic, electrostatic and electromagnetic fields. In most of the previous studies the DPSM technique has been applied to model two dimensional surface geometries and simple three dimensional scatterer geometries. It was difficult to perform the analysis for complex three dimensional geometries. This technique has been extended to model wave scattering in an arbitrary geometry. In this paper a channel section idealized as a thin solid plate with several rivet holes is formulated. The simulation has been carried out with and without cracks near the rivet holes. Further, a comparison study has been also carried out to characterize the crack. A computer code has been developed in C for modeling the ultrasonic field in a solid plate with and without cracks near the rivet holes.
Phase-locked-loop interferometry applied to aspheric testing with a computer-stored compensator.
Servin, M; Malacara, D; Rodriguez-Vera, R
1994-05-01
A recently developed technique for continuous-phase determination of interferograms with a digital phase-locked loop (PLL) is applied to the null testing of aspheres. Although this PLL demodulating scheme is also a synchronous or direct interferometric technique, the separate unwrapping process is not explicitly required. The unwrapping and the phase-detection processes are achieved simultaneously within the PLL. The proposed method uses a computer-generated holographic compensator. The holographic compensator does not need to be printed out by any means; it is calculated and used from the computer. This computer-stored compensator is used as the reference signal to phase demodulate a sample interferogram obtained from the asphere being tested. Consequently the demodulated phase contains information about the wave-front departures from the ideal computer-stored aspheric interferogram. Wave-front differences of ~ 1 λ are handled easily by the proposed PLL scheme. The maximum recorded frequency in the template's interferogram as well as in the sampled interferogram are assumed to be below the Nyquist frequency.
Development of hybrid computer plasma models for different pressure regimes
NASA Astrophysics Data System (ADS)
Hromadka, Jakub; Ibehej, Tomas; Hrach, Rudolf
2016-09-01
With increased performance of contemporary computers during last decades numerical simulations became a very powerful tool applicable also in plasma physics research. Plasma is generally an ensemble of mutually interacting particles that is out of the thermodynamic equilibrium and for this reason fluid computer plasma models give results with only limited accuracy. On the other hand, much more precise particle models are often limited only on 2D problems because of their huge demands on the computer resources. Our contribution is devoted to hybrid modelling techniques that combine advantages of both modelling techniques mentioned above, particularly to their so-called iterative version. The study is focused on mutual relations between fluid and particle models that are demonstrated on the calculations of sheath structures of low temperature argon plasma near a cylindrical Langmuir probe for medium and higher pressures. Results of a simple iterative hybrid plasma computer model are also given. The authors acknowledge the support of the Grant Agency of Charles University in Prague (project 220215).
Privacy-preserving search for chemical compound databases.
Shimizu, Kana; Nuida, Koji; Arai, Hiromi; Mitsunari, Shigeo; Attrapadung, Nuttapong; Hamada, Michiaki; Tsuda, Koji; Hirokawa, Takatsugu; Sakuma, Jun; Hanaoka, Goichiro; Asai, Kiyoshi
2015-01-01
Searching for similar compounds in a database is the most important process for in-silico drug screening. Since a query compound is an important starting point for the new drug, a query holder, who is afraid of the query being monitored by the database server, usually downloads all the records in the database and uses them in a closed network. However, a serious dilemma arises when the database holder also wants to output no information except for the search results, and such a dilemma prevents the use of many important data resources. In order to overcome this dilemma, we developed a novel cryptographic protocol that enables database searching while keeping both the query holder's privacy and database holder's privacy. Generally, the application of cryptographic techniques to practical problems is difficult because versatile techniques are computationally expensive while computationally inexpensive techniques can perform only trivial computation tasks. In this study, our protocol is successfully built only from an additive-homomorphic cryptosystem, which allows only addition performed on encrypted values but is computationally efficient compared with versatile techniques such as general purpose multi-party computation. In an experiment searching ChEMBL, which consists of more than 1,200,000 compounds, the proposed method was 36,900 times faster in CPU time and 12,000 times as efficient in communication size compared with general purpose multi-party computation. We proposed a novel privacy-preserving protocol for searching chemical compound databases. The proposed method, easily scaling for large-scale databases, may help to accelerate drug discovery research by making full use of unused but valuable data that includes sensitive information.
Privacy-preserving search for chemical compound databases
2015-01-01
Background Searching for similar compounds in a database is the most important process for in-silico drug screening. Since a query compound is an important starting point for the new drug, a query holder, who is afraid of the query being monitored by the database server, usually downloads all the records in the database and uses them in a closed network. However, a serious dilemma arises when the database holder also wants to output no information except for the search results, and such a dilemma prevents the use of many important data resources. Results In order to overcome this dilemma, we developed a novel cryptographic protocol that enables database searching while keeping both the query holder's privacy and database holder's privacy. Generally, the application of cryptographic techniques to practical problems is difficult because versatile techniques are computationally expensive while computationally inexpensive techniques can perform only trivial computation tasks. In this study, our protocol is successfully built only from an additive-homomorphic cryptosystem, which allows only addition performed on encrypted values but is computationally efficient compared with versatile techniques such as general purpose multi-party computation. In an experiment searching ChEMBL, which consists of more than 1,200,000 compounds, the proposed method was 36,900 times faster in CPU time and 12,000 times as efficient in communication size compared with general purpose multi-party computation. Conclusion We proposed a novel privacy-preserving protocol for searching chemical compound databases. The proposed method, easily scaling for large-scale databases, may help to accelerate drug discovery research by making full use of unused but valuable data that includes sensitive information. PMID:26678650
TBGG- INTERACTIVE ALGEBRAIC GRID GENERATION
NASA Technical Reports Server (NTRS)
Smith, R. E.
1994-01-01
TBGG, Two-Boundary Grid Generation, applies an interactive algebraic grid generation technique in two dimensions. The program incorporates mathematical equations that relate the computational domain to the physical domain. TBGG has application to a variety of problems using finite difference techniques, such as computational fluid dynamics. Examples include the creation of a C-type grid about an airfoil and a nozzle configuration in which no left or right boundaries are specified. The underlying two-boundary technique of grid generation is based on Hermite cubic interpolation between two fixed, nonintersecting boundaries. The boundaries are defined by two ordered sets of points, referred to as the top and bottom. Left and right side boundaries may also be specified, and call upon linear blending functions to conform interior interpolation to the side boundaries. Spacing between physical grid coordinates is determined as a function of boundary data and uniformly spaced computational coordinates. Control functions relating computational coordinates to parametric intermediate variables that affect the distance between grid points are embedded in the interpolation formulas. A versatile control function technique with smooth cubic spline functions is also presented. The TBGG program is written in FORTRAN 77. It works best in an interactive graphics environment where computational displays and user responses are quickly exchanged. The program has been implemented on a CDC Cyber 170 series computer using NOS 2.4 operating system, with a central memory requirement of 151,700 (octal) 60 bit words. TBGG requires a Tektronix 4015 terminal and the DI-3000 Graphics Library of Precision Visuals, Inc. TBGG was developed in 1986.
Development of a computational model for astronaut reorientation.
Stirling, Leia; Willcox, Karen; Newman, Dava
2010-08-26
The ability to model astronaut reorientations computationally provides a simple way to develop and study human motion control strategies. Since the cost of experimenting in microgravity is high, and underwater training can lead to motions inappropriate for microgravity, these techniques allow for motions to be developed and well-understood prior to any microgravity exposure. By including a model of the current space suit, we have the ability to study both intravehicular and extravehicular activities. We present several techniques for rotating about the axes of the body and show that motions performed by the legs create a greater net rotation than those performed by the arms. Adding a space suit to the motions was seen to increase the resistance torque and limit the available range of motion. While rotations about the body axes can be performed in the current space suit, the resulting motions generated a reduced rotation when compared to the unsuited configuration. 2010 Elsevier Ltd. All rights reserved.
Execution environment for intelligent real-time control systems
NASA Technical Reports Server (NTRS)
Sztipanovits, Janos
1987-01-01
Modern telerobot control technology requires the integration of symbolic and non-symbolic programming techniques, different models of parallel computations, and various programming paradigms. The Multigraph Architecture, which has been developed for the implementation of intelligent real-time control systems is described. The layered architecture includes specific computational models, integrated execution environment and various high-level tools. A special feature of the architecture is the tight coupling between the symbolic and non-symbolic computations. It supports not only a data interface, but also the integration of the control structures in a parallel computing environment.
Dynamic VM Provisioning for TORQUE in a Cloud Environment
NASA Astrophysics Data System (ADS)
Zhang, S.; Boland, L.; Coddington, P.; Sevior, M.
2014-06-01
Cloud computing, also known as an Infrastructure-as-a-Service (IaaS), is attracting more interest from the commercial and educational sectors as a way to provide cost-effective computational infrastructure. It is an ideal platform for researchers who must share common resources but need to be able to scale up to massive computational requirements for specific periods of time. This paper presents the tools and techniques developed to allow the open source TORQUE distributed resource manager and Maui cluster scheduler to dynamically integrate OpenStack cloud resources into existing high throughput computing clusters.
Modern Computational Techniques for the HMMER Sequence Analysis
2013-01-01
This paper focuses on the latest research and critical reviews on modern computing architectures, software and hardware accelerated algorithms for bioinformatics data analysis with an emphasis on one of the most important sequence analysis applications—hidden Markov models (HMM). We show the detailed performance comparison of sequence analysis tools on various computing platforms recently developed in the bioinformatics society. The characteristics of the sequence analysis, such as data and compute-intensive natures, make it very attractive to optimize and parallelize by using both traditional software approach and innovated hardware acceleration technologies. PMID:25937944
Computation offloading for real-time health-monitoring devices.
Kalantarian, Haik; Sideris, Costas; Tuan Le; Hosseini, Anahita; Sarrafzadeh, Majid
2016-08-01
Among the major challenges in the development of real-time wearable health monitoring systems is to optimize battery life. One of the major techniques with which this objective can be achieved is computation offloading, in which portions of computation can be partitioned between the device and other resources such as a server or cloud. In this paper, we describe a novel dynamic computation offloading scheme for real-time wearable health monitoring devices that adjusts the partitioning of data between the wearable device and mobile application as a function of desired classification accuracy.
Statistical iterative reconstruction to improve image quality for digital breast tomosynthesis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Xu, Shiyu, E-mail: shiyu.xu@gmail.com; Chen, Ying, E-mail: adachen@siu.edu; Lu, Jianping
2015-09-15
Purpose: Digital breast tomosynthesis (DBT) is a novel modality with the potential to improve early detection of breast cancer by providing three-dimensional (3D) imaging with a low radiation dose. 3D image reconstruction presents some challenges: cone-beam and flat-panel geometry, and highly incomplete sampling. A promising means to overcome these challenges is statistical iterative reconstruction (IR), since it provides the flexibility of accurate physics modeling and a general description of system geometry. The authors’ goal was to develop techniques for applying statistical IR to tomosynthesis imaging data. Methods: These techniques include the following: a physics model with a local voxel-pair basedmore » prior with flexible parameters to fine-tune image quality; a precomputed parameter λ in the prior, to remove data dependence and to achieve a uniform resolution property; an effective ray-driven technique to compute the forward and backprojection; and an oversampled, ray-driven method to perform high resolution reconstruction with a practical region-of-interest technique. To assess the performance of these techniques, the authors acquired phantom data on the stationary DBT prototype system. To solve the estimation problem, the authors proposed an optimization-transfer based algorithm framework that potentially allows fewer iterations to achieve an acceptably converged reconstruction. Results: IR improved the detectability of low-contrast and small microcalcifications, reduced cross-plane artifacts, improved spatial resolution, and lowered noise in reconstructed images. Conclusions: Although the computational load remains a significant challenge for practical development, the superior image quality provided by statistical IR, combined with advancing computational techniques, may bring benefits to screening, diagnostics, and intraoperative imaging in clinical applications.« less
1976-07-01
Systems Division ......... ........................ 60 Oceanology Area ........... ............................ 62 Shipboard Computing Group...directed toward new and improved materials, equipment, techniques, systems , and related operational procedures for the Navy. In fulfillment of this...Within areas of technological expertise, develops prototype systems applicable to specific projects. (d) Performs scientific research development for
Macroeconomic Activity Module - NEMS Documentation
2016-01-01
Documents the objectives, analytical approach, and development of the National Energy Modeling System (NEMS) Macroeconomic Activity Module (MAM) used to develop the Annual Energy Outlook for 2016 (AEO2016). The report catalogues and describes the module assumptions, computations, methodology, parameter estimation techniques, and mainframe source code
NASA Technical Reports Server (NTRS)
Roskam, J.
1983-01-01
The transmission loss characteristics of panels using the acoustic intensity technique is presented. The theoretical formulation, installation of hardware, modifications to the test facility, and development of computer programs and test procedures are described. A listing of all the programs is also provided. The initial test results indicate that the acoustic intensity technique is easily adapted to measure transmission loss characteristics of panels. Use of this method will give average transmission loss values. The fixtures developed to position the microphones along the grid points are very useful in plotting the intensity maps of vibrating panels.
Improved Slip Casting Of Ceramic Models
NASA Technical Reports Server (NTRS)
Buck, Gregory M.; Vasquez, Peter; Hicks, Lana P.
1994-01-01
Improved technique of investment slip casting developed for making precise ceramic wind-tunnel models. Needed in wind-tunnel experiments to verify predictions of aerothermodynamical computer codes. Ceramic materials used because of their low heat conductivities and ability to survive high temperatures. Present improved slip-casting technique enables casting of highly detailed models from aqueous or nonaqueous solutions. Wet shell molds peeled off models to ensure precise and undamaged details. Used at NASA Langley Research Center to form superconducting ceramic components from nonaqueous slip solutions. Technique has many more applications when ceramic materials developed further for such high-strength/ temperature components as engine parts.
NASA Technical Reports Server (NTRS)
Kim, B. F.; Moorjani, K.; Phillips, T. E.; Adrian, F. J.; Bohandy, J.; Dolecek, Q. E.
1993-01-01
A method for characterization of granular superconducting thin films has been developed which encompasses both the morphological state of the sample and its fabrication process parameters. The broad scope of this technique is due to the synergism between experimental measurements and their interpretation using numerical simulation. Two novel technologies form the substance of this system: the magnetically modulated resistance method for characterizing superconductors; and a powerful new computer peripheral, the Parallel Information Processor card, which provides enhanced computing capability for PC computers. This enhancement allows PC computers to operate at speeds approaching that of supercomputers. This makes atomic scale simulations possible on low cost machines. The present development of this system involves the integration of these two technologies using mesoscale simulations of thin film growth. A future stage of development will incorporate atomic scale modeling.
Light reflection models for computer graphics.
Greenberg, D P
1989-04-14
During the past 20 years, computer graphic techniques for simulating the reflection of light have progressed so that today images of photorealistic quality can be produced. Early algorithms considered direct lighting only, but global illumination phenomena with indirect lighting, surface interreflections, and shadows can now be modeled with ray tracing, radiosity, and Monte Carlo simulations. This article describes the historical development of computer graphic algorithms for light reflection and pictorially illustrates what will be commonly available in the near future.
Restricted access processor - An application of computer security technology
NASA Technical Reports Server (NTRS)
Mcmahon, E. M.
1985-01-01
This paper describes a security guard device that is currently being developed by Computer Sciences Corporation (CSC). The methods used to provide assurance that the system meets its security requirements include the system architecture, a system security evaluation, and the application of formal and informal verification techniques. The combination of state-of-the-art technology and the incorporation of new verification procedures results in a demonstration of the feasibility of computer security technology for operational applications.
Investigation of safety analysis methods using computer vision techniques
NASA Astrophysics Data System (ADS)
Shirazi, Mohammad Shokrolah; Morris, Brendan Tran
2017-09-01
This work investigates safety analysis methods using computer vision techniques. The vision-based tracking system is developed to provide the trajectory of road users including vehicles and pedestrians. Safety analysis methods are developed to estimate time to collision (TTC) and postencroachment time (PET) that are two important safety measurements. Corresponding algorithms are presented and their advantages and drawbacks are shown through their success in capturing the conflict events in real time. The performance of the tracking system is evaluated first, and probability density estimation of TTC and PET are shown for 1-h monitoring of a Las Vegas intersection. Finally, an idea of an intersection safety map is introduced, and TTC values of two different intersections are estimated for 1 day from 8:00 a.m. to 6:00 p.m.
Experience with Aero- and Fluid-Dynamic Testing for Engineering and CFD Validation
NASA Technical Reports Server (NTRS)
Ross, James C.
2016-01-01
Ever since computations have been used to simulate aerodynamics the need to ensure that the computations adequately represent real life has followed. Many experiments have been performed specifically for validation and as computational methods have improved, so have the validation experiments. Validation is also a moving target because computational methods improve requiring validation for the new aspect of flow physics that the computations aim to capture. Concurrently, new measurement techniques are being developed that can help capture more detailed flow features pressure sensitive paint (PSP) and particle image velocimetry (PIV) come to mind. This paper will present various wind-tunnel tests the author has been involved with and how they were used for validation of various kinds of CFD. A particular focus is the application of advanced measurement techniques to flow fields (and geometries) that had proven to be difficult to predict computationally. Many of these difficult flow problems arose from engineering and development problems that needed to be solved for a particular vehicle or research program. In some cases the experiments required to solve the engineering problems were refined to provide valuable CFD validation data in addition to the primary engineering data. All of these experiments have provided physical insight and validation data for a wide range of aerodynamic and acoustic phenomena for vehicles ranging from tractor-trailers to crewed spacecraft.
Constructing a patient-specific computer model of the upper airway in sleep apnea patients.
Dhaliwal, Sandeep S; Hesabgar, Seyyed M; Haddad, Seyyed M H; Ladak, Hanif; Samani, Abbas; Rotenberg, Brian W
2018-01-01
The use of computer simulation to develop a high-fidelity model has been proposed as a novel and cost-effective alternative to help guide therapeutic intervention in sleep apnea surgery. We describe a computer model based on patient-specific anatomy of obstructive sleep apnea (OSA) subjects wherein the percentage and sites of upper airway collapse are compared to findings on drug-induced sleep endoscopy (DISE). Basic science computer model generation. Three-dimensional finite element techniques were undertaken for model development in a pilot study of four OSA patients. Magnetic resonance imaging was used to capture patient anatomy and software employed to outline critical anatomical structures. A finite-element mesh was applied to the volume enclosed by each structure. Linear and hyperelastic soft-tissue properties for various subsites (tonsils, uvula, soft palate, and tongue base) were derived using an inverse finite-element technique from surgical specimens. Each model underwent computer simulation to determine the degree of displacement on various structures within the upper airway, and these findings were compared to DISE exams performed on the four study patients. Computer simulation predictions for percentage of airway collapse and site of maximal collapse show agreement with observed results seen on endoscopic visualization. Modeling the upper airway in OSA patients is feasible and holds promise in aiding patient-specific surgical treatment. NA. Laryngoscope, 128:277-282, 2018. © 2017 The American Laryngological, Rhinological and Otological Society, Inc.
Virtual reality neurosurgery: a simulator blueprint.
Spicer, Mark A; van Velsen, Martin; Caffrey, John P; Apuzzo, Michael L J
2004-04-01
This article details preliminary studies undertaken to integrate the most relevant advancements across multiple disciplines in an effort to construct a highly realistic neurosurgical simulator based on a distributed computer architecture. Techniques based on modified computational modeling paradigms incorporating finite element analysis are presented, as are current and projected efforts directed toward the implementation of a novel bidirectional haptic device. Patient-specific data derived from noninvasive magnetic resonance imaging sequences are used to construct a computational model of the surgical region of interest. Magnetic resonance images of the brain may be coregistered with those obtained from magnetic resonance angiography, magnetic resonance venography, and diffusion tensor imaging to formulate models of varying anatomic complexity. The majority of the computational burden is encountered in the presimulation reduction of the computational model and allows realization of the required threshold rates for the accurate and realistic representation of real-time visual animations. Intracranial neurosurgical procedures offer an ideal testing site for the development of a totally immersive virtual reality surgical simulator when compared with the simulations required in other surgical subspecialties. The material properties of the brain as well as the typically small volumes of tissue exposed in the surgical field, coupled with techniques and strategies to minimize computational demands, provide unique opportunities for the development of such a simulator. Incorporation of real-time haptic and visual feedback is approached here and likely will be accomplished soon.
Aeroelastic Model Structure Computation for Envelope Expansion
NASA Technical Reports Server (NTRS)
Kukreja, Sunil L.
2007-01-01
Structure detection is a procedure for selecting a subset of candidate terms, from a full model description, that best describes the observed output. This is a necessary procedure to compute an efficient system description which may afford greater insight into the functionality of the system or a simpler controller design. Structure computation as a tool for black-box modelling may be of critical importance in the development of robust, parsimonious models for the flight-test community. Moreover, this approach may lead to efficient strategies for rapid envelope expansion which may save significant development time and costs. In this study, a least absolute shrinkage and selection operator (LASSO) technique is investigated for computing efficient model descriptions of nonlinear aeroelastic systems. The LASSO minimises the residual sum of squares by the addition of an l(sub 1) penalty term on the parameter vector of the traditional 2 minimisation problem. Its use for structure detection is a natural extension of this constrained minimisation approach to pseudolinear regression problems which produces some model parameters that are exactly zero and, therefore, yields a parsimonious system description. Applicability of this technique for model structure computation for the F/A-18 Active Aeroelastic Wing using flight test data is shown for several flight conditions (Mach numbers) by identifying a parsimonious system description with a high percent fit for cross-validated data.
NASA Astrophysics Data System (ADS)
Guo, Jie; Zhu, Chang`an
2016-01-01
The development of optics and computer technologies enables the application of the vision-based technique that uses digital cameras to the displacement measurement of large-scale structures. Compared with traditional contact measurements, vision-based technique allows for remote measurement, has a non-intrusive characteristic, and does not necessitate mass introduction. In this study, a high-speed camera system is developed to complete the displacement measurement in real time. The system consists of a high-speed camera and a notebook computer. The high-speed camera can capture images at a speed of hundreds of frames per second. To process the captured images in computer, the Lucas-Kanade template tracking algorithm in the field of computer vision is introduced. Additionally, a modified inverse compositional algorithm is proposed to reduce the computing time of the original algorithm and improve the efficiency further. The modified algorithm can rapidly accomplish one displacement extraction within 1 ms without having to install any pre-designed target panel onto the structures in advance. The accuracy and the efficiency of the system in the remote measurement of dynamic displacement are demonstrated in the experiments on motion platform and sound barrier on suspension viaduct. Experimental results show that the proposed algorithm can extract accurate displacement signal and accomplish the vibration measurement of large-scale structures.
Determination of high temperature strains using a PC based vision system
NASA Astrophysics Data System (ADS)
McNeill, Stephen R.; Sutton, Michael A.; Russell, Samuel S.
1992-09-01
With the widespread availability of video digitizers and cheap personal computers, the use of computer vision as an experimental tool is becoming common place. These systems are being used to make a wide variety of measurements that range from simple surface characterization to velocity profiles. The Sub-Pixel Digital Image Correlation technique has been developed to measure full field displacement and gradients of the surface of an object subjected to a driving force. The technique has shown its utility by measuring the deformation and movement of objects that range from simple translation to fluid velocity profiles to crack tip deformation of solid rocket fuel. This technique has recently been improved and used to measure the surface displacement field of an object at high temperature. The development of a PC based Sub-Pixel Digital Image Correlation system has yielded an accurate and easy to use system for measuring surface displacements and gradients. Experiments have been performed to show the system is viable for measuring thermal strain.
Automated diagnosis of fetal alcohol syndrome using 3D facial image analysis
Fang, Shiaofen; McLaughlin, Jason; Fang, Jiandong; Huang, Jeffrey; Autti-Rämö, Ilona; Fagerlund, Åse; Jacobson, Sandra W.; Robinson, Luther K.; Hoyme, H. Eugene; Mattson, Sarah N.; Riley, Edward; Zhou, Feng; Ward, Richard; Moore, Elizabeth S.; Foroud, Tatiana
2012-01-01
Objectives Use three-dimensional (3D) facial laser scanned images from children with fetal alcohol syndrome (FAS) and controls to develop an automated diagnosis technique that can reliably and accurately identify individuals prenatally exposed to alcohol. Methods A detailed dysmorphology evaluation, history of prenatal alcohol exposure, and 3D facial laser scans were obtained from 149 individuals (86 FAS; 63 Control) recruited from two study sites (Cape Town, South Africa and Helsinki, Finland). Computer graphics, machine learning, and pattern recognition techniques were used to automatically identify a set of facial features that best discriminated individuals with FAS from controls in each sample. Results An automated feature detection and analysis technique was developed and applied to the two study populations. A unique set of facial regions and features were identified for each population that accurately discriminated FAS and control faces without any human intervention. Conclusion Our results demonstrate that computer algorithms can be used to automatically detect facial features that can discriminate FAS and control faces. PMID:18713153
Advancements in optical techniques and imaging in the diagnosis and management of bladder cancer.
Rose, Tracy L; Lotan, Yair
2018-03-01
Accurate detection and staging is critical to the appropriate management of urothelial cancer (UC). The use of advanced optical techniques during cystoscopy is becoming more widespread to prevent recurrent nonmuscle invasive bladder cancer. Standard of care for muscle-invasive UC includes the use of computed tomography and/or magnetic resonance imaging, but staging accuracy of these tests remains imperfect. Novel imaging modalities are being developed to improve current test performance. Positron emission tomography/computed tomography has a role in the initial evaluation of select patients with muscle-invasive bladder cancer and in disease recurrence in some cases. Several novel immuno-positron emission tomography tracers are currently in development to address the inadequacy of current imaging modalities for monitoring of tumor response to newer immune-based treatments. This review summaries the current standards and recent advances in optical techniques and imaging modalities in localized and metastatic UC. Copyright © 2018 Elsevier Inc. All rights reserved.