Advanced techniques for determining long term compatibility of materials with propellants
NASA Technical Reports Server (NTRS)
Green, R. L.
1972-01-01
The search for advanced measurement techniques for determining long term compatibility of materials with propellants was conducted in several parts. A comprehensive survey of the existing measurement and testing technology for determining material-propellant interactions was performed. Selections were made from those existing techniques which were determined could meet or be made to meet the requirements. Areas of refinement or changes were recommended for improvement of others. Investigations were also performed to determine the feasibility and advantages of developing and using new techniques to achieve significant improvements over existing ones. The most interesting demonstration was that of the new technique, the volatile metal chelate analysis. Rivaling the neutron activation analysis in terms of sensitivity and specificity, the volatile metal chelate technique was fully demonstrated.
ANALYSIS OF RADON MITIGATION TECHNIQUES USED IN EXISTING U.S. HOUSES
This paper reviews the full range of techniques that have been installed in existing US houses for the purpose of reducing indoor radon concentrations resulting from soil gas entry. The review addresses the performance, installation and operating costs, applicability, mechanisms,...
Reachability analysis of real-time systems using time Petri nets.
Wang, J; Deng, Y; Xu, G
2000-01-01
Time Petri nets (TPNs) are a popular Petri net model for specification and verification of real-time systems. A fundamental and most widely applied method for analyzing Petri nets is reachability analysis. The existing technique for reachability analysis of TPNs, however, is not suitable for timing property verification because one cannot derive end-to-end delay in task execution, an important issue for time-critical systems, from the reachability tree constructed using the technique. In this paper, we present a new reachability based analysis technique for TPNs for timing property analysis and verification that effectively addresses the problem. Our technique is based on a concept called clock-stamped state class (CS-class). With the reachability tree generated based on CS-classes, we can directly compute the end-to-end time delay in task execution. Moreover, a CS-class can be uniquely mapped to a traditional state class based on which the conventional reachability tree is constructed. Therefore, our CS-class-based analysis technique is more general than the existing technique. We show how to apply this technique to timing property verification of the TPN model of a command and control (C2) system.
NASA Technical Reports Server (NTRS)
Keuper, H. R.; Peplies, R. W.; Gillooly, R. P.
1977-01-01
The use of machine scanning and/or computer-based techniques to provide greater objectivity in the photomorphic approach was investigated. Photomorphic analysis and its application in regional planning are discussed. Topics included: delineation of photomorphic regions; inadequacies of existing classification systems; tonal and textural characteristics and signature analysis techniques; pattern recognition and Fourier transform analysis; and optical experiments. A bibliography is included.
Techniques for Forecasting Air Passenger Traffic
NASA Technical Reports Server (NTRS)
Taneja, N.
1972-01-01
The basic techniques of forecasting the air passenger traffic are outlined. These techniques can be broadly classified into four categories: judgmental, time-series analysis, market analysis and analytical. The differences between these methods exist, in part, due to the degree of formalization of the forecasting procedure. Emphasis is placed on describing the analytical method.
2011-07-22
L., Upgrading of Existing X - Ray Photoelectron Spectrometer Capabilities for Development and Analysis of Novel Energetic NanoCluster materials (DURIP...References From the Technical Reports database Allara, David L., Pennsylvania State University, Upgrading of Existing X - Ray Photoelectron...Scanning probe X - ray Of these techniques, the most popularly used is the scanning probe, also known as the Dip-Pen Nanolithography (DPN) technique
A Structural and Content-Based Analysis for Web Filtering.
ERIC Educational Resources Information Center
Lee, P. Y.; Hui, S. C.; Fong, A. C. M.
2003-01-01
Presents an analysis of the distinguishing features of pornographic Web pages so that effective filtering techniques can be developed. Surveys the existing techniques for Web content filtering and describes the implementation of a Web content filtering system that uses an artificial neural network. (Author/LRW)
A Cost-Effectiveness/Benefit Analysis Model for Postsecondary Vocational Programs. Technical Report.
ERIC Educational Resources Information Center
Kim, Jin Eun
A cost-effectiveness/benefit analysis is defined as a technique for measuring the outputs of existing and new programs in relation to their specified program objectives, against the costs of those programs. In terms of its specific use, the technique is conceptualized as a systems analysis method, an evaluation method, and a planning tool for…
A linear circuit analysis program with stiff systems capability
NASA Technical Reports Server (NTRS)
Cook, C. H.; Bavuso, S. J.
1973-01-01
Several existing network analysis programs have been modified and combined to employ a variable topological approach to circuit translation. Efficient numerical integration techniques are used for transient analysis.
Multidimensional chromatography in food analysis.
Herrero, Miguel; Ibáñez, Elena; Cifuentes, Alejandro; Bernal, Jose
2009-10-23
In this work, the main developments and applications of multidimensional chromatographic techniques in food analysis are reviewed. Different aspects related to the existing couplings involving chromatographic techniques are examined. These couplings include multidimensional GC, multidimensional LC, multidimensional SFC as well as all their possible combinations. Main advantages and drawbacks of each coupling are critically discussed and their key applications in food analysis described.
Image Analysis Technique for Material Behavior Evaluation in Civil Structures
Moretti, Michele; Rossi, Gianluca
2017-01-01
The article presents a hybrid monitoring technique for the measurement of the deformation field. The goal is to obtain information about crack propagation in existing structures, for the purpose of monitoring their state of health. The measurement technique is based on the capture and analysis of a digital image set. Special markers were used on the surface of the structures that can be removed without damaging existing structures as the historical masonry. The digital image analysis was done using software specifically designed in Matlab to follow the tracking of the markers and determine the evolution of the deformation state. The method can be used in any type of structure but is particularly suitable when it is necessary not to damage the surface of structures. A series of experiments carried out on masonry walls of the Oliverian Museum (Pesaro, Italy) and Palazzo Silvi (Perugia, Italy) have allowed the validation of the procedure elaborated by comparing the results with those derived from traditional measuring techniques. PMID:28773129
Image Analysis Technique for Material Behavior Evaluation in Civil Structures.
Speranzini, Emanuela; Marsili, Roberto; Moretti, Michele; Rossi, Gianluca
2017-07-08
The article presents a hybrid monitoring technique for the measurement of the deformation field. The goal is to obtain information about crack propagation in existing structures, for the purpose of monitoring their state of health. The measurement technique is based on the capture and analysis of a digital image set. Special markers were used on the surface of the structures that can be removed without damaging existing structures as the historical masonry. The digital image analysis was done using software specifically designed in Matlab to follow the tracking of the markers and determine the evolution of the deformation state. The method can be used in any type of structure but is particularly suitable when it is necessary not to damage the surface of structures. A series of experiments carried out on masonry walls of the Oliverian Museum (Pesaro, Italy) and Palazzo Silvi (Perugia, Italy) have allowed the validation of the procedure elaborated by comparing the results with those derived from traditional measuring techniques.
NASA Technical Reports Server (NTRS)
Djorgovski, George
1993-01-01
The existing and forthcoming data bases from NASA missions contain an abundance of information whose complexity cannot be efficiently tapped with simple statistical techniques. Powerful multivariate statistical methods already exist which can be used to harness much of the richness of these data. Automatic classification techniques have been developed to solve the problem of identifying known types of objects in multiparameter data sets, in addition to leading to the discovery of new physical phenomena and classes of objects. We propose an exploratory study and integration of promising techniques in the development of a general and modular classification/analysis system for very large data bases, which would enhance and optimize data management and the use of human research resource.
NASA Technical Reports Server (NTRS)
Djorgovski, Stanislav
1992-01-01
The existing and forthcoming data bases from NASA missions contain an abundance of information whose complexity cannot be efficiently tapped with simple statistical techniques. Powerful multivariate statistical methods already exist which can be used to harness much of the richness of these data. Automatic classification techniques have been developed to solve the problem of identifying known types of objects in multi parameter data sets, in addition to leading to the discovery of new physical phenomena and classes of objects. We propose an exploratory study and integration of promising techniques in the development of a general and modular classification/analysis system for very large data bases, which would enhance and optimize data management and the use of human research resources.
Comparison of existing digital image analysis systems for the analysis of Thematic Mapper data
NASA Technical Reports Server (NTRS)
Likens, W. C.; Wrigley, R. C.
1984-01-01
Most existing image analysis systems were designed with the Landsat Multi-Spectral Scanner in mind, leaving open the question of whether or not these systems could adequately process Thematic Mapper data. In this report, both hardware and software systems have been evaluated for compatibility with TM data. Lack of spectral analysis capability was not found to be a problem, though techniques for spatial filtering and texture varied. Computer processing speed and data storage of currently existing mini-computer based systems may be less than adequate. Upgrading to more powerful hardware may be required for many TM applications.
Using the Delphi Technique to Support Curriculum Development
ERIC Educational Resources Information Center
Sitlington, Helen Barbara; Coetzer, Alan John
2015-01-01
Purpose: The purpose of this paper is to present an analysis of the use of the Delphi technique to support curriculum development with a view to enhancing existing literature on use of the technique for renewal of business course curricula. Design/methodology/approach: The authors outline the Delphi process for obtaining consensus amongst a…
Application of a substructuring technique to the problem of crack extension and closure
NASA Technical Reports Server (NTRS)
Armen, H., Jr.
1974-01-01
A substructuring technique, originally developed for the efficient reanalysis of structures, is incorporated into the methodology associated with the plastic analysis of structures. An existing finite-element computer program that accounts for elastic-plastic material behavior under cyclic loading was modified to account for changing kinematic constraint conditions - crack growth and intermittent contact of crack surfaces in two dimensional regions. Application of the analysis is presented for a problem of a centercrack panel to demonstrate the efficiency and accuracy of the technique.
Looking at Fossils in New Ways
ERIC Educational Resources Information Center
Flannery, Maura C.
2005-01-01
Existing fossils could be studied from a different prospective with the use of new methods of analysis for gathering more information. The new techniques of studying fossils binds the new and the old techniques and information and provides another way to look at fossils.
NASA Astrophysics Data System (ADS)
Rajshekhar, G.; Gorthi, Sai Siva; Rastogi, Pramod
2010-04-01
For phase estimation in digital holographic interferometry, a high-order instantaneous moments (HIM) based method was recently developed which relies on piecewise polynomial approximation of phase and subsequent evaluation of the polynomial coefficients using the HIM operator. A crucial step in the method is mapping the polynomial coefficient estimation to single-tone frequency determination for which various techniques exist. The paper presents a comparative analysis of the performance of the HIM operator based method in using different single-tone frequency estimation techniques for phase estimation. The analysis is supplemented by simulation results.
Application of pattern recognition techniques to crime analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bender, C.F.; Cox, L.A. Jr.; Chappell, G.A.
1976-08-15
The initial goal was to evaluate the capabilities of current pattern recognition techniques when applied to existing computerized crime data. Performance was to be evaluated both in terms of the system's capability to predict crimes and to optimize police manpower allocation. A relation was sought to predict the crime's susceptibility to solution, based on knowledge of the crime type, location, time, etc. The preliminary results of this work are discussed. They indicate that automatic crime analysis involving pattern recognition techniques is feasible, and that efforts to determine optimum variables and techniques are warranted. 47 figures (RWR)
Fringe pattern demodulation with a two-dimensional digital phase-locked loop algorithm.
Gdeisat, Munther A; Burton, David R; Lalor, Michael J
2002-09-10
A novel technique called a two-dimensional digital phase-locked loop (DPLL) for fringe pattern demodulation is presented. This algorithm is more suitable for demodulation of fringe patterns with varying phase in two directions than the existing DPLL techniques that assume that the phase of the fringe patterns varies only in one direction. The two-dimensional DPLL technique assumes that the phase of a fringe pattern is continuous in both directions and takes advantage of the phase continuity; consequently, the algorithm has better noise performance than the existing DPLL schemes. The two-dimensional DPLL algorithm is also suitable for demodulation of fringe patterns with low sampling rates, and it outperforms the Fourier fringe analysis technique in this aspect.
Efficient morse decompositions of vector fields.
Chen, Guoning; Mischaikow, Konstantin; Laramee, Robert S; Zhang, Eugene
2008-01-01
Existing topology-based vector field analysis techniques rely on the ability to extract the individual trajectories such as fixed points, periodic orbits, and separatrices that are sensitive to noise and errors introduced by simulation and interpolation. This can make such vector field analysis unsuitable for rigorous interpretations. We advocate the use of Morse decompositions, which are robust with respect to perturbations, to encode the topological structures of a vector field in the form of a directed graph, called a Morse connection graph (MCG). While an MCG exists for every vector field, it need not be unique. Previous techniques for computing MCG's, while fast, are overly conservative and usually results in MCG's that are too coarse to be useful for the applications. To address this issue, we present a new technique for performing Morse decomposition based on the concept of tau-maps, which typically provides finer MCG's than existing techniques. Furthermore, the choice of tau provides a natural tradeoff between the fineness of the MCG's and the computational costs. We provide efficient implementations of Morse decomposition based on tau-maps, which include the use of forward and backward mapping techniques and an adaptive approach in constructing better approximations of the images of the triangles in the meshes used for simulation.. Furthermore, we propose the use of spatial tau-maps in addition to the original temporal tau-maps. These techniques provide additional trade-offs between the quality of the MCGs and the speed of computation. We demonstrate the utility of our technique with various examples in the plane and on surfaces including engine simulation data sets.
Heuristics to Facilitate Understanding of Discriminant Analysis.
ERIC Educational Resources Information Center
Van Epps, Pamela D.
This paper discusses the principles underlying discriminant analysis and constructs a simulated data set to illustrate its methods. Discriminant analysis is a multivariate technique for identifying the best combination of variables to maximally discriminate between groups. Discriminant functions are established on existing groups and used to…
Reliability analysis of a robotic system using hybridized technique
NASA Astrophysics Data System (ADS)
Kumar, Naveen; Komal; Lather, J. S.
2017-09-01
In this manuscript, the reliability of a robotic system has been analyzed using the available data (containing vagueness, uncertainty, etc). Quantification of involved uncertainties is done through data fuzzification using triangular fuzzy numbers with known spreads as suggested by system experts. With fuzzified data, if the existing fuzzy lambda-tau (FLT) technique is employed, then the computed reliability parameters have wide range of predictions. Therefore, decision-maker cannot suggest any specific and influential managerial strategy to prevent unexpected failures and consequently to improve complex system performance. To overcome this problem, the present study utilizes a hybridized technique. With this technique, fuzzy set theory is utilized to quantify uncertainties, fault tree is utilized for the system modeling, lambda-tau method is utilized to formulate mathematical expressions for failure/repair rates of the system, and genetic algorithm is utilized to solve established nonlinear programming problem. Different reliability parameters of a robotic system are computed and the results are compared with the existing technique. The components of the robotic system follow exponential distribution, i.e., constant. Sensitivity analysis is also performed and impact on system mean time between failures (MTBF) is addressed by varying other reliability parameters. Based on analysis some influential suggestions are given to improve the system performance.
Wan, Yong; Otsuna, Hideo; Holman, Holly A; Bagley, Brig; Ito, Masayoshi; Lewis, A Kelsey; Colasanto, Mary; Kardon, Gabrielle; Ito, Kei; Hansen, Charles
2017-05-26
Image segmentation and registration techniques have enabled biologists to place large amounts of volume data from fluorescence microscopy, morphed three-dimensionally, onto a common spatial frame. Existing tools built on volume visualization pipelines for single channel or red-green-blue (RGB) channels have become inadequate for the new challenges of fluorescence microscopy. For a three-dimensional atlas of the insect nervous system, hundreds of volume channels are rendered simultaneously, whereas fluorescence intensity values from each channel need to be preserved for versatile adjustment and analysis. Although several existing tools have incorporated support of multichannel data using various strategies, the lack of a flexible design has made true many-channel visualization and analysis unavailable. The most common practice for many-channel volume data presentation is still converting and rendering pseudosurfaces, which are inaccurate for both qualitative and quantitative evaluations. Here, we present an alternative design strategy that accommodates the visualization and analysis of about 100 volume channels, each of which can be interactively adjusted, selected, and segmented using freehand tools. Our multichannel visualization includes a multilevel streaming pipeline plus a triple-buffer compositing technique. Our method also preserves original fluorescence intensity values on graphics hardware, a crucial feature that allows graphics-processing-unit (GPU)-based processing for interactive data analysis, such as freehand segmentation. We have implemented the design strategies as a thorough restructuring of our original tool, FluoRender. The redesign of FluoRender not only maintains the existing multichannel capabilities for a greatly extended number of volume channels, but also enables new analysis functions for many-channel data from emerging biomedical-imaging techniques.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cui, Yonggang
In implementation of nuclear safeguards, many different techniques are being used to monitor operation of nuclear facilities and safeguard nuclear materials, ranging from radiation detectors, flow monitors, video surveillance, satellite imagers, digital seals to open source search and reports of onsite inspections/verifications. Each technique measures one or more unique properties related to nuclear materials or operation processes. Because these data sets have no or loose correlations, it could be beneficial to analyze the data sets together to improve the effectiveness and efficiency of safeguards processes. Advanced visualization techniques and machine-learning based multi-modality analysis could be effective tools in such integratedmore » analysis. In this project, we will conduct a survey of existing visualization and analysis techniques for multi-source data and assess their potential values in nuclear safeguards.« less
2013-09-01
existing MR scanning systems providing the ability to visualize structures that are impossible with current methods . Using techniques to concurrently...and unique system for analysis of affected brain regions and coupled with other imaging techniques and molecular measurements holds significant...scanning systems providing the ability to visualize structures that are impossible with current methods . Using techniques to concurrently stain
Metamodels for Computer-Based Engineering Design: Survey and Recommendations
NASA Technical Reports Server (NTRS)
Simpson, Timothy W.; Peplinski, Jesse; Koch, Patrick N.; Allen, Janet K.
1997-01-01
The use of statistical techniques to build approximations of expensive computer analysis codes pervades much of todays engineering design. These statistical approximations, or metamodels, are used to replace the actual expensive computer analyses, facilitating multidisciplinary, multiobjective optimization and concept exploration. In this paper we review several of these techniques including design of experiments, response surface methodology, Taguchi methods, neural networks, inductive learning, and kriging. We survey their existing application in engineering design and then address the dangers of applying traditional statistical techniques to approximate deterministic computer analysis codes. We conclude with recommendations for the appropriate use of statistical approximation techniques in given situations and how common pitfalls can be avoided.
Magnetic separation techniques in sample preparation for biological analysis: a review.
He, Jincan; Huang, Meiying; Wang, Dongmei; Zhang, Zhuomin; Li, Gongke
2014-12-01
Sample preparation is a fundamental and essential step in almost all the analytical procedures, especially for the analysis of complex samples like biological and environmental samples. In past decades, with advantages of superparamagnetic property, good biocompatibility and high binding capacity, functionalized magnetic materials have been widely applied in various processes of sample preparation for biological analysis. In this paper, the recent advancements of magnetic separation techniques based on magnetic materials in the field of sample preparation for biological analysis were reviewed. The strategy of magnetic separation techniques was summarized. The synthesis, stabilization and bio-functionalization of magnetic nanoparticles were reviewed in detail. Characterization of magnetic materials was also summarized. Moreover, the applications of magnetic separation techniques for the enrichment of protein, nucleic acid, cell, bioactive compound and immobilization of enzyme were described. Finally, the existed problems and possible trends of magnetic separation techniques for biological analysis in the future were proposed. Copyright © 2014 Elsevier B.V. All rights reserved.
Muller, E; Gargani, D; Banuls, A L; Tibayrenc, M; Dollet, M
1997-10-01
The genetic polymorphism of 30 isolates of plant trypanosomatids colloquially referred to as plant trypanosomes was assayed by means of RAPD. The principle objectives of this study were to assess the discriminative power of RAPD analysis for studying plant trypanosomes and to determine whether the results obtained were comparable with those from a previous isoenzyme (MLEE) study. The principle groups of plant trypanosomes identified previously by isoenzyme analysis--intraphloemic trypanosomes, intralaticiferous trypanosomes and trypanosomes isolated from fruits--were also clearly separated by the RAPD technique. Moreover, the results showed a fair parity between MLEE and RAPD data (coefficient of correlation = 0.84) and the two techniques have comparable discriminative ability. Most of the separation revealed by the two techniques between the clusters was associated with major biological properties. However, the RAPD technique gave a more coherent separation than MLEE because the intraphloemic isolates, which were biologically similar in terms of their specific localization in the sieve tubes of the plant, were found to be in closer groups by the RAPD. For both techniques, the existence of the main clusters was correlated with the existence of synapomorphic characters, which could be used as powerful tools in taxonomy and epidemiology.
pyNSMC: A Python Module for Null-Space Monte Carlo Uncertainty Analysis
NASA Astrophysics Data System (ADS)
White, J.; Brakefield, L. K.
2015-12-01
The null-space monte carlo technique is a non-linear uncertainty analyses technique that is well-suited to high-dimensional inverse problems. While the technique is powerful, the existing workflow for completing null-space monte carlo is cumbersome, requiring the use of multiple commandline utilities, several sets of intermediate files and even a text editor. pyNSMC is an open-source python module that automates the workflow of null-space monte carlo uncertainty analyses. The module is fully compatible with the PEST and PEST++ software suites and leverages existing functionality of pyEMU, a python framework for linear-based uncertainty analyses. pyNSMC greatly simplifies the existing workflow for null-space monte carlo by taking advantage of object oriented design facilities in python. The core of pyNSMC is the ensemble class, which draws and stores realized random vectors and also provides functionality for exporting and visualizing results. By relieving users of the tedium associated with file handling and command line utility execution, pyNSMC instead focuses the user on the important steps and assumptions of null-space monte carlo analysis. Furthermore, pyNSMC facilitates learning through flow charts and results visualization, which are available at many points in the algorithm. The ease-of-use of the pyNSMC workflow is compared to the existing workflow for null-space monte carlo for a synthetic groundwater model with hundreds of estimable parameters.
Analysis of structural response data using discrete modal filters. M.S. Thesis
NASA Technical Reports Server (NTRS)
Freudinger, Lawrence C.
1991-01-01
The application of reciprocal modal vectors to the analysis of structural response data is described. Reciprocal modal vectors are constructed using an existing experimental modal model and an existing frequency response matrix of a structure, and can be assembled into a matrix that effectively transforms the data from the physical space to a modal space within a particular frequency range. In other words, the weighting matrix necessary for modal vector orthogonality (typically the mass matrix) is contained within the reciprocal model matrix. The underlying goal of this work is mostly directed toward observing the modal state responses in the presence of unknown, possibly closed loop forcing functions, thus having an impact on both operating data analysis techniques and independent modal space control techniques. This study investigates the behavior of reciprocol modal vectors as modal filters with respect to certain calculation parameters and their performance with perturbed system frequency response data.
Ivezic, Nenad; Potok, Thomas E.
2003-09-30
A method for automatically evaluating a manufacturing technique comprises the steps of: receiving from a user manufacturing process step parameters characterizing a manufacturing process; accepting from the user a selection for an analysis of a particular lean manufacturing technique; automatically compiling process step data for each process step in the manufacturing process; automatically calculating process metrics from a summation of the compiled process step data for each process step; and, presenting the automatically calculated process metrics to the user. A method for evaluating a transition from a batch manufacturing technique to a lean manufacturing technique can comprise the steps of: collecting manufacturing process step characterization parameters; selecting a lean manufacturing technique for analysis; communicating the selected lean manufacturing technique and the manufacturing process step characterization parameters to an automatic manufacturing technique evaluation engine having a mathematical model for generating manufacturing technique evaluation data; and, using the lean manufacturing technique evaluation data to determine whether to transition from an existing manufacturing technique to the selected lean manufacturing technique.
Criteria for the use of regression analysis for remote sensing of sediment and pollutants
NASA Technical Reports Server (NTRS)
Whitlock, C. H.; Kuo, C. Y.; Lecroy, S. R. (Principal Investigator)
1982-01-01
Data analysis procedures for quantification of water quality parameters that are already identified and are known to exist within the water body are considered. The liner multiple-regression technique was examined as a procedure for defining and calibrating data analysis algorithms for such instruments as spectrometers and multispectral scanners.
NASA Astrophysics Data System (ADS)
Donato, M. B.; Milasi, M.; Vitanza, C.
2010-09-01
An existence result of a Walrasian equilibrium for an integrated model of exchange, consumption and production is obtained. The equilibrium model is characterized in terms of a suitable generalized quasi-variational inequality; so the existence result comes from an original technique which takes into account tools of convex and set-valued analysis.
MASTER ANALYTICAL SCHEME FOR ORGANIC COMPOUNDS IN WATER: PART 1. PROTOCOLS
A Master Analytical Scheme (MAS) has been developed for the analysis of volatile (gas chromatographable) organic compounds in water. In developing the MAS, it was necessary to evaluate and modify existing analysis procedures and develop new techniques to produce protocols that pr...
Unsupervised classification of earth resources data.
NASA Technical Reports Server (NTRS)
Su, M. Y.; Jayroe, R. R., Jr.; Cummings, R. E.
1972-01-01
A new clustering technique is presented. It consists of two parts: (a) a sequential statistical clustering which is essentially a sequential variance analysis and (b) a generalized K-means clustering. In this composite clustering technique, the output of (a) is a set of initial clusters which are input to (b) for further improvement by an iterative scheme. This unsupervised composite technique was employed for automatic classification of two sets of remote multispectral earth resource observations. The classification accuracy by the unsupervised technique is found to be comparable to that by existing supervised maximum liklihood classification technique.
Theoretical and software considerations for nonlinear dynamic analysis
NASA Technical Reports Server (NTRS)
Schmidt, R. J.; Dodds, R. H., Jr.
1983-01-01
In the finite element method for structural analysis, it is generally necessary to discretize the structural model into a very large number of elements to accurately evaluate displacements, strains, and stresses. As the complexity of the model increases, the number of degrees of freedom can easily exceed the capacity of present-day software system. Improvements of structural analysis software including more efficient use of existing hardware and improved structural modeling techniques are discussed. One modeling technique that is used successfully in static linear and nonlinear analysis is multilevel substructuring. This research extends the use of multilevel substructure modeling to include dynamic analysis and defines the requirements for a general purpose software system capable of efficient nonlinear dynamic analysis. The multilevel substructuring technique is presented, the analytical formulations and computational procedures for dynamic analysis and nonlinear mechanics are reviewed, and an approach to the design and implementation of a general purpose structural software system is presented.
Flow analysis techniques for phosphorus: an overview.
Estela, José Manuel; Cerdà, Víctor
2005-04-15
A bibliographical review on the implementation and the results obtained in the use of different flow analytical techniques for the determination of phosphorus is carried out. The sources, occurrence and importance of phosphorus together with several aspects regarding the analysis and terminology used in the determination of this element are briefly described. A classification as well as a brief description of the basis, advantages and disadvantages of the different existing flow techniques, namely; segmented flow analysis (SFA), flow injection analysis (FIA), sequential injection analysis (SIA), all injection analysis (AIA), batch injection analysis (BIA), multicommutated FIA (MCFIA), multisyringe FIA (MSFIA) and multipumped FIA (MPFIA) is also carried out. The most relevant manuscripts regarding the analysis of phosphorus by means of flow techniques are herein classified according to the detection instrumental technique used with the aim to facilitate their study and obtain an overall scope. Finally, the analytical characteristics of numerous flow-methods reported in the literature are provided in the form of a table and their applicability to samples with different matrixes, namely water samples (marine, river, estuarine, waste, industrial, drinking, etc.), soils leachates, plant leaves, toothpaste, detergents, foodstuffs (wine, orange juice, milk), biological samples, sugars, fertilizer, hydroponic solutions, soils extracts and cyanobacterial biofilms are tabulated.
ERIC Educational Resources Information Center
Hanson, James H.; Brophy, Patrick D.
2012-01-01
Not all knowledge and skills that educators want to pass to students exists yet in textbooks. Some still resides only in the experiences of practicing engineers (e.g., how engineers create new products, how designers identify errors in calculations). The critical incident technique, CIT, is an established method for cognitive task analysis. It is…
Sonic Fatigue Design Techniques for Advanced Composite Aircraft Structures
1980-04-01
AFWAL-TR-80.3019 AD A 090553 SONIC FATIGUE DESIGN TECHNIQUES FOR ADVANCED COMPOSITE AIRCRAFT STRUCTURES FINAL REPORT Ian Holehouse Rohr Industries...5 2. General Sonic Fatigue Theory .... ....... 7 3. Composite Laminate Analysis .. ....... ... 10 4. Preliminary Sonic Fatigue...overall sonic fatigue design guides. These existing desiyn methcds have been developed for metal structures. However, recent advanced composite
MASTER ANALYTICAL SCHEME FOR ORGANIC COMPOUNDS IN WATER. PART 2. APPENDICES TO PROTOCOLS
A Master Analytical Scheme (MAS) has been developed for the analysis of volatile (gas chromatographable) organic compounds in water. In developing the MAS, it was necessary to evaluate and modify existing analysis procedures and develop new techniques to produce protocols that pr...
NASA Astrophysics Data System (ADS)
Woolfitt, Adrian R.; Boyer, Anne E.; Quinn, Conrad P.; Hoffmaster, Alex R.; Kozel, Thomas R.; de, Barun K.; Gallegos, Maribel; Moura, Hercules; Pirkle, James L.; Barr, John R.
A range of mass spectrometry-based techniques have been used to identify, characterize and differentiate Bacillus anthracis, both in culture for forensic applications and for diagnosis during infection. This range of techniques could usefully be considered to exist as a continuum, based on the degrees of specificity involved. We show two examples here, a whole-organism fingerprinting method and a high-specificity assay for one unique protein, anthrax lethal factor.
NASA Astrophysics Data System (ADS)
Doležel, Jiří; Novák, Drahomír; Petrů, Jan
2017-09-01
Transportation routes of oversize and excessive loads are currently planned in relation to ensure the transit of a vehicle through critical points on the road. Critical points are level-intersection of roads, bridges etc. This article presents a comprehensive procedure to determine a reliability and a load-bearing capacity level of the existing bridges on highways and roads using the advanced methods of reliability analysis based on simulation techniques of Monte Carlo type in combination with nonlinear finite element method analysis. The safety index is considered as a main criterion of the reliability level of the existing construction structures and the index is described in current structural design standards, e.g. ISO and Eurocode. An example of a single-span slab bridge made of precast prestressed concrete girders of the 60 year current time and its load bearing capacity is set for the ultimate limit state and serviceability limit state. The structure’s design load capacity was estimated by the full probability nonlinear MKP analysis using a simulation technique Latin Hypercube Sampling (LHS). Load-bearing capacity values based on a fully probabilistic analysis are compared with the load-bearing capacity levels which were estimated by deterministic methods of a critical section of the most loaded girders.
ERIC Educational Resources Information Center
Morris, Phillip; Thrall, Grant
2010-01-01
Geographic analysis has been adopted by businesses, especially the retail sector, since the early 1990s (Thrall, 2002). Institutional research can receive the same benefits businesses have by adopting geographic analysis and technology. The commonalities between businesses and higher education institutions include the existence of trade areas, the…
NECAP 4.1: NASA's Energy-Cost Analysis Program input manual
NASA Technical Reports Server (NTRS)
Jensen, R. N.
1982-01-01
The computer program NECAP (NASA's Energy Cost Analysis Program) is described. The program is a versatile building design and energy analysis tool which has embodied within it state of the art techniques for performing thermal load calculations and energy use predictions. With the program, comparisons of building designs and operational alternatives for new or existing buildings can be made. The major feature of the program is the response factor technique for calculating the heat transfer through the building surfaces which accounts for the building's mass. The program expands the response factor technique into a space response factor to account for internal building temperature swings; this is extremely important in determining true building loads and energy consumption when internal temperatures are allowed to swing.
DOT National Transportation Integrated Search
1973-12-01
Various forms of Dual Mode transportation were analyzed in order to assess the economic viability of the dual mode concept. Specially designed new small Dual Mode vehicles, modifications of existing automobiles, and pallet systems, all operating in c...
DOT National Transportation Integrated Search
1973-12-01
Various forms of Dual Mode transportation were analyzed in order to assess the economic viability of the dual mode concept. Specially designed new small Dual Mode vehicles, modifications of existing automobiles, and pallet systems, all operating in c...
A Theory of Term Importance in Automatic Text Analysis.
ERIC Educational Resources Information Center
Salton, G.; And Others
Most existing automatic content analysis and indexing techniques are based on work frequency characteristics applied largely in an ad hoc manner. Contradictory requirements arise in this connection, in that terms exhibiting high occurrence frequencies in individual documents are often useful for high recall performance (to retrieve many relevant…
Statistical Analysis For Nucleus/Nucleus Collisions
NASA Technical Reports Server (NTRS)
Mcguire, Stephen C.
1989-01-01
Report describes use of several statistical techniques to charactertize angular distributions of secondary particles emitted in collisions of atomic nuclei in energy range of 24 to 61 GeV per nucleon. Purpose of statistical analysis to determine correlations between intensities of emitted particles and angles comfirming existence of quark/gluon plasma.
IGA: A Simplified Introduction and Implementation Details for Finite Element Users
NASA Astrophysics Data System (ADS)
Agrawal, Vishal; Gautam, Sachin S.
2018-05-01
Isogeometric analysis (IGA) is a recently introduced technique that employs the Computer Aided Design (CAD) concept of Non-uniform Rational B-splines (NURBS) tool to bridge the substantial bottleneck between the CAD and finite element analysis (FEA) fields. The simplified transition of exact CAD models into the analysis alleviates the issues originating from geometrical discontinuities and thus, significantly reduces the design-to-analysis time in comparison to traditional FEA technique. Since its origination, the research in the field of IGA is accelerating and has been applied to various problems. However, the employment of CAD tools in the area of FEA invokes the need of adapting the existing implementation procedure for the framework of IGA. Also, the usage of IGA requires the in-depth knowledge of both the CAD and FEA fields. This can be overwhelming for a beginner in IGA. Hence, in this paper, a simplified introduction and implementation details for the incorporation of NURBS based IGA technique within the existing FEA code is presented. It is shown that with little modifications, the available standard code structure of FEA can be adapted for IGA. For the clear and concise explanation of these modifications, step-by-step implementation of a benchmark plate with a circular hole under the action of in-plane tension is included.
Cheng, Ching-Min; Hwang, Sheue-Ling
2015-03-01
This paper outlines the human error identification (HEI) techniques that currently exist to assess latent human errors. Many formal error identification techniques have existed for years, but few have been validated to cover latent human error analysis in different domains. This study considers many possible error modes and influential factors, including external error modes, internal error modes, psychological error mechanisms, and performance shaping factors, and integrates several execution procedures and frameworks of HEI techniques. The case study in this research was the operational process of changing chemical cylinders in a factory. In addition, the integrated HEI method was used to assess the operational processes and the system's reliability. It was concluded that the integrated method is a valuable aid to develop much safer operational processes and can be used to predict human error rates on critical tasks in the plant. Copyright © 2014 Elsevier Ltd and The Ergonomics Society. All rights reserved.
Change Detection Analysis of Water Pollution in Coimbatore Region using Different Color Models
NASA Astrophysics Data System (ADS)
Jiji, G. Wiselin; Devi, R. Naveena
2017-12-01
The data acquired through remote sensing satellites furnish facts about the land and water at varying resolutions and has been widely used for several change detection studies. Apart from the existence of many change detection methodologies and techniques, emergence of new ones continues to subsist. Existing change detection techniques exploit images that are either in gray scale or RGB color model. In this paper we introduced color models for performing change detection for water pollution. Here the polluted lakes are classified and post-classification change detection techniques are applied to RGB images and results obtained are analysed for changes to exist or not. Furthermore RGB images obtained after classification when converted to any of the two color models YCbCr and YIQ is found to produce the same results as that of the RGB model images. Thus it can be concluded that other color models like YCbCr, YIQ can be used as substitution to RGB color model for analysing change detection with regard to water pollution.
A multiple technique approach to the analysis of urinary calculi.
Rodgers, A L; Nassimbeni, L R; Mulder, K J
1982-01-01
10 urinary calculi have been qualitatively and quantitatively analysed using X-ray diffraction, infra-red, scanning electron microscopy, X-ray fluorescence, atomic absorption and density gradient procedures. Constituents and compositional features which often go undetected due to limitations in the particular analytical procedure being used, have been identified and a detailed picture of each stone's composition and structure has been obtained. In all cases at least two components were detected suggesting that the multiple technique approach might cast some doubt as to the existence of "pure" stones. Evidence for a continuous, non-sequential deposition mechanism has been detected. In addition, the usefulness of each technique in the analysis of urinary stones has been assessed and the multiple technique approach has been evaluated as a whole.
NASA Technical Reports Server (NTRS)
Hong, Jaesub; Allen, Branden; Grindlay, Jonathan; Barthelmy, Scott D.
2016-01-01
Wide-field (greater than or approximately equal to 100 degrees squared) hard X-ray coded-aperture telescopes with high angular resolution (greater than or approximately equal to 2 minutes) will enable a wide range of time domain astrophysics. For instance, transient sources such as gamma-ray bursts can be precisely localized without the assistance of secondary focusing X-ray telescopes to enable rapid followup studies. On the other hand, high angular resolution in coded-aperture imaging introduces a new challenge in handling the systematic uncertainty: the average photon count per pixel is often too small to establish a proper background pattern or model the systematic uncertainty in a timescale where the model remains invariant. We introduce two new techniques to improve detection sensitivity, which are designed for, but not limited to, a high-resolution coded-aperture system: a self-background modeling scheme which utilizes continuous scan or dithering operations, and a Poisson-statistics based probabilistic approach to evaluate the significance of source detection without subtraction in handling the background. We illustrate these new imaging analysis techniques in high resolution coded-aperture telescope using the data acquired by the wide-field hard X-ray telescope ProtoEXIST2 during a high-altitude balloon flight in fall 2012. We review the imaging sensitivity of ProtoEXIST2 during the flight, and demonstrate the performance of the new techniques using our balloon flight data in comparison with a simulated ideal Poisson background.
A review of risk management process in construction projects of developing countries
NASA Astrophysics Data System (ADS)
Bahamid, R. A.; Doh, S. I.
2017-11-01
In the construction industry, risk management concept is a less popular technique. There are three main stages in the systematic approach to risk management in construction industry. These stages include: a) risk response; b) risk analysis and evaluation; and c) risk identification. The high risk related to construction business affects each of its participants; while operational analysis and management of construction related risks remain an enormous task to practitioners of the industry. This paper tends towards reviewing the existing literature on construction project risk managements in developing countries specifically on risk management process. The literature lacks ample risk management process approach capable of capturing risk impact on diverse project objectives. This literature review aims at discovering the frequently used techniques in risk identification and analysis. It also attempts to identify response to clarifying the different classifications of risk sources in the existing literature of developing countries, and to identify the future research directions on project risks in the area of construction in developing countries.
Restructuring the rotor analysis program C-60
NASA Technical Reports Server (NTRS)
1985-01-01
The continuing evolution of the rotary wing industry demands increasing analytical capabilities. To keep up with this demand, software must be structured to accommodate change. The approach discussed for meeting this demand is to restructure an existing analysis. The motivational factors, basic principles, application techniques, and practical lessons from experience with this restructuring effort are reviewed.
Publication Bias in Research Synthesis: Sensitivity Analysis Using A Priori Weight Functions
ERIC Educational Resources Information Center
Vevea, Jack L.; Woods, Carol M.
2005-01-01
Publication bias, sometimes known as the "file-drawer problem" or "funnel-plot asymmetry," is common in empirical research. The authors review the implications of publication bias for quantitative research synthesis (meta-analysis) and describe existing techniques for detecting and correcting it. A new approach is proposed that is suitable for…
Seeking Social Capital and Expertise in a Newly-Formed Research Community: A Co-Author Analysis
ERIC Educational Resources Information Center
Forte, Christine E.
2017-01-01
This exploratory study applies social network analysis techniques to existing, publicly available data to understand collaboration patterns within the co-author network of a federally-funded, interdisciplinary research program. The central questions asked: What underlying social capital structures can be determined about a group of researchers…
Evolutionary computing for the design search and optimization of space vehicle power subsystems
NASA Technical Reports Server (NTRS)
Kordon, M.; Klimeck, G.; Hanks, D.
2004-01-01
Evolutionary computing has proven to be a straightforward and robust approach for optimizing a wide range of difficult analysis and design problems. This paper discusses the application of these techniques to an existing space vehicle power subsystem resource and performance analysis simulation in a parallel processing environment.
USDA-ARS?s Scientific Manuscript database
A quantitative answer cannot exist in an analysis without a qualitative component to give enough confidence that the result meets the analytical needs for the analysis (i.e. the result relates to the analyte and not something else). Just as a quantitative method must typically undergo an empirical ...
Olivieri, Alejandro C
2005-08-01
Sensitivity and selectivity are important figures of merit in multiway analysis, regularly employed for comparison of the analytical performance of methods and for experimental design and planning. They are especially interesting in the second-order advantage scenario, where the latter property allows for the analysis of samples with a complex background, permitting analyte determination even in the presence of unsuspected interferences. Since no general theory exists for estimating the multiway sensitivity, Monte Carlo numerical calculations have been developed for estimating variance inflation factors, as a convenient way of assessing both sensitivity and selectivity parameters for the popular parallel factor (PARAFAC) analysis and also for related multiway techniques. When the second-order advantage is achieved, the existing expressions derived from net analyte signal theory are only able to adequately cover cases where a single analyte is calibrated using second-order instrumental data. However, they fail for certain multianalyte cases, or when third-order data are employed, calling for an extension of net analyte theory. The results have strong implications in the planning of multiway analytical experiments.
NASA Technical Reports Server (NTRS)
Miles, J. H.; Stevens, G. H.; Leininger, G. G.
1975-01-01
Ground reflections generate undesirable effects on acoustic measurements such as those conducted outdoors for jet noise research, aircraft certification, and motor vehicle regulation. Cepstral techniques developed in speech processing are adapted to identify echo delay time and to correct for ground reflection effects. A sample result is presented using an actual narrowband sound pressure level spectrum. The technique can readily be adapted to existing fast Fourier transform type spectrum measurement instrumentation to provide field measurements/of echo time delays.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gross, Cory Thomas
2008-01-01
The focus of this dissertation is the development of techniques with which to enhance the existing abilities of inductively coupled plasma mass spectrometry (ICP-MS). ICP-MS is a powerful technique for trace metal analysis in samples of many types, but like any technique it has certain strengths and weaknesses. Attempts are made to improve upon those strengths and to overcome certain weaknesses.
The workload book: Assessment of operator workload to engineering systems
NASA Technical Reports Server (NTRS)
Gopher, D.
1983-01-01
The structure and initial work performed toward the creation of a handbook for workload analysis directed at the operational community of engineers and human factors psychologists are described. The goal, when complete, will be to make accessible to such individuals the results of theoretically-based research that are of practical interest and utility in the analysis and prediction of operator workload in advanced and existing systems. In addition, the results of laboratory study focused on the development of a subjective rating technique for workload that is based on psychophysical scaling techniques are described.
Geophysical monitoring in a hydrocarbon reservoir
NASA Astrophysics Data System (ADS)
Caffagni, Enrico; Bokelmann, Goetz
2016-04-01
Extraction of hydrocarbons from reservoirs demands ever-increasing technological effort, and there is need for geophysical monitoring to better understand phenomena occurring within the reservoir. Significant deformation processes happen when man-made stimulation is performed, in combination with effects deriving from the existing natural conditions such as stress regime in situ or pre-existing fracturing. Keeping track of such changes in the reservoir is important, on one hand for improving recovery of hydrocarbons, and on the other hand to assure a safe and proper mode of operation. Monitoring becomes particularly important when hydraulic-fracturing (HF) is used, especially in the form of the much-discussed "fracking". HF is a sophisticated technique that is widely applied in low-porosity geological formations to enhance the production of natural hydrocarbons. In principle, similar HF techniques have been applied in Europe for a long time in conventional reservoirs, and they will probably be intensified in the near future; this suggests an increasing demand in technological development, also for updating and adapting the existing monitoring techniques in applied geophysics. We review currently available geophysical techniques for reservoir monitoring, which appear in the different fields of analysis in reservoirs. First, the properties of the hydrocarbon reservoir are identified; here we consider geophysical monitoring exclusively. The second step is to define the quantities that can be monitored, associated to the properties. We then describe the geophysical monitoring techniques including the oldest ones, namely those in practical usage from 40-50 years ago, and the most recent developments in technology, within distinct groups, according to the application field of analysis in reservoir. This work is performed as part of the FracRisk consortium (www.fracrisk.eu); this project, funded by the Horizon2020 research programme, aims at helping minimize the environmental footprint of the shale-gas exploration and exploitation.
System Identification of Mistuned Bladed Disks from Traveling Wave Response Measurements
NASA Technical Reports Server (NTRS)
Feiner, D. M.; Griffin, J. H.; Jones, K. W.; Kenyon, J. A.; Mehmed, O.; Kurkov, A. P.
2003-01-01
A new approach to modal analysis is presented. By applying this technique to bladed disk system identification methods, one can determine the mistuning in a rotor based on its response to a traveling wave excitation. This allows system identification to be performed under rotating conditions, and thus expands the applicability of existing mistuning identification techniques from integrally bladed rotors to conventional bladed disks.
Quantifying short-lived events in multistate ionic current measurements.
Balijepalli, Arvind; Ettedgui, Jessica; Cornio, Andrew T; Robertson, Joseph W F; Cheung, Kin P; Kasianowicz, John J; Vaz, Canute
2014-02-25
We developed a generalized technique to characterize polymer-nanopore interactions via single channel ionic current measurements. Physical interactions between analytes, such as DNA, proteins, or synthetic polymers, and a nanopore cause multiple discrete states in the current. We modeled the transitions of the current to individual states with an equivalent electrical circuit, which allowed us to describe the system response. This enabled the estimation of short-lived states that are presently not characterized by existing analysis techniques. Our approach considerably improves the range and resolution of single-molecule characterization with nanopores. For example, we characterized the residence times of synthetic polymers that are three times shorter than those estimated with existing algorithms. Because the molecule's residence time follows an exponential distribution, we recover nearly 20-fold more events per unit time that can be used for analysis. Furthermore, the measurement range was extended from 11 monomers to as few as 8. Finally, we applied this technique to recover a known sequence of single-stranded DNA from previously published ion channel recordings, identifying discrete current states with subpicoampere resolution.
Paper simulation techniques in user requirements analysis for interactive computer systems
NASA Technical Reports Server (NTRS)
Ramsey, H. R.; Atwood, M. E.; Willoughby, J. K.
1979-01-01
This paper describes the use of a technique called 'paper simulation' in the analysis of user requirements for interactive computer systems. In a paper simulation, the user solves problems with the aid of a 'computer', as in normal man-in-the-loop simulation. In this procedure, though, the computer does not exist, but is simulated by the experimenters. This allows simulated problem solving early in the design effort, and allows the properties and degree of structure of the system and its dialogue to be varied. The technique, and a method of analyzing the results, are illustrated with examples from a recent paper simulation exercise involving a Space Shuttle flight design task
Automatic differentiation evaluated as a tool for rotorcraft design and optimization
NASA Technical Reports Server (NTRS)
Walsh, Joanne L.; Young, Katherine C.
1995-01-01
This paper investigates the use of automatic differentiation (AD) as a means for generating sensitivity analyses in rotorcraft design and optimization. This technique transforms an existing computer program into a new program that performs sensitivity analysis in addition to the original analysis. The original FORTRAN program calculates a set of dependent (output) variables from a set of independent (input) variables, the new FORTRAN program calculates the partial derivatives of the dependent variables with respect to the independent variables. The AD technique is a systematic implementation of the chain rule of differentiation, this method produces derivatives to machine accuracy at a cost that is comparable with that of finite-differencing methods. For this study, an analysis code that consists of the Langley-developed hover analysis HOVT, the comprehensive rotor analysis CAMRAD/JA, and associated preprocessors is processed through the AD preprocessor ADIFOR 2.0. The resulting derivatives are compared with derivatives obtained from finite-differencing techniques. The derivatives obtained with ADIFOR 2.0 are exact within machine accuracy and do not depend on the selection of step-size, as are the derivatives obtained with finite-differencing techniques.
Portable Infrared Laser Spectroscopy for On-site Mycotoxin Analysis.
Sieger, Markus; Kos, Gregor; Sulyok, Michael; Godejohann, Matthias; Krska, Rudolf; Mizaikoff, Boris
2017-03-09
Mycotoxins are toxic secondary metabolites of fungi that spoil food, and severely impact human health (e.g., causing cancer). Therefore, the rapid determination of mycotoxin contamination including deoxynivalenol and aflatoxin B 1 in food and feed samples is of prime interest for commodity importers and processors. While chromatography-based techniques are well established in laboratory environments, only very few (i.e., mostly immunochemical) techniques exist enabling direct on-site analysis for traders and manufacturers. In this study, we present MYCOSPEC - an innovative approach for spectroscopic mycotoxin contamination analysis at EU regulatory limits for the first time utilizing mid-infrared tunable quantum cascade laser (QCL) spectroscopy. This analysis technique facilitates on-site mycotoxin analysis by combining QCL technology with GaAs/AlGaAs thin-film waveguides. Multivariate data mining strategies (i.e., principal component analysis) enabled the classification of deoxynivalenol-contaminated maize and wheat samples, and of aflatoxin B 1 affected peanuts at EU regulatory limits of 1250 μg kg -1 and 8 μg kg -1 , respectively.
Portable Infrared Laser Spectroscopy for On-site Mycotoxin Analysis
Sieger, Markus; Kos, Gregor; Sulyok, Michael; Godejohann, Matthias; Krska, Rudolf; Mizaikoff, Boris
2017-01-01
Mycotoxins are toxic secondary metabolites of fungi that spoil food, and severely impact human health (e.g., causing cancer). Therefore, the rapid determination of mycotoxin contamination including deoxynivalenol and aflatoxin B1 in food and feed samples is of prime interest for commodity importers and processors. While chromatography-based techniques are well established in laboratory environments, only very few (i.e., mostly immunochemical) techniques exist enabling direct on-site analysis for traders and manufacturers. In this study, we present MYCOSPEC - an innovative approach for spectroscopic mycotoxin contamination analysis at EU regulatory limits for the first time utilizing mid-infrared tunable quantum cascade laser (QCL) spectroscopy. This analysis technique facilitates on-site mycotoxin analysis by combining QCL technology with GaAs/AlGaAs thin-film waveguides. Multivariate data mining strategies (i.e., principal component analysis) enabled the classification of deoxynivalenol-contaminated maize and wheat samples, and of aflatoxin B1 affected peanuts at EU regulatory limits of 1250 μg kg−1 and 8 μg kg−1, respectively. PMID:28276454
Portable Infrared Laser Spectroscopy for On-site Mycotoxin Analysis
NASA Astrophysics Data System (ADS)
Sieger, Markus; Kos, Gregor; Sulyok, Michael; Godejohann, Matthias; Krska, Rudolf; Mizaikoff, Boris
2017-03-01
Mycotoxins are toxic secondary metabolites of fungi that spoil food, and severely impact human health (e.g., causing cancer). Therefore, the rapid determination of mycotoxin contamination including deoxynivalenol and aflatoxin B1 in food and feed samples is of prime interest for commodity importers and processors. While chromatography-based techniques are well established in laboratory environments, only very few (i.e., mostly immunochemical) techniques exist enabling direct on-site analysis for traders and manufacturers. In this study, we present MYCOSPEC - an innovative approach for spectroscopic mycotoxin contamination analysis at EU regulatory limits for the first time utilizing mid-infrared tunable quantum cascade laser (QCL) spectroscopy. This analysis technique facilitates on-site mycotoxin analysis by combining QCL technology with GaAs/AlGaAs thin-film waveguides. Multivariate data mining strategies (i.e., principal component analysis) enabled the classification of deoxynivalenol-contaminated maize and wheat samples, and of aflatoxin B1 affected peanuts at EU regulatory limits of 1250 μg kg-1 and 8 μg kg-1, respectively.
A LITERATURE REVIEW OF WIPE SAMPLING METHODS ...
Wipe sampling is an important technique for the estimation of contaminant deposition in buildings, homes, or outdoor surfaces as a source of possible human exposure. Numerousmethods of wipe sampling exist, and each method has its own specification for the type of wipe, wetting solvent, and determinative step to be used, depending upon the contaminant of concern. The objective of this report is to concisely summarize the findings of a literature review that was conducted to identify the state-of-the-art wipe sampling techniques for a target list of compounds. This report describes the methods used to perform the literature review; a brief review of wipe sampling techniques in general; an analysis of physical and chemical properties of each target analyte; an analysis of wipe sampling techniques for the target analyte list; and asummary of the wipe sampling techniques for the target analyte list, including existing data gaps. In general, no overwhelming consensus can be drawn from the current literature on how to collect a wipe sample for the chemical warfare agents, organophosphate pesticides, and other toxic industrial chemicals of interest to this study. Different methods, media, and wetting solvents have been recommended and used by various groups and different studies. For many of the compounds of interest, no specific wipe sampling methodology has been established for their collection. Before a wipe sampling method (or methods) can be established for the co
Overview of Sparse Graph for Multiple Access in Future Mobile Networks
NASA Astrophysics Data System (ADS)
Lei, Jing; Li, Baoguo; Li, Erbao; Gong, Zhenghui
2017-10-01
Multiple access via sparse graph, such as low density signature (LDS) and sparse code multiple access (SCMA), is a promising technique for future wireless communications. This survey presents an overview of the developments in this burgeoning field, including transmitter structures, extrinsic information transform (EXIT) chart analysis and comparisons with existing multiple access techniques. Such technique enables multiple access under overloaded conditions to achieve a satisfactory performance. Message passing algorithm is utilized for multi-user detection in the receiver, and structures of the sparse graph are illustrated in detail. Outlooks and challenges of this technique are also presented.
Cost Analysis of Instructional Technology.
ERIC Educational Resources Information Center
Johnson, F. Craig; Dietrich, John E.
Although some serious limitations in the cost analysis technique do exist, the need for cost data in decision making is so great that every effort should be made to obtain accurate estimates. This paper discusses the several issues which arise when an attempt is made to make quality, trade-off, or scope decisions based on cost data. Three methods…
Digression and Value Concatenation to Enable Privacy-Preserving Regression.
Li, Xiao-Bai; Sarkar, Sumit
2014-09-01
Regression techniques can be used not only for legitimate data analysis, but also to infer private information about individuals. In this paper, we demonstrate that regression trees, a popular data-analysis and data-mining technique, can be used to effectively reveal individuals' sensitive data. This problem, which we call a "regression attack," has not been addressed in the data privacy literature, and existing privacy-preserving techniques are not appropriate in coping with this problem. We propose a new approach to counter regression attacks. To protect against privacy disclosure, our approach introduces a novel measure, called digression , which assesses the sensitive value disclosure risk in the process of building a regression tree model. Specifically, we develop an algorithm that uses the measure for pruning the tree to limit disclosure of sensitive data. We also propose a dynamic value-concatenation method for anonymizing data, which better preserves data utility than a user-defined generalization scheme commonly used in existing approaches. Our approach can be used for anonymizing both numeric and categorical data. An experimental study is conducted using real-world financial, economic and healthcare data. The results of the experiments demonstrate that the proposed approach is very effective in protecting data privacy while preserving data quality for research and analysis.
NASA Astrophysics Data System (ADS)
McCray, Wilmon Wil L., Jr.
The research was prompted by a need to conduct a study that assesses process improvement, quality management and analytical techniques taught to students in U.S. colleges and universities undergraduate and graduate systems engineering and the computing science discipline (e.g., software engineering, computer science, and information technology) degree programs during their academic training that can be applied to quantitatively manage processes for performance. Everyone involved in executing repeatable processes in the software and systems development lifecycle processes needs to become familiar with the concepts of quantitative management, statistical thinking, process improvement methods and how they relate to process-performance. Organizations are starting to embrace the de facto Software Engineering Institute (SEI) Capability Maturity Model Integration (CMMI RTM) Models as process improvement frameworks to improve business processes performance. High maturity process areas in the CMMI model imply the use of analytical, statistical, quantitative management techniques, and process performance modeling to identify and eliminate sources of variation, continually improve process-performance; reduce cost and predict future outcomes. The research study identifies and provides a detail discussion of the gap analysis findings of process improvement and quantitative analysis techniques taught in U.S. universities systems engineering and computing science degree programs, gaps that exist in the literature, and a comparison analysis which identifies the gaps that exist between the SEI's "healthy ingredients " of a process performance model and courses taught in U.S. universities degree program. The research also heightens awareness that academicians have conducted little research on applicable statistics and quantitative techniques that can be used to demonstrate high maturity as implied in the CMMI models. The research also includes a Monte Carlo simulation optimization model and dashboard that demonstrates the use of statistical methods, statistical process control, sensitivity analysis, quantitative and optimization techniques to establish a baseline and predict future customer satisfaction index scores (outcomes). The American Customer Satisfaction Index (ACSI) model and industry benchmarks were used as a framework for the simulation model.
Technique for Early Reliability Prediction of Software Components Using Behaviour Models
Ali, Awad; N. A. Jawawi, Dayang; Adham Isa, Mohd; Imran Babar, Muhammad
2016-01-01
Behaviour models are the most commonly used input for predicting the reliability of a software system at the early design stage. A component behaviour model reveals the structure and behaviour of the component during the execution of system-level functionalities. There are various challenges related to component reliability prediction at the early design stage based on behaviour models. For example, most of the current reliability techniques do not provide fine-grained sequential behaviour models of individual components and fail to consider the loop entry and exit points in the reliability computation. Moreover, some of the current techniques do not tackle the problem of operational data unavailability and the lack of analysis results that can be valuable for software architects at the early design stage. This paper proposes a reliability prediction technique that, pragmatically, synthesizes system behaviour in the form of a state machine, given a set of scenarios and corresponding constraints as input. The state machine is utilized as a base for generating the component-relevant operational data. The state machine is also used as a source for identifying the nodes and edges of a component probabilistic dependency graph (CPDG). Based on the CPDG, a stack-based algorithm is used to compute the reliability. The proposed technique is evaluated by a comparison with existing techniques and the application of sensitivity analysis to a robotic wheelchair system as a case study. The results indicate that the proposed technique is more relevant at the early design stage compared to existing works, and can provide a more realistic and meaningful prediction. PMID:27668748
New Ground Truth Capability from InSAR Time Series Analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Buckley, S; Vincent, P; Yang, D
2005-07-13
We demonstrate that next-generation interferometric synthetic aperture radar (InSAR) processing techniques applied to existing data provide rich InSAR ground truth content for exploitation in seismic source identification. InSAR time series analyses utilize tens of interferograms and can be implemented in different ways. In one such approach, conventional InSAR displacement maps are inverted in a final post-processing step. Alternatively, computationally intensive data reduction can be performed with specialized InSAR processing algorithms. The typical final result of these approaches is a synthesized set of cumulative displacement maps. Examples from our recent work demonstrate that these InSAR processing techniques can provide appealing newmore » ground truth capabilities. We construct movies showing the areal and temporal evolution of deformation associated with previous nuclear tests. In other analyses, we extract time histories of centimeter-scale surface displacement associated with tunneling. The potential exists to identify millimeter per year surface movements when sufficient data exists for InSAR techniques to isolate and remove phase signatures associated with digital elevation model errors and the atmosphere.« less
Biometric Authentication for Gender Classification Techniques: A Review
NASA Astrophysics Data System (ADS)
Mathivanan, P.; Poornima, K.
2017-12-01
One of the challenging biometric authentication applications is gender identification and age classification, which captures gait from far distance and analyze physical information of the subject such as gender, race and emotional state of the subject. It is found that most of the gender identification techniques have focused only with frontal pose of different human subject, image size and type of database used in the process. The study also classifies different feature extraction process such as, Principal Component Analysis (PCA) and Local Directional Pattern (LDP) that are used to extract the authentication features of a person. This paper aims to analyze different gender classification techniques that help in evaluating strength and weakness of existing gender identification algorithm. Therefore, it helps in developing a novel gender classification algorithm with less computation cost and more accuracy. In this paper, an overview and classification of different gender identification techniques are first presented and it is compared with other existing human identification system by means of their performance.
Runtime Speculative Software-Only Fault Tolerance
2012-06-01
reliability of RSFT, a in-depth analysis on its window of vulnerability is also discussed and measured via simulated fault injection. The performance...propagation of faults through the entire program. For optimal performance, these techniques have to use herotic alias analysis to find the minimum set of...affect program output. No program source code or alias analysis is needed to analyze the fault propagation ahead of time. 2.3 Limitations of Existing
Radar fall detection using principal component analysis
NASA Astrophysics Data System (ADS)
Jokanovic, Branka; Amin, Moeness; Ahmad, Fauzia; Boashash, Boualem
2016-05-01
Falls are a major cause of fatal and nonfatal injuries in people aged 65 years and older. Radar has the potential to become one of the leading technologies for fall detection, thereby enabling the elderly to live independently. Existing techniques for fall detection using radar are based on manual feature extraction and require significant parameter tuning in order to provide successful detections. In this paper, we employ principal component analysis for fall detection, wherein eigen images of observed motions are employed for classification. Using real data, we demonstrate that the PCA based technique provides performance improvement over the conventional feature extraction methods.
Nazarzadeh, Kimia; Arjunan, Sridhar P; Kumar, Dinesh K; Das, Debi Prasad
2016-08-01
In this study, we have analyzed the accelerometer data recorded during gait analysis of Parkinson disease patients for detecting freezing of gait (FOG) episodes. The proposed method filters the recordings for noise reduction of the leg movement changes and computes the wavelet coefficients to detect FOG events. Publicly available FOG database was used and the technique was evaluated using receiver operating characteristic (ROC) analysis. Results show a higher performance of the wavelet feature in discrimination of the FOG events from the background activity when compared with the existing technique.
Influences on corporate executive decision behavior in government acquisitions
NASA Technical Reports Server (NTRS)
Wetherington, J. R.
1986-01-01
This paper presents extensive exploratory research which had as its primary objective, the discovery and determination of major areas of concern exhibited by U.S. corporate executives in the preparation and submittal of proposals and bids to the Federal government. The existence of numerous unique concerns inherent in corporate strategies within the government market environment was established. A determination of the relationship of these concerns to each other was accomplished utilizing statistical factor analysis techniques resulting in the identification of major groupings of management concerns. Finally, using analysis of variance, an analysis and discovery of the interrelationship of the factors to corporate demographics was accomplished. The existence of separate and distinct concerns exhibited by corporate executives when contemplating sales and operations in the government marketplace was established. It was also demonstrated that quantifiable relationships exist between such variables and that the decision behavior exhibited by the responsible executives has an interrelationship to their company's demographics.
Analysis of Extracellular Vesicles in the Tumor Microenvironment.
Al-Nedawi, Khalid; Read, Jolene
2016-01-01
Extracellular vesicles (ECV) are membrane compartments shed from all types of cells in various physiological and pathological states. In recent years, ECV have gained an increasing interest from the scientific community for their role as an intercellular communicator that plays important roles in modifying the tumor microenvironment. Multiple techniques have been established to collect ECV from conditioned media of cell culture or physiological fluids. The gold standard methodology is differential centrifugation. Although alternative techniques exist to collect ECV, these techniques have not proven suitable as a substitution for the ultracentrifugation procedure.
Double Density Dual Tree Discrete Wavelet Transform implementation for Degraded Image Enhancement
NASA Astrophysics Data System (ADS)
Vimala, C.; Aruna Priya, P.
2018-04-01
Wavelet transform is a main tool for image processing applications in modern existence. A Double Density Dual Tree Discrete Wavelet Transform is used and investigated for image denoising. Images are considered for the analysis and the performance is compared with discrete wavelet transform and the Double Density DWT. Peak Signal to Noise Ratio values and Root Means Square error are calculated in all the three wavelet techniques for denoised images and the performance has evaluated. The proposed techniques give the better performance when comparing other two wavelet techniques.
Extending enterprise architecture modelling with business goals and requirements
NASA Astrophysics Data System (ADS)
Engelsman, Wilco; Quartel, Dick; Jonkers, Henk; van Sinderen, Marten
2011-02-01
The methods for enterprise architecture (EA), such as The Open Group Architecture Framework, acknowledge the importance of requirements modelling in the development of EAs. Modelling support is needed to specify, document, communicate and reason about goals and requirements. The current modelling techniques for EA focus on the products, services, processes and applications of an enterprise. In addition, techniques may be provided to describe structured requirements lists and use cases. Little support is available however for modelling the underlying motivation of EAs in terms of stakeholder concerns and the high-level goals that address these concerns. This article describes a language that supports the modelling of this motivation. The definition of the language is based on existing work on high-level goal and requirements modelling and is aligned with an existing standard for enterprise modelling: the ArchiMate language. Furthermore, the article illustrates how EA can benefit from analysis techniques from the requirements engineering domain.
Rahaman, Mijanur; Pang, Chin-Tzong; Ishtyak, Mohd; Ahmad, Rais
2017-01-01
In this article, we introduce a perturbed system of generalized mixed quasi-equilibrium-like problems involving multi-valued mappings in Hilbert spaces. To calculate the approximate solutions of the perturbed system of generalized multi-valued mixed quasi-equilibrium-like problems, firstly we develop a perturbed system of auxiliary generalized multi-valued mixed quasi-equilibrium-like problems, and then by using the celebrated Fan-KKM technique, we establish the existence and uniqueness of solutions of the perturbed system of auxiliary generalized multi-valued mixed quasi-equilibrium-like problems. By deploying an auxiliary principle technique and an existence result, we formulate an iterative algorithm for solving the perturbed system of generalized multi-valued mixed quasi-equilibrium-like problems. Lastly, we study the strong convergence analysis of the proposed iterative sequences under monotonicity and some mild conditions. These results are new and generalize some known results in this field.
Structure identification methods for atomistic simulations of crystalline materials
Stukowski, Alexander
2012-05-28
Here, we discuss existing and new computational analysis techniques to classify local atomic arrangements in large-scale atomistic computer simulations of crystalline solids. This article includes a performance comparison of typical analysis algorithms such as common neighbor analysis (CNA), centrosymmetry analysis, bond angle analysis, bond order analysis and Voronoi analysis. In addition we propose a simple extension to the CNA method that makes it suitable for multi-phase systems. Finally, we introduce a new structure identification algorithm, the neighbor distance analysis, which is designed to identify atomic structure units in grain boundaries.
NASA Technical Reports Server (NTRS)
Bao, Xiaoqi; Badescu, Mircea; Bar-Cohen, Yoseph
2015-01-01
The potential to return Martian samples to Earth for extensive analysis is in great interest of the planetary science community. It is important to make sure the mission would securely contain any microbes that may possibly exist on Mars so that they would not be able to cause any adverse effects on Earth's environment. A brazing sealing and sterilizing technique has been proposed to break the Mars-to-Earth contamination chain. Thermal analysis of the brazing process was conducted for several conceptual designs that apply the technique. Control of the increase of the temperature of the Martian samples is a challenge. The temperature profiles of the Martian samples being sealed in the container were predicted by finite element thermal models. The results show that the sealing and sterilization process can be controlled such that the samples' temperature is maintained below the potentially required level, and that the brazing technique is a feasible approach to break the contamination chain.
Robust volcano plot: identification of differential metabolites in the presence of outliers.
Kumar, Nishith; Hoque, Md Aminul; Sugimoto, Masahiro
2018-04-11
The identification of differential metabolites in metabolomics is still a big challenge and plays a prominent role in metabolomics data analyses. Metabolomics datasets often contain outliers because of analytical, experimental, and biological ambiguity, but the currently available differential metabolite identification techniques are sensitive to outliers. We propose a kernel weight based outlier-robust volcano plot for identifying differential metabolites from noisy metabolomics datasets. Two numerical experiments are used to evaluate the performance of the proposed technique against nine existing techniques, including the t-test and the Kruskal-Wallis test. Artificially generated data with outliers reveal that the proposed method results in a lower misclassification error rate and a greater area under the receiver operating characteristic curve compared with existing methods. An experimentally measured breast cancer dataset to which outliers were artificially added reveals that our proposed method produces only two non-overlapping differential metabolites whereas the other nine methods produced between seven and 57 non-overlapping differential metabolites. Our data analyses show that the performance of the proposed differential metabolite identification technique is better than that of existing methods. Thus, the proposed method can contribute to analysis of metabolomics data with outliers. The R package and user manual of the proposed method are available at https://github.com/nishithkumarpaul/Rvolcano .
Using cognitive task analysis to develop simulation-based training for medical tasks.
Cannon-Bowers, Jan; Bowers, Clint; Stout, Renee; Ricci, Katrina; Hildabrand, Annette
2013-10-01
Pressures to increase the efficacy and effectiveness of medical training are causing the Department of Defense to investigate the use of simulation technologies. This article describes a comprehensive cognitive task analysis technique that can be used to simultaneously generate training requirements, performance metrics, scenario requirements, and simulator/simulation requirements for medical tasks. On the basis of a variety of existing techniques, we developed a scenario-based approach that asks experts to perform the targeted task multiple times, with each pass probing a different dimension of the training development process. In contrast to many cognitive task analysis approaches, we argue that our technique can be highly cost effective because it is designed to accomplish multiple goals. The technique was pilot tested with expert instructors from a large military medical training command. These instructors were employed to generate requirements for two selected combat casualty care tasks-cricothyroidotomy and hemorrhage control. Results indicated that the technique is feasible to use and generates usable data to inform simulation-based training system design. Reprint & Copyright © 2013 Association of Military Surgeons of the U.S.
A Bio Medical Waste Identification and Classification Algorithm Using Mltrp and Rvm.
Achuthan, Aravindan; Ayyallu Madangopal, Vasumathi
2016-10-01
We aimed to extract the histogram features for text analysis and, to classify the types of Bio Medical Waste (BMW) for garbage disposal and management. The given BMW was preprocessed by using the median filtering technique that efficiently reduced the noise in the image. After that, the histogram features of the filtered image were extracted with the help of proposed Modified Local Tetra Pattern (MLTrP) technique. Finally, the Relevance Vector Machine (RVM) was used to classify the BMW into human body parts, plastics, cotton and liquids. The BMW image was collected from the garbage image dataset for analysis. The performance of the proposed BMW identification and classification system was evaluated in terms of sensitivity, specificity, classification rate and accuracy with the help of MATLAB. When compared to the existing techniques, the proposed techniques provided the better results. This work proposes a new texture analysis and classification technique for BMW management and disposal. It can be used in many real time applications such as hospital and healthcare management systems for proper BMW disposal.
Handling nonnormality and variance heterogeneity for quantitative sublethal toxicity tests.
Ritz, Christian; Van der Vliet, Leana
2009-09-01
The advantages of using regression-based techniques to derive endpoints from environmental toxicity data are clear, and slowly, this superior analytical technique is gaining acceptance. As use of regression-based analysis becomes more widespread, some of the associated nuances and potential problems come into sharper focus. Looking at data sets that cover a broad spectrum of standard test species, we noticed that some model fits to data failed to meet two key assumptions-variance homogeneity and normality-that are necessary for correct statistical analysis via regression-based techniques. Failure to meet these assumptions often is caused by reduced variance at the concentrations showing severe adverse effects. Although commonly used with linear regression analysis, transformation of the response variable only is not appropriate when fitting data using nonlinear regression techniques. Through analysis of sample data sets, including Lemna minor, Eisenia andrei (terrestrial earthworm), and algae, we show that both the so-called Box-Cox transformation and use of the Poisson distribution can help to correct variance heterogeneity and nonnormality and so allow nonlinear regression analysis to be implemented. Both the Box-Cox transformation and the Poisson distribution can be readily implemented into existing protocols for statistical analysis. By correcting for nonnormality and variance heterogeneity, these two statistical tools can be used to encourage the transition to regression-based analysis and the depreciation of less-desirable and less-flexible analytical techniques, such as linear interpolation.
NASA Technical Reports Server (NTRS)
Fomenkova, M. N.
1997-01-01
The computer-intensive project consisted of the analysis and synthesis of existing data on composition of comet Halley dust particles. The main objective was to obtain a complete inventory of sulfur containing compounds in the comet Halley dust by building upon the existing classification of organic and inorganic compounds and applying a variety of statistical techniques for cluster and cross-correlational analyses. A student hired for this project wrote and tested the software to perform cluster analysis. The following tasks were carried out: (1) selecting the data from existing database for the proposed project; (2) finding access to a standard library of statistical routines for cluster analysis; (3) reformatting the data as necessary for input into the library routines; (4) performing cluster analysis and constructing hierarchical cluster trees using three methods to define the proximity of clusters; (5) presenting the output results in different formats to facilitate the interpretation of the obtained cluster trees; (6) selecting groups of data points common for all three trees as stable clusters. We have also considered the chemistry of sulfur in inorganic compounds.
DOT National Transportation Integrated Search
2010-01-18
This research demonstrated the application of gel permeation chromatography (GPC) as an analytical tool to : ascertain the amounts of polymer modifiers in polymer modified asphalt cements, which are soluble in eluting GPC : solvents. The technique wa...
DOT National Transportation Integrated Search
2010-01-18
This research demonstrated the application of gel permeation chromatography (GPC) as an analytical tool to ascertain the amounts of polymer modifiers in polymer modified asphalt cements, which are soluble in eluting GPC solvents. The technique was ap...
NASA software specification and evaluation system design, part 2
NASA Technical Reports Server (NTRS)
1976-01-01
A survey and analysis of the existing methods, tools and techniques employed in the development of software are presented along with recommendations for the construction of reliable software. Functional designs for software specification language, and the data base verifier are presented.
DOE Office of Scientific and Technical Information (OSTI.GOV)
McMillan, J.D.
This paper reviews existing and proposed pretreatment processes for biomass. The focus is on the mechanisms by which the various pretreatments act and the influence of biomass structure and composition on the efficacy of particular pretreatment techniques. This analysis is used to identify pretreatment technologies and issues that warrant further research.
Monopulse azimuth measurement in the ATC Radar Beacon System
DOT National Transportation Integrated Search
1971-12-01
A review is made of the application of sum-difference beam : techniques to the ATC Radar Beacon System. A detailed error analysis : is presented for the case of a monopulse azimuth measurement based : on the existing beacon antenna with a modified fe...
From Phenomena to Objects: Segmentation of Fuzzy Objects and its Application to Oceanic Eddies
NASA Astrophysics Data System (ADS)
Wu, Qingling
A challenging image analysis problem that has received limited attention to date is the isolation of fuzzy objects---i.e. those with inherently indeterminate boundaries---from continuous field data. This dissertation seeks to bridge the gap between, on the one hand, the recognized need for Object-Based Image Analysis of fuzzy remotely sensed features, and on the other, the optimization of existing image segmentation techniques for the extraction of more discretely bounded features. Using mesoscale oceanic eddies as a case study of a fuzzy object class evident in Sea Surface Height Anomaly (SSHA) imagery, the dissertation demonstrates firstly, that the widely used region-growing and watershed segmentation techniques can be optimized and made comparable in the absence of ground truth data using the principle of parsimony. However, they both have significant shortcomings, with the region growing procedure creating contour polygons that do not follow the shape of eddies while the watershed technique frequently subdivides eddies or groups together separate eddy objects. Secondly, it was determined that these problems can be remedied by using a novel Non-Euclidian Voronoi (NEV) tessellation technique. NEV is effective in isolating the extrema associated with eddies in SSHA data while using a non-Euclidian cost-distance based procedure (based on cumulative gradients in ocean height) to define the boundaries between fuzzy objects. Using this procedure as the first stage in isolating candidate eddy objects, a novel "region-shrinking" multicriteria eddy identification algorithm was developed that includes consideration of shape and vorticity. Eddies identified by this region-shrinking technique compare favorably with those identified by existing techniques, while simplifying and improving existing automated eddy detection algorithms. However, it also tends to find a larger number of eddies as a result of its ability to separate what other techniques identify as connected eddies. The research presented here is of significance not only to eddy research in oceanography, but also to other areas of Earth System Science for which the automated detection of features lacking rigid boundary definitions is of importance.
Transgender Phonosurgery: A Systematic Review and Meta-analysis.
Song, Tara Elena; Jiang, Nancy
2017-05-01
Objectives Different surgical techniques have been described in the literature to increase vocal pitch. The purpose of this study is to systematically review these surgeries and perform a meta-analysis to determine which technique increases pitch the most. Data Sources CINAHL, Cochrane, Embase, Medline, PubMed, and Science Direct. Review Methods A systematic review and meta-analysis of the literature was performed using the CINAHL, Cochrane, Embase, Medline, PubMed, and Science Direct databases. Studies were eligible for inclusion if they evaluated pitch-elevating phonosurgical techniques in live humans and performed pre- and postoperative acoustic analysis. Data were gathered regarding surgical technique, pre- and postoperative fundamental frequencies, perioperative care measures, and complications. Results Twenty-nine studies were identified. After applying inclusion and exclusion criteria, a total of 13 studies were included in the meta-analysis. Mechanisms of pitch elevation included increasing vocal cord tension (cricothyroid approximation), shortening the vocal cord length (cold knife glottoplasty, laser-shortening glottoplasty), and decreasing mass (laser reduction glottoplasty). The most common interventions were shortening techniques and cricothyroid approximation (6 studies each). The largest increase in fundamental frequency was seen with techniques that shortened the vocal cords. Preoperative speech therapy, postoperative voice rest, and reporting of patient satisfaction were inconsistent. Many of the studies were limited by low power and short length of follow-up. Conclusions Multiple techniques for elevation of vocal pitch exist, but vocal cord shortening procedures appear to result in the largest increase in fundamental frequency.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ziemer, B; Hubbard, L; Groves, E
2015-06-15
Purpose: To evaluate a first pass analysis (FPA) technique for CT perfusion measurement in a swine animal and its validation using fractional flow reserve (FFR) as a reference standard. Methods: Swine were placed under anesthesia and relevant physiologic parameters were continuously recorded. Intra-coronary adenosine was administered to induce maximum hyperemia. A pressure wire was advanced distal to the first diagonal branch of the left anterior descending (LAD) artery for FFR measurements and a balloon dilation catheter was inserted over the pressure wire into the proximal LAD to create varying levels of stenosis. Images were acquired with a 320-row wide volumemore » CT scanner. Three main coronary perfusion beds were delineated in the myocardium using arteries extracted from CT angiography images using a minimum energy hypothesis. The integrated density in the perfusion bed was used to calculate perfusion using the FPA technique. The perfusion in the LAD bed over a range of stenosis severity was measured. The measured fractional perfusion was compared to FFR and linear regression was performed. Results: The measured fractional perfusion using the FPA technique (P-FPA) and FFR were related as P-FPA = 1.06FFR – 0.06 (r{sup 2} = 0.86). The perfusion measurements were calculated with only three to five total CT volume scans, which drastically reduces the radiation dose as compared with the existing techniques requiring 15–20 volume scans. Conclusion: The measured perfusion using the first pass analysis technique showed good correlation with FFR measurements as a reference standard. The technique for perfusion measurement can potentially make a substantial reduction in radiation dose as compared with the existing techniques.« less
Mediratta, Anuj; Addetia, Karima; Medvedofsky, Diego; Schneider, Robert J; Kruse, Eric; Shah, Atman P; Nathan, Sandeep; Paul, Jonathan D; Blair, John E; Ota, Takeyoshi; Balkhy, Husam H; Patel, Amit R; Mor-Avi, Victor; Lang, Roberto M
2017-05-01
With the increasing use of transcatheter aortic valve replacement (TAVR) in patients with aortic stenosis (AS), computed tomography (CT) remains the standard for annulus sizing. However, 3D transesophageal echocardiography (TEE) has been an alternative in patients with contraindications to CT. We sought to (1) test the feasibility, accuracy, and reproducibility of prototype 3DTEE analysis software (Philips) for aortic annular measurements and (2) compare the new approach to the existing echocardiographic techniques. We prospectively studied 52 patients who underwent gated contrast CT, procedural 3DTEE, and TAVR. 3DTEE images were analyzed using novel semi-automated software designed for 3D measurements of the aortic root, which uses multiplanar reconstruction, similar to CT analysis. Aortic annulus measurements included area, perimeter, and diameter calculations from these measurements. The results were compared to CT-derived values. Additionally, 3D echocardiographic measurements (3D planimetry and mitral valve analysis software adapted for the aortic valve) were also compared to the CT reference values. 3DTEE image quality was sufficient in 90% of patients for aortic annulus measurements using the new software, which were in good agreement with CT (r-values: .89-.91) and small (<4%) inter-modality nonsignificant biases. Repeated measurements showed <10% measurements variability. The new 3D analysis was the more accurate and reproducible of the existing echocardiographic techniques. Novel semi-automated 3DTEE analysis software can accurately measure aortic annulus in patients with severe AS undergoing TAVR, in better agreement with CT than the existing methodology. Accordingly, intra-procedural TEE could potentially replace CT in patients where CT carries significant risk. © 2017, Wiley Periodicals, Inc.
15 CFR 292.3 - Technical tools, techniques, practices, and analyses projects.
Code of Federal Regulations, 2011 CFR
2011-01-01
... understanding of existing organizations and resources relevant to the proposed project; adequate linkages and... the governing or managing organization to conduct the proposed activities; qualifications of the... objective. The purpose of these projects is to support the initial development, implementation, and analysis...
Innovative techniques with multi-purpose survey vehicle for automated analysis of cross-slope data.
DOT National Transportation Integrated Search
2007-11-02
Manual surveying methods have long been used in the field of highway engineering to determine : the cross-slope, and longitudinal grade of an existing roadway. However, these methods are : slow, tedious and labor intensive. Moreover, manual survey me...
STS-1 environmental control and life support system. Consumables and thermal analysis
NASA Technical Reports Server (NTRS)
Steines, G.
1980-01-01
The Environmental Control and Life Support Systems (ECLSS)/thermal systems analysis for the Space Transportation System 1 Flight (STS-1) was performed using the shuttle environmental consumables usage requirements evaluation (SECURE) computer program. This program employs a nodal technique utilizing the Fortran Environmental Analysis Routines (FEAR). The output parameters evaluated were consumable quantities, fluid temperatures, heat transfer and rejection, and cabin atmospheric pressure. Analysis of these indicated that adequate margins exist for the nonpropulsive consumables and related thermal environment.
Which causal structures might support a quantum-classical gap?
NASA Astrophysics Data System (ADS)
Pienaar, Jacques
2017-04-01
A causal scenario is a graph that describes the cause and effect relationships between all relevant variables in an experiment. A scenario is deemed ‘not interesting’ if there is no device-independent way to distinguish the predictions of classical physics from any generalised probabilistic theory (including quantum mechanics). Conversely, an interesting scenario is one in which there exists a gap between the predictions of different operational probabilistic theories, as occurs for example in Bell-type experiments. Henson, Lal and Pusey (HLP) recently proposed a sufficient condition for a causal scenario to not be interesting. In this paper we supplement their analysis with some new techniques and results. We first show that existing graphical techniques due to Evans can be used to confirm by inspection that many graphs are interesting without having to explicitly search for inequality violations. For three exceptional cases—the graphs numbered \\#15,16,20 in HLP—we show that there exist non-Shannon type entropic inequalities that imply these graphs are interesting. In doing so, we find that existing methods of entropic inequalities can be greatly enhanced by conditioning on the specific values of certain variables.
Surface and Thin Film Analysis during Metal Organic Vapour Phase Epitaxial Growth
NASA Astrophysics Data System (ADS)
Richter, Wolfgang
2007-06-01
In-situ analysis of epitaxial growth is the essential ingredient in order to understand the growth process, to optimize growth and last but not least to monitor or even control the epitaxial growth on a microscopic scale. In MBE (molecular beam epitaxy) in-situ analysis tools existed right from the beginning because this technique developed from Surface Science technology with all its electron based analysis tools (LEED, RHEED, PES etc). Vapour Phase Epitaxy, in contrast, remained for a long time in an empirical stage ("alchemy") because only post growth characterisations like photoluminescence, Hall effect and electrical conductivity were available. Within the last two decades, however, optical techniques were developed which provide similar capabilities as in MBE for Vapour Phase growth. I will discuss in this paper the potential of Reflectance Anisotropy Spectroscopy (RAS) and Spectroscopic Ellipsometry (SE) for the growth of thin epitaxial semiconductor layers with zincblende (GaAs etc) and wurtzite structure (GaN etc). Other techniques and materials will be also mentioned.
Correlating Detergent Fiber Analysis and Dietary Fiber Analysis Data for Corn Stover
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wolfrum, E. J.; Lorenz, A. J.; deLeon, N.
There exist large amounts of detergent fiber analysis data [neutral detergent fiber (NDF), acid detergent fiber (ADF), acid detergent lignin (ADL)] for many different potential cellulosic ethanol feedstocks, since these techniques are widely used for the analysis of forages. Researchers working in the area of cellulosic ethanol are interested in the structural carbohydrates in a feedstock (principally glucan and xylan), which are typically determined by acid hydrolysis of the structural fraction after multiple extractions of the biomass. These so-called dietary fiber analysis methods are significantly more involved than detergent fiber analysis methods. The purpose of this study was to determinemore » whether it is feasible to correlate detergent fiber analysis values to glucan and xylan content determined by dietary fiber analysis methods for corn stover. In the detergent fiber analysis literature cellulose is often estimated as the difference between ADF and ADL, while hemicellulose is often estimated as the difference between NDF and ADF. Examination of a corn stover dataset containing both detergent fiber analysis data and dietary fiber analysis data predicted using near infrared spectroscopy shows that correlations between structural glucan measured using dietary fiber techniques and cellulose estimated using detergent techniques, and between structural xylan measured using dietary fiber techniques and hemicellulose estimated using detergent techniques are high, but are driven largely by the underlying correlation between total extractives measured by fiber analysis and NDF/ADF. That is, detergent analysis data is correlated to dietary fiber analysis data for structural carbohydrates, but only indirectly; the main correlation is between detergent analysis data and solvent extraction data produced during the dietary fiber analysis procedure.« less
Detection of Erroneous Payments Utilizing Supervised And Unsupervised Data Mining Techniques
2004-09-01
will look at which statistical analysis technique will work best in developing and enhancing existing erroneous payment models . Chapter I and II... payment models that are used for selection of records to be audited. The models are set up such that if two or more records have the same payment...Identification Number, Invoice Number and Delivery Order Number are not compared. The DM0102 Duplicate Payment Model will be analyzed in this thesis
Application of neural networks and sensitivity analysis to improved prediction of trauma survival.
Hunter, A; Kennedy, L; Henry, J; Ferguson, I
2000-05-01
The performance of trauma departments is widely audited by applying predictive models that assess probability of survival, and examining the rate of unexpected survivals and deaths. Although the TRISS methodology, a logistic regression modelling technique, is still the de facto standard, it is known that neural network models perform better. A key issue when applying neural network models is the selection of input variables. This paper proposes a novel form of sensitivity analysis, which is simpler to apply than existing techniques, and can be used for both numeric and nominal input variables. The technique is applied to the audit survival problem, and used to analyse the TRISS variables. The conclusions discuss the implications for the design of further improved scoring schemes and predictive models.
Near-Field Magnetic Dipole Moment Analysis
NASA Technical Reports Server (NTRS)
Harris, Patrick K.
2003-01-01
This paper describes the data analysis technique used for magnetic testing at the NASA Goddard Space Flight Center (GSFC). Excellent results have been obtained using this technique to convert a spacecraft s measured magnetic field data into its respective magnetic dipole moment model. The model is most accurate with the earth s geomagnetic field cancelled in a spherical region bounded by the measurement magnetometers with a minimum radius large enough to enclose the magnetic source. Considerably enhanced spacecraft magnetic testing is offered by using this technique in conjunction with a computer-controlled magnetic field measurement system. Such a system, with real-time magnetic field display capabilities, has been incorporated into other existing magnetic measurement facilities and is also used at remote locations where transport to a magnetics test facility is impractical.
High efficiency processing for reduced amplitude zones detection in the HRECG signal
NASA Astrophysics Data System (ADS)
Dugarte, N.; Álvarez, A.; Balacco, J.; Mercado, G.; Gonzalez, A.; Dugarte, E.; Olivares, A.
2016-04-01
Summary - This article presents part of a more detailed research proposed in the medium to long term, with the intention of establishing a new philosophy of electrocardiogram surface analysis. This research aims to find indicators of cardiovascular disease in its early stage that may go unnoticed with conventional electrocardiography. This paper reports the development of a software processing which collect some existing techniques and incorporates novel methods for detection of reduced amplitude zones (RAZ) in high resolution electrocardiographic signal (HRECG).The algorithm consists of three stages, an efficient processing for QRS detection, averaging filter using correlation techniques and a step for RAZ detecting. Preliminary results show the efficiency of system and point to incorporation of techniques new using signal analysis with involving 12 leads.
NASA Technical Reports Server (NTRS)
Doyle, James D.; Warner, Thomas T.
1987-01-01
Various combinations of VAS (Visible and Infrared Spin Scan Radiometer Atmospheric Sounder) data, conventional rawinsonde data, and gridded data from the National Weather Service's (NWS) global analysis, were used in successive-correction and variational objective-analysis procedures. Analyses are produced for 0000 GMT 7 March 1982, when the VAS sounding distribution was not greatly limited by the existence of cloud cover. The successive-correction (SC) procedure was used with VAS data alone, rawinsonde data alone, and both VAS and rawinsonde data. Variational techniques were applied in three ways. Each of these techniques was discussed.
NASA Astrophysics Data System (ADS)
Larkin, Serguey Y.; Anischenko, Serguei E.; Kamyshin, Vladimir A.
1996-12-01
The frequency and power measurements technique using ac Josephson effect is founded on deviation of the voltagecurrent curve of irradiated Josephson junction from its autonomous voltage-current (V-I) curve [1]. Generally this technique, in case of harmonic incident radiation, may be characterized in the following manner: -to measure frequency of the hannonic microwave signal inadiating the Josephson junction and to estimate its intensity using functional processing of the voltage-current curves, one should identify the "Special feature existence" zone on the voltage-current curves. The "Special feature existence" zone results the junction's response to the incident radiation. As this takes place, it is necessary to define the coordinate of a central point of the "Special feature existence" zone on the curve and to estimate the deviation of the V-I curve of irradiated Josephson junction from its autonomous V-I curve. The practical implementation of this technique place at one's disposal a number of algorithms, which enable to realize frequency measurements and intensity estimation with a particular accuracy for incident radiation. This paper presents two rational algorithms to determine the aggregate of their merits and disadvantages and to choose more optimal one.
Progress in multidisciplinary design optimization at NASA Langley
NASA Technical Reports Server (NTRS)
Padula, Sharon L.
1993-01-01
Multidisciplinary Design Optimization refers to some combination of disciplinary analyses, sensitivity analysis, and optimization techniques used to design complex engineering systems. The ultimate objective of this research at NASA Langley Research Center is to help the US industry reduce the costs associated with development, manufacturing, and maintenance of aerospace vehicles while improving system performance. This report reviews progress towards this objective and highlights topics for future research. Aerospace design problems selected from the author's research illustrate strengths and weaknesses in existing multidisciplinary optimization techniques. The techniques discussed include multiobjective optimization, global sensitivity equations and sequential linear programming.
Search automation of the generalized method of device operational characteristics improvement
NASA Astrophysics Data System (ADS)
Petrova, I. Yu; Puchkova, A. A.; Zaripova, V. M.
2017-01-01
The article presents brief results of analysis of existing search methods of the closest patents, which can be applied to determine generalized methods of device operational characteristics improvement. There were observed the most widespread clustering algorithms and metrics for determining the proximity degree between two documents. The article proposes the technique of generalized methods determination; it has two implementation variants and consists of 7 steps. This technique has been implemented in the “Patents search” subsystem of the “Intellect” system. Also the article gives an example of the use of the proposed technique.
Translating Current Bioanalytical Techniques for Studying Corona Activity.
Wang, Chunming; Wang, Zhenzhen; Dong, Lei
2018-07-01
The recent discovery of the biological corona is revolutionising our understanding of the in vivo behaviour of nanomaterials. Accurate analysis of corona bioactivity is essential for predicting the fate of nanomaterials and thereby improving nanomedicine design. Nevertheless, current biotechniques for protein analysis are not readily adaptable for analysing corona proteins, given that their conformation, activity, and interaction may largely differ from those of the native proteins. Here, we introduce and propose tailor-made modifications to five types of mainstream bioanalytical methodologies. We specifically illustrate how these modifications can translate existing techniques for protein analysis into competent tools for dissecting the composition, bioactivity, and interaction (with both nanomaterials and the tissue) of corona formed on specific nanomaterial surfaces. Copyright © 2018 Elsevier Ltd. All rights reserved.
Comparison of analysis and flight test data for a drone aircraft with active flutter suppression
NASA Technical Reports Server (NTRS)
Newsom, J. R.; Pototzky, A. S.
1981-01-01
A drone aircraft equipped with an active flutter suppression system is considered with emphasis on the comparison of modal dampings and frequencies as a function of Mach number. Results are presented for both symmetric and antisymmetric motion with flutter suppression off. Only symmetric results are given for flutter suppression on. Frequency response functions of the vehicle are presented from both flight test data and analysis. The analysis correlation is improved by using an empirical aerodynamic correction factor which is proportional to the ratio of experimental to analytical steady-state lift curve slope. The mathematical models are included and existing analytical techniques are described as well as an alternative analytical technique for obtaining closed-loop results.
Baró, Jordi; Martín-Olalla, José-María; Romero, Francisco Javier; Gallardo, María Carmen; Salje, Ekhard K H; Vives, Eduard; Planes, Antoni
2014-03-26
The existence of temporal correlations during the intermittent dynamics of a thermally driven structural phase transition is studied in a Cu-Zn-Al alloy. The sequence of avalanches is observed by means of two techniques: acoustic emission and high sensitivity calorimetry. Both methods reveal the existence of event clustering in a way that is equivalent to the Omori correlations between aftershocks in earthquakes as are commonly used in seismology.
A Practical Approach to Vocabulary Reinforcement.
ERIC Educational Resources Information Center
Stieglitz, Ezra L.
1983-01-01
Techniques of semantic feature analysis are applied to exploration and reinforcement of vocabulary. Students are presented with categories of familiar items and asked to describe their characteristics. The method can be used to elicit sentences, reinforce existing vocabulary, and begin discussion. Sample exercises for several difficulty levels are…
A SINDA thermal model using CAD/CAE technologies
NASA Technical Reports Server (NTRS)
Rodriguez, Jose A.; Spencer, Steve
1992-01-01
The approach to thermal analysis described by this paper is a technique that incorporates Computer Aided Design (CAD) and Computer Aided Engineering (CAE) to develop a thermal model that has the advantages of Finite Element Methods (FEM) without abandoning the unique advantages of Finite Difference Methods (FDM) in the analysis of thermal systems. The incorporation of existing CAD geometry, the powerful use of a pre and post processor and the ability to do interdisciplinary analysis, will be described.
Statistical analysis of fNIRS data: a comprehensive review.
Tak, Sungho; Ye, Jong Chul
2014-01-15
Functional near-infrared spectroscopy (fNIRS) is a non-invasive method to measure brain activities using the changes of optical absorption in the brain through the intact skull. fNIRS has many advantages over other neuroimaging modalities such as positron emission tomography (PET), functional magnetic resonance imaging (fMRI), or magnetoencephalography (MEG), since it can directly measure blood oxygenation level changes related to neural activation with high temporal resolution. However, fNIRS signals are highly corrupted by measurement noises and physiology-based systemic interference. Careful statistical analyses are therefore required to extract neuronal activity-related signals from fNIRS data. In this paper, we provide an extensive review of historical developments of statistical analyses of fNIRS signal, which include motion artifact correction, short source-detector separation correction, principal component analysis (PCA)/independent component analysis (ICA), false discovery rate (FDR), serially-correlated errors, as well as inference techniques such as the standard t-test, F-test, analysis of variance (ANOVA), and statistical parameter mapping (SPM) framework. In addition, to provide a unified view of various existing inference techniques, we explain a linear mixed effect model with restricted maximum likelihood (ReML) variance estimation, and show that most of the existing inference methods for fNIRS analysis can be derived as special cases. Some of the open issues in statistical analysis are also described. Copyright © 2013 Elsevier Inc. All rights reserved.
From air to rubber: New techniques for measuring and replicating mouthpieces, bocals, and bores
NASA Astrophysics Data System (ADS)
Fuks, Leonardo
2002-11-01
The history of musical instruments comprises a long genealogy of models and prototypes that results from a combination of copying existing specimens with the change in constructive parameters, and the addition of new devices. In making wind instruments, several techniques have been traditionally employed for extracting the external and internal dimensions of toneholes, air columns, bells, and mouthpieces. In the twentieth century, methods such as pulse reflectometry, x-ray, magnetic resonance, and ultrasound imaging have been made available for bore measurement. Advantages and drawbacks of the existing methods are discussed and a new method is presented that makes use of the injection and coating of silicon rubber, for accurate molding of the instrument. This technique is harmless to all traditional materials, being indicated also for measurements of historical instruments. The paper presents dimensional data obtained from clarinet and saxophone mouthpieces. A set of replicas of top quality clarinet and saxophone mouthpieces, trombone bocals, and flute headjoints is shown, with comparative acoustical and performance analyses. The application of such techniques for historical and modern instrument analysis, restoration, and manufacturing is proposed.
NASA Technical Reports Server (NTRS)
Kreinovich, Vladik
1996-01-01
For a space mission to be successful it is vitally important to have a good control strategy. For example, with the Space Shuttle it is necessary to guarantee the success and smoothness of docking, the smoothness and fuel efficiency of trajectory control, etc. For an automated planetary mission it is important to control the spacecraft's trajectory, and after that, to control the planetary rover so that it would be operable for the longest possible period of time. In many complicated control situations, traditional methods of control theory are difficult or even impossible to apply. In general, in uncertain situations, where no routine methods are directly applicable, we must rely on the creativity and skill of the human operators. In order to simulate these experts, an intelligent control methodology must be developed. The research objectives of this project were: to analyze existing control techniques; to find out which of these techniques is the best with respect to the basic optimality criteria (stability, smoothness, robustness); and, if for some problems, none of the existing techniques is satisfactory, to design new, better intelligent control techniques.
Understanding and Optimizing Asynchronous Low-Precision Stochastic Gradient Descent
De Sa, Christopher; Feldman, Matthew; Ré, Christopher; Olukotun, Kunle
2018-01-01
Stochastic gradient descent (SGD) is one of the most popular numerical algorithms used in machine learning and other domains. Since this is likely to continue for the foreseeable future, it is important to study techniques that can make it run fast on parallel hardware. In this paper, we provide the first analysis of a technique called Buckwild! that uses both asynchronous execution and low-precision computation. We introduce the DMGC model, the first conceptualization of the parameter space that exists when implementing low-precision SGD, and show that it provides a way to both classify these algorithms and model their performance. We leverage this insight to propose and analyze techniques to improve the speed of low-precision SGD. First, we propose software optimizations that can increase throughput on existing CPUs by up to 11×. Second, we propose architectural changes, including a new cache technique we call an obstinate cache, that increase throughput beyond the limits of current-generation hardware. We also implement and analyze low-precision SGD on the FPGA, which is a promising alternative to the CPU for future SGD systems. PMID:29391770
Martino, Piera Di; Magnoni, Federico; Peregrina, Dolores Vargas; Gigliobianco, Maria Rosa; Censi, Roberta; Malaj, Ledjan
2016-01-01
Drugs and excipients used for pharmaceutical applications generally exist in the solid (crystalline or amorphous) state, more rarely as liquid materials. In some cases, according to the physicochemical nature of the molecule, or as a consequence of specific technological processes, a compound may exist exclusively in the amorphous state. In other cases, as a consequence of specific treatments (freezing and spray drying, melting and co-melting, grinding and compression), the crystalline form may convert into a completely or partially amorphous form. An amorphous material shows physical and thermodynamic properties different from the corresponding crystalline form, with profound repercussions on its technological performance and biopharmaceutical properties. Several physicochemical techniques such as X-ray powder diffraction, thermal methods of analysis, spectroscopic techniques, gravimetric techniques, and inverse gas chromatography can be applied to characterize the amorphous form of a compound (drug or excipient), and to evaluate its thermodynamic stability. This review offers a survey of the technologies used to convert a crystalline solid into an amorphous form, and describes the most important techniques for characterizing the amorphous state of compounds of pharmaceutical interest.
An R package for the integrated analysis of metabolomics and spectral data.
Costa, Christopher; Maraschin, Marcelo; Rocha, Miguel
2016-06-01
Recently, there has been a growing interest in the field of metabolomics, materialized by a remarkable growth in experimental techniques, available data and related biological applications. Indeed, techniques as nuclear magnetic resonance, gas or liquid chromatography, mass spectrometry, infrared and UV-visible spectroscopies have provided extensive datasets that can help in tasks as biological and biomedical discovery, biotechnology and drug development. However, as it happens with other omics data, the analysis of metabolomics datasets provides multiple challenges, both in terms of methodologies and in the development of appropriate computational tools. Indeed, from the available software tools, none addresses the multiplicity of existing techniques and data analysis tasks. In this work, we make available a novel R package, named specmine, which provides a set of methods for metabolomics data analysis, including data loading in different formats, pre-processing, metabolite identification, univariate and multivariate data analysis, machine learning, and feature selection. Importantly, the implemented methods provide adequate support for the analysis of data from diverse experimental techniques, integrating a large set of functions from several R packages in a powerful, yet simple to use environment. The package, already available in CRAN, is accompanied by a web site where users can deposit datasets, scripts and analysis reports to be shared with the community, promoting the efficient sharing of metabolomics data analysis pipelines. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Mashayekhi, Mohammad Jalali; Behdinan, Kamran
2017-10-01
The increasing demand to minimize undesired vibration and noise levels in several high-tech industries has generated a renewed interest in vibration transfer path analysis. Analyzing vibration transfer paths within a system is of crucial importance in designing an effective vibration isolation strategy. Most of the existing vibration transfer path analysis techniques are empirical which are suitable for diagnosis and troubleshooting purpose. The lack of an analytical transfer path analysis to be used in the design stage is the main motivation behind this research. In this paper an analytical transfer path analysis based on the four-pole theory is proposed for multi-energy-domain systems. Bond graph modeling technique which is an effective approach to model multi-energy-domain systems is used to develop the system model. In this paper an electro-mechanical system is used as a benchmark example to elucidate the effectiveness of the proposed technique. An algorithm to obtain the equivalent four-pole representation of a dynamical systems based on the corresponding bond graph model is also presented in this paper.
Nahar, Jesmin; Imam, Tasadduq; Tickle, Kevin S; Garcia-Alonso, Debora
2013-01-01
This chapter is a review of data mining techniques used in medical research. It will cover the existing applications of these techniques in the identification of diseases, and also present the authors' research experiences in medical disease diagnosis and analysis. A computational diagnosis approach can have a significant impact on accurate diagnosis and result in time and cost effective solutions. The chapter will begin with an overview of computational intelligence concepts, followed by details on different classification algorithms. Use of association learning, a well recognised data mining procedure, will also be discussed. Many of the datasets considered in existing medical data mining research are imbalanced, and the chapter focuses on this issue as well. Lastly, the chapter outlines the need of data governance in this research domain.
Study of advanced techniques for determining the long term performance of components
NASA Technical Reports Server (NTRS)
1973-01-01
The application of existing and new technology to the problem of determining the long-term performance capability of liquid rocket propulsion feed systems is discussed. The long term performance of metal to metal valve seats in a liquid propellant fuel system is stressed. The approaches taken in conducting the analysis are: (1) advancing the technology of characterizing components through the development of new or more sensitive techniques and (2) improving the understanding of the physical of degradation.
Study of advanced techniques for determining the long-term performance of components
NASA Technical Reports Server (NTRS)
1972-01-01
A study was conducted of techniques having the capability of determining the performance and reliability of components for spacecraft liquid propulsion applications for long term missions. The study utilized two major approaches; improvement in the existing technology, and the evolution of new technology. The criteria established and methods evolved are applicable to valve components. Primary emphasis was placed on the propellants oxygen difluoride and diborane combination. The investigation included analysis, fabrication, and tests of experimental equipment to provide data and performance criteria.
Research Using Government Data Sets: An Underutilised Resource
ERIC Educational Resources Information Center
Knipe, Sally
2011-01-01
The use of existing data for education research activities can be a valuable resource. Improvement in statistical analysis and data management and retrieval techniques, as well as access to government data bases, has expanded opportunities for researchers seeking to investigate issues that are institutional in nature, such as participation…
Most of the existing arsenic dietary databases were developed from the analysis of total arsenic in water and dietary samples. These databases have been used to estimate arsenic exposure and in turn human health risk. However, these dietary databases are becoming obsolete as the ...
ERIC Educational Resources Information Center
Fritzen, Anny
2011-01-01
The term "sheltered instruction" (SI) has become a widely used metaphor representing a common pedagogical intervention intended to help English language learners simultaneously gain English proficiency and academic content knowledge. While existing research places considerable emphasis on observable pedagogical techniques that characterize SI,…
Making Definitions Explicit and Capturing Evaluation Policies.
ERIC Educational Resources Information Center
Houston, Samuel R.
Judgment ANalysis (JAN) is described as a technique for identifying the rating policies that exist within a group of judges. Studies are presented in which JAN has been used in evaluating teacher effectiveness by capturing both student and faculty policies of teacher effectiveness at the University of Northern Colorado. In addition, research…
Accounting for Cheating: An Evolving Theory and Emergent Themes
ERIC Educational Resources Information Center
Brent, Edward; Atkisson, Curtis
2011-01-01
This study examines student responses to the question, "What circumstances, if any, could make cheating justified?" It then assesses how well those responses can be classified by existing theories and categories that emerge from a qualitative analysis of the data. Results show considerable support for techniques of neutralization, partial support…
Testing a Conceptual Change Model Framework for Visual Data
ERIC Educational Resources Information Center
Finson, Kevin D.; Pedersen, Jon E.
2015-01-01
An emergent data analysis technique was employed to test the veracity of a conceptual framework constructed around visual data use and instruction in science classrooms. The framework incorporated all five key components Vosniadou (2007a, 2007b) described as existing in a learner's schema: framework theory, presuppositions, conceptual domains,…
Macready, Anna L; Fallaize, Rosalind; Butler, Laurie T; Ellis, Judi A; Kuznesof, Sharron; Frewer, Lynn J; Celis-Morales, Carlos; Livingstone, Katherine M; Araújo-Soares, Vera; Fischer, Arnout RH; Stewart-Knox, Barbara J; Mathers, John C
2018-01-01
Background To determine the efficacy of behavior change techniques applied in dietary and physical activity intervention studies, it is first necessary to record and describe techniques that have been used during such interventions. Published frameworks used in dietary and smoking cessation interventions undergo continuous development, and most are not adapted for Web-based delivery. The Food4Me study (N=1607) provided the opportunity to use existing frameworks to describe standardized Web-based techniques employed in a large-scale, internet-based intervention to change dietary behavior and physical activity. Objective The aims of this study were (1) to describe techniques embedded in the Food4Me study design and explain the selection rationale and (2) to demonstrate the use of behavior change technique taxonomies, develop standard operating procedures for training, and identify strengths and limitations of the Food4Me framework that will inform its use in future studies. Methods The 6-month randomized controlled trial took place simultaneously in seven European countries, with participants receiving one of four levels of personalized advice (generalized, intake-based, intake+phenotype–based, and intake+phenotype+gene–based). A three-phase approach was taken: (1) existing taxonomies were reviewed and techniques were identified a priori for possible inclusion in the Food4Me study, (2) a standard operating procedure was developed to maintain consistency in the use of methods and techniques across research centers, and (3) the Food4Me behavior change technique framework was reviewed and updated post intervention. An analysis of excluded techniques was also conducted. Results Of 46 techniques identified a priori as being applicable to Food4Me, 17 were embedded in the intervention design; 11 were from a dietary taxonomy, and 6 from a smoking cessation taxonomy. In addition, the four-category smoking cessation framework structure was adopted for clarity of communication. Smoking cessation texts were adapted for dietary use where necessary. A posteriori, a further 9 techniques were included. Examination of excluded items highlighted the distinction between techniques considered appropriate for face-to-face versus internet-based delivery. Conclusions The use of existing taxonomies facilitated the description and standardization of techniques used in Food4Me. We recommend that for complex studies of this nature, technique analysis should be conducted a priori to develop standardized procedures and training and reviewed a posteriori to audit the techniques actually adopted. The present framework description makes a valuable contribution to future systematic reviews and meta-analyses that explore technique efficacy and underlying psychological constructs. This was a novel application of the behavior change taxonomies and was the first internet-based personalized nutrition intervention to use such a framework remotely. Trial Registration ClinicalTrials.gov NCT01530139; https://clinicaltrials.gov/ct2/show/NCT01530139 (Archived by WebCite at http://www.webcitation.org/6y8XYUft1) PMID:29631993
Development of Novel Noninvasive Methods of Stress Assessment in Baleen Whales
2014-09-30
large whales. Few methods exist for assessment of physiological stress levels of free-swimming cetaceans (Amaral 2010, ONR 2010, Hunt et al. 2013...hormone aldosterone . Our aim in this project is to further develop both techniques - respiratory hormone analysis and fecal hormone analysis - for use...noninvasive aldosterone assay (for both feces and blow) that can be used as an alternative measure of adrenal gland activation relative to stress
Advanced statistical methods for improved data analysis of NASA astrophysics missions
NASA Technical Reports Server (NTRS)
Feigelson, Eric D.
1992-01-01
The investigators under this grant studied ways to improve the statistical analysis of astronomical data. They looked at existing techniques, the development of new techniques, and the production and distribution of specialized software to the astronomical community. Abstracts of nine papers that were produced are included, as well as brief descriptions of four software packages. The articles that are abstracted discuss analytical and Monte Carlo comparisons of six different linear least squares fits, a (second) paper on linear regression in astronomy, two reviews of public domain software for the astronomer, subsample and half-sample methods for estimating sampling distributions, a nonparametric estimation of survival functions under dependent competing risks, censoring in astronomical data due to nondetections, an astronomy survival analysis computer package called ASURV, and improving the statistical methodology of astronomical data analysis.
Spain, Seth M; Miner, Andrew G; Kroonenberg, Pieter M; Drasgow, Fritz
2010-08-06
Questions about the dynamic processes that drive behavior at work have been the focus of increasing attention in recent years. Models describing behavior at work and research on momentary behavior indicate that substantial variation exists within individuals. This article examines the rationale behind this body of work and explores a method of analyzing momentary work behavior using experience sampling methods. The article also examines a previously unused set of methods for analyzing data produced by experience sampling. These methods are known collectively as multiway component analysis. Two archetypal techniques of multimode factor analysis, the Parallel factor analysis and the Tucker3 models, are used to analyze data from Miner, Glomb, and Hulin's (2010) experience sampling study of work behavior. The efficacy of these techniques for analyzing experience sampling data is discussed as are the substantive multimode component models obtained.
Comparison of analysis and flight test data for a drone aircraft with active flutter suppression
NASA Technical Reports Server (NTRS)
Newsom, J. R.; Pototzky, A. S.
1981-01-01
This paper presents a comparison of analysis and flight test data for a drone aircraft equipped with an active flutter suppression system. Emphasis is placed on the comparison of modal dampings and frequencies as a function of Mach number. Results are presented for both symmetric and antisymmetric motion with flutter suppression off. Only symmetric results are presented for flutter suppression on. Frequency response functions of the vehicle are presented from both flight test data and analysis. The analysis correlation is improved by using an empirical aerodynamic correction factor which is proportional to the ratio of experimental to analytical steady-state lift curve slope. In addition to presenting the mathematical models and a brief description of existing analytical techniques, an alternative analytical technique for obtaining closed-loop results is presented.
Aerosol Index Dynamics over Athens and Beijing
NASA Astrophysics Data System (ADS)
Christodoulakis, J.; Varotsos, C.; Tzanis, C.; Xue, Y.
2014-11-01
We present the analysis of monthly mean Aerosol Index (AI) values, over Athens, Greece, and Beijing, China, for the period 1979-2012. The aim of the analysis is the identification of time scaling in the AI time series, by using a data analysis technique that would not be affected by the non-stationarity of the data. The appropriate technique satisfying this criterion is the Detrended Fluctuation Analysis (DF A). For the deseasonalization of time series classic Wiener method was applied filtering out the seasonal - 3 months, semiannual - 6 months and annual - 12 months periods. The data analysis for both Athens and Beijing revealed that the exponents α for both time periods are greater than 0.5 indicating that persistence of the correlations in the fluctuations of the deseasonalized AI values exists for time scales between about 4 months and 3.5 years (for the period 1979-1993) or 4 years (for the period 1996-2012).
Aerosol Index Dynamics over Athens and Beijing
NASA Astrophysics Data System (ADS)
Christodoulakis, J.; Varotsos, C.; Tzanis, C.; Xue, Y.
2014-11-01
We present the analysis of monthly mean Aerosol Index (AI) values, over Athens, Greece, and Beijing, China, for the period 1979- 2012. The aim of the analysis is the identification of time scaling in the AI time series, by using a data analysis technique that would not be affected by the non-stationarity of the data. The appropriate technique satisfying this criterion is the Detrended Fluctuation Analysis (DFA). For the deseasonalization of time series classic Wiener method was applied filtering out the seasonal - 3 months, semiannual - 6 months and annual - 12 months periods. The data analysis for both Athens and Beijing revealed that the exponents α for both time periods are greater than 0.5 indicating that persistence of the correlations in the fluctuations of the deseasonalized AI values exists for time scales between about 4 months and 3.5 years (for the period 1979-1993) or 4 years (for the period 1996-2012).
Kahwati, Leila; Viswanathan, Meera; Golin, Carol E; Kane, Heather; Lewis, Megan; Jacobs, Sara
2016-05-04
Interventions to improve medication adherence are diverse and complex. Consequently, synthesizing this evidence is challenging. We aimed to extend the results from an existing systematic review of interventions to improve medication adherence by using qualitative comparative analysis (QCA) to identify necessary or sufficient configurations of behavior change techniques among effective interventions. We used data from 60 studies in a completed systematic review to examine the combinations of nine behavior change techniques (increasing knowledge, increasing awareness, changing attitude, increasing self-efficacy, increasing intention formation, increasing action control, facilitation, increasing maintenance support, and motivational interviewing) among studies demonstrating improvements in adherence. Among the 60 studies, 34 demonstrated improved medication adherence. Among effective studies, increasing patient knowledge was a necessary but not sufficient technique. We identified seven configurations of behavior change techniques sufficient for improving adherence, which together accounted for 26 (76 %) of the effective studies. The intervention configuration that included increasing knowledge and self-efficacy was the most empirically relevant, accounting for 17 studies (50 %) and uniquely accounting for 15 (44 %). This analysis extends the completed review findings by identifying multiple combinations of behavior change techniques that improve adherence. Our findings offer direction for policy makers, practitioners, and future comparative effectiveness research on improving adherence.
Subsynchronous instability of a geared centrifugal compressor of overhung design
NASA Technical Reports Server (NTRS)
Hudson, J. H.; Wittman, L. J.
1980-01-01
The original design analysis and shop test data are presented for a three stage (poster) air compressor with impellers mounted on the extensions of a twin pinion gear, and driven by an 8000 hp synchronous motor. Also included are field test data, subsequent rotor dynamics analysis, modifications, and final rotor behavior. A subsynchronous instability existed on a geared, overhung rotor. State-of-the-art rotor dynamics analysis techniques provided a reasonable analytical model of the rotor. A bearing modification arrived at analytically eliminated the instability.
Bridging the gap between high and low acceleration for planetary escape
NASA Astrophysics Data System (ADS)
Indrikis, Janis; Preble, Jeffrey C.
With the exception of the often time consuming analysis by numerical optimization, no single orbit transfer analysis technique exists that can be applied over a wide range of accelerations. Using the simple planetary escape (parabolic trajectory) mission some of the more common techniques are considered as the limiting bastions at the high and the extremely low acceleration regimes. The brachistochrone, the minimum time of flight path, is proposed as the technique to bridge the gap between the high and low acceleration regions, providing a smooth bridge over the entire acceleration spectrum. A smooth and continuous velocity requirement is established for the planetary escape mission. By using these results, it becomes possible to determine the effect of finite accelerations on mission performance and target propulsion and power system designs which are consistent with a desired mission objective.
Validation of protein carbonyl measurement: A multi-centre study
Augustyniak, Edyta; Adam, Aisha; Wojdyla, Katarzyna; Rogowska-Wrzesinska, Adelina; Willetts, Rachel; Korkmaz, Ayhan; Atalay, Mustafa; Weber, Daniela; Grune, Tilman; Borsa, Claudia; Gradinaru, Daniela; Chand Bollineni, Ravi; Fedorova, Maria; Griffiths, Helen R.
2014-01-01
Protein carbonyls are widely analysed as a measure of protein oxidation. Several different methods exist for their determination. A previous study had described orders of magnitude variance that existed when protein carbonyls were analysed in a single laboratory by ELISA using different commercial kits. We have further explored the potential causes of variance in carbonyl analysis in a ring study. A soluble protein fraction was prepared from rat liver and exposed to 0, 5 and 15 min of UV irradiation. Lyophilised preparations were distributed to six different laboratories that routinely undertook protein carbonyl analysis across Europe. ELISA and Western blotting techniques detected an increase in protein carbonyl formation between 0 and 5 min of UV irradiation irrespective of method used. After irradiation for 15 min, less oxidation was detected by half of the laboratories than after 5 min irradiation. Three of the four ELISA carbonyl results fell within 95% confidence intervals. Likely errors in calculating absolute carbonyl values may be attributed to differences in standardisation. Out of up to 88 proteins identified as containing carbonyl groups after tryptic cleavage of irradiated and control liver proteins, only seven were common in all three liver preparations. Lysine and arginine residues modified by carbonyls are likely to be resistant to tryptic proteolysis. Use of a cocktail of proteases may increase the recovery of oxidised peptides. In conclusion, standardisation is critical for carbonyl analysis and heavily oxidised proteins may not be effectively analysed by any existing technique. PMID:25560243
A Passive System Reliability Analysis for a Station Blackout
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brunett, Acacia; Bucknor, Matthew; Grabaskas, David
2015-05-03
The latest iterations of advanced reactor designs have included increased reliance on passive safety systems to maintain plant integrity during unplanned sequences. While these systems are advantageous in reducing the reliance on human intervention and availability of power, the phenomenological foundations on which these systems are built require a novel approach to a reliability assessment. Passive systems possess the unique ability to fail functionally without failing physically, a result of their explicit dependency on existing boundary conditions that drive their operating mode and capacity. Argonne National Laboratory is performing ongoing analyses that demonstrate various methodologies for the characterization of passivemore » system reliability within a probabilistic framework. Two reliability analysis techniques are utilized in this work. The first approach, the Reliability Method for Passive Systems, provides a mechanistic technique employing deterministic models and conventional static event trees. The second approach, a simulation-based technique, utilizes discrete dynamic event trees to treat time- dependent phenomena during scenario evolution. For this demonstration analysis, both reliability assessment techniques are used to analyze an extended station blackout in a pool-type sodium fast reactor (SFR) coupled with a reactor cavity cooling system (RCCS). This work demonstrates the entire process of a passive system reliability analysis, including identification of important parameters and failure metrics, treatment of uncertainties and analysis of results.« less
FAST: A multi-processed environment for visualization of computational fluid dynamics
NASA Technical Reports Server (NTRS)
Bancroft, Gordon V.; Merritt, Fergus J.; Plessel, Todd C.; Kelaita, Paul G.; Mccabe, R. Kevin
1991-01-01
Three-dimensional, unsteady, multi-zoned fluid dynamics simulations over full scale aircraft are typical of the problems being investigated at NASA Ames' Numerical Aerodynamic Simulation (NAS) facility on CRAY2 and CRAY-YMP supercomputers. With multiple processor workstations available in the 10-30 Mflop range, we feel that these new developments in scientific computing warrant a new approach to the design and implementation of analysis tools. These larger, more complex problems create a need for new visualization techniques not possible with the existing software or systems available as of this writing. The visualization techniques will change as the supercomputing environment, and hence the scientific methods employed, evolves even further. The Flow Analysis Software Toolkit (FAST), an implementation of a software system for fluid mechanics analysis, is discussed.
Prediction of light aircraft interior noise
NASA Technical Reports Server (NTRS)
Howlett, J. T.; Morales, D. A.
1976-01-01
At the present time, predictions of aircraft interior noise depend heavily on empirical correction factors derived from previous flight measurements. However, to design for acceptable interior noise levels and to optimize acoustic treatments, analytical techniques which do not depend on empirical data are needed. This paper describes a computerized interior noise prediction method for light aircraft. An existing analytical program (developed for commercial jets by Cockburn and Jolly in 1968) forms the basis of some modal analysis work which is described. The accuracy of this modal analysis technique for predicting low-frequency coupled acoustic-structural natural frequencies is discussed along with trends indicating the effects of varying parameters such as fuselage length and diameter, structural stiffness, and interior acoustic absorption.
NASA Technical Reports Server (NTRS)
Doyle, James D.; Warner, Thomas T.
1988-01-01
Various combinations of VAS (Visible and Infrared Spin Scan Radiometer Atmospheric Sounder) data, conventional rawinsonde data, and gridded data from the National Weather Service's (NWS) global analysis, were used in successive-correction and variational objective-analysis procedures. Analyses are produced for 0000 GMT 7 March 1982, when the VAS sounding distribution was not greatly limited by the existence of cloud cover. The successive-correction (SC) Procedure was used with VAS data alone, rawinsonde data alone, and both VAS and rawinsonde data. Variational techniques were applied in three ways. Each of these techniques was discussed.
Analysis of XFEL serial diffraction data from individual crystalline fibrils
Wojtas, David H.; Ayyer, Kartik; Liang, Mengning; Mossou, Estelle; Romoli, Filippo; Seuring, Carolin; Beyerlein, Kenneth R.; Bean, Richard J.; Morgan, Andrew J.; Oberthuer, Dominik; Fleckenstein, Holger; Heymann, Michael; Gati, Cornelius; Yefanov, Oleksandr; Barthelmess, Miriam; Ornithopoulou, Eirini; Galli, Lorenzo; Xavier, P. Lourdu; Ling, Wai Li; Frank, Matthias; Yoon, Chun Hong; White, Thomas A.; Bajt, Saša; Mitraki, Anna; Boutet, Sebastien; Aquila, Andrew; Barty, Anton; Forsyth, V. Trevor; Chapman, Henry N.; Millane, Rick P.
2017-01-01
Serial diffraction data collected at the Linac Coherent Light Source from crystalline amyloid fibrils delivered in a liquid jet show that the fibrils are well oriented in the jet. At low fibril concentrations, diffraction patterns are recorded from single fibrils; these patterns are weak and contain only a few reflections. Methods are developed for determining the orientation of patterns in reciprocal space and merging them in three dimensions. This allows the individual structure amplitudes to be calculated, thus overcoming the limitations of orientation and cylindrical averaging in conventional fibre diffraction analysis. The advantages of this technique should allow structural studies of fibrous systems in biology that are inaccessible using existing techniques. PMID:29123682
Adaptation of the Practice Environment Scale for military nurses: a psychometric analysis.
Swiger, Pauline A; Raju, Dheeraj; Breckenridge-Sproat, Sara; Patrician, Patricia A
2017-09-01
The aim of this study was to confirm the psychometric properties of Practice Environment Scale of the Nursing Work Index in a military population. This study also demonstrates association rule analysis, a contemporary exploratory technique. One of the instruments most commonly used to evaluate the nursing practice environment is the Practice Environment Scale of the Nursing Work Index. Although the instrument has been widely used, the reliability, validity and individual item function are not commonly evaluated. Gaps exist with regard to confirmatory evaluation of the subscale factors, individual item analysis and evaluation in the outpatient setting and with non-registered nursing staff. This was a secondary data analysis of existing survey data. Multiple psychometric methods were used for this analysis using survey data collected in 2014. First, descriptive analyses were conducted, including exploration using association rules. Next, internal consistency was tested and confirmatory factor analysis was performed to test the factor structure. The specified factor structure did not hold; therefore, exploratory factor analysis was performed. Finally, item analysis was executed using item response theory. The differential item functioning technique allowed the comparison of responses by care setting and nurse type. The results of this study indicate that responses differ between groups and that several individual items could be removed without altering the psychometric properties of the instrument. The instrument functions moderately well in a military population; however, researchers may want to consider nurse type and care setting during analysis to identify any meaningful variation in responses. © 2017 John Wiley & Sons Ltd.
Novel measurement techniques (development and analysis of silicon solar cells near 20% effciency)
NASA Technical Reports Server (NTRS)
Wolf, M.; Newhouse, M.
1986-01-01
Work in identifying, developing, and analyzing techniques for measuring bulk recombination rates, and surface recombination velocities and rates in all regions of high-efficiency silicon solar cells is presented. The accuracy of the previously developed DC measurement system was improved by adding blocked interference filters. The system was further automated by writing software that completely samples the unkown solar cell regions with data of numerous recombination velocity and lifetime pairs. The results can be displayed in three dimensions and the best fit can be found numerically using the simplex minimization algorithm. Also described is a theoretical methodology to analyze and compare existing dynamic measurement techniques.
Novel measurement techniques (development and analysis of silicon solar cells near 20% effciency)
NASA Astrophysics Data System (ADS)
Wolf, M.; Newhouse, M.
Work in identifying, developing, and analyzing techniques for measuring bulk recombination rates, and surface recombination velocities and rates in all regions of high-efficiency silicon solar cells is presented. The accuracy of the previously developed DC measurement system was improved by adding blocked interference filters. The system was further automated by writing software that completely samples the unkown solar cell regions with data of numerous recombination velocity and lifetime pairs. The results can be displayed in three dimensions and the best fit can be found numerically using the simplex minimization algorithm. Also described is a theoretical methodology to analyze and compare existing dynamic measurement techniques.
Report of the panel on international programs
NASA Technical Reports Server (NTRS)
Anderson, Allen Joel; Fuchs, Karl W.; Ganeka, Yasuhiro; Gaur, Vinod; Green, Andrew A.; Siegfried, W.; Lambert, Anthony; Rais, Jacub; Reighber, Christopher; Seeger, Herman
1991-01-01
The panel recommends that NASA participate and take an active role in the continuous monitoring of existing regional networks, the realization of high resolution geopotential and topographic missions, the establishment of interconnection of the reference frames as defined by different space techniques, the development and implementation of automation for all ground-to-space observing systems, calibration and validation experiments for measuring techniques and data, the establishment of international space-based networks for real-time transmission of high density space data in standardized formats, tracking and support for non-NASA missions, and the extension of state-of-the art observing and analysis techniques to developing nations.
Fault Tree Analysis Application for Safety and Reliability
NASA Technical Reports Server (NTRS)
Wallace, Dolores R.
2003-01-01
Many commercial software tools exist for fault tree analysis (FTA), an accepted method for mitigating risk in systems. The method embedded in the tools identifies a root as use in system components, but when software is identified as a root cause, it does not build trees into the software component. No commercial software tools have been built specifically for development and analysis of software fault trees. Research indicates that the methods of FTA could be applied to software, but the method is not practical without automated tool support. With appropriate automated tool support, software fault tree analysis (SFTA) may be a practical technique for identifying the underlying cause of software faults that may lead to critical system failures. We strive to demonstrate that existing commercial tools for FTA can be adapted for use with SFTA, and that applied to a safety-critical system, SFTA can be used to identify serious potential problems long before integrator and system testing.
Consistent detection and identification of individuals in a large camera network
NASA Astrophysics Data System (ADS)
Colombo, Alberto; Leung, Valerie; Orwell, James; Velastin, Sergio A.
2007-10-01
In the wake of an increasing number of terrorist attacks, counter-terrorism measures are now a main focus of many research programmes. An important issue for the police is the ability to track individuals and groups reliably through underground stations, and in the case of post-event analysis, to be able to ascertain whether specific individuals have been at the station previously. While there exist many motion detection and tracking algorithms, the reliable deployment of them in a large network is still ongoing research. Specifically, to track individuals through multiple views, on multiple levels and between levels, consistent detection and labelling of individuals is crucial. In view of these issues, we have developed a change detection algorithm to work reliably in the presence of periodic movements, e.g. escalators and scrolling advertisements, as well as a content-based retrieval technique for identification. The change detection technique automatically extracts periodically varying elements in the scene using Fourier analysis, and constructs a Markov model for the process. Training is performed online, and no manual intervention is required, making this system suitable for deployment in large networks. Experiments on real data shows significant improvement over existing techniques. The content-based retrieval technique uses MPEG-7 descriptors to identify individuals. Given the environment under which the system operates, i.e. at relatively low resolution, this approach is suitable for short timescales. For longer timescales, other forms of identification such as gait, or if the resolution allows, face recognition, will be required.
NASA Astrophysics Data System (ADS)
Ravindra, H. J.; John Kiran, A.; Nooji, Satheesha Rai; Dharmaprakash, S. M.; Chandrasekharan, K.; Kalluraya, Balakrishna; Rotermund, Fabian
2008-05-01
Good quality single crystals of p-chloro dibenzylideneacetone (CDBA) of size 13 mm×8 mm×2 mm were grown by slow evaporation solution growth technique. The grown crystals were confirmed by elemental analysis, Fourier transform infrared (FTIR) analysis and single crystal X-ray diffraction techniques. From the thermo gravimetric/differential thermal (TG/DT) analysis, the CDBA was found to be thermally stable up to 250 °C. The mechanical stability of the crystal is comparable with that of the other reported chalcones. The lower optical cut-off wavelength for this crystal was observed at 440 nm. The laser damage threshold of the crystal was 0.6 GW/cm 2 at 532 nm. The second harmonic generation conversion efficiency of the powder sample of CDBA was found to be 4.5 times greater than that of urea. We also demonstrate the existence of the phase matching property in this crystal using Kurtz powder technique.
Development of Novel Noninvasive Methods of Stress Assessment in Baleen Whales
2015-09-30
large whales. Few methods exist for assessment of physiological stress levels of free-swimming cetaceans (Amaral 2010, ONR 2010, Hunt et al. 2013...adrenal hormone aldosterone . Our aim in this project is to further develop both techniques - respiratory hormone analysis and fecal hormone analysis...development of a noninvasive aldosterone assay (for both feces and blow) that can be used as an alternative measure of adrenal gland activation relative to
DeLong, Jeffrey M; Waterman, Brian R
2015-11-01
To systematically review reconstruction techniques of the medial collateral ligament (MCL) and associated medial structures of the knee (e.g., posterior oblique ligament). A systematic review of Medline/PubMed Database (1966 to November 2013), reference list scanning and citation searches of included articles, and manual searches of high-impact journals (2000 to July 2013) and conference proceedings (2009 to July 2013) were performed to identify publications describing MCL reconstruction techniques of the knee. Exclusion criteria included (1) MCL primary repair techniques or advancement procedures, (2) lack of clear description of MCL reconstruction technique, (3) animal models, (4) nonrelevant study design, (5) and foreign language articles without available translation. After review of 4,600 references, 25 publications with 359 of 388 patients (92.5%) were isolated for analysis, including 18 single-bundle MCL and 10 double-bundle reconstruction techniques. Only 2 techniques were classified as anatomic reconstructions, and clinical and objective outcomes (n = 28; 100% <3 mm side-to-side difference [SSD]) were superior to those with nonanatomic reconstruction (n = 182; 79.1% <3 mm SSD) and tendon transfer techniques (n = 114; 52.6% <3 mm SSD). This systematic review demonstrated that numerous medial reconstruction techniques have been used in the treatment of isolated and combined medial knee injuries in the existent literature. Many variations exist among reconstruction techniques and may differ by graft choices, method of fixation, number of bundles, tensioning protocol, and degree of anatomic restoration of medial and posteromedial corner knee restraints. Further studies are required to better ascertain the comparative clinical outcomes with anatomic, non-anatomic, and tendon transfer techniques for medial knee reconstruction. Level IV, systematic review of level IV studies and surgical techniques. Published by Elsevier Inc.
Fabrication Materials for a Closed Cycle Brayton Turbine Wheel
NASA Technical Reports Server (NTRS)
Khandelwal, Suresh; Hah, Chunill; Powers, Lynn M.; Stewart, Mark E.; Suresh, Ambady; Owen, Albert K.
2006-01-01
A multidisciplinary analysis of a radial inflow turbine rotor is presented. This work couples high-fidelity fluid, structural, and thermal simulations in a seamless multidisciplinary analysis to investigate the consequences of material selection. This analysis extends multidisciplinary techniques previously demonstrated on rocket turbopumps and hypersonic engines. Since no design information is available for the anticipated Brayton rotating machinery, an existing rotor design (the Brayton Rotating Unit (BRU)) was used in the analysis. Steady state analysis results of a notional turbine rotor indicate that stress levels are easily manageable at the turbine inlet temperature, and stress levels anticipated using either superalloys or ceramics.
Histology image analysis for carcinoma detection and grading
He, Lei; Long, L. Rodney; Antani, Sameer; Thoma, George R.
2012-01-01
This paper presents an overview of the image analysis techniques in the domain of histopathology, specifically, for the objective of automated carcinoma detection and classification. As in other biomedical imaging areas such as radiology, many computer assisted diagnosis (CAD) systems have been implemented to aid histopathologists and clinicians in cancer diagnosis and research, which have been attempted to significantly reduce the labor and subjectivity of traditional manual intervention with histology images. The task of automated histology image analysis is usually not simple due to the unique characteristics of histology imaging, including the variability in image preparation techniques, clinical interpretation protocols, and the complex structures and very large size of the images themselves. In this paper we discuss those characteristics, provide relevant background information about slide preparation and interpretation, and review the application of digital image processing techniques to the field of histology image analysis. In particular, emphasis is given to state-of-the-art image segmentation methods for feature extraction and disease classification. Four major carcinomas of cervix, prostate, breast, and lung are selected to illustrate the functions and capabilities of existing CAD systems. PMID:22436890
ERIC Educational Resources Information Center
Landmesser, John Andrew
2014-01-01
Information technology (IT) investment decision makers are required to process large volumes of complex data. An existing body of knowledge relevant to IT portfolio management (PfM), decision analysis, visual comprehension of large volumes of information, and IT investment decision making suggest Multi-Criteria Decision Making (MCDM) and…
ERIC Educational Resources Information Center
Smith, Lindsey J. Wolff; Beretvas, S. Natasha
2017-01-01
Conventional multilevel modeling works well with purely hierarchical data; however, pure hierarchies rarely exist in real datasets. Applied researchers employ ad hoc procedures to create purely hierarchical data. For example, applied educational researchers either delete mobile participants' data from the analysis or identify the student only with…
ERIC Educational Resources Information Center
Mattern, Krista D.; Marini, Jessica P.; Shaw, Emily J.
2015-01-01
Throughout the college retention literature, there is a recurring theme that students leave college for a variety of reasons making retention a difficult phenomenon to model. In the current study, cluster analysis techniques were employed to investigate whether multiple empirically based profiles of nonreturning students existed to more fully…
A summary and evaluation of semi-empirical methods for the prediction of helicopter rotor noise
NASA Technical Reports Server (NTRS)
Pegg, R. J.
1979-01-01
Existing prediction techniques are compiled and described. The descriptions include input and output parameter lists, required equations and graphs, and the range of validity for each part of the prediction procedures. Examples are provided illustrating the analysis procedure and the degree of agreement with experimental results.
Organizational Training across Cultures: Variations in Practices and Attitudes
ERIC Educational Resources Information Center
Hassi, Abderrahman; Storti, Giovanna
2011-01-01
Purpose: The purpose of this paper is to provide a synthesis based on a review of the existing literature with respect to the variations in training practices and attitudes across national cultures. Design/methodology/approach: A content analysis technique was adopted with a comparative cross-cultural management perspective as a backdrop to…
Strategic Long Range Planning for Universities. AIR Forum 1980 Paper.
ERIC Educational Resources Information Center
Baker, Michael E.
The use of strategic long-range planning at Carnegie-Mellon University (CMU) is discussed. A structure for strategic planning analysis that integrates existing techniques is presented, and examples of planning activities at CMU are included. The key concept in strategic planning is competitive advantage: if a university has a competitive…
An Abstraction-Based Data Model for Information Retrieval
NASA Astrophysics Data System (ADS)
McAllister, Richard A.; Angryk, Rafal A.
Language ontologies provide an avenue for automated lexical analysis that may be used to supplement existing information retrieval methods. This paper presents a method of information retrieval that takes advantage of WordNet, a lexical database, to generate paths of abstraction, and uses them as the basis for an inverted index structure to be used in the retrieval of documents from an indexed corpus. We present this method as a entree to a line of research on using ontologies to perform word-sense disambiguation and improve the precision of existing information retrieval techniques.
Regional reconstruction of flash flood history in the Guadarrama range (Central System, Spain).
Rodriguez-Morata, C; Ballesteros-Cánovas, J A; Trappmann, D; Beniston, M; Stoffel, M
2016-04-15
Flash floods are a common natural hazard in Mediterranean mountain environments and responsible for serious economic and human disasters. The study of flash flood dynamics and their triggers is a key issue; however, the retrieval of historical data is often limited in mountain regions as a result of short time series and the systematic lack of historical data. In this study, we attempt to overcome data deficiency by supplementing existing records with dendrogeomorphic techniques which were employed in seven mountain streams along the northern slopes of the Guadarrama Mountain range. Here we present results derived from the tree-ring analysis of 117 samples from 63 Pinus sylvestris L. trees injured by flash floods, to complement existing flash flood records covering the last ~200years and comment on their hydro-meteorological triggers. To understand the varying number of reconstructed flash flood events in each of the catchments, we also performed a comparative analysis of geomorphic catchment characteristics, land use evolution and forest management. Furthermore, we discuss the limitations of dendrogeomorphic techniques applied in managed forests. Copyright © 2016 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Chen, Xi; Ogasawara, Nagahisa; Zhao, Manhong; Chiba, Norimasa
2007-08-01
Indentation is widely used to extract material elastoplastic properties from the measured force-displacement curves. One of the most well-established indentation techniques utilizes dual (or plural) sharp indenters (which have different apex angles) to deduce key parameters such as the elastic modulus, yield stress, and work-hardening exponent for materials that obey the power-law constitutive relationship. However, the uniqueness of such analysis is not yet systematically studied or challenged. Here we show the existence of "mystical materials", which have distinct elastoplastic properties yet they yield almost identical indentation behaviors, even when the indenter angle is varied in a large range. These mystical materials are, therefore, indistinguishable by many existing indentation analyses unless extreme (and often impractical) indenter angles are used. Explicit procedures of deriving these mystical materials are established, and the general characteristics of the mystical materials are discussed. In many cases, for a given indenter angle range, a material would have infinite numbers of mystical siblings, and the existence maps of the mystical materials are also obtained. Furthermore, we propose two alternative techniques to effectively distinguish these mystical materials. The study in this paper addresses the important question of the uniqueness of indentation test, as well as providing useful guidelines to properly use the indentation technique to measure material elastoplastic properties.
Particle Streak Anemometry: A New Method for Proximal Flow Sensing from Aircraft
NASA Astrophysics Data System (ADS)
Nichols, T. W.
Accurate sensing of relative air flow direction from fixed-wing small unmanned aircraft (sUAS) is challenging with existing multi-hole pitot-static and vane systems. Sub-degree direction accuracy is generally not available on such systems and disturbances to the local flow field, induced by the airframe, introduce an additional error source. An optical imaging approach to make a relative air velocity measurement with high-directional accuracy is presented. Optical methods offer the capability to make a proximal measurement in undisturbed air outside of the local flow field without the need to place sensors on vulnerable probes extended ahead of the aircraft. Current imaging flow analysis techniques for laboratory use rely on relatively thin imaged volumes and sophisticated hardware and intensity thresholding in low-background conditions. A new method is derived and assessed using a particle streak imaging technique that can be implemented with low-cost commercial cameras and illumination systems, and can function in imaged volumes of arbitrary depth with complex background signal. The new technique, referred to as particle streak anemometry (PSA) (to differentiate from particle streak velocimetry which makes a field measurement rather than a single bulk flow measurement) utilizes a modified Canny Edge detection algorithm with a connected component analysis and principle component analysis to detect streak ends in complex imaging conditions. A linear solution for the air velocity direction is then implemented with a random sample consensus (RANSAC) solution approach. A single DOF non-linear, non-convex optimization problem is then solved for the air speed through an iterative approach. The technique was tested through simulation and wind tunnel tests yielding angular accuracies under 0.2 degrees, superior to the performance of existing commercial systems. Air speed error standard deviations varied from 1.6 to 2.2 m/s depending on the techniques of implementation. While air speed sensing is secondary to accurate flow direction measurement, the air speed results were in line with commercial pitot static systems at low speeds.
Open source Modeling and optimization tools for Planning
DOE Office of Scientific and Technical Information (OSTI.GOV)
Peles, S.
Open source modeling and optimization tools for planning The existing tools and software used for planning and analysis in California are either expensive, difficult to use, or not generally accessible to a large number of participants. These limitations restrict the availability of participants for larger scale energy and grid studies in the state. The proposed initiative would build upon federal and state investments in open source software, and create and improve open source tools for use in the state planning and analysis activities. Computational analysis and simulation frameworks in development at national labs and universities can be brought forward tomore » complement existing tools. An open source platform would provide a path for novel techniques and strategies to be brought into the larger community and reviewed by a broad set of stakeholders.« less
NASA Astrophysics Data System (ADS)
Cheruku, Rajesh; Govindaraj, G.; Vijayan, Lakshmi
2017-12-01
The nanocrystalline lithium ferrite was synthesized by wet chemical methods such as solution combustion technique, sol-gel, and hydrothermal for a comparative study. Different characterization techniques like x-ray powder diffraction and thermal analysis were employed to confirm the structure and phase. Temperature-dependent Raman analysis was employed to classify the phonon modes associated with precise atomic motions existing in the synthesized materials. Morphology of sample surface was explored by scanning electron microscopy, and elemental analysis was done by energy dispersive spectroscopy analysis. The nanocrystalline nature of the materials was confirmed through transmission electron microscopy. Magnetic properties of these samples were explored through a vibrating sample magnetometer. Ac electrical impedance spectroscopy data were investigated using two Cole-Cole functions, and activation energies were calculated for all materials. Among them, solution combustion prepared lithium ferrite shows the highest conductivity and lowest activation energy.
DOE Office of Scientific and Technical Information (OSTI.GOV)
LaBelle, S.J.; Smith, A.E.; Seymour, D.A.
1977-02-01
The technique applies equally well to new or existing airports. The importance of accurate accounting of emissions, cannot be overstated. The regional oxidant modelling technique used in conjunction with a balance sheet review must be a proportional reduction technique. This type of emission balancing presumes equality of all sources in the analysis region. The technique can be applied successfully in the highway context, either in planning at the system level or looking only at projects individually. The project-by-project reviews could be used to examine each project in the same way as the airport projects are examined for their impact onmore » regional desired emission levels. The primary limitation of this technique is that it should not be used when simulation models have been used for regional oxidant air quality. In the case of highway projects, the balance sheet technique might appear to be limited; the real limitations are in the transportation planning process. That planning process is not well-suited to the needs of air quality forecasting. If the transportation forecasting techniques are insensitive to change in the variables that affect HC emissions, then no internal emission trade-offs can be identified, and the initial highway emission forecasts are themselves suspect. In general, the balance sheet technique is limited by the quality of the data used in the review. Additionally, the technique does not point out effective trade-off strategies, nor does it indicate when it might be worthwhile to ignore small amounts of excess emissions. Used in the context of regional air quality plans based on proportional reduction models, the balance sheet analysis technique shows promise as a useful method by state or regional reviewing agencies.« less
Characterization of agricultural land using singular value decomposition
NASA Astrophysics Data System (ADS)
Herries, Graham M.; Danaher, Sean; Selige, Thomas
1995-11-01
A method is defined and tested for the characterization of agricultural land from multi-spectral imagery, based on singular value decomposition (SVD) and key vector analysis. The SVD technique, which bears a close resemblance to multivariate statistic techniques, has previously been successfully applied to problems of signal extraction for marine data and forestry species classification. In this study the SVD technique is used as a classifier for agricultural regions, using airborne Daedalus ATM data, with 1 m resolution. The specific region chosen is an experimental research farm in Bavaria, Germany. This farm has a large number of crops, within a very small region and hence is not amenable to existing techniques. There are a number of other significant factors which render existing techniques such as the maximum likelihood algorithm less suitable for this area. These include a very dynamic terrain and tessellated pattern soil differences, which together cause large variations in the growth characteristics of the crops. The SVD technique is applied to this data set using a multi-stage classification approach, removing unwanted land-cover classes one step at a time. Typical classification accuracy's for SVD are of the order of 85-100%. Preliminary results indicate that it is a fast and efficient classifier with the ability to differentiate between crop types such as wheat, rye, potatoes and clover. The results of characterizing 3 sub-classes of Winter Wheat are also shown.
Study of TEC and foF2 with the Help of GPS and Ionosonde Data over Maitri, Antarctica
NASA Astrophysics Data System (ADS)
Khatarkar, Prakash; Gwal, Ashok Kumar
Prakash Khatarkar, Purusottam Bhaware, Azad Ahmad Mansoori, Varsha Kachneria, Shweta Thakur, and A. K. Gwal Abstract The behavior of ionosphere can be diagnosed by a number of techniques. The common techniques used are the space based Global Positioning System and the ground based Ionosonde. We have compared the variability of ionospheric parameters by using two different techniques GPS and Ionosonde, during December 2009 to November 2010 at the Indian base station Maitri (11.45E, 70.45S). The comparison between the measurements of two techniques was realized through the Total Electron Content (TEC) parameters derived by using different methods. The comparison was made diurnally, seasonally, polar day and polar night variations and the annually. From our analysis we found that a strong correlation exists between the GPS derived TEC and Ionosonde derived foF2 during the day period while during the night time the correlation is insignificant. At the same time we found that a strong correlation exists between the Ionosonde and GPS derived TEC. The pattern of variation of ionospheric parameters derived from two techniques is strikingly similar indicating that the high degree of synchronization between them. This has a practical applicability by allowing calculating the error in one technique by comparing with other. Keywords: Ionosphere, Ionosonde, GPS, foF2, TEC.
Illias, Hazlee Azil; Chai, Xin Rui; Abu Bakar, Ab Halim; Mokhlis, Hazlie
2015-01-01
It is important to predict the incipient fault in transformer oil accurately so that the maintenance of transformer oil can be performed correctly, reducing the cost of maintenance and minimise the error. Dissolved gas analysis (DGA) has been widely used to predict the incipient fault in power transformers. However, sometimes the existing DGA methods yield inaccurate prediction of the incipient fault in transformer oil because each method is only suitable for certain conditions. Many previous works have reported on the use of intelligence methods to predict the transformer faults. However, it is believed that the accuracy of the previously proposed methods can still be improved. Since artificial neural network (ANN) and particle swarm optimisation (PSO) techniques have never been used in the previously reported work, this work proposes a combination of ANN and various PSO techniques to predict the transformer incipient fault. The advantages of PSO are simplicity and easy implementation. The effectiveness of various PSO techniques in combination with ANN is validated by comparison with the results from the actual fault diagnosis, an existing diagnosis method and ANN alone. Comparison of the results from the proposed methods with the previously reported work was also performed to show the improvement of the proposed methods. It was found that the proposed ANN-Evolutionary PSO method yields the highest percentage of correct identification for transformer fault type than the existing diagnosis method and previously reported works.
2015-01-01
It is important to predict the incipient fault in transformer oil accurately so that the maintenance of transformer oil can be performed correctly, reducing the cost of maintenance and minimise the error. Dissolved gas analysis (DGA) has been widely used to predict the incipient fault in power transformers. However, sometimes the existing DGA methods yield inaccurate prediction of the incipient fault in transformer oil because each method is only suitable for certain conditions. Many previous works have reported on the use of intelligence methods to predict the transformer faults. However, it is believed that the accuracy of the previously proposed methods can still be improved. Since artificial neural network (ANN) and particle swarm optimisation (PSO) techniques have never been used in the previously reported work, this work proposes a combination of ANN and various PSO techniques to predict the transformer incipient fault. The advantages of PSO are simplicity and easy implementation. The effectiveness of various PSO techniques in combination with ANN is validated by comparison with the results from the actual fault diagnosis, an existing diagnosis method and ANN alone. Comparison of the results from the proposed methods with the previously reported work was also performed to show the improvement of the proposed methods. It was found that the proposed ANN-Evolutionary PSO method yields the highest percentage of correct identification for transformer fault type than the existing diagnosis method and previously reported works. PMID:26103634
NASA Astrophysics Data System (ADS)
Prilianti, K. R.; Setiawan, Y.; Indriatmoko, Adhiwibawa, M. A. S.; Limantara, L.; Brotosudarmo, T. H. P.
2014-02-01
Environmental and health problem caused by artificial colorant encourages the increasing usage of natural colorant nowadays. Natural colorant refers to the colorant that is derivate from living organism or minerals. Extensive research topic has been done to exploit these colorant, but recent data shows that only 0.5% of the wide range of plant pigments in the earth has been exhaustively used. Hence development of the pigment characterization technique is an important consideration. High-performance liquid chromatography (HPLC) is a widely used technique to separate pigments in a mixture and identify it. In former HPLC fingerprinting, pigment characterization was based on a single chromatogram from a fixed wavelength (one dimensional) and discard the information contained at other wavelength. Therefore, two dimensional fingerprints have been proposed to use more chromatographic information. Unfortunately this method leads to the data processing problem due to the size of its data matrix. The other common problem in the chromatogram analysis is the subjectivity of the researcher in recognizing the chromatogram pattern. In this research an automated analysis method of the multi wavelength chromatographic data was proposed. Principal component analysis (PCA) was used to compress the data matrix and Maximum Likelihood (ML) classification was applied to identify the chromatogram pattern of the existing pigments in a mixture. Three photosynthetic pigments were selected to show the proposed method. Those pigments are β-carotene, fucoxanthin and zeaxanthin. The result suggests that the method could well inform the existence of the pigments in a particular mixture. A simple computer application was also developed to facilitate real time analysis. Input of the application is multi wavelength chromatographic data matrix and the output is information about the existence of the three pigments.
Gutiérrez, Alfonso; Prieto, Iván; Cancela, José M.
2009-01-01
The purpose of this study is to provide a tool, based on the knowledge of technical errors, which helps to improve the teaching and learning process of the Uki Goshi technique. With this aim, we set out to determine the most frequent errors made by 44 students when performing this technique and how these mistakes relate. In order to do so, an observational analysis was carried out using the OSJUDO-UKG instrument and the data were registered using Match Vision Studio (Castellano, Perea, Alday and Hernández, 2008). The results, analyzed through descriptive statistics, show that the absence of a correct initial unbalancing movement (45,5%), the lack of proper right-arm pull (56,8%), not blocking the faller’s body (Uke) against the thrower’s hip -Tori- (54,5%) and throwing the Uke through the Tori’s side are the most usual mistakes (72,7%). Through the sequencial analysis of T-Patterns obtained with the THÈME program (Magnusson, 1996, 2000) we have concluded that not blocking the body with the Tori’s hip provokes the Uke’s throw through the Tori’s side during the final phase of the technique (95,8%), and positioning the right arm on the dorsal region of the Uke’s back during the Tsukuri entails the absence of a subsequent pull of the Uke’s body (73,3%). Key Points In this study, the most frequent errors in the performance of the Uki Goshi technique have been determined and the existing relations among these mistakes have been shown through T-Patterns. The SOBJUDO-UKG is an observation instrument for detecting mistakes in the aforementioned technique. The results show that those mistakes related to the initial imbalancing movement and the main driving action of the technique are the most frequent. The use of T-Patterns turns out to be effective in order to obtain the most important relations among the observed errors. PMID:24474885
Extracting Loop Bounds for WCET Analysis Using the Instrumentation Point Graph
NASA Astrophysics Data System (ADS)
Betts, A.; Bernat, G.
2009-05-01
Every calculation engine proposed in the literature of Worst-Case Execution Time (WCET) analysis requires upper bounds on loop iterations. Existing mechanisms to procure this information are either error prone, because they are gathered from the end-user, or limited in scope, because automatic analyses target very specific loop structures. In this paper, we present a technique that obtains bounds completely automatically for arbitrary loop structures. In particular, we show how to employ the Instrumentation Point Graph (IPG) to parse traces of execution (generated by an instrumented program) in order to extract bounds relative to any loop-nesting level. With this technique, therefore, non-rectangular dependencies between loops can be captured, allowing more accurate WCET estimates to be calculated. We demonstrate the improvement in accuracy by comparing WCET estimates computed through our HMB framework against those computed with state-of-the-art techniques.
NASA Technical Reports Server (NTRS)
Bemra, R. S.; Rastogi, P. K.; Balsley, B. B.
1986-01-01
An analysis of frequency spectra at periods of about 5 days to 5 min from two 20-day sets of velocity measurements in the stratosphere and troposphere region obtained with the Poker Flat mesosphere-stratosphere-troposphere (MST) radar during January and June, 1984 is presented. A technique based on median filtering and averaged order statistics for automatic editing, smoothing and spectral analysis of velocity time series contaminated with spurious data points or outliers is outlined. The validity of this technique and its effects on the inferred spectral index was tested through simulation. Spectra obtained with this technique are discussed. The measured spectral indices show variability with season and height, especially across the tropopause. The discussion briefly outlines the need for obtaining better climatologies of velocity spectra and for the refinements of the existing theories to explain their behavior.
NASA Technical Reports Server (NTRS)
Brooks, David E.; Gassman, Holly; Beering, Dave R.; Welch, Arun; Hoder, Douglas J.; Ivancic, William D.
1999-01-01
Transmission Control Protocol (TCP) is the underlying protocol used within the Internet for reliable information transfer. As such, there is great interest to have all implementations of TCP efficiently interoperate. This is particularly important for links exhibiting long bandwidth-delay products. The tools exist to perform TCP analysis at low rates and low delays. However, for extremely high-rate and lone-delay links such as 622 Mbps over geosynchronous satellites, new tools and testing techniques are required. This paper describes the tools and techniques used to analyze and debug various TCP implementations over high-speed, long-delay links.
Evaluation of automobiles with alternative fuels utilizing multicriteria techniques
NASA Astrophysics Data System (ADS)
Brey, J. J.; Contreras, I.; Carazo, A. F.; Brey, R.; Hernández-Díaz, A. G.; Castro, A.
This work applies the non-parametric technique of Data Envelopment Analysis (DEA) to conduct a multicriteria comparison of some existing and under development technologies in the automotive sector. The results indicate that some of the technologies under development, such as hydrogen fuel cell vehicles, can be classified as efficient when evaluated in function of environmental and economic criteria, with greater importance being given to the environmental criteria. The article also demonstrates the need to improve the hydrogen-based technology, in comparison with the others, in aspects such as vehicle sale costs and fuel price.
[Meta-analyses of quarks, baryons and mesons--a "Cochrane Collaboration" in particle physics].
Sauerland, Stefan; Sauerland, Thankmar; Antes, Gerd; Barnett, R Michael
2002-02-01
Within the last 20 years meta-analysis has become an important research technique in medicine for integrating the results of independent studies. Meta-analytical techniques, however, are much older. In particle physics for 50 years now the properties of huge numbers of particles have been assessed in meta-analyses. The Cochrane Collaboration's counterpart in physics is the Particle Data Group. This article compares methodological and organisational aspects of meta-analyses in medicine and physics. Several interesting parallels exist, especially with regard to methodology.
On the Use of Statistics in Design and the Implications for Deterministic Computer Experiments
NASA Technical Reports Server (NTRS)
Simpson, Timothy W.; Peplinski, Jesse; Koch, Patrick N.; Allen, Janet K.
1997-01-01
Perhaps the most prevalent use of statistics in engineering design is through Taguchi's parameter and robust design -- using orthogonal arrays to compute signal-to-noise ratios in a process of design improvement. In our view, however, there is an equally exciting use of statistics in design that could become just as prevalent: it is the concept of metamodeling whereby statistical models are built to approximate detailed computer analysis codes. Although computers continue to get faster, analysis codes always seem to keep pace so that their computational time remains non-trivial. Through metamodeling, approximations of these codes are built that are orders of magnitude cheaper to run. These metamodels can then be linked to optimization routines for fast analysis, or they can serve as a bridge for integrating analysis codes across different domains. In this paper we first review metamodeling techniques that encompass design of experiments, response surface methodology, Taguchi methods, neural networks, inductive learning, and kriging. We discuss their existing applications in engineering design and then address the dangers of applying traditional statistical techniques to approximate deterministic computer analysis codes. We conclude with recommendations for the appropriate use of metamodeling techniques in given situations and how common pitfalls can be avoided.
Chellali, Amine; Schwaitzberg, Steven D.; Jones, Daniel B.; Romanelli, John; Miller, Amie; Rattner, David; Roberts, Kurt E.; Cao, Caroline G.L.
2014-01-01
Background NOTES is an emerging technique for performing surgical procedures, such as cholecystectomy. Debate about its real benefit over the traditional laparoscopic technique is on-going. There have been several clinical studies comparing NOTES to conventional laparoscopic surgery. However, no work has been done to compare these techniques from a Human Factors perspective. This study presents a systematic analysis describing and comparing different existing NOTES methods to laparoscopic cholecystectomy. Methods Videos of endoscopic/laparoscopic views from fifteen live cholecystectomies were analyzed to conduct a detailed task analysis of the NOTES technique. A hierarchical task analysis of laparoscopic cholecystectomy and several hybrid transvaginal NOTES cholecystectomies was performed and validated by expert surgeons. To identify similarities and differences between these techniques, their hierarchical decomposition trees were compared. Finally, a timeline analysis was conducted to compare the steps and substeps. Results At least three variations of the NOTES technique were used for cholecystectomy. Differences between the observed techniques at the substep level of hierarchy and on the instruments being used were found. The timeline analysis showed an increase in time to perform some surgical steps and substeps in NOTES compared to laparoscopic cholecystectomy. Conclusion As pure NOTES is extremely difficult given the current state of development in instrumentation design, most surgeons utilize different hybrid methods – combination of endoscopic and laparoscopic instruments/optics. Results of our hierarchical task analysis yielded an identification of three different hybrid methods to perform cholecystectomy with significant variability amongst them. The varying degrees to which laparoscopic instruments are utilized to assist in NOTES methods appear to introduce different technical issues and additional tasks leading to an increase in the surgical time. The NOTES continuum of invasiveness is proposed here as a classification scheme for these methods, which was used to construct a clear roadmap for training and technology development. PMID:24902811
Webster, Victoria A; Nieto, Santiago G; Grosberg, Anna; Akkus, Ozan; Chiel, Hillel J; Quinn, Roger D
2016-10-01
In this study, new techniques for approximating the contractile properties of cells in biohybrid devices using Finite Element Analysis (FEA) have been investigated. Many current techniques for modeling biohybrid devices use individual cell forces to simulate the cellular contraction. However, such techniques result in long simulation runtimes. In this study we investigated the effect of the use of thermal contraction on simulation runtime. The thermal contraction model was significantly faster than models using individual cell forces, making it beneficial for rapidly designing or optimizing devices. Three techniques, Stoney׳s Approximation, a Modified Stoney׳s Approximation, and a Thermostat Model, were explored for calibrating thermal expansion/contraction parameters (TECPs) needed to simulate cellular contraction using thermal contraction. The TECP values were calibrated by using published data on the deflections of muscular thin films (MTFs). Using these techniques, TECP values that suitably approximate experimental deflections can be determined by using experimental data obtained from cardiomyocyte MTFs. Furthermore, a sensitivity analysis was performed in order to investigate the contribution of individual variables, such as elastic modulus and layer thickness, to the final calibrated TECP for each calibration technique. Additionally, the TECP values are applicable to other types of biohybrid devices. Two non-MTF models were simulated based on devices reported in the existing literature. Copyright © 2016 Elsevier Ltd. All rights reserved.
Development of a New VLBI Data Analysis Software
NASA Technical Reports Server (NTRS)
Bolotin, Sergei; Gipson, John M.; MacMillan, Daniel S.
2010-01-01
We present an overview of a new VLBI analysis software under development at NASA GSFC. The new software will replace CALC/SOLVE and many related utility programs. It will have the capabilities of the current system as well as incorporate new models and data analysis techniques. In this paper we give a conceptual overview of the new software. We formulate the main goals of the software. The software should be flexible and modular to implement models and estimation techniques that currently exist or will appear in future. On the other hand it should be reliable and possess production quality for processing standard VLBI sessions. Also, it needs to be capable of processing observations from a fully deployed network of VLBI2010 stations in a reasonable time. We describe the software development process and outline the software architecture.
2017-12-01
carefully to ensure only minimum information needed for effective management control is requested. Requires cost-benefit analysis and PM...baseline offers metrics that highlights performance treads and program variances. This information provides Program Managers and higher levels of...The existing training philosophy is effective only if the managers using the information have well trained and experienced personnel that can
NASA Astrophysics Data System (ADS)
Bezmaternykh, P. V.; Nikolaev, D. P.; Arlazarov, V. L.
2018-04-01
Textual blocks rectification or slant correction is an important stage of document image processing in OCR systems. This paper considers existing methods and introduces an approach for the construction of such algorithms based on Fast Hough Transform analysis. A quality measurement technique is proposed and obtained results are shown for both printed and handwritten textual blocks processing as a part of an industrial system of identity documents recognition on mobile devices.
Chen, Z; Ngo, H H; Guo, W S; Listowski, A; O'Halloran, K; Thompson, M; Muthukaruppan, M
2012-11-01
This paper aims to put forward several management alternatives regarding the application of recycled water for household laundry in Sydney. Based on different recycled water treatment techniques such as microfiltration (MF), granular activated carbon (GAC) or reverse osmosis (RO), and types of washing machines (WMs), five alternatives were proposed as follows: (1) do nothing scenario; (2) MF+existing WMs; (3) MF+new WMs; (4) MF-GAC+existing WMs; and (5) MF-RO+existing WMs. Accordingly, a comprehensive quantitative assessment on the trade-off among a variety of issues (e.g., engineering feasibility, initial cost, energy consumption, supply flexibility and water savings) was performed over the alternatives. This was achieved by a computer-based multi-criteria analysis (MCA) using the rank order weight generation together with preference ranking organization method for enrichment evaluation (PROMETHEE) outranking techniques. Particularly, the generated 10,000 combinations of weights via Monte Carlo simulation were able to significantly reduce the man-made errors of single fixed set of weights because of its objectivity and high efficiency. To illustrate the methodology, a case study on Rouse Hill Development Area (RHDA), Sydney, Australia was carried out afterwards. The study was concluded by highlighting the feasibility of using highly treated recycled water for existing and new washing machines. This could provide a powerful guidance for sustainable water reuse management in the long term. However, more detailed field trials and investigations are still needed to effectively understand, predict and manage the impact of selected recycled water for new end use alternatives. Copyright © 2012 Elsevier B.V. All rights reserved.
Clogging of Manifolds with Evaporatively Frozen Propellants. Part 2; Analysis
NASA Technical Reports Server (NTRS)
Simmon, J. A.; Gift, R. D.; Spurlock, J. M.
1966-01-01
The mechanisms of evaporative freezing of leaking propellant and the creation of flow stoppages within injector manifolds is discussed. A quantitative analysis of the conditions, including the existence of minimum and maximum leak rates, for the accumulation of evaporatively frozen propellant is presented. Clogging of the injector manifolds of the Apollo SPS and the Gemini OAMS engines by the freezing of leaking propellant is predicted and the seriousness of the consequences are discussed. Based on the analysis a realistic evaluation of selected techniques to eliminate flow stoppages by frozen propellant is made.
Susong, David D.; Gallegos, Tanya J.; Oelsner, Gretchen P.
2012-01-01
The U.S. Geological Survey (USGS) John Wesley Powell Center for Analysis and Synthesis is hosting an interdisciplinary working group of USGS scientists to conduct a temporal and spatial analysis of surface-water and groundwater quality in areas of unconventional oil and gas development. The analysis uses existing national and regional datasets to describe water quality, evaluate water-quality changes over time where there are sufficient data, and evaluate spatial and temporal data gaps.
NASA Technical Reports Server (NTRS)
Jones, Kenneth M.; Biedron, Robert T.; Whitlock, Mark
1995-01-01
A computational study was performed to determine the predictive capability of a Reynolds averaged Navier-Stokes code (CFL3D) for two-dimensional and three-dimensional multielement high-lift systems. Three configurations were analyzed: a three-element airfoil, a wing with a full span flap and a wing with a partial span flap. In order to accurately model these complex geometries, two different multizonal structured grid techniques were employed. For the airfoil and full span wing configurations, a chimera or overset grid technique was used. The results of the airfoil analysis illustrated that although the absolute values of lift were somewhat in error, the code was able to predict reasonably well the variation with Reynolds number and flap position. The full span flap analysis demonstrated good agreement with experimental surface pressure data over the wing and flap. Multiblock patched grids were used to model the partial span flap wing. A modification to an existing patched- grid algorithm was required to analyze the configuration as modeled. Comparisons with experimental data were very good, indicating the applicability of the patched-grid technique to analyses of these complex geometries.
Boundary formulations for sensitivity analysis without matrix derivatives
NASA Technical Reports Server (NTRS)
Kane, J. H.; Guru Prasad, K.
1993-01-01
A new hybrid approach to continuum structural shape sensitivity analysis employing boundary element analysis (BEA) is presented. The approach uses iterative reanalysis to obviate the need to factor perturbed matrices in the determination of surface displacement and traction sensitivities via a univariate perturbation/finite difference (UPFD) step. The UPFD approach makes it possible to immediately reuse existing subroutines for computation of BEA matrix coefficients in the design sensitivity analysis process. The reanalysis technique computes economical response of univariately perturbed models without factoring perturbed matrices. The approach provides substantial computational economy without the burden of a large-scale reprogramming effort.
Enhanced vasomotion of cerebral arterioles in spontaneously hypertensive rats
NASA Technical Reports Server (NTRS)
Lefer, D. J.; Lynch, C. D.; Lapinski, K. C.; Hutchins, P. M.
1990-01-01
Intrinsic rhythmic changes in the diameter of pial cerebral arterioles (30-70 microns) in anesthetized normotensive and hypertensive rats were assessed in vivo to determine if any significant differences exist between the two strains. All diameter measurements were analyzed using a traditional graphic analysis technique and a new frequency spectrum analysis technique known as the Prony Spectral Line Estimator. Graphic analysis of the data revealed that spontaneously hypertensive rats (SHR) possess a significantly greater fundamental frequency (5.57 +/- 0.28 cycles/min) of vasomotion compared to the control Wistar-Kyoto normotensive rats (WKY) (1.95 +/- 0.37 cycles/min). Furthermore, the SHR cerebral arterioles exhibited a significantly greater amplitude of vasomotion (10.07 +/- 0.70 microns) when compared to the WKY cerebral arterioles of the same diameter (8.10 +/- 0.70 microns). Diameter measurements processed with the Prony technique revealed that the fundamental frequency of vasomotion in SHR cerebral arterioles (6.14 +/- 0.39 cycles/min) was also significantly greater than that of the WKY cerebral arterioles (2.99 +/- 0.42 cycles/min). The mean amplitudes of vasomotion in the SHR and WKY strains obtained by the Prony analysis were found not to be statistically significant in contrast to the graphic analysis of the vasomotion amplitude of the arterioles. In addition, the Prony system was able to consistently uncover a very low frequency of vasomotion in both strains of rats that was typically less than 1 cycle/min and was not significantly different between the two strains. The amplitude of this slow frequency was also not significantly different between the two strains. The amplitude of the slow frequency of vasomotion (less than 1 cycle/min) was not different from the amplitude of the higher frequency (2-6 cycles/min) vasomotion by Prony or graphic analysis. These data suggest that a fundamental intrinsic defect exists in the spontaneously hypertensive rat that may contribute to the pathogenesis of hypertension in these animals.
Analysis of XFEL serial diffraction data from individual crystalline fibrils
Wojtas, David H.; Ayyer, Kartik; Liang, Mengning; ...
2017-10-20
Serial diffraction data collected at the Linac Coherent Light Source from crystalline amyloid fibrils delivered in a liquid jet show that the fibrils are well oriented in the jet. At low fibril concentrations, diffraction patterns are recorded from single fibrils; these patterns are weak and contain only a few reflections. Methods are developed for determining the orientation of patterns in reciprocal space and merging them in three dimensions. This allows the individual structure amplitudes to be calculated, thus overcoming the limitations of orientation and cylindrical averaging in conventional fibre diffraction analysis. In conclusion, the advantages of this technique should allowmore » structural studies of fibrous systems in biology that are inaccessible using existing techniques.« less
Analysis of XFEL serial diffraction data from individual crystalline fibrils
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wojtas, David H.; Ayyer, Kartik; Liang, Mengning
Serial diffraction data collected at the Linac Coherent Light Source from crystalline amyloid fibrils delivered in a liquid jet show that the fibrils are well oriented in the jet. At low fibril concentrations, diffraction patterns are recorded from single fibrils; these patterns are weak and contain only a few reflections. Methods are developed for determining the orientation of patterns in reciprocal space and merging them in three dimensions. This allows the individual structure amplitudes to be calculated, thus overcoming the limitations of orientation and cylindrical averaging in conventional fibre diffraction analysis. In conclusion, the advantages of this technique should allowmore » structural studies of fibrous systems in biology that are inaccessible using existing techniques.« less
Tau-independent Phase Analysis: A Novel Method for Accurately Determining Phase Shifts.
Tackenberg, Michael C; Jones, Jeff R; Page, Terry L; Hughey, Jacob J
2018-06-01
Estimations of period and phase are essential in circadian biology. While many techniques exist for estimating period, comparatively few methods are available for estimating phase. Current approaches to analyzing phase often vary between studies and are sensitive to coincident changes in period and the stage of the circadian cycle at which the stimulus occurs. Here we propose a new technique, tau-independent phase analysis (TIPA), for quantifying phase shifts in multiple types of circadian time-course data. Through comprehensive simulations, we show that TIPA is both more accurate and more precise than the standard actogram approach. TIPA is computationally simple and therefore will enable accurate and reproducible quantification of phase shifts across multiple subfields of chronobiology.
Reimer, G.M.
1990-01-01
Radon reconnaissance requires some special considerations because a large area must be covered in a short period of time and analyses must be made soon after collection because of Rn decay. A simple approach to collection and field analysis consists of a small-diameter probe pounded into the ground to a depth of at least 0.75 m. Analysis is by an alpha-scintillometer. Soil-gas samples collected along a traverse in Prince Georges County, Maryland, demonstrates the utility of the technique. The reconnaissance sampling revealed Rn soil-gas concentrations of up to 2500 pCi/L (picocuries per liter) indicating that the potential exists for indoor accumulations in excess of 4 pCi/L. -from Author
Analysis of airfoil transitional separation bubbles
NASA Technical Reports Server (NTRS)
Davis, R. L.; Carter, J. E.
1984-01-01
A previously developed local inviscid-viscous interaction technique for the analysis of airfoil transitional separation bubbles, ALESEP (Airfoil Leading Edge Separation) has been modified to utilize a more accurate windward finite difference procedure in the reversed flow region, and a natural transition/turbulence model has been incorporated for the prediction of transition within the separation bubble. Numerous calculations and experimental comparisons are presented to demonstrate the effects of the windward differencing scheme and the natural transition/turbulence model. Grid sensitivity and convergence capabilities of this inviscid-viscous interaction technique are briefly addressed. A major contribution of this report is that with the use of windward differencing, a second, counter-rotating eddy has been found to exist in the wall layer of the primary separation bubble.
[The future of forensic DNA analysis for criminal justice].
Laurent, François-Xavier; Vibrac, Geoffrey; Rubio, Aurélien; Thévenot, Marie-Thérèse; Pène, Laurent
2017-11-01
In the criminal framework, the analysis of approximately 20 DNA microsatellites enables the establishment of a genetic profile with a high statistical power of discrimination. This technique gives us the possibility to establish or exclude a match between a biological trace detected at a crime scene and a suspect whose DNA was collected via an oral swab. However, conventional techniques do tend to complexify the interpretation of complex DNA samples, such as degraded DNA and mixture DNA. The aim of this review is to highlight the powerness of new forensic DNA methods (including high-throughput sequencing or single-cell sequencing) to facilitate the interpretation of the expert with full compliance with existing french legislation. © 2017 médecine/sciences – Inserm.
Mathematical models for exploring different aspects of genotoxicity and carcinogenicity databases.
Benigni, R; Giuliani, A
1991-12-01
One great obstacle to understanding and using the information contained in the genotoxicity and carcinogenicity databases is the very size of such databases. Their vastness makes them difficult to read; this leads to inadequate exploitation of the information, which becomes costly in terms of time, labor, and money. In its search for adequate approaches to the problem, the scientific community has, curiously, almost entirely neglected an existent series of very powerful methods of data analysis: the multivariate data analysis techniques. These methods were specifically designed for exploring large data sets. This paper presents the multivariate techniques and reports a number of applications to genotoxicity problems. These studies show how biology and mathematical modeling can be combined and how successful this combination is.
Recent Advances in the Measurement of Arsenic, Cadmium, and Mercury in Rice and Other Foods
Punshon, Tracy
2015-01-01
Trace element analysis of foods is of increasing importance because of raised consumer awareness and the need to evaluate and establish regulatory guidelines for toxic trace metals and metalloids. This paper reviews recent advances in the analysis of trace elements in food, including challenges, state-of-the art methods, and use of spatially resolved techniques for localizing the distribution of As and Hg within rice grains. Total elemental analysis of foods is relatively well-established but the push for ever lower detection limits requires that methods be robust from potential matrix interferences which can be particularly severe for food. Inductively coupled plasma mass spectrometry (ICP-MS) is the method of choice, allowing for multi-element and highly sensitive analyses. For arsenic, speciation analysis is necessary because the inorganic forms are more likely to be subject to regulatory limits. Chromatographic techniques coupled to ICP-MS are most often used for arsenic speciation and a range of methods now exist for a variety of different arsenic species in different food matrices. Speciation and spatial analysis of foods, especially rice, can also be achieved with synchrotron techniques. Sensitive analytical techniques and methodological advances provide robust methods for the assessment of several metals in animal and plant-based foods, in particular for arsenic, cadmium and mercury in rice and arsenic speciation in foodstuffs. PMID:25938012
NASA Astrophysics Data System (ADS)
Inc, Mustafa; Yusuf, Abdullahi; Isa Aliyu, Aliyu; Hashemi, M. S.
2018-05-01
This paper studies the brusselator reaction diffusion model (BRDM) with time- and constant-dependent coefficients. The soliton solutions for BRDM with time-dependent coefficients are obtained via first integral (FIM), ansatz, and sine-Gordon expansion (SGEM) methods. Moreover, it is well known that stability analysis (SA), symmetry analysis and conservation laws (CLs) give several information for modelling a system of differential equations (SDE). This is because they can be used for investigating the internal properties, existence, uniqueness and integrability of different SDE. For this reason, we investigate the SA via linear stability technique, symmetry analysis and CLs for BRDM with constant-dependent coefficients in order to extract more physics and information on the governing equation. The constraint conditions for the existence of the solutions are also examined. The new solutions obtained in this paper can be useful for describing the concentrations of diffusion problems of the BRDM. It is shown that the examined dependent coefficients are some of the factors that are affecting the diffusion rate. So, the present paper provides much motivational information in comparison to the existing results in the literature.
Middle Atmosphere Program. Handbook for MAP, volume 9
NASA Technical Reports Server (NTRS)
Bowhill, S. A. (Editor); Edwards, B. (Editor)
1983-01-01
The term Mesosphere-Stratosphere-Troposphere radar (MST) was invented to describe the use of a high power radar transmitter together with a large vertically, or near vertically, pointing antenna to study the dynamics and structure of the atmosphere from about 10 to 100 km, using the very weak coherently scattered radiation returned from small scale irregularities in refractive index. Nine topics were addressed including: meteorological and dynamic requirements for MST radar networks; interpretation of radar returns for clear air; techniques for the measurement of horizontal and vertical velocities; techniques for studying gravity waves and turbulence; capabilities and limitations of existing MST radar; design considerations for high power VHF radar transceivers; optimum radar antenna configurations; and data analysis techniques.
Comparison of Content Structure and Cognitive Structure in the Learning of Probability.
ERIC Educational Resources Information Center
Geeslin, William E.
Digraphs, graphs, and task analysis were used to map out the content structure of a programed text (SMSG) in elementary probability. Mathematical structure was defined as the relationship between concepts within a set of abstract systems. The word association technique was used to measure the existing relations (cognitive structure) in S's memory…
Paint Analysis Using Visible Reflectance Spectroscopy: An Undergraduate Forensic Lab
ERIC Educational Resources Information Center
Hoffman, Erin M.; Beussman, Douglas J.
2007-01-01
The study of forensic science is found throughout undergraduate programs in growing numbers, both as stand-alone courses as well as specific examples within existing courses. Part of the driving force for this trend is the ability to apply common chemistry techniques to everyday situations, all couched in the context of a mystery that must be…
Predictive Cache Modeling and Analysis
2011-11-01
metaheuristic /bin-packing algorithm to optimize task placement based on task communication characterization. Our previous work on task allocation showed...Cache Miss Minimization Technology To efficiently explore combinations and discover nearly-optimal task-assignment algorithms , we extended to our...it was possible to use our algorithmic techniques to decrease network bandwidth consumption by ~25%. In this effort, we adapted these existing
USDA-ARS?s Scientific Manuscript database
Proteins exist in every plant cell wall. Certain protein residues interfere with lignin characterization and quantification. The current solution-state 2D-NMR technique (gel-NMR) for whole plant cell wall structural profiling provides detailed information regarding cell walls and proteins. However, ...
Applying Early Systems Engineering: Injecting Knowledge into the Capability Development Process
2012-10-01
involves early use of systems engi- neering and technical analyses to supplement the existing operational analysis techniques currently used in...complexity, and costs of systems now being developed require tight coupling between operational requirements stated in the CDD, system requirements...Fleischer » Keywords: Capability Development, Competitive Prototyping, Knowledge Points, Early Systems Engineering Applying Early Systems
An overview of computer vision
NASA Technical Reports Server (NTRS)
Gevarter, W. B.
1982-01-01
An overview of computer vision is provided. Image understanding and scene analysis are emphasized, and pertinent aspects of pattern recognition are treated. The basic approach to computer vision systems, the techniques utilized, applications, the current existing systems and state-of-the-art issues and research requirements, who is doing it and who is funding it, and future trends and expectations are reviewed.
Floating-point system quantization errors in digital control systems
NASA Technical Reports Server (NTRS)
Phillips, C. L.; Vallely, D. P.
1978-01-01
This paper considers digital controllers (filters) operating in floating-point arithmetic in either open-loop or closed-loop systems. A quantization error analysis technique is developed, and is implemented by a digital computer program that is based on a digital simulation of the system. The program can be integrated into existing digital simulations of a system.
a Method for the Measurements of Children's Feet
NASA Astrophysics Data System (ADS)
Bernard, , M.; Buffevant, B.; Querio, R.; Rigal, R.
1980-07-01
The Centre Technique du Cuir (Leather Technical Center) has been entrusted with the task of measuring children's feet. A new equipement has been devised which makes the precision measures sure and which is quick to give informations. The paper will present : 1 - the existing engineerings, 2 - the research's and analysis's methodology, 3 - the CTC apparatus actually used in schools.
Analysis of a Knowledge-Management-Based Process of Transferring Project Management Skills
ERIC Educational Resources Information Center
Ioi, Toshihiro; Ono, Masakazu; Ishii, Kota; Kato, Kazuhiko
2012-01-01
Purpose: The purpose of this paper is to propose a method for the transfer of knowledge and skills in project management (PM) based on techniques in knowledge management (KM). Design/methodology/approach: The literature contains studies on methods to extract experiential knowledge in PM, but few studies exist that focus on methods to convert…
Modern Data Analysis techniques in Noise and Vibration Problems
1981-11-01
Hilbert l’une de l’autre. Cette propriete se retrouve dans l’etude de la causalite : ce qui de- finit un critere pratique caracterisant un signal donc, par...entre Ie champ direct et Ie champ reflechi se caracterisent loca- lement par l’existence de frequences pour lesquelles l’interference est totale
DOT National Transportation Integrated Search
1985-03-01
A report is offered on a study of the information activities within the Right-of-Way section of ADOT. The objectives of the study were to adapt and apply techniques to measure user-perceived needs, satisfaction and utility of services provided Right-...
DOT National Transportation Integrated Search
1985-03-01
A report is offered on a study of the information activities within the Right-of-Way section of ADOT. The objectives of the study were to adapt and apply techniques to measure user-perceived needs, satisfaction and utility of services provided Right-...
Nitrogen dioxide/oxides of nitrogen (NO2/NOX) ratios are an important surrogate for nitric oxide (NO) NO-to-NO2 chemistry in dispersion models when estimating NOX emissions in a near-road environment. Existing dispersion models use different techniques and assumptions to represe...
Analysis of Weibull Grading Test for Solid Tantalum Capacitors
NASA Technical Reports Server (NTRS)
Teverovsky, Alexander
2010-01-01
Weibull grading test is a powerful technique that allows selection and reliability rating of solid tantalum capacitors for military and space applications. However, inaccuracies in the existing method and non-adequate acceleration factors can result in significant, up to three orders of magnitude, errors in the calculated failure rate of capacitors. This paper analyzes deficiencies of the existing technique and recommends more accurate method of calculations. A physical model presenting failures of tantalum capacitors as time-dependent-dielectric-breakdown is used to determine voltage and temperature acceleration factors and select adequate Weibull grading test conditions. This, model is verified by highly accelerated life testing (HALT) at different temperature and voltage conditions for three types of solid chip tantalum capacitors. It is shown that parameters of the model and acceleration factors can be calculated using a general log-linear relationship for the characteristic life with two stress levels.
A Survey of UML Based Regression Testing
NASA Astrophysics Data System (ADS)
Fahad, Muhammad; Nadeem, Aamer
Regression testing is the process of ensuring software quality by analyzing whether changed parts behave as intended, and unchanged parts are not affected by the modifications. Since it is a costly process, a lot of techniques are proposed in the research literature that suggest testers how to build regression test suite from existing test suite with minimum cost. In this paper, we discuss the advantages and drawbacks of using UML diagrams for regression testing and analyze that UML model helps in identifying changes for regression test selection effectively. We survey the existing UML based regression testing techniques and provide an analysis matrix to give a quick insight into prominent features of the literature work. We discuss the open research issues like managing and reducing the size of regression test suite, prioritization of the test cases that would be helpful during strict schedule and resources that remain to be addressed for UML based regression testing.
Ahmed, Mohammed M; Otto, Thomas J; Moed, Berton R
2016-04-22
Limited-incision total hip arthroplasty (THA) preserves hip abductors, posterior capsule, and external rotators potentially diminishing dislocation risk. However, potential complications also exist, such as component malposition. Specific implants have been manufactured that enhance compatibility with this technique, while preserving metaphyseal bone; however, little data exists documenting early complications and component position. The purpose was to evaluate primary THA using a curved, bone-sparing stem inserted through the anterior approach with respect to component alignment and early complications. In a retrospective analysis of 108 cases, the surgical technique was outlined and the occurrence of intraoperative fractures, postoperative dislocations, infection, and limb length inequality was determined. Femoral stem and acetabular cup alignment was quantified using the initial postoperative radiographs. Patient follow-up averaged 12.9 (range 2 to 36) months. There were eight (7.4 %) complications requiring revision surgery in three (2.8 %) patients with three (2.8 %) infections and three (2.8 %) dislocations. Intraoperative complications included one calcar fracture above the lesser trochanter. Leg length inequality >5 mm was present in three (2.8 %) patients. Radiographic analysis showed that femoral neutral alignment was achieved in 95 hips (88.0 %). All femoral stems demonstrated satisfactory fit and fill and no evidence of subsidence, osteolysis, or loosening. An average abduction angle of 44.8° (± 5.3) and average cup anteversion of 16.2° (± 4.2) were also noted. Although the technique with this implant and approach is promising, it does not appear to offer important advantages over standard techniques. However, the findings merit further, long-term study.
A comparison of TSS and TRASYS in form factor calculation
NASA Technical Reports Server (NTRS)
Golliher, Eric
1993-01-01
As the workstation and personal computer become more popular than a centralized mainframe to perform thermal analysis, the methods for space vehicle thermal analysis will change. Already, many thermal analysis codes are now available for workstations, which were not in existence just five years ago. As these changes occur, some organizations will adopt the new codes and analysis techniques, while others will not. This might lead to misunderstandings between thermal shops in different organizations. If thermal analysts make an effort to understand the major differences between the new and old methods, a smoother transition to a more efficient and more versatile thermal analysis environment will be realized.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bradonjic, Milan; Hagberg, Aric; Hengartner, Nick
We analyze component evolution in general random intersection graphs (RIGs) and give conditions on existence and uniqueness of the giant component. Our techniques generalize the existing methods for analysis on component evolution in RIGs. That is, we analyze survival and extinction properties of a dependent, inhomogeneous Galton-Watson branching process on general RIGs. Our analysis relies on bounding the branching processes and inherits the fundamental concepts from the study on component evolution in Erdos-Renyi graphs. The main challenge becomes from the underlying structure of RIGs, when the number of offsprings follows a binomial distribution with a different number of nodes andmore » different rate at each step during the evolution. RIGs can be interpreted as a model for large randomly formed non-metric data sets. Besides the mathematical analysis on component evolution, which we provide in this work, we perceive RIGs as an important random structure which has already found applications in social networks, epidemic networks, blog readership, or wireless sensor networks.« less
Multilingual Sentiment Analysis: State of the Art and Independent Comparison of Techniques.
Dashtipour, Kia; Poria, Soujanya; Hussain, Amir; Cambria, Erik; Hawalah, Ahmad Y A; Gelbukh, Alexander; Zhou, Qiang
With the advent of Internet, people actively express their opinions about products, services, events, political parties, etc., in social media, blogs, and website comments. The amount of research work on sentiment analysis is growing explosively. However, the majority of research efforts are devoted to English-language data, while a great share of information is available in other languages. We present a state-of-the-art review on multilingual sentiment analysis. More importantly, we compare our own implementation of existing approaches on common data. Precision observed in our experiments is typically lower than the one reported by the original authors, which we attribute to the lack of detail in the original presentation of those approaches. Thus, we compare the existing works by what they really offer to the reader, including whether they allow for accurate implementation and for reliable reproduction of the reported results.
Flood Detection/Monitoring Using Adjustable Histogram Equalization Technique
Riaz, Muhammad Mohsin; Ghafoor, Abdul
2014-01-01
Flood monitoring technique using adjustable histogram equalization is proposed. The technique overcomes the limitations (overenhancement, artifacts, and unnatural look) of existing technique by adjusting the contrast of images. The proposed technique takes pre- and postimages and applies different processing steps for generating flood map without user interaction. The resultant flood maps can be used for flood monitoring and detection. Simulation results show that the proposed technique provides better output quality compared to the state of the art existing technique. PMID:24558332
Development of Gold Standard Ion-Selective Electrode-Based Methods for Fluoride Analysis
Martínez-Mier, E.A.; Cury, J.A.; Heilman, J.R.; Katz, B.P.; Levy, S.M.; Li, Y.; Maguire, A.; Margineda, J.; O’Mullane, D.; Phantumvanit, P.; Soto-Rojas, A.E.; Stookey, G.K.; Villa, A.; Wefel, J.S.; Whelton, H.; Whitford, G.M.; Zero, D.T.; Zhang, W.; Zohouri, V.
2011-01-01
Background/Aims: Currently available techniques for fluoride analysis are not standardized. Therefore, this study was designed to develop standardized methods for analyzing fluoride in biological and nonbiological samples used for dental research. Methods A group of nine laboratories analyzed a set of standardized samples for fluoride concentration using their own methods. The group then reviewed existing analytical techniques for fluoride analysis, identified inconsistencies in the use of these techniques and conducted testing to resolve differences. Based on the results of the testing undertaken to define the best approaches for the analysis, the group developed recommendations for direct and microdiffusion methods using the fluoride ion-selective electrode. Results Initial results demonstrated that there was no consensus regarding the choice of analytical techniques for different types of samples. Although for several types of samples, the results of the fluoride analyses were similar among some laboratories, greater differences were observed for saliva, food and beverage samples. In spite of these initial differences, precise and true values of fluoride concentration, as well as smaller differences between laboratories, were obtained once the standardized methodologies were used. Intraclass correlation coefficients ranged from 0.90 to 0.93, for the analysis of a certified reference material, using the standardized methodologies. Conclusion The results of this study demonstrate that the development and use of standardized protocols for F analysis significantly decreased differences among laboratories and resulted in more precise and true values. PMID:21160184
Automatic dilution gaging of rapidly varying flow
Duerk, M.D.
1983-01-01
The analysis showed that the discharges measured by dye-dilution techniques were generally within ± 10 percent of the discharges determined from ratings established by current-meter measurements. Larger differences were noted at the start of and on the rising limb of four hydrographs. Of the 20 storms monitored, dilution measurements on 17 were of acceptable accuracy. Peak discharges from the open-channel site ranged from 0 to 12 percent departures from the existing rating whereas the comparison of peak discharge at the storm sewer site ranged from 0 to 5 percent departures from the existing rating.
Expanding access and choice for health care consumers through tax reform.
Butler, S; Kendall, D B
1999-01-01
A refundable tax credit for the uninsured would complement the existing job-based health insurance system while letting people keep their job-based coverage if they wish. Among the wide variety of design options for a tax credit, policy and political analysis does not reveal an obvious choice, but a tax credit based on a percentage of spending may have a slight advantage. Congress should give states maximum flexibility to use existing funding sources to supplement the value of a federal tax credit and encourage the use of techniques to create stable insurance pools.
NASA Technical Reports Server (NTRS)
Langland, R. A.; Stephens, P. L.; Pihos, G. G.
1980-01-01
The techniques used for ingesting SEASAT-A SASS wind retrievals into the existing operational software are described. The intent is to assess the impact of SEASAT data in he marine wind fields produced by the global marine wind/sea level pressure analysis. This analysis is performed on a 21/2 deg latitude/longitude global grid which executes at three hourly time increments. Wind fields with and without SASS winds are being compared. The problems of data volume reduction and aliased wind retrieval ambiquity are treated.
NASA Astrophysics Data System (ADS)
Allain, Rhett; Williams, Richard
2009-02-01
Suppose we had a brand new world to study—a world that possibly works with a different set of principles, a non-Newtonian world. Maybe this world is Newtonian, maybe it isn't. This world exists in video games, and it is open for exploration. Most video games try to incorporate realistic physics, but sometimes this does not happen. The obvious approach is to look at the source code for the game, but this would not allow students to apply analysis techniques.
From fields to objects: A review of geographic boundary analysis
NASA Astrophysics Data System (ADS)
Jacquez, G. M.; Maruca, S.; Fortin, M.-J.
Geographic boundary analysis is a relatively new approach unfamiliar to many spatial analysts. It is best viewed as a technique for defining objects - geographic boundaries - on spatial fields, and for evaluating the statistical significance of characteristics of those boundary objects. This is accomplished using null spatial models representative of the spatial processes expected in the absence of boundary-generating phenomena. Close ties to the object-field dialectic eminently suit boundary analysis to GIS data. The majority of existing spatial methods are field-based in that they describe, estimate, or predict how attributes (variables defining the field) vary through geographic space. Such methods are appropriate for field representations but not object representations. As the object-field paradigm gains currency in geographic information science, appropriate techniques for the statistical analysis of objects are required. The methods reviewed in this paper are a promising foundation. Geographic boundary analysis is clearly a valuable addition to the spatial statistical toolbox. This paper presents the philosophy of, and motivations for geographic boundary analysis. It defines commonly used statistics for quantifying boundaries and their characteristics, as well as simulation procedures for evaluating their significance. We review applications of these techniques, with the objective of making this promising approach accessible to the GIS-spatial analysis community. We also describe the implementation of these methods within geographic boundary analysis software: GEM.
NASA Technical Reports Server (NTRS)
Brown, Andrew M.; Schmauch, Preston
2012-01-01
Turbine blades in rocket and jet engine turbomachinery experience enormous harmonic loading conditions. These loads result from the integer number of upstream and downstream stator vanes as well as the other turbine stages. The standard technique for forced response analysis to assess structural integrity is to decompose a CFD generated flow field into its harmonic components, and to then perform a frequency response analysis at the problematic natural frequencies. Recent CFD analysis and water-flow testing at NASA/MSFC, though, indicates that this technique may miss substantial harmonic and non-harmonic excitation sources that become present in complex flows. These complications suggest the question of whether frequency domain analysis is capable of capturing the excitation content sufficiently. Two studies comparing frequency response analysis with transient response analysis, therefore, have been performed. The first is of a bladed disk with each blade modeled by simple beam elements. It was hypothesized that the randomness and other variation from the standard harmonic excitation would reduce the blade structural response, but the results showed little reduction. The second study was of a realistic model of a bladed-disk excited by the same CFD used in the J2X engine program. The results showed that the transient analysis results were up to 10% higher for "clean" nodal diameter excitations and six times larger for "messy" excitations, where substantial Fourier content around the main harmonic exists.
Cavitation in liquid cryogens. 2: Hydrofoil
NASA Technical Reports Server (NTRS)
Hord, J.
1973-01-01
Boundary layer principles, along with two-phase concepts, are used to improve existing correlative theory for developed cavity data. Details concerning cavity instrumentation, data analysis, correlative techniques, and experimental and theoretical aspects of a cavitating hydrofoil are given. Both desinent and thermodynamic data, using liquid hydrogen and liquid nitrogen, are reported. The thermodynamic data indicated that stable thermodynamic equilibrium exists throughout the vaporous cryogen cavities. The improved correlative formulas were used to evaluate these data. A new correlating parameter based on consideration of mass limiting two-phase flow flux across the cavity interface, is proposed. This correlating parameter appears attractive for future correlative and predictive applications. Agreement between theory and experiment is discussed, and directions for future analysis are suggested. The front half of the cavities, developed on the hydrofoil, may be considered as parabolically shaped.
A Regev-type fully homomorphic encryption scheme using modulus switching.
Chen, Zhigang; Wang, Jian; Chen, Liqun; Song, Xinxia
2014-01-01
A critical challenge in a fully homomorphic encryption (FHE) scheme is to manage noise. Modulus switching technique is currently the most efficient noise management technique. When using the modulus switching technique to design and implement a FHE scheme, how to choose concrete parameters is an important step, but to our best knowledge, this step has drawn very little attention to the existing FHE researches in the literature. The contributions of this paper are twofold. On one hand, we propose a function of the lower bound of dimension value in the switching techniques depending on the LWE specific security levels. On the other hand, as a case study, we modify the Brakerski FHE scheme (in Crypto 2012) by using the modulus switching technique. We recommend concrete parameter values of our proposed scheme and provide security analysis. Our result shows that the modified FHE scheme is more efficient than the original Brakerski scheme in the same security level.
Bioimaging of cells and tissues using accelerator-based sources.
Petibois, Cyril; Cestelli Guidi, Mariangela
2008-07-01
A variety of techniques exist that provide chemical information in the form of a spatially resolved image: electron microprobe analysis, nuclear microprobe analysis, synchrotron radiation microprobe analysis, secondary ion mass spectrometry, and confocal fluorescence microscopy. Linear (LINAC) and circular (synchrotrons) particle accelerators have been constructed worldwide to provide to the scientific community unprecedented analytical performances. Now, these facilities match at least one of the three analytical features required for the biological field: (1) a sufficient spatial resolution for single cell (< 1 mum) or tissue (<1 mm) analyses, (2) a temporal resolution to follow molecular dynamics, and (3) a sensitivity in the micromolar to nanomolar range, thus allowing true investigations on biological dynamics. Third-generation synchrotrons now offer the opportunity of bioanalytical measurements at nanometer resolutions with incredible sensitivity. Linear accelerators are more specialized in their physical features but may exceed synchrotron performances. All these techniques have become irreplaceable tools for developing knowledge in biology. This review highlights the pros and cons of the most popular techniques that have been implemented on accelerator-based sources to address analytical issues on biological specimens.
Novel Passive Clearing Methods for the Rapid Production of Optical Transparency in Whole CNS Tissue.
Woo, Jiwon; Lee, Eunice Yoojin; Park, Hyo-Suk; Park, Jeong Yoon; Cho, Yong Eun
2018-05-08
Since the development of CLARITY, a bioelectrochemical clearing technique that allows for three-dimensional phenotype mapping within transparent tissues, a multitude of novel clearing methodologies including CUBIC (clear, unobstructed brain imaging cocktails and computational analysis), SWITCH (system-wide control of interaction time and kinetics of chemicals), MAP (magnified analysis of the proteome), and PACT (passive clarity technique), have been established to further expand the existing toolkit for the microscopic analysis of biological tissues. The present study aims to improve upon and optimize the original PACT procedure for an array of intact rodent tissues, including the whole central nervous system (CNS), kidneys, spleen, and whole mouse embryos. Termed psPACT (process-separate PACT) and mPACT (modified PACT), these novel techniques provide highly efficacious means of mapping cell circuitry and visualizing subcellular structures in intact normal and pathological tissues. In the following protocol, we provide a detailed, step-by-step outline on how to achieve maximal tissue clearance with minimal invasion of their structural integrity via psPACT and mPACT.
Leijdekkers, A G M; Sanders, M G; Schols, H A; Gruppen, H
2011-12-23
Analysis of complex mixtures of plant cell wall derived oligosaccharides is still challenging and multiple analytical techniques are often required for separation and characterization of these mixtures. In this work it is demonstrated that hydrophilic interaction chromatography coupled with evaporative light scattering and mass spectrometry detection (HILIC-ELSD-MS(n)) is a valuable tool for identification of a wide range of neutral and acidic cell wall derived oligosaccharides. The separation potential for acidic oligosaccharides observed with HILIC is much better compared to other existing techniques, like capillary electrophoresis, reversed phase and porous-graphitized carbon chromatography. Important structural information, such as presence of methyl esters and acetyl groups, is retained during analysis. Separation of acidic oligosaccharides with equal charge yet with different degrees of polymerization can be obtained. The efficient coupling of HILIC with ELSD and MS(n)-detection enables characterization and quantification of many different oligosaccharide structures present in complex mixtures. This makes HILIC-ELSD-MS(n) a versatile and powerful additional technique in plant cell wall analysis. Copyright © 2011 Elsevier B.V. All rights reserved.
Demonstration of a Safety Analysis on a Complex System
NASA Technical Reports Server (NTRS)
Leveson, Nancy; Alfaro, Liliana; Alvarado, Christine; Brown, Molly; Hunt, Earl B.; Jaffe, Matt; Joslyn, Susan; Pinnell, Denise; Reese, Jon; Samarziya, Jeffrey;
1997-01-01
For the past 17 years, Professor Leveson and her graduate students have been developing a theoretical foundation for safety in complex systems and building a methodology upon that foundation. The methodology includes special management structures and procedures, system hazard analyses, software hazard analysis, requirements modeling and analysis for completeness and safety, special software design techniques including the design of human-machine interaction, verification, operational feedback, and change analysis. The Safeware methodology is based on system safety techniques that are extended to deal with software and human error. Automation is used to enhance our ability to cope with complex systems. Identification, classification, and evaluation of hazards is done using modeling and analysis. To be effective, the models and analysis tools must consider the hardware, software, and human components in these systems. They also need to include a variety of analysis techniques and orthogonal approaches: There exists no single safety analysis or evaluation technique that can handle all aspects of complex systems. Applying only one or two may make us feel satisfied, but will produce limited results. We report here on a demonstration, performed as part of a contract with NASA Langley Research Center, of the Safeware methodology on the Center-TRACON Automation System (CTAS) portion of the air traffic control (ATC) system and procedures currently employed at the Dallas/Fort Worth (DFW) TRACON (Terminal Radar Approach CONtrol). CTAS is an automated system to assist controllers in handling arrival traffic in the DFW area. Safety is a system property, not a component property, so our safety analysis considers the entire system and not simply the automated components. Because safety analysis of a complex system is an interdisciplinary effort, our team included system engineers, software engineers, human factors experts, and cognitive psychologists.
Reiner, Bruce I
2018-02-01
One method for addressing existing peer review limitations is the assignment of peer review cases on a completely blinded basis, in which the peer reviewer would create an independent report which can then be cross-referenced with the primary reader report of record. By leveraging existing computerized data mining techniques, one could in theory automate and objectify the process of report data extraction, classification, and analysis, while reducing time and resource requirements intrinsic to manual peer review report analysis. Once inter-report analysis has been performed, resulting inter-report discrepancies can be presented to the radiologist of record for review, along with the option to directly communicate with the peer reviewer through an electronic data reconciliation tool aimed at collaboratively resolving inter-report discrepancies and improving report accuracy. All associated report and reconciled data could in turn be recorded in a referenceable peer review database, which provides opportunity for context and user-specific education and decision support.
Analytical Characterization of Erythritol Tetranitrate, an Improvised Explosive.
Matyáš, Robert; Lyčka, Antonín; Jirásko, Robert; Jakový, Zdeněk; Maixner, Jaroslav; Mišková, Linda; Künzel, Martin
2016-05-01
Erythritol tetranitrate (ETN), an ester of nitric acid and erythritol, is a solid crystalline explosive with high explosive performance. Although it has never been used in any industrial or military application, it has become one of the most prepared and misused improvise explosives. In this study, several analytical techniques were explored to facilitate analysis in forensic laboratories. FTIR and Raman spectrometry measurements expand existing data and bring more detailed assignment of bands through the parallel study of erythritol [(15) N4 ] tetranitrate. In the case of powder diffraction, recently published data were verified, and (1) H, (13) C, and (15) N NMR spectra are discussed in detail. The technique of electrospray ionization tandem mass spectrometry was successfully used for the analysis of ETN. Described methods allow fast, versatile, and reliable detection or analysis of samples containing erythritol tetranitrate in forensic laboratories. © 2016 American Academy of Forensic Sciences.
NASA Astrophysics Data System (ADS)
Sait, Abdulrahman S.
This dissertation presents a reliable technique for monitoring the condition of rotating machinery by applying instantaneous angular speed (IAS) analysis. A new analysis of the effects of changes in the orientation of the line of action and the pressure angle of the resultant force acting on gear tooth profile of spur gear under different levels of tooth damage is utilized. The analysis and experimental work discussed in this dissertation provide a clear understating of the effects of damage on the IAS by analyzing the digital signals output of rotary incremental optical encoder. A comprehensive literature review of state of the knowledge in condition monitoring and fault diagnostics of rotating machinery, including gearbox system is presented. Progress and new developments over the past 30 years in failure detection techniques of rotating machinery including engines, bearings and gearboxes are thoroughly reviewed. This work is limited to the analysis of a gear train system with gear tooth surface faults utilizing angular motion analysis technique. Angular motion data were acquired using an incremental optical encoder. Results are compared to a vibration-based technique. The vibration data were acquired using an accelerometer. The signals were obtained and analyzed in the phase domains using signal averaging to determine the existence and position of faults on the gear train system. Forces between the mating teeth surfaces are analyzed and simulated to validate the influence of the presence of damage on the pressure angle and the IAS. National Instruments hardware is used and NI LabVIEW software code is developed for real-time, online condition monitoring systems and fault detection techniques. The sensitivity of optical encoders to gear fault detection techniques is experimentally investigated by applying IAS analysis under different gear damage levels and different operating conditions. A reliable methodology is developed for selecting appropriate testing/operating conditions of a rotating system to generate an alarm system for damage detection.
Regression Verification Using Impact Summaries
NASA Technical Reports Server (NTRS)
Backes, John; Person, Suzette J.; Rungta, Neha; Thachuk, Oksana
2013-01-01
Regression verification techniques are used to prove equivalence of syntactically similar programs. Checking equivalence of large programs, however, can be computationally expensive. Existing regression verification techniques rely on abstraction and decomposition techniques to reduce the computational effort of checking equivalence of the entire program. These techniques are sound but not complete. In this work, we propose a novel approach to improve scalability of regression verification by classifying the program behaviors generated during symbolic execution as either impacted or unimpacted. Our technique uses a combination of static analysis and symbolic execution to generate summaries of impacted program behaviors. The impact summaries are then checked for equivalence using an o-the-shelf decision procedure. We prove that our approach is both sound and complete for sequential programs, with respect to the depth bound of symbolic execution. Our evaluation on a set of sequential C artifacts shows that reducing the size of the summaries can help reduce the cost of software equivalence checking. Various reduction, abstraction, and compositional techniques have been developed to help scale software verification techniques to industrial-sized systems. Although such techniques have greatly increased the size and complexity of systems that can be checked, analysis of large software systems remains costly. Regression analysis techniques, e.g., regression testing [16], regression model checking [22], and regression verification [19], restrict the scope of the analysis by leveraging the differences between program versions. These techniques are based on the idea that if code is checked early in development, then subsequent versions can be checked against a prior (checked) version, leveraging the results of the previous analysis to reduce analysis cost of the current version. Regression verification addresses the problem of proving equivalence of closely related program versions [19]. These techniques compare two programs with a large degree of syntactic similarity to prove that portions of one program version are equivalent to the other. Regression verification can be used for guaranteeing backward compatibility, and for showing behavioral equivalence in programs with syntactic differences, e.g., when a program is refactored to improve its performance, maintainability, or readability. Existing regression verification techniques leverage similarities between program versions by using abstraction and decomposition techniques to improve scalability of the analysis [10, 12, 19]. The abstractions and decomposition in the these techniques, e.g., summaries of unchanged code [12] or semantically equivalent methods [19], compute an over-approximation of the program behaviors. The equivalence checking results of these techniques are sound but not complete-they may characterize programs as not functionally equivalent when, in fact, they are equivalent. In this work we describe a novel approach that leverages the impact of the differences between two programs for scaling regression verification. We partition program behaviors of each version into (a) behaviors impacted by the changes and (b) behaviors not impacted (unimpacted) by the changes. Only the impacted program behaviors are used during equivalence checking. We then prove that checking equivalence of the impacted program behaviors is equivalent to checking equivalence of all program behaviors for a given depth bound. In this work we use symbolic execution to generate the program behaviors and leverage control- and data-dependence information to facilitate the partitioning of program behaviors. The impacted program behaviors are termed as impact summaries. The dependence analyses that facilitate the generation of the impact summaries, we believe, could be used in conjunction with other abstraction and decomposition based approaches, [10, 12], as a complementary reduction technique. An evaluation of our regression verification technique shows that our approach is capable of leveraging similarities between program versions to reduce the size of the queries and the time required to check for logical equivalence. The main contributions of this work are: - A regression verification technique to generate impact summaries that can be checked for functional equivalence using an off-the-shelf decision procedure. - A proof that our approach is sound and complete with respect to the depth bound of symbolic execution. - An implementation of our technique using the LLVMcompiler infrastructure, the klee Symbolic Virtual Machine [4], and a variety of Satisfiability Modulo Theory (SMT) solvers, e.g., STP [7] and Z3 [6]. - An empirical evaluation on a set of C artifacts which shows that the use of impact summaries can reduce the cost of regression verification.
Observations of large parallel electric fields in the auroral ionosphere
NASA Technical Reports Server (NTRS)
Mozer, F. S.
1976-01-01
Rocket borne measurements employing a double probe technique were used to gather evidence for the existence of electric fields in the auroral ionosphere having components parallel to the magnetic field direction. An analysis of possible experimental errors leads to the conclusion that no known uncertainties can account for the roughly 10 mV/m parallel electric fields that are observed.
Rastislav Jakus; Wojciech Grodzki; Marek Jezik; Marcin Jachym
2003-01-01
The spread of bark beetle outbreaks in the Tatra Mountains was explored by using both terrestrial and remote sensing techniques. Both approaches have proven to be useful for studying spatial patterns of bark beetle population dynamics. The terrestrial methods were applied on existing forestry databases. Vegetation change analysis (image differentiation), digital...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jacobs, J.A.
1987-01-01
The latest attempt to summarise the wealth of knowledge now available on geomagnetic phenomena has resulted in this multi-volume treatise, with contributions and reviews from many scientists. The first volume in the series contains a thorough review of all existing information on measuring the Earth's magnetic field, both on land and at sea, and includes a comparative analysis of the techniques available for this purpose.
ERIC Educational Resources Information Center
Bouffard, Laura Annie; Sarkar, Mela
2008-01-01
Most research on language awareness in a second language (L2) has been carried out with adult learners. This research presents data showing that pedagogical techniques can be devised enabling children as young as 8 to develop metalinguistic awareness of their emerging L2 system. Building on existing work by Canadian researchers, this…
The problem of resonance in technology usage
NASA Technical Reports Server (NTRS)
Sayani, H. H.; Svoboda, C. P.
1981-01-01
Various information system tools and techniques are analyzed. A case study is presented which draws together the issues raised in three distinct cases. This case study shows a typical progression from the selection of an analysis methodology, to the adoption of an automated tool for specification and documentation, and the difficulty of fitting these into an existing life cycle development methodology.
Jackson, Brian A; Faith, Kay Sullivan
2013-02-01
Although significant progress has been made in measuring public health emergency preparedness, system-level performance measures are lacking. This report examines a potential approach to such measures for Strategic National Stockpile (SNS) operations. We adapted an engineering analytic technique used to assess the reliability of technological systems-failure mode and effects analysis-to assess preparedness. That technique, which includes systematic mapping of the response system and identification of possible breakdowns that affect performance, provides a path to use data from existing SNS assessment tools to estimate likely future performance of the system overall. Systems models of SNS operations were constructed and failure mode analyses were performed for each component. Linking data from existing assessments, including the technical assistance review and functional drills, to reliability assessment was demonstrated using publicly available information. The use of failure mode and effects estimates to assess overall response system reliability was demonstrated with a simple simulation example. Reliability analysis appears an attractive way to integrate information from the substantial investment in detailed assessments for stockpile delivery and dispensing to provide a view of likely future response performance.
NASA Astrophysics Data System (ADS)
Li, Dong-xia; Ye, Qian-wen
Out-of-band radiation suppression algorithm must be used efficiently for broadband aeronautical communication system in order not to interfere the operation of the existing systems in aviation L-Band. Based on the simple introduction of the broadband aeronautical multi-carrier communication (B-AMC) system model, several sidelobe suppression techniques in orthogonal frequency multiplexing (OFDM) system are presented and analyzed so as to find a suitable algorithm for B-AMC system in this paper. Simulation results show that raise-cosine function windowing can suppress the out-of-band radiation of B-AMC system effectively.
Macready, Anna L; Fallaize, Rosalind; Butler, Laurie T; Ellis, Judi A; Kuznesof, Sharron; Frewer, Lynn J; Celis-Morales, Carlos; Livingstone, Katherine M; Araújo-Soares, Vera; Fischer, Arnout Rh; Stewart-Knox, Barbara J; Mathers, John C; Lovegrove, Julie A
2018-04-09
To determine the efficacy of behavior change techniques applied in dietary and physical activity intervention studies, it is first necessary to record and describe techniques that have been used during such interventions. Published frameworks used in dietary and smoking cessation interventions undergo continuous development, and most are not adapted for Web-based delivery. The Food4Me study (N=1607) provided the opportunity to use existing frameworks to describe standardized Web-based techniques employed in a large-scale, internet-based intervention to change dietary behavior and physical activity. The aims of this study were (1) to describe techniques embedded in the Food4Me study design and explain the selection rationale and (2) to demonstrate the use of behavior change technique taxonomies, develop standard operating procedures for training, and identify strengths and limitations of the Food4Me framework that will inform its use in future studies. The 6-month randomized controlled trial took place simultaneously in seven European countries, with participants receiving one of four levels of personalized advice (generalized, intake-based, intake+phenotype-based, and intake+phenotype+gene-based). A three-phase approach was taken: (1) existing taxonomies were reviewed and techniques were identified a priori for possible inclusion in the Food4Me study, (2) a standard operating procedure was developed to maintain consistency in the use of methods and techniques across research centers, and (3) the Food4Me behavior change technique framework was reviewed and updated post intervention. An analysis of excluded techniques was also conducted. Of 46 techniques identified a priori as being applicable to Food4Me, 17 were embedded in the intervention design; 11 were from a dietary taxonomy, and 6 from a smoking cessation taxonomy. In addition, the four-category smoking cessation framework structure was adopted for clarity of communication. Smoking cessation texts were adapted for dietary use where necessary. A posteriori, a further 9 techniques were included. Examination of excluded items highlighted the distinction between techniques considered appropriate for face-to-face versus internet-based delivery. The use of existing taxonomies facilitated the description and standardization of techniques used in Food4Me. We recommend that for complex studies of this nature, technique analysis should be conducted a priori to develop standardized procedures and training and reviewed a posteriori to audit the techniques actually adopted. The present framework description makes a valuable contribution to future systematic reviews and meta-analyses that explore technique efficacy and underlying psychological constructs. This was a novel application of the behavior change taxonomies and was the first internet-based personalized nutrition intervention to use such a framework remotely. ClinicalTrials.gov NCT01530139; https://clinicaltrials.gov/ct2/show/NCT01530139 (Archived by WebCite at http://www.webcitation.org/6y8XYUft1). ©Anna L Macready, Rosalind Fallaize, Laurie T Butler, Judi A Ellis, Sharron Kuznesof, Lynn J Frewer, Carlos Celis-Morales, Katherine M Livingstone, Vera Araújo-Soares, Arnout RH Fischer, Barbara J Stewart-Knox, John C Mathers, Julie A Lovegrove. Originally published in JMIR Research Protocols (http://www.researchprotocols.org), 09.04.2018.
2014-01-01
Linear algebraic concept of subspace plays a significant role in the recent techniques of spectrum estimation. In this article, the authors have utilized the noise subspace concept for finding hidden periodicities in DNA sequence. With the vast growth of genomic sequences, the demand to identify accurately the protein-coding regions in DNA is increasingly rising. Several techniques of DNA feature extraction which involves various cross fields have come up in the recent past, among which application of digital signal processing tools is of prime importance. It is known that coding segments have a 3-base periodicity, while non-coding regions do not have this unique feature. One of the most important spectrum analysis techniques based on the concept of subspace is the least-norm method. The least-norm estimator developed in this paper shows sharp period-3 peaks in coding regions completely eliminating background noise. Comparison of proposed method with existing sliding discrete Fourier transform (SDFT) method popularly known as modified periodogram method has been drawn on several genes from various organisms and the results show that the proposed method has better as well as an effective approach towards gene prediction. Resolution, quality factor, sensitivity, specificity, miss rate, and wrong rate are used to establish superiority of least-norm gene prediction method over existing method. PMID:24386895
Choi, D J; Park, H
2001-11-01
For control and automation of biological treatment processes, lack of reliable on-line sensors to measure water quality parameters is one of the most important problems to overcome. Many parameters cannot be measured directly with on-line sensors. The accuracy of existing hardware sensors is also not sufficient and maintenance problems such as electrode fouling often cause trouble. This paper deals with the development of software sensor techniques that estimate the target water quality parameter from other parameters using the correlation between water quality parameters. We focus our attention on the preprocessing of noisy data and the selection of the best model feasible to the situation. Problems of existing approaches are also discussed. We propose a hybrid neural network as a software sensor inferring wastewater quality parameter. Multivariate regression, artificial neural networks (ANN), and a hybrid technique that combines principal component analysis as a preprocessing stage are applied to data from industrial wastewater processes. The hybrid ANN technique shows an enhancement of prediction capability and reduces the overfitting problem of neural networks. The result shows that the hybrid ANN technique can be used to extract information from noisy data and to describe the nonlinearity of complex wastewater treatment processes.
Kearney, Philip E; Carson, Howie J; Collins, Dave
2018-05-01
This paper explores the approaches adopted by high-level field athletics coaches when attempting to refine an athlete's already well-established technique (long and triple jump and javelin throwing). Six coaches, who had all coached multiple athletes to multiple major championships, took part in semi-structured interviews focused upon a recent example of technique refinement. Data were analysed using a thematic content analysis. The coaching tools reported were generally consistent with those advised by the existing literature, focusing on attaining "buy-in", utilising part-practice, restoring movement automaticity and securing performance under pressure. Five of the six coaches reported using a systematic sequence of stages to implement the refinement, although the number and content of these stages varied between them. Notably, however, there were no formal sources of knowledge (e.g., coach education or training) provided to inform coaches' decision making. Instead, coaches' decisions were largely based on experience both within and outside the sporting domain. Data offer a useful stimulus for reflection amongst sport practitioners confronted by the problem of technique refinement. Certainly the limited awareness of existing guidelines on technique refinement expressed by the coaches emphasises a need for further collaborative work by researchers and coach educators to disseminate best practice.
Non-destructive evaluation of laboratory scale hydraulic fracturing using acoustic emission
NASA Astrophysics Data System (ADS)
Hampton, Jesse Clay
The primary objective of this research is to develop techniques to characterize hydraulic fractures and fracturing processes using acoustic emission monitoring based on laboratory scale hydraulic fracturing experiments. Individual microcrack AE source characterization is performed to understand the failure mechanisms associated with small failures along pre-existing discontinuities and grain boundaries. Individual microcrack analysis methods include moment tensor inversion techniques to elucidate the mode of failure, crack slip and crack normal direction vectors, and relative volumetric deformation of an individual microcrack. Differentiation between individual microcrack analysis and AE cloud based techniques is studied in efforts to refine discrete fracture network (DFN) creation and regional damage quantification of densely fractured media. Regional damage estimations from combinations of individual microcrack analyses and AE cloud density plotting are used to investigate the usefulness of weighting cloud based AE analysis techniques with microcrack source data. Two granite types were used in several sample configurations including multi-block systems. Laboratory hydraulic fracturing was performed with sample sizes ranging from 15 x 15 x 25 cm3 to 30 x 30 x 25 cm 3 in both unconfined and true-triaxially confined stress states using different types of materials. Hydraulic fracture testing in rock block systems containing a large natural fracture was investigated in terms of AE response throughout fracture interactions. Investigations of differing scale analyses showed the usefulness of individual microcrack characterization as well as DFN and cloud based techniques. Individual microcrack characterization weighting cloud based techniques correlated well with post-test damage evaluations.
McCulloch, G; Dawson, L A; Ross, J M; Morgan, R M
2018-07-01
There is a need to develop a wider empirical research base to expand the scope for utilising the organic fraction of soil in forensic geoscience, and to demonstrate the capability of the analytical techniques used in forensic geoscience to discriminate samples at close proximity locations. The determination of wax markers from soil samples by GC analysis has been used extensively in court and is known to be effective in discriminating samples from different land use types. A new HPLC method for the analysis of the organic fraction of forensic sediment samples has also been shown recently to add value in conjunction with existing inorganic techniques for the discrimination of samples derived from close proximity locations. This study compares the ability of these two organic techniques to discriminate samples derived from close proximity locations and finds the GC technique to provide good discrimination at this scale, providing quantification of known compounds, whilst the HPLC technique offered a shorter and simpler sample preparation method and provided very good discrimination between groups of samples of different provenance in most cases. The use of both data sets together gave further improved accuracy rates in some cases, suggesting that a combined organic approach can provide added benefits in certain case scenarios and crime reconstruction contexts. Copyright © 2018 The Authors. Published by Elsevier B.V. All rights reserved.
Power flow as a complement to statistical energy analysis and finite element analysis
NASA Technical Reports Server (NTRS)
Cuschieri, J. M.
1987-01-01
Present methods of analysis of the structural response and the structure-borne transmission of vibrational energy use either finite element (FE) techniques or statistical energy analysis (SEA) methods. The FE methods are a very useful tool at low frequencies where the number of resonances involved in the analysis is rather small. On the other hand SEA methods can predict with acceptable accuracy the response and energy transmission between coupled structures at relatively high frequencies where the structural modal density is high and a statistical approach is the appropriate solution. In the mid-frequency range, a relatively large number of resonances exist which make finite element method too costly. On the other hand SEA methods can only predict an average level form. In this mid-frequency range a possible alternative is to use power flow techniques, where the input and flow of vibrational energy to excited and coupled structural components can be expressed in terms of input and transfer mobilities. This power flow technique can be extended from low to high frequencies and this can be integrated with established FE models at low frequencies and SEA models at high frequencies to form a verification of the method. This method of structural analysis using power flo and mobility methods, and its integration with SEA and FE analysis is applied to the case of two thin beams joined together at right angles.
PyMVPA: A python toolbox for multivariate pattern analysis of fMRI data.
Hanke, Michael; Halchenko, Yaroslav O; Sederberg, Per B; Hanson, Stephen José; Haxby, James V; Pollmann, Stefan
2009-01-01
Decoding patterns of neural activity onto cognitive states is one of the central goals of functional brain imaging. Standard univariate fMRI analysis methods, which correlate cognitive and perceptual function with the blood oxygenation-level dependent (BOLD) signal, have proven successful in identifying anatomical regions based on signal increases during cognitive and perceptual tasks. Recently, researchers have begun to explore new multivariate techniques that have proven to be more flexible, more reliable, and more sensitive than standard univariate analysis. Drawing on the field of statistical learning theory, these new classifier-based analysis techniques possess explanatory power that could provide new insights into the functional properties of the brain. However, unlike the wealth of software packages for univariate analyses, there are few packages that facilitate multivariate pattern classification analyses of fMRI data. Here we introduce a Python-based, cross-platform, and open-source software toolbox, called PyMVPA, for the application of classifier-based analysis techniques to fMRI datasets. PyMVPA makes use of Python's ability to access libraries written in a large variety of programming languages and computing environments to interface with the wealth of existing machine learning packages. We present the framework in this paper and provide illustrative examples on its usage, features, and programmability.
PyMVPA: A Python toolbox for multivariate pattern analysis of fMRI data
Hanke, Michael; Halchenko, Yaroslav O.; Sederberg, Per B.; Hanson, Stephen José; Haxby, James V.; Pollmann, Stefan
2009-01-01
Decoding patterns of neural activity onto cognitive states is one of the central goals of functional brain imaging. Standard univariate fMRI analysis methods, which correlate cognitive and perceptual function with the blood oxygenation-level dependent (BOLD) signal, have proven successful in identifying anatomical regions based on signal increases during cognitive and perceptual tasks. Recently, researchers have begun to explore new multivariate techniques that have proven to be more flexible, more reliable, and more sensitive than standard univariate analysis. Drawing on the field of statistical learning theory, these new classifier-based analysis techniques possess explanatory power that could provide new insights into the functional properties of the brain. However, unlike the wealth of software packages for univariate analyses, there are few packages that facilitate multivariate pattern classification analyses of fMRI data. Here we introduce a Python-based, cross-platform, and open-source software toolbox, called PyMVPA, for the application of classifier-based analysis techniques to fMRI datasets. PyMVPA makes use of Python's ability to access libraries written in a large variety of programming languages and computing environments to interface with the wealth of existing machine-learning packages. We present the framework in this paper and provide illustrative examples on its usage, features, and programmability. PMID:19184561
NASA Technical Reports Server (NTRS)
Podwysocki, M. H.
1974-01-01
Two study areas in a cratonic platform underlain by flat-lying sedimentary rocks were analyzed to determine if a quantitative relationship exists between fracture trace patterns and their frequency distributions and subsurface structural closures which might contain petroleum. Fracture trace lengths and frequency (number of fracture traces per unit area) were analyzed by trend surface analysis and length frequency distributions also were compared to a standard Gaussian distribution. Composite rose diagrams of fracture traces were analyzed using a multivariate analysis method which grouped or clustered the rose diagrams and their respective areas on the basis of the behavior of the rays of the rose diagram. Analysis indicates that the lengths of fracture traces are log-normally distributed according to the mapping technique used. Fracture trace frequency appeared higher on the flanks of active structures and lower around passive reef structures. Fracture trace log-mean lengths were shorter over several types of structures, perhaps due to increased fracturing and subsequent erosion. Analysis of rose diagrams using a multivariate technique indicated lithology as the primary control for the lower grouping levels. Groupings at higher levels indicated that areas overlying active structures may be isolated from their neighbors by this technique while passive structures showed no differences which could be isolated.
Analytical methods for determination of mycotoxins: a review.
Turner, Nicholas W; Subrahmanyam, Sreenath; Piletsky, Sergey A
2009-01-26
Mycotoxins are small (MW approximately 700), toxic chemical products formed as secondary metabolites by a few fungal species that readily colonise crops and contaminate them with toxins in the field or after harvest. Ochratoxins and Aflatoxins are mycotoxins of major significance and hence there has been significant research on broad range of analytical and detection techniques that could be useful and practical. Due to the variety of structures of these toxins, it is impossible to use one standard technique for analysis and/or detection. Practical requirements for high-sensitivity analysis and the need for a specialist laboratory setting create challenges for routine analysis. Several existing analytical techniques, which offer flexible and broad-based methods of analysis and in some cases detection, have been discussed in this manuscript. There are a number of methods used, of which many are lab-based, but to our knowledge there seems to be no single technique that stands out above the rest, although analytical liquid chromatography, commonly linked with mass spectroscopy is likely to be popular. This review manuscript discusses (a) sample pre-treatment methods such as liquid-liquid extraction (LLE), supercritical fluid extraction (SFE), solid phase extraction (SPE), (b) separation methods such as (TLC), high performance liquid chromatography (HPLC), gas chromatography (GC), and capillary electrophoresis (CE) and (c) others such as ELISA. Further currents trends, advantages and disadvantages and future prospects of these methods have been discussed.
DNA-based techniques for authentication of processed food and food supplements.
Lo, Yat-Tung; Shaw, Pang-Chui
2018-02-01
Authentication of food or food supplements with medicinal values is important to avoid adverse toxic effects, provide consumer rights, as well as for certification purpose. Compared to morphological and spectrometric techniques, molecular authentication is found to be accurate, sensitive and reliable. However, DNA degradation and inclusion of inhibitors may lead to failure in PCR amplification. This paper reviews on the existing DNA extraction and PCR protocols, and the use of small size DNA markers with sufficient discriminative power for molecular authentication. Various emerging new molecular techniques such as isothermal amplification for on-site diagnosis, next-generation sequencing for high-throughput species identification, high resolution melting analysis for quick species differentiation, DNA array techniques for rapid detection and quantitative determination in food products are also discussed. Copyright © 2017 Elsevier Ltd. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
1983-02-01
Work on energy consumption in a large office building is reported, including the following tasks: (1) evaluating and testing the effectiveness of the existing ASHRAE 90-75 and 90-80 standards; (2) evaluating the effectiveness of the BEPS; (3) evaluating the effectiveness of some envelope and lighting design variables towards achieving the BEPS budgets; and (4) comparing the computer energy analysis technique, DOE-2.1, with manual calculation procedures. These tasks are the initial activities in the energy analysis of the Park Plaza Building and will serve as the basis for further understanding the results of ongoing data collection and analysis.
Strain gage measurement errors in the transient heating of structural components
NASA Technical Reports Server (NTRS)
Richards, W. Lance
1993-01-01
Significant strain-gage errors may exist in measurements acquired in transient thermal environments if conventional correction methods are applied. Conventional correction theory was modified and a new experimental method was developed to correct indicated strain data for errors created in radiant heating environments ranging from 0.6 C/sec (1 F/sec) to over 56 C/sec (100 F/sec). In some cases the new and conventional methods differed by as much as 30 percent. Experimental and analytical results were compared to demonstrate the new technique. For heating conditions greater than 6 C/sec (10 F/sec), the indicated strain data corrected with the developed technique compared much better to analysis than the same data corrected with the conventional technique.
Neufeld, E; Chavannes, N; Samaras, T; Kuster, N
2007-08-07
The modeling of thermal effects, often based on the Pennes Bioheat Equation, is becoming increasingly popular. The FDTD technique commonly used in this context suffers considerably from staircasing errors at boundaries. A new conformal technique is proposed that can easily be integrated into existing implementations without requiring a special update scheme. It scales fluxes at interfaces with factors derived from the local surface normal. The new scheme is validated using an analytical solution, and an error analysis is performed to understand its behavior. The new scheme behaves considerably better than the standard scheme. Furthermore, in contrast to the standard scheme, it is possible to obtain with it more accurate solutions by increasing the grid resolution.
A data analysis expert system for large established distributed databases
NASA Technical Reports Server (NTRS)
Gnacek, Anne-Marie; An, Y. Kim; Ryan, J. Patrick
1987-01-01
A design for a natural language database interface system, called the Deductively Augmented NASA Management Decision support System (DANMDS), is presented. The DANMDS system components have been chosen on the basis of the following considerations: maximal employment of the existing NASA IBM-PC computers and supporting software; local structuring and storing of external data via the entity-relationship model; a natural easy-to-use error-free database query language; user ability to alter query language vocabulary and data analysis heuristic; and significant artificial intelligence data analysis heuristic techniques that allow the system to become progressively and automatically more useful.
NASA Technical Reports Server (NTRS)
Hague, D. S.; Vanderberg, J. D.; Woodbury, N. W.
1974-01-01
A method for rapidly examining the probable applicability of weight estimating formulae to a specific aerospace vehicle design is presented. The Multivariate Analysis Retrieval and Storage System (MARS) is comprised of three computer programs which sequentially operate on the weight and geometry characteristics of past aerospace vehicles designs. Weight and geometric characteristics are stored in a set of data bases which are fully computerized. Additional data bases are readily added to the MARS system and/or the existing data bases may be easily expanded to include additional vehicles or vehicle characteristics.
Paper-based analytical devices for environmental analysis.
Meredith, Nathan A; Quinn, Casey; Cate, David M; Reilly, Thomas H; Volckens, John; Henry, Charles S
2016-03-21
The field of paper-based microfluidics has experienced rapid growth over the past decade. Microfluidic paper-based analytical devices (μPADs), originally developed for point-of-care medical diagnostics in resource-limited settings, are now being applied in new areas, such as environmental analyses. Low-cost paper sensors show great promise for on-site environmental analysis; the theme of ongoing research complements existing instrumental techniques by providing high spatial and temporal resolution for environmental monitoring. This review highlights recent applications of μPADs for environmental analysis along with technical advances that may enable μPADs to be more widely implemented in field testing.
NASA Technical Reports Server (NTRS)
Walter, L. S.; Doan, A. S., Jr.; Wood, F. M., Jr.; Bredekamp, J. H.
1972-01-01
A combined WDS-EDS system obviates the severe X-ray peak overlap problems encountered with Na, Mg, Al and Si common to pure EDS systems. By application of easily measured empirical correction factors for pulse pile-up and peak overlaps which are normally observed in the analysis of silicate minerals, the accuracy of analysis is comparable with that expected for WDS electron microprobe analyses. The continuum backgrounds are subtracted for the spectra by a spline fitting technique based on integrated intensities between the peaks. The preprocessed data are then reduced to chemical analyses by existing data reduction programs.
Methods for geochemical analysis
Baedecker, Philip A.
1987-01-01
The laboratories for analytical chemistry within the Geologic Division of the U.S. Geological Survey are administered by the Office of Mineral Resources. The laboratory analysts provide analytical support to those programs of the Geologic Division that require chemical information and conduct basic research in analytical and geochemical areas vital to the furtherance of Division program goals. Laboratories for research and geochemical analysis are maintained at the three major centers in Reston, Virginia, Denver, Colorado, and Menlo Park, California. The Division has an expertise in a broad spectrum of analytical techniques, and the analytical research is designed to advance the state of the art of existing techniques and to develop new methods of analysis in response to special problems in geochemical analysis. The geochemical research and analytical results are applied to the solution of fundamental geochemical problems relating to the origin of mineral deposits and fossil fuels, as well as to studies relating to the distribution of elements in varied geologic systems, the mechanisms by which they are transported, and their impact on the environment.
3D shape measurement of automotive glass by using a fringe reflection technique
NASA Astrophysics Data System (ADS)
Skydan, O. A.; Lalor, M. J.; Burton, D. R.
2007-01-01
In automotive and glass making industries, there is a need for accurately measuring the 3D shapes of reflective surfaces to speed up and ensure product development and manufacturing quality by using non-contact techniques. This paper describes a technique for the measurement of non-full-field reflective surfaces of automotive glass by using a fringe reflection technique. Physical properties of the measurement surfaces do not allow us to apply optical geometries used in existing techniques for surface measurement based upon direct fringe pattern illumination. However, this property of surface reflectivity can be used to implement similar ideas from existing techniques in a new improved method. In other words, the reflective surface can be used as a mirror to reflect illuminated fringe patterns onto a screen behind. It has been found that in the case of implementing the reflective fringe technique, the phase-shift distribution depends not only on the height of the object but also on the slope at each measurement point. This requires the solving of differential equations to find the surface slope and height distributions in the x and y directions and development of the additional height reconstruction algorithms. The main focus has been made on developing a mathematical model of the optical sub-system and discussing ways for its practical implementation including calibration procedures. A number of implemented image processing algorithms for system calibration and data analysis are discussed and two experimental results are given for automotive glass surfaces with different shapes and defects. The proposed technique showed the ability to provide accurate non-destructive measurement of 3D shapes of the reflective automotive glass surfaces and can be used as a key element for a glass shape quality control system on-line or in a laboratory environment.
Periodic solution of neutral Lotka-Volterra system with periodic delays
NASA Astrophysics Data System (ADS)
Liu, Zhijun; Chen, Lansun
2006-12-01
A nonautonomous n-species Lotka-Volterra system with neutral delays is investigated. A set of verifiable sufficient conditions is derived for the existence of at least one strictly positive periodic solution of this Lotka-Volterra system by applying an existence theorem and some analysis techniques, where the assumptions of the existence theorem are different from that of Gaines and Mawhin's continuation theorem [R.E. Gaines, J.L. Mawhin, Coincidence Degree and Nonlinear Differential Equations, Springer-Verlag, Berlin, 1977] and that of abstract continuation theory for k-set contraction [W. Petryshyn, Z. Yu, Existence theorem for periodic solutions of higher order nonlinear periodic boundary value problems, Nonlinear Anal. 6 (1982) 943-969]. Moreover, a problem proposed by Freedman and Wu [H.I. Freedman, J. Wu, Periodic solution of single species models with periodic delay, SIAM J. Math. Anal. 23 (1992) 689-701] is answered.
NASA Astrophysics Data System (ADS)
Zhang, Ji; Ding, Mingyue; Yuchi, Ming; Hou, Wenguang; Ye, Huashan; Qiu, Wu
2010-03-01
Factor analysis is an efficient technique to the analysis of dynamic structures in medical image sequences and recently has been used in contrast-enhanced ultrasound (CEUS) of hepatic perfusion. Time-intensity curves (TICs) extracted by factor analysis can provide much more diagnostic information for radiologists and improve the diagnostic rate of focal liver lesions (FLLs). However, one of the major drawbacks of factor analysis of dynamic structures (FADS) is nonuniqueness of the result when only the non-negativity criterion is used. In this paper, we propose a new method of replace-approximation based on apex-seeking for ambiguous FADS solutions. Due to a partial overlap of different structures, factor curves are assumed to be approximately replaced by the curves existing in medical image sequences. Therefore, how to find optimal curves is the key point of the technique. No matter how many structures are assumed, our method always starts to seek apexes from one-dimensional space where the original high-dimensional data is mapped. By finding two stable apexes from one dimensional space, the method can ascertain the third one. The process can be continued until all structures are found. This technique were tested on two phantoms of blood perfusion and compared to the two variants of apex-seeking method. The results showed that the technique outperformed two variants in comparison of region of interest measurements from phantom data. It can be applied to the estimation of TICs derived from CEUS images and separation of different physiological regions in hepatic perfusion.
Evidence-based surgery: barriers, solutions, and the role of evidence synthesis.
Garas, George; Ibrahim, Amel; Ashrafian, Hutan; Ahmed, Kamran; Patel, Vanash; Okabayashi, Koji; Skapinakis, Petros; Darzi, Ara; Athanasiou, Thanos
2012-08-01
Surgery is a rapidly evolving field, making the rigorous testing of emerging innovations vital. However, most surgical research fails to employ randomized controlled trials (RCTs) and has particularly been based on low-quality study designs. Subsequently, the analysis of data through meta-analysis and evidence synthesis is particularly difficult. Through a systematic review of the literature, this article explores the barriers to achieving a strong evidence base in surgery and offers potential solutions to overcome the barriers. Many barriers exist to evidence-based surgical research. They include enabling factors, such as funding, time, infrastructure, patient preference, ethical issues, and additionally barriers associated with specific attributes related to researchers, methodologies, or interventions. Novel evidence synthesis techniques in surgery are discussed, including graphics synthesis, treatment networks, and network meta-analyses that help overcome many of the limitations associated with existing techniques. They offer the opportunity to assess gaps and quantitatively present inconsistencies within the existing evidence of RCTs. Poorly or inadequately performed RCTs and meta-analyses can give rise to incorrect results and thus fail to inform clinical practice or revise policy. The above barriers can be overcome by providing academic leadership and good organizational support to ensure that adequate personnel, resources, and funding are allocated to the researcher. Training in research methodology and data interpretation can ensure that trials are conducted correctly and evidence is adequately synthesized and disseminated. The ultimate goal of overcoming the barriers to evidence-based surgery includes the improved quality of patient care in addition to enhanced patient outcomes.
Realising the knowledge spiral in healthcare: the role of data mining and knowledge management.
Wickramasinghe, Nilmini; Bali, Rajeev K; Gibbons, M Chris; Schaffer, Jonathan
2008-01-01
Knowledge Management (KM) is an emerging business approach aimed at solving current problems such as competitiveness and the need to innovate which are faced by businesses today. The premise for the need for KM is based on a paradigm shift in the business environment where knowledge is central to organizational performance . Organizations trying to embrace KM have many tools, techniques and strategies at their disposal. A vital technique in KM is data mining which enables critical knowledge to be gained from the analysis of large amounts of data and information. The healthcare industry is a very information rich industry. The collecting of data and information permeate most, if not all areas of this industry; however, the healthcare industry has yet to fully embrace KM, let alone the new evolving techniques of data mining. In this paper, we demonstrate the ubiquitous benefits of data mining and KM to healthcare by highlighting their potential to enable and facilitate superior clinical practice and administrative management to ensue. Specifically, we show how data mining can realize the knowledge spiral by effecting the four key transformations identified by Nonaka of turning: (1) existing explicit knowledge to new explicit knowledge, (2) existing explicit knowledge to new tacit knowledge, (3) existing tacit knowledge to new explicit knowledge and (4) existing tacit knowledge to new tacit knowledge. This is done through the establishment of theoretical models that respectively identify the function of the knowledge spiral and the powers of data mining, both exploratory and predictive, in the knowledge discovery process. Our models are then applied to a healthcare data set to demonstrate the potential of this approach as well as the implications of such an approach to the clinical and administrative aspects of healthcare. Further, we demonstrate how these techniques can facilitate hospitals to address the six healthcare quality dimensions identified by the Committee for Quality Healthcare.
Concrete Condition Assessment Using Impact-Echo Method and Extreme Learning Machines
Zhang, Jing-Kui; Yan, Weizhong; Cui, De-Mi
2016-01-01
The impact-echo (IE) method is a popular non-destructive testing (NDT) technique widely used for measuring the thickness of plate-like structures and for detecting certain defects inside concrete elements or structures. However, the IE method is not effective for full condition assessment (i.e., defect detection, defect diagnosis, defect sizing and location), because the simple frequency spectrum analysis involved in the existing IE method is not sufficient to capture the IE signal patterns associated with different conditions. In this paper, we attempt to enhance the IE technique and enable it for full condition assessment of concrete elements by introducing advanced machine learning techniques for performing comprehensive analysis and pattern recognition of IE signals. Specifically, we use wavelet decomposition for extracting signatures or features out of the raw IE signals and apply extreme learning machine, one of the recently developed machine learning techniques, as classification models for full condition assessment. To validate the capabilities of the proposed method, we build a number of specimens with various types, sizes, and locations of defects and perform IE testing on these specimens in a lab environment. Based on analysis of the collected IE signals using the proposed machine learning based IE method, we demonstrate that the proposed method is effective in performing full condition assessment of concrete elements or structures. PMID:27023563
Multiplex gas chromatography for use in space craft
NASA Technical Reports Server (NTRS)
Valentin, J. R.
1985-01-01
Gas chromatography is a powerful technique for the analysis of gaseous mixtures. Some limitations in this technique still exist which can be alleviated with multiplex gas chromatography (MGC). In MGC, rapid multiple sample injections are made into the column without having to wait for one determination to be finished before taking a new sample. The resulting data must then be reduced using computational methods such as cross correlation. In order to efficiently perform multiplexgas chromatography, experiments in the laboratory and on board future space craft, skills, equipment, and computer software were developed. Three new techniques for modulating, i.e., changing, sample concentrations were demonstrated by using desorption, decomposition, and catalytic modulators. In all of them, the need for a separate gas stream as the carrier was avoided by placing the modulator at the head of the column to directly modulate a sample stream. Finally, the analysis of an environmental sample by multiplex chromatography was accomplished by employing silver oxide to catalytically modulate methane in ambient air.
Prakash, M; Geetha, D; Lydia Caroline, M
2013-04-15
Single crystals of L-phenylalanine-benzoic acid (LPBA) were successfully grown from aqueous solution by solvent evaporation technique. Purity of the crystals was increased by the method of recrystallization. The XRD analysis confirms that the crystal belongs to the monoclinic system with noncentrosymmetric space group P21. The chemical structure of compound was established by FT-NMR technique. The presence of functional groups was estimated qualitatively by Fourier transform infrared analysis (FT-IR). Ultraviolet-visible spectral analyses showed that the crystal has low UV cut-off at 254 nm combined with very good transparency of 90% in a wide range. The optical band gap was estimated to be 6.91 eV. Thermal behavior has been studied with TGA/DTA analyses. The existence of second harmonic generation (SHG) efficiency was found to be 0.56 times the value of KDP. The dielectric behavior of the sample was also studied for the first time. Copyright © 2013 Elsevier B.V. All rights reserved.
Machin, Laura
2011-11-01
The aim of the paper is to examine how those working in, using and regulating assisted conception clinics discussed infertility counselling and its provision within the context of embryo donation and in vitro fertilisation. 35 participants were recruited for semi-structured, face-to-face interviews. All data were analysed using thematic analysis. The thematic analysis revealed recurring themes based upon the portrayals of infertility counselling, embryo donation and in vitro fertilisation. This paper suggests that an implicit hierarchy exists around those using assisted conception techniques and their infertility counselling requirements, which was dependent upon the assisted conception technique used. As a result, some people using assisted conception techniques felt that their needs had been overlooked due to this covert hierarchy. Those working in, using or regulating assisted conception clinics should not view infertility counselling as restricted to treatments involving donation, or solely for people within the clinical system. Copyright © 2010 Elsevier Ireland Ltd. All rights reserved.
Fast probabilistic file fingerprinting for big data
2013-01-01
Background Biological data acquisition is raising new challenges, both in data analysis and handling. Not only is it proving hard to analyze the data at the rate it is generated today, but simply reading and transferring data files can be prohibitively slow due to their size. This primarily concerns logistics within and between data centers, but is also important for workstation users in the analysis phase. Common usage patterns, such as comparing and transferring files, are proving computationally expensive and are tying down shared resources. Results We present an efficient method for calculating file uniqueness for large scientific data files, that takes less computational effort than existing techniques. This method, called Probabilistic Fast File Fingerprinting (PFFF), exploits the variation present in biological data and computes file fingerprints by sampling randomly from the file instead of reading it in full. Consequently, it has a flat performance characteristic, correlated with data variation rather than file size. We demonstrate that probabilistic fingerprinting can be as reliable as existing hashing techniques, with provably negligible risk of collisions. We measure the performance of the algorithm on a number of data storage and access technologies, identifying its strengths as well as limitations. Conclusions Probabilistic fingerprinting may significantly reduce the use of computational resources when comparing very large files. Utilisation of probabilistic fingerprinting techniques can increase the speed of common file-related workflows, both in the data center and for workbench analysis. The implementation of the algorithm is available as an open-source tool named pfff, as a command-line tool as well as a C library. The tool can be downloaded from http://biit.cs.ut.ee/pfff. PMID:23445565
A framework for graph-based synthesis, analysis, and visualization of HPC cluster job data.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mayo, Jackson R.; Kegelmeyer, W. Philip, Jr.; Wong, Matthew H.
The monitoring and system analysis of high performance computing (HPC) clusters is of increasing importance to the HPC community. Analysis of HPC job data can be used to characterize system usage and diagnose and examine failure modes and their effects. This analysis is not straightforward, however, due to the complex relationships that exist between jobs. These relationships are based on a number of factors, including shared compute nodes between jobs, proximity of jobs in time, etc. Graph-based techniques represent an approach that is particularly well suited to this problem, and provide an effective technique for discovering important relationships in jobmore » queuing and execution data. The efficacy of these techniques is rooted in the use of a semantic graph as a knowledge representation tool. In a semantic graph job data, represented in a combination of numerical and textual forms, can be flexibly processed into edges, with corresponding weights, expressing relationships between jobs, nodes, users, and other relevant entities. This graph-based representation permits formal manipulation by a number of analysis algorithms. This report presents a methodology and software implementation that leverages semantic graph-based techniques for the system-level monitoring and analysis of HPC clusters based on job queuing and execution data. Ontology development and graph synthesis is discussed with respect to the domain of HPC job data. The framework developed automates the synthesis of graphs from a database of job information. It also provides a front end, enabling visualization of the synthesized graphs. Additionally, an analysis engine is incorporated that provides performance analysis, graph-based clustering, and failure prediction capabilities for HPC systems.« less
The use of virtual environments for percentage view analysis.
Schofield, Damian; Cox, Christopher J B
2005-09-01
It is recognised that Visual Impact Assessment (VIA), unlike many other aspects of Environmental Impact Assessments (EIA), relies less upon measurement than upon experience and judgement. Hence, it is necessary for a more structured and consistent approach towards VIA, reducing the amount of bias and subjectivity. For proposed developments, there are very few quantitative techniques for the evaluation of visibility, and these existing methods can be highly inaccurate and time consuming. Percentage view changes are one of the few quantitative techniques, and the use of computer technology can reduce the inaccuracy and the time spent evaluating the visibility of either existing or proposed developments. For over 10 years, research work undertaken by the authors at the University of Nottingham has employed Computer Graphics (CG) and Virtual Reality (VR) in civilian and industrial contexts for environmental planning, design visualisation, accident reconstruction, risk analysis, data visualisation and training simulators. This paper describes a method to quantitatively assess the visual impact of proposed developments on the landscape using CG techniques. This method allows the determination of accurate percentage view changes with the use of a computer-generated model of the environment and the application of specialist software that has been developed at the University of Nottingham. The principles are easy to understand and therefore planners, authorisation agencies and members of the public can use and understand the results. A case study is shown to demonstrate the application and the capabilities of the technology.
An Experimental Study on Strengthening of Reinforced Concrete Flexural Members using Steel Wire Mesh
NASA Astrophysics Data System (ADS)
Al Saadi, Hamza Salim Mohammed; Mohandas, Hoby P.; Namasivayam, Aravind
2017-01-01
One of the major challenges and contemporary research in the field of structural engineering is strengthening of existing structural elements using readily available materials in the market. Several investigations were conducted on strengthening of various structural components using traditional and advanced materials. Many researchers tried to enhance the reinforced concrete (RC) beams strength using steel plate, Glass and Carbon Fibre Reinforced Polymers (GFRP & CFRP). For the reason that high weight to the strength ratio and compatibility in strength between FRP composites and steel bars, steel plates and GFRP and CFRP composites are not used for strengthening works practically. Hence, in this present work the suitability of using wire mesh for the purpose of strengthening the RC flexural members is studied by conducting experimental works. New technique of strengthening system using wire mesh with a view to improve sectional properties and subsequently flexural strength of RC beams is adopted in this work. The results for experimental and theoretical analysis were compared and found that good correlation exists between them. The experimental results indicate that RC beams strengthened with steel wire mesh are easy technique for strengthening of existing flexural members.
Structural design using equilibrium programming formulations
NASA Technical Reports Server (NTRS)
Scotti, Stephen J.
1995-01-01
Solutions to increasingly larger structural optimization problems are desired. However, computational resources are strained to meet this need. New methods will be required to solve increasingly larger problems. The present approaches to solving large-scale problems involve approximations for the constraints of structural optimization problems and/or decomposition of the problem into multiple subproblems that can be solved in parallel. An area of game theory, equilibrium programming (also known as noncooperative game theory), can be used to unify these existing approaches from a theoretical point of view (considering the existence and optimality of solutions), and be used as a framework for the development of new methods for solving large-scale optimization problems. Equilibrium programming theory is described, and existing design techniques such as fully stressed design and constraint approximations are shown to fit within its framework. Two new structural design formulations are also derived. The first new formulation is another approximation technique which is a general updating scheme for the sensitivity derivatives of design constraints. The second new formulation uses a substructure-based decomposition of the structure for analysis and sensitivity calculations. Significant computational benefits of the new formulations compared with a conventional method are demonstrated.
NASA Astrophysics Data System (ADS)
Hu, Nan; Chen, Dajing; Wang, Dong; Huang, Shicheng; Trase, Ian; Grover, Hannah M.; Yu, Xiaojiao; Zhang, John X. J.; Chen, Zi
2018-02-01
Kirigami, a modified form of origami which includes cutting, has been used to improve material stretchability and compliance. However, this technique is, so far, underexplored in patterning piezoelectric materials towards developing efficient and mechanically flexible thin-film energy generators. Motivated by existing kirigami-based applications, we introduce interdigitated cuts to polyvinylidene fluoride (PVDF) films to evaluate the effect on voltage generation and stretchability. Our results from theoretical analysis, numerical simulations, and experimental tests show that kirigami PVDF films exhibit an extended strain range while still maintaining significant voltage generation compared to films without cuts. Various cutting patterns are studied, and it is found that films with denser cuts have a larger voltage output. This kirigami design can enhance the properties of existing piezoelectric materials and help to integrate tunable PVDF generators into biomedical devices.
[Popular wisdom: its existence in the university environment].
Barbosa, Maria Alves; de Melo, Marcia Borges; Júnior, Raul Soares Silveira; Brasil, Virginia Visconde; Martins, Cleusa Alves; Bezerra, Ana Lúcia Queiroz
2004-01-01
Nowadays, myths and superstitions are present in spite of scientific and technological developments, especially when trying to solve problems that escape human understanding. This study was aimed at determining the existence of superstitions and myths in the university community, investigating their origins, influences, adoption and credibility, correlating them with people's level of knowledge. It is a descriptive/analytical research conducted at Teaching Units in the Area of Health of the Federal University of Goiás. The technique of content analysis was utilized for data analysis. Two categories have been created: Personal Attitudes related to Superstitions and Influences and Destruction of Superstitions. It was found out that there is a clash between popular and scientific knowledge, either leading to the exclusion of popular wisdom, to its 'veiled' maintenance, or even to an alliance between the two types of knowledge.
ERIC Educational Resources Information Center
Abdullah, Nurdiana; Surif, Johari
2015-01-01
This study is conducted with the purpose of identifying the alternative framework contained in students' imagination on the concept of matter at submicroscopic level. Through the design of purposive sampling techniques, a total of 15 students are interviewed to obtain the data. Data from analysis document is utilized to strengthen the interview.…
NASA Technical Reports Server (NTRS)
Chin, J.; Barbero, P.
1975-01-01
The revision of an existing digital program to analyze the stability of models mounted on a two-cable mount system used in a transonic dynamics wind tunnel is presented. The program revisions and analysis of an active feedback control system to be used for controlling the free-flying models are treated.
Innovative and Cost Effective Remediation of Orbital Debris
2014-04-25
to face international opposition because it could be used offensively to disable spacecraft. 4 Technical Analysis Most of StreamSat’s... LDR ). 5 They demonstrated droplet dispersion of less than 1 micro radian for some generators and devised an instrument for measuring the...error can be limited to less than one micro radian using existing technology and techniques. During transit, external forces will alter the path of
DOT National Transportation Integrated Search
1985-03-01
A report is offered on a study of the information activities within the Right-of-Way section of ADOT. The objectives of the study were to adapt and apply techniques to measure user-perceived needs, satisfaction and utility of services provided Right-...
Propulsion system/flight control integration for supersonic aircraft
NASA Technical Reports Server (NTRS)
Reukauf, P. J.; Burcham, F. W., Jr.
1976-01-01
Digital integrated control systems are studied. Such systems allow minimization of undesirable interactions while maximizing performance at all flight conditions. One such program is the YF-12 cooperative control program. The existing analog air data computer, autothrottle, autopilot, and inlet control systems are converted to digital systems by using a general purpose airborne computer and interface unit. Existing control laws are programed and tested in flight. Integrated control laws, derived using accurate mathematical models of the airplane and propulsion system in conjunction with modern control techniques, are tested in flight. Analysis indicates that an integrated autothrottle autopilot gives good flight path control and that observers are used to replace failed sensors.
Mission feasibility study of a very long baseline interferometer utilizing the space shuttle
NASA Technical Reports Server (NTRS)
Burke, B. F.
1978-01-01
An introductory overview of very long baseline interferometry (VLBI) as it exists and is used today is given and the scientific advances that have been achieved with this technique in the past decade are described. The report briefly reviews developments now in progress that will improve ground station VLBI in the next few years, and the limitations that still will exist. The advantages and the scientific return on investment that may be expected from a VLBI terminal in space are described. Practical problems that have to be faced range from system design through hardware implementation, to data recovery and analysis.
Modeling of ETL-Processes and Processed Information in Clinical Data Warehousing.
Tute, Erik; Steiner, Jochen
2018-01-01
Literature describes a big potential for reuse of clinical patient data. A clinical data warehouse (CDWH) is a means for that. To support management and maintenance of processes extracting, transforming and loading (ETL) data into CDWHs as well as to ease reuse of metadata between regular IT-management, CDWH and secondary data users by providing a modeling approach. Expert survey and literature review to find requirements and existing modeling techniques. An ETL-modeling-technique was developed extending existing modeling techniques. Evaluation by exemplarily modeling existing ETL-process and a second expert survey. Nine experts participated in the first survey. Literature review yielded 15 included publications. Six existing modeling techniques were identified. A modeling technique extending 3LGM2 and combining it with openEHR information models was developed and evaluated. Seven experts participated in the evaluation. The developed approach can help in management and maintenance of ETL-processes and could serve as interface between regular IT-management, CDWH and secondary data users.
Expert systems tools for Hubble Space Telescope observation scheduling
NASA Technical Reports Server (NTRS)
Miller, Glenn; Rosenthal, Don; Cohen, William; Johnston, Mark
1987-01-01
The utility of expert systems techniques for the Hubble Space Telescope (HST) planning and scheduling is discussed and a plan for development of expert system tools which will augment the existing ground system is described. Additional capabilities provided by these tools will include graphics-oriented plan evaluation, long-range analysis of the observation pool, analysis of optimal scheduling time intervals, constructing sequences of spacecraft activities which minimize operational overhead, and optimization of linkages between observations. Initial prototyping of a scheduler used the Automated Reasoning Tool running on a LISP workstation.
NASA Astrophysics Data System (ADS)
Dyar, M. Darby; Giguere, Stephen; Carey, CJ; Boucher, Thomas
2016-12-01
This project examines the causes, effects, and optimization of continuum removal in laser-induced breakdown spectroscopy (LIBS) to produce the best possible prediction accuracy of elemental composition in geological samples. We compare prediction accuracy resulting from several different techniques for baseline removal, including asymmetric least squares (ALS), adaptive iteratively reweighted penalized least squares (Air-PLS), fully automatic baseline correction (FABC), continuous wavelet transformation, median filtering, polynomial fitting, the iterative thresholding Dietrich method, convex hull/rubber band techniques, and a newly-developed technique for Custom baseline removal (BLR). We assess the predictive performance of these methods using partial least-squares analysis for 13 elements of geological interest, expressed as the weight percentages of SiO2, Al2O3, TiO2, FeO, MgO, CaO, Na2O, K2O, and the parts per million concentrations of Ni, Cr, Zn, Mn, and Co. We find that previously published methods for baseline subtraction generally produce equivalent prediction accuracies for major elements. When those pre-existing methods are used, automated optimization of their adjustable parameters is always necessary to wring the best predictive accuracy out of a data set; ideally, it should be done for each individual variable. The new technique of Custom BLR produces significant improvements in prediction accuracy over existing methods across varying geological data sets, instruments, and varying analytical conditions. These results also demonstrate the dual objectives of the continuum removal problem: removing a smooth underlying signal to fit individual peaks (univariate analysis) versus using feature selection to select only those channels that contribute to best prediction accuracy for multivariate analyses. Overall, the current practice of using generalized, one-method-fits-all-spectra baseline removal results in poorer predictive performance for all methods. The extra steps needed to optimize baseline removal for each predicted variable and empower multivariate techniques with the best possible input data for optimal prediction accuracy are shown to be well worth the slight increase in necessary computations and complexity.
The scale invariant generator technique for quantifying anisotropic scale invariance
NASA Astrophysics Data System (ADS)
Lewis, G. M.; Lovejoy, S.; Schertzer, D.; Pecknold, S.
1999-11-01
Scale invariance is rapidly becoming a new paradigm for geophysics. However, little attention has been paid to the anisotropy that is invariably present in geophysical fields in the form of differential stratification and rotation, texture and morphology. In order to account for scaling anisotropy, the formalism of generalized scale invariance (GSI) was developed. Until now there has existed only a single fairly ad hoc GSI analysis technique valid for studying differential rotation. In this paper, we use a two-dimensional representation of the linear approximation to generalized scale invariance, to obtain a much improved technique for quantifying anisotropic scale invariance called the scale invariant generator technique (SIG). The accuracy of the technique is tested using anisotropic multifractal simulations and error estimates are provided for the geophysically relevant range of parameters. It is found that the technique yields reasonable estimates for simulations with a diversity of anisotropic and statistical characteristics. The scale invariant generator technique can profitably be applied to the scale invariant study of vertical/horizontal and space/time cross-sections of geophysical fields as well as to the study of the texture/morphology of fields.
The influence of surface finishing methods on touch-sensitive reactions
NASA Astrophysics Data System (ADS)
Kukhta, M. S.; Sokolov, A. P.; Krauinsh, P. Y.; Kozlova, A. D.; Bouchard, C.
2017-02-01
This paper describes the modern technological development trends in jewelry design. In the jewelry industry, new trends, associated with the introduction of updated non-traditional materials and finishing techniques, are appearing. The existing information-oriented society enhances the visual aesthetics of new jewelry forms, decoration techniques (depth and surface), synthesis of different materials, which, all in all, reveal a bias towards positive effects of visual design. Today, the jewelry industry includes not only traditional techniques, but also such improved techniques as computer-assisted design, 3D-prototyping and other alternatives to produce an updated level of jewelry material processing. The authors present the specific features of ornamental pattern designing, decoration types (depth and surface) and comparative analysis of different approaches in surface finishing. Identifying the appearance or the effect of jewelry is based on proposed evaluation criteria, providing an advanced visual aesthetics basis is predicated on touch-sensitive responses.
Label-Free Imaging and Biochemical Characterization of Bovine Sperm Cells
Ferrara, Maria Antonietta; Di Caprio, Giuseppe; Managò, Stefano; De Angelis, Annalisa; Sirleto, Luigi; Coppola, Giuseppe; De Luca, Anna Chiara
2015-01-01
A full label-free morphological and biochemical characterization is desirable to select spermatozoa during preparation for artificial insemination. In order to study these fundamental parameters, we take advantage of two attractive techniques: digital holography (DH) and Raman spectroscopy (RS). DH presents new opportunities for studying morphological aspect of cells and tissues non-invasively, quantitatively and without the need for staining or tagging, while RS is a very specific technique allowing the biochemical analysis of cellular components with a spatial resolution in the sub-micrometer range. In this paper, morphological and biochemical bovine sperm cell alterations were studied using these techniques. In addition, a complementary DH and RS study was performed to identify X- and Y-chromosome-bearing sperm cells. We demonstrate that the two techniques together are a powerful and highly efficient tool elucidating some important criterions for sperm morphological selection and sex-identification, overcoming many of the limitations associated with existing protocols. PMID:25836358
The application of welat latino for creating paes in solo wedding bride
NASA Astrophysics Data System (ADS)
Ihsani, Ade Novi Nurul; Krisnawati, Maria; Prasetyaningtyas, Wulansari; Anggraeni, Puput; Bela, Herlina Tria; Zunaedah, Putri Wahyu
2018-03-01
The purposes of this research were: 1) to find out the process of creating innovative welat, 2) to find out how to use innovative welat for Solo wedding bride paes creation. The method used in the research was research and development (R & D). Sampling technique in this research was purposive sampling by using 13 people as models. The data collection technique used observation and documentation. Data analysis technique used descriptive technique. The results of the study showed that 1) there were two times design change of the validity of welat creation, each product passed through several stages of designing, forming, determining the material and printing, 3) the first way of using the welat determined the distance dot between the cengkorongan of both forms by using welat according to the existed mold. In conclusion, Innovative welat can produce paes in accordance with the standard and shorten the process.
Counterflow Dielectrophoresis for Trypanosome Enrichment and Detection in Blood
NASA Astrophysics Data System (ADS)
Menachery, Anoop; Kremer, Clemens; Wong, Pui E.; Carlsson, Allan; Neale, Steven L.; Barrett, Michael P.; Cooper, Jonathan M.
2012-10-01
Human African trypanosomiasis or sleeping sickness is a deadly disease endemic in sub-Saharan Africa, caused by single-celled protozoan parasites. Although it has been targeted for elimination by 2020, this will only be realized if diagnosis can be improved to enable identification and treatment of afflicted patients. Existing techniques of detection are restricted by their limited field-applicability, sensitivity and capacity for automation. Microfluidic-based technologies offer the potential for highly sensitive automated devices that could achieve detection at the lowest levels of parasitemia and consequently help in the elimination programme. In this work we implement an electrokinetic technique for the separation of trypanosomes from both mouse and human blood. This technique utilises differences in polarisability between the blood cells and trypanosomes to achieve separation through opposed bi-directional movement (cell counterflow). We combine this enrichment technique with an automated image analysis detection algorithm, negating the need for a human operator.
Civil infrastructure monitoring for IVHS using optical fiber sensors
NASA Astrophysics Data System (ADS)
de Vries, Marten J.; Arya, Vivek; Grinder, C. R.; Murphy, Kent A.; Claus, Richard O.
1995-01-01
8Early deployment of Intelligent Vehicle Highway Systems would necessitate the internal instrumentation of infrastructure for emergency preparedness. Existing quantitative analysis and visual analysis techniques are time consuming, cost prohibitive, and are often unreliable. Fiber optic sensors are rapidly replacing conventional instrumentation because of their small size, light weight, immunity to electromagnetic interference, and extremely high information carrying capability. In this paper research on novel optical fiber sensing techniques for health monitoring of civil infrastructure such as highways and bridges is reported. Design, fabrication, and implementation of fiber optic sensor configurations used for measurements of strain are discussed. Results from field tests conducted to demonstrate the effectiveness of fiber sensors at determining quantitative strain vector components near crack locations in bridges are presented. Emerging applications of fiber sensors for vehicle flow, vehicle speed, and weigh-in-motion measurements are also discussed.
Khusainov, Rinat; Azzi, Djamel; Achumba, Ifeyinwa E.; Bersch, Sebastian D.
2013-01-01
Automated methods of real-time, unobtrusive, human ambulation, activity, and wellness monitoring and data analysis using various algorithmic techniques have been subjects of intense research. The general aim is to devise effective means of addressing the demands of assisted living, rehabilitation, and clinical observation and assessment through sensor-based monitoring. The research studies have resulted in a large amount of literature. This paper presents a holistic articulation of the research studies and offers comprehensive insights along four main axes: distribution of existing studies; monitoring device framework and sensor types; data collection, processing and analysis; and applications, limitations and challenges. The aim is to present a systematic and most complete study of literature in the area in order to identify research gaps and prioritize future research directions. PMID:24072027
Image analysis software for following progression of peripheral neuropathy
NASA Astrophysics Data System (ADS)
Epplin-Zapf, Thomas; Miller, Clayton; Larkin, Sean; Hermesmeyer, Eduardo; Macy, Jenny; Pellegrini, Marco; Luccarelli, Saverio; Staurenghi, Giovanni; Holmes, Timothy
2009-02-01
A relationship has been reported by several research groups [1 - 4] between the density and shapes of nerve fibers in the cornea and the existence and severity of peripheral neuropathy. Peripheral neuropathy is a complication of several prevalent diseases or conditions, which include diabetes, HIV, prolonged alcohol overconsumption and aging. A common clinical technique for confirming the condition is intramuscular electromyography (EMG), which is invasive, so a noninvasive technique like the one proposed here carries important potential advantages for the physician and patient. A software program that automatically detects the nerve fibers, counts them and measures their shapes is being developed and tested. Tests were carried out with a database of subjects with levels of severity of diabetic neuropathy as determined by EMG testing. Results from this testing, that include a linear regression analysis are shown.
Requeno, José Ignacio; Colom, José Manuel
2014-12-01
Model checking is a generic verification technique that allows the phylogeneticist to focus on models and specifications instead of on implementation issues. Phylogenetic trees are considered as transition systems over which we interrogate phylogenetic questions written as formulas of temporal logic. Nonetheless, standard logics become insufficient for certain practices of phylogenetic analysis since they do not allow the inclusion of explicit time and probabilities. The aim of this paper is to extend the application of model checking techniques beyond qualitative phylogenetic properties and adapt the existing logical extensions and tools to the field of phylogeny. The introduction of time and probabilities in phylogenetic specifications is motivated by the study of a real example: the analysis of the ratio of lactose intolerance in some populations and the date of appearance of this phenotype.
Requeno, José Ignacio; Colom, José Manuel
2014-10-23
Model checking is a generic verification technique that allows the phylogeneticist to focus on models and specifications instead of on implementation issues. Phylogenetic trees are considered as transition systems over which we interrogate phylogenetic questions written as formulas of temporal logic. Nonetheless, standard logics become insufficient for certain practices of phylogenetic analysis since they do not allow the inclusion of explicit time and probabilities. The aim of this paper is to extend the application of model checking techniques beyond qualitative phylogenetic properties and adapt the existing logical extensions and tools to the field of phylogeny. The introduction of time and probabilities in phylogenetic specifications is motivated by the study of a real example: the analysis of the ratio of lactose intolerance in some populations and the date of appearance of this phenotype.
Robust passivity analysis for discrete-time recurrent neural networks with mixed delays
NASA Astrophysics Data System (ADS)
Huang, Chuan-Kuei; Shu, Yu-Jeng; Chang, Koan-Yuh; Shou, Ho-Nien; Lu, Chien-Yu
2015-02-01
This article considers the robust passivity analysis for a class of discrete-time recurrent neural networks (DRNNs) with mixed time-delays and uncertain parameters. The mixed time-delays that consist of both the discrete time-varying and distributed time-delays in a given range are presented, and the uncertain parameters are norm-bounded. The activation functions are assumed to be globally Lipschitz continuous. Based on new bounding technique and appropriate type of Lyapunov functional, a sufficient condition is investigated to guarantee the existence of the desired robust passivity condition for the DRNNs, which can be derived in terms of a family of linear matrix inequality (LMI). Some free-weighting matrices are introduced to reduce the conservatism of the criterion by using the bounding technique. A numerical example is given to illustrate the effectiveness and applicability.
Variable beam dose rate and DMLC IMRT to moving body anatomy
DOE Office of Scientific and Technical Information (OSTI.GOV)
Papiez, Lech; Abolfath, Ramin M.
2008-11-15
Derivation of formulas relating leaf speeds and beam dose rates for delivering planned intensity profiles to static and moving targets in dynamic multileaf collimator (DMLC) intensity modulated radiation therapy (IMRT) is presented. The analysis of equations determining algorithms for DMLC IMRT delivery under a variable beam dose rate reveals a multitude of possible delivery strategies for a given intensity map and for any given target motion patterns. From among all equivalent delivery strategies for DMLC IMRT treatments specific subclasses of strategies can be selected to provide deliveries that are particularly suitable for clinical applications providing existing delivery devices are used.more » Special attention is devoted to the subclass of beam dose rate variable DMLC delivery strategies to moving body anatomy that generalize existing techniques of such deliveries in Varian DMLC irradiation methodology to static body anatomy. Few examples of deliveries from this subclass of DMLC IMRT irradiations are investigated to illustrate the principle and show practical benefits of proposed techniques.« less
Distributed support vector machine in master-slave mode.
Chen, Qingguo; Cao, Feilong
2018-05-01
It is well known that the support vector machine (SVM) is an effective learning algorithm. The alternating direction method of multipliers (ADMM) algorithm has emerged as a powerful technique for solving distributed optimisation models. This paper proposes a distributed SVM algorithm in a master-slave mode (MS-DSVM), which integrates a distributed SVM and ADMM acting in a master-slave configuration where the master node and slave nodes are connected, meaning the results can be broadcasted. The distributed SVM is regarded as a regularised optimisation problem and modelled as a series of convex optimisation sub-problems that are solved by ADMM. Additionally, the over-relaxation technique is utilised to accelerate the convergence rate of the proposed MS-DSVM. Our theoretical analysis demonstrates that the proposed MS-DSVM has linear convergence, meaning it possesses the fastest convergence rate among existing standard distributed ADMM algorithms. Numerical examples demonstrate that the convergence and accuracy of the proposed MS-DSVM are superior to those of existing methods under the ADMM framework. Copyright © 2018 Elsevier Ltd. All rights reserved.
Objective fitting of hemoglobin dynamics in traumatic bruises based on temperature depth profiling
NASA Astrophysics Data System (ADS)
Vidovič, Luka; Milanič, Matija; Majaron, Boris
2014-02-01
Pulsed photothermal radiometry (PPTR) allows noninvasive measurement of laser-induced temperature depth profiles. The obtained profiles provide information on depth distribution of absorbing chromophores, such as melanin and hemoglobin. We apply this technique to objectively characterize mass diffusion and decomposition rate of extravasated hemoglobin during the bruise healing process. In present study, we introduce objective fitting of PPTR data obtained over the course of the bruise healing process. By applying Monte Carlo simulation of laser energy deposition and simulation of the corresponding PPTR signal, quantitative analysis of underlying bruise healing processes is possible. Introduction of objective fitting enables an objective comparison between the simulated and experimental PPTR signals. In this manner, we avoid reconstruction of laser-induced depth profiles and thus inherent loss of information in the process. This approach enables us to determine the value of hemoglobin mass diffusivity, which is controversial in existing literature. Such information will be a valuable addition to existing bruise age determination techniques.
An introduction to chaotic and random time series analysis
NASA Technical Reports Server (NTRS)
Scargle, Jeffrey D.
1989-01-01
The origin of chaotic behavior and the relation of chaos to randomness are explained. Two mathematical results are described: (1) a representation theorem guarantees the existence of a specific time-domain model for chaos and addresses the relation between chaotic, random, and strictly deterministic processes; (2) a theorem assures that information on the behavior of a physical system in its complete state space can be extracted from time-series data on a single observable. Focus is placed on an important connection between the dynamical state space and an observable time series. These two results lead to a practical deconvolution technique combining standard random process modeling methods with new embedded techniques.
Progress in The Semantic Analysis of Scientific Code
NASA Technical Reports Server (NTRS)
Stewart, Mark
2000-01-01
This paper concerns a procedure that analyzes aspects of the meaning or semantics of scientific and engineering code. This procedure involves taking a user's existing code, adding semantic declarations for some primitive variables, and parsing this annotated code using multiple, independent expert parsers. These semantic parsers encode domain knowledge and recognize formulae in different disciplines including physics, numerical methods, mathematics, and geometry. The parsers will automatically recognize and document some static, semantic concepts and help locate some program semantic errors. These techniques may apply to a wider range of scientific codes. If so, the techniques could reduce the time, risk, and effort required to develop and modify scientific codes.
Perceptual distortion analysis of color image VQ-based coding
NASA Astrophysics Data System (ADS)
Charrier, Christophe; Knoblauch, Kenneth; Cherifi, Hocine
1997-04-01
It is generally accepted that a RGB color image can be easily encoded by using a gray-scale compression technique on each of the three color planes. Such an approach, however, fails to take into account correlations existing between color planes and perceptual factors. We evaluated several linear and non-linear color spaces, some introduced by the CIE, compressed with the vector quantization technique for minimum perceptual distortion. To study these distortions, we measured contrast and luminance of the video framebuffer, to precisely control color. We then obtained psychophysical judgements to measure how well these methods work to minimize perceptual distortion in a variety of color space.
Advances in numerical and applied mathematics
NASA Technical Reports Server (NTRS)
South, J. C., Jr. (Editor); Hussaini, M. Y. (Editor)
1986-01-01
This collection of papers covers some recent developments in numerical analysis and computational fluid dynamics. Some of these studies are of a fundamental nature. They address basic issues such as intermediate boundary conditions for approximate factorization schemes, existence and uniqueness of steady states for time dependent problems, and pitfalls of implicit time stepping. The other studies deal with modern numerical methods such as total variation diminishing schemes, higher order variants of vortex and particle methods, spectral multidomain techniques, and front tracking techniques. There is also a paper on adaptive grids. The fluid dynamics papers treat the classical problems of imcompressible flows in helically coiled pipes, vortex breakdown, and transonic flows.
A Survey of Ballistic Transfers to the Lunar Surface
NASA Technical Reports Server (NTRS)
Anderson, Rodney L.; Parker, Jeffrey S.
2011-01-01
In this study techniques are developed which allow an analysis of a range of different types of transfer trajectories from the Earth to the lunar surface. Trajectories ranging from those obtained using the invariant manifolds of unstable orbits to those derived from collision orbits are analyzed. These techniques allow the computation of trajectories encompassing low-energy trajectories as well as more direct transfers. The range of possible trajectory options is summarized, and a broad range of trajectories that exist as a result of the Sun's influence are computed and analyzed. The results are then classified by type, and trades between different measures of cost are discussed.
Study of inelastic e-Cd and e-Zn collisions
NASA Astrophysics Data System (ADS)
Piwinski, Mariusz; Klosowski, Lukasz; Dziczek, Darek; Chwirot, Stanislaw
2016-09-01
Electron-photon coincidence experiments are well known for providing more detailed information about electron-atom collision than any other technique. The Electron Impact Coherence Parameters (EICP) values obtained in such studies deliver the most complete characterization of the inelastic collision and allow for a verification of proposed theoretical models. We present the results of Stokes and EICP parameters characterising electronic excitation of the lowest singlet P-state of cadmium and zinc atoms for various collision energies. The experiments were performed using electron-photon coincidence technique in the coherence analysis version. The obtained data are presented and compared with existing CCC and RDWA theoretical predictions.
Comparison of public peak detection algorithms for MALDI mass spectrometry data analysis.
Yang, Chao; He, Zengyou; Yu, Weichuan
2009-01-06
In mass spectrometry (MS) based proteomic data analysis, peak detection is an essential step for subsequent analysis. Recently, there has been significant progress in the development of various peak detection algorithms. However, neither a comprehensive survey nor an experimental comparison of these algorithms is yet available. The main objective of this paper is to provide such a survey and to compare the performance of single spectrum based peak detection methods. In general, we can decompose a peak detection procedure into three consequent parts: smoothing, baseline correction and peak finding. We first categorize existing peak detection algorithms according to the techniques used in different phases. Such a categorization reveals the differences and similarities among existing peak detection algorithms. Then, we choose five typical peak detection algorithms to conduct a comprehensive experimental study using both simulation data and real MALDI MS data. The results of comparison show that the continuous wavelet-based algorithm provides the best average performance.
A strategy for reducing turnaround time in design optimization using a distributed computer system
NASA Technical Reports Server (NTRS)
Young, Katherine C.; Padula, Sharon L.; Rogers, James L.
1988-01-01
There is a need to explore methods for reducing lengthly computer turnaround or clock time associated with engineering design problems. Different strategies can be employed to reduce this turnaround time. One strategy is to run validated analysis software on a network of existing smaller computers so that portions of the computation can be done in parallel. This paper focuses on the implementation of this method using two types of problems. The first type is a traditional structural design optimization problem, which is characterized by a simple data flow and a complicated analysis. The second type of problem uses an existing computer program designed to study multilevel optimization techniques. This problem is characterized by complicated data flow and a simple analysis. The paper shows that distributed computing can be a viable means for reducing computational turnaround time for engineering design problems that lend themselves to decomposition. Parallel computing can be accomplished with a minimal cost in terms of hardware and software.
Fibre Optic Sensors for Selected Wastewater Characteristics
Chong, Su Sin; Abdul Aziz, A. R.; Harun, Sulaiman W.
2013-01-01
Demand for online and real-time measurements techniques to meet environmental regulation and treatment compliance are increasing. However the conventional techniques, which involve scheduled sampling and chemical analysis can be expensive and time consuming. Therefore cheaper and faster alternatives to monitor wastewater characteristics are required as alternatives to conventional methods. This paper reviews existing conventional techniques and optical and fibre optic sensors to determine selected wastewater characteristics which are colour, Chemical Oxygen Demand (COD) and Biological Oxygen Demand (BOD). The review confirms that with appropriate configuration, calibration and fibre features the parameters can be determined with accuracy comparable to conventional method. With more research in this area, the potential for using FOS for online and real-time measurement of more wastewater parameters for various types of industrial effluent are promising. PMID:23881131
Novel applications of X-ray photoelectron spectroscopy on unsupported nanoparticles
NASA Astrophysics Data System (ADS)
Kostko, Oleg; Xu, Bo; Jacobs, Michael I.; Ahmed, Musahid
X-ray photoelectron spectroscopy (XPS) is a powerful technique for chemical analysis of surfaces. We will present novel results of XPS on unsupported, gas-phase nanoparticles using a velocity-map imaging (VMI) spectrometer. This technique allows for probes of both the surfaces of nanoparticles via XPS as well as their interiors via near edge X-ray absorption fine structure (NEXAFS) spectroscopy. A recent application of this technique has confirmed that arginine's guanidinium group exists in a protonated state even in strongly basic solution. Moreover, the core-level photoelectron spectroscopy can provide information on the effective attenuation length (EAL) of low kinetic energy electrons. This contradictory value is important for determining the probing depth of XPS and in photolithography. A new method for determining EALs will be presented.
A fast efficient implicit scheme for the gasdynamic equations using a matrix reduction technique
NASA Technical Reports Server (NTRS)
Barth, T. J.; Steger, J. L.
1985-01-01
An efficient implicit finite-difference algorithm for the gasdynamic equations utilizing matrix reduction techniques is presented. A significant reduction in arithmetic operations is achieved without loss of the stability characteristics generality found in the Beam and Warming approximate factorization algorithm. Steady-state solutions to the conservative Euler equations in generalized coordinates are obtained for transonic flows and used to show that the method offers computational advantages over the conventional Beam and Warming scheme. Existing Beam and Warming codes can be retrofit with minimal effort. The theoretical extension of the matrix reduction technique to the full Navier-Stokes equations in Cartesian coordinates is presented in detail. Linear stability, using a Fourier stability analysis, is demonstrated and discussed for the one-dimensional Euler equations.
Suppression of ambipolar current in tunnel FETs using drain-pocket: Proposal and analysis
NASA Astrophysics Data System (ADS)
Garg, Shelly; Saurabh, Sneh
2018-01-01
In this paper, we investigate the impact of a drain-pocket (DP) adjacent to the drain region in Tunnel Field-Effect Transistors (TFETs) to effectively suppress the ambipolar current. Using calibrated two-dimensional device simulation, we examine the impact of DP in Double Gate TFET (DGTFET). We demonstrate the superiority of the DP technique over the existing techniques in controlling the ambipolar current. In particular, the addition of DP to a TFET is able to fully suppress the ambipolar current even when TFET is biased at high negative gate voltages and drain doping is kept as high as the source doping. Moreover, adding DP is complementary to the well-known technique of employ-ing source-pocket (SP) in a TFET since both need similar doping type and doping concentration.
Objective analysis of tidal fields in the Atlantic and Indian Oceans
NASA Technical Reports Server (NTRS)
Sanchez, B. V.; Rao, D. B.; Steenrod, S. D.
1986-01-01
An objective analysis technique has been developed to extrapolate tidal amplitudes and phases over entire ocean basins using existing gauge data and the altimetric measurements which are now beginning to be provided by satellite oceanography. The technique was previously tested in the Lake Superior basin. The method has now been developed and applied in the Atlantic-Indian ocean basins using a 6 deg x 6 deg grid to test its essential features. The functions used in the interpolation are the eigenfunctions of the velocity potential (Proudman functions) which are computed numerically from a knowledge of the basin's bottom topography, the horizontal plan form and the necessary boundary conditions. These functions are characteristic of the particular basin. The gravitational normal modes of the basin are computed as part of the investigation, they are used to obtain the theoretical forced solutions for the tidal constituents, the latter provide the simulated data for the testing of the method and serve as a guide in choosing the most energetic modes for the objective analysis. The results of the objective analysis of the M2 and K1 tidal constituents indicate the possibility of recovering the tidal signal with a degree of accuracy well within the error bounds of present day satellite techniques.
Extracting neuronal functional network dynamics via adaptive Granger causality analysis.
Sheikhattar, Alireza; Miran, Sina; Liu, Ji; Fritz, Jonathan B; Shamma, Shihab A; Kanold, Patrick O; Babadi, Behtash
2018-04-24
Quantifying the functional relations between the nodes in a network based on local observations is a key challenge in studying complex systems. Most existing time series analysis techniques for this purpose provide static estimates of the network properties, pertain to stationary Gaussian data, or do not take into account the ubiquitous sparsity in the underlying functional networks. When applied to spike recordings from neuronal ensembles undergoing rapid task-dependent dynamics, they thus hinder a precise statistical characterization of the dynamic neuronal functional networks underlying adaptive behavior. We develop a dynamic estimation and inference paradigm for extracting functional neuronal network dynamics in the sense of Granger, by integrating techniques from adaptive filtering, compressed sensing, point process theory, and high-dimensional statistics. We demonstrate the utility of our proposed paradigm through theoretical analysis, algorithm development, and application to synthetic and real data. Application of our techniques to two-photon Ca 2+ imaging experiments from the mouse auditory cortex reveals unique features of the functional neuronal network structures underlying spontaneous activity at unprecedented spatiotemporal resolution. Our analysis of simultaneous recordings from the ferret auditory and prefrontal cortical areas suggests evidence for the role of rapid top-down and bottom-up functional dynamics across these areas involved in robust attentive behavior.
Deriving Earth Science Data Analytics Tools/Techniques Requirements
NASA Astrophysics Data System (ADS)
Kempler, S. J.
2015-12-01
Data Analytics applications have made successful strides in the business world where co-analyzing extremely large sets of independent variables have proven profitable. Today, most data analytics tools and techniques, sometimes applicable to Earth science, have targeted the business industry. In fact, the literature is nearly absent of discussion about Earth science data analytics. Earth science data analytics (ESDA) is the process of examining large amounts of data from a variety of sources to uncover hidden patterns, unknown correlations, and other useful information. ESDA is most often applied to data preparation, data reduction, and data analysis. Co-analysis of increasing number and volume of Earth science data has become more prevalent ushered by the plethora of Earth science data sources generated by US programs, international programs, field experiments, ground stations, and citizen scientists. Through work associated with the Earth Science Information Partners (ESIP) Federation, ESDA types have been defined in terms of data analytics end goals. Goals of which are very different than those in business, requiring different tools and techniques. A sampling of use cases have been collected and analyzed in terms of data analytics end goal types, volume, specialized processing, and other attributes. The goal of collecting these use cases is to be able to better understand and specify requirements for data analytics tools and techniques yet to be implemented. This presentation will describe the attributes and preliminary findings of ESDA use cases, as well as provide early analysis of data analytics tools/techniques requirements that would support specific ESDA type goals. Representative existing data analytics tools/techniques relevant to ESDA will also be addressed.
Kilambi, Ragini; Singh, Anand Narayan
2018-03-25
Pancreaticojejunostomy (PJ is the most widely used reconstruction technique after pancreaticoduodenectomy. Despite several randomized trials, the ideal technique of pancreaticojejunostomy remains debatable. We planned a meta-analysis of randomized trials comparing the two most common techniques of PJ (duct-to-mucosa and dunking) to identify the best available evidence in the current literature. We searched the Pubmed/Medline, Web of science, Science citation index, Google scholar and Cochrane Central Register of Controlled Trials electronic databases till October 2017 for all English language randomized trials comparing the two approaches. Statistical analysis was performed using Review Manager (RevMan), Version 5.3. Copenhagen: The Nordic Cochrane Center, The Cochrane Collaboration, 2014 and results were expressed as odds ratio for dichotomous and mean difference for continuous variables. P-value ≤ 0.05 was considered significant. Trial sequential analysis was performed using TSA version 0.9.5.5 (Copenhagen: The Copenhagen Trial Unit, Center for Clinical Intervention Research, 2016). A total of 8 trials were included, with a total of 1043 patients (DTM: 518; Dunking: 525). There was no significant difference between the two groups in terms of overall as well as clinically relevant POPF rate. Similarly, both groups were comparable for the secondary outcomes. Trial sequential analysis revealed that the required information size had been crossed without achieving a clinically significant difference for overall POPF; and though the required information size had not been achieved for CR-POPF, the current data has already crossed the futility line for CR-POPF with a 10% risk difference, 80% power and 5% α error. This meta-analysis found no significant difference between the two techniques in terms of overall and CR-POPF rates. Further, the existing evidence is sufficient to conclude lack of difference and further trials are unlikely to result in any change in the outcome. (CRD42017074886). © 2018 Wiley Periodicals, Inc.
Grace, M; Fletcher, L; Powers, S K; Hughes, M; Coombes, J
1996-12-01
Homogenization of tissue for analysis of bioenergetic enzyme activities is a common practice in studies examining metabolic properties of skeletal muscle adaptation to disease, aging, inactivity or exercise. While numerous homogenization techniques are in use today, limited information exists concerning the efficacy of specific homogenization protocols. Therefore, the purpose of this study was to compare the efficacy of four commonly used approaches to homogenizing skeletal muscle for analysis of bioenergetic enzyme activity. The maximal enzyme activity (Vmax) of citrate synthase (CS) and lactate dehydrogenase (LDH) were measured from homogenous muscle samples (N = 48 per homogenization technique) and used as indicators to determine which protocol had the highest efficacy. The homogenization techniques were: (1) glass-on-glass pestle; (2) a combination of a mechanical blender and a teflon pestle (Potter-Elvehjem); (3) a combination of the mechanical blender and a biological detergent; and (4) the combined use of a mechanical blender and a sonicator. The glass-on-glass pestle homogenization protocol produced significantly higher (P < 0.05) enzyme activities compared to all other protocols for both enzymes. Of the four protocols examined, the data demonstrate that the glass-on-glass pestle homogenization protocol is the technique of choice for studying bioenergetic enzyme activity in skeletal muscle.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Miller, Mary A.; Tangyunyong, Paiboon; Cole, Edward I.
2016-01-14
Laser-based failure analysis techniques demonstrate the ability to quickly and non-intrusively screen deep ultraviolet light-emitting diodes (LEDs) for electrically-active defects. In particular, two laser-based techniques, light-induced voltage alteration and thermally-induced voltage alteration, generate applied voltage maps (AVMs) that provide information on electrically-active defect behavior including turn-on bias, density, and spatial location. Here, multiple commercial LEDs were examined and found to have dark defect signals in the AVM indicating a site of reduced resistance or leakage through the diode. The existence of the dark defect signals in the AVM correlates strongly with an increased forward-bias leakage current. This increased leakage ismore » not present in devices without AVM signals. Transmission electron microscopy analysis of a dark defect signal site revealed a dislocation cluster through the pn junction. The cluster included an open core dislocation. Even though LEDs with few dark AVM defect signals did not correlate strongly with power loss, direct association between increased open core dislocation densities and reduced LED device performance has been presented elsewhere [M. W. Moseley et al., J. Appl. Phys. 117, 095301 (2015)].« less
Novel Method for Incorporating Model Uncertainties into Gravitational Wave Parameter Estimates
NASA Astrophysics Data System (ADS)
Moore, Christopher J.; Gair, Jonathan R.
2014-12-01
Posterior distributions on parameters computed from experimental data using Bayesian techniques are only as accurate as the models used to construct them. In many applications, these models are incomplete, which both reduces the prospects of detection and leads to a systematic error in the parameter estimates. In the analysis of data from gravitational wave detectors, for example, accurate waveform templates can be computed using numerical methods, but the prohibitive cost of these simulations means this can only be done for a small handful of parameters. In this Letter, a novel method to fold model uncertainties into data analysis is proposed; the waveform uncertainty is analytically marginalized over using with a prior distribution constructed by using Gaussian process regression to interpolate the waveform difference from a small training set of accurate templates. The method is well motivated, easy to implement, and no more computationally expensive than standard techniques. The new method is shown to perform extremely well when applied to a toy problem. While we use the application to gravitational wave data analysis to motivate and illustrate the technique, it can be applied in any context where model uncertainties exist.
Fechter, Dominik; Storch, Ilse
2014-01-01
Due to legislative protection, many species, including large carnivores, are currently recolonizing Europe. To address the impending human-wildlife conflicts in advance, predictive habitat models can be used to determine potentially suitable habitat and areas likely to be recolonized. As field data are often limited, quantitative rule based models or the extrapolation of results from other studies are often the techniques of choice. Using the wolf (Canis lupus) in Germany as a model for habitat generalists, we developed a habitat model based on the location and extent of twelve existing wolf home ranges in Eastern Germany, current knowledge on wolf biology, different habitat modeling techniques and various input data to analyze ten different input parameter sets and address the following questions: (1) How do a priori assumptions and different input data or habitat modeling techniques affect the abundance and distribution of potentially suitable wolf habitat and the number of wolf packs in Germany? (2) In a synthesis across input parameter sets, what areas are predicted to be most suitable? (3) Are existing wolf pack home ranges in Eastern Germany consistent with current knowledge on wolf biology and habitat relationships? Our results indicate that depending on which assumptions on habitat relationships are applied in the model and which modeling techniques are chosen, the amount of potentially suitable habitat estimated varies greatly. Depending on a priori assumptions, Germany could accommodate between 154 and 1769 wolf packs. The locations of the existing wolf pack home ranges in Eastern Germany indicate that wolves are able to adapt to areas densely populated by humans, but are limited to areas with low road densities. Our analysis suggests that predictive habitat maps in general, should be interpreted with caution and illustrates the risk for habitat modelers to concentrate on only one selection of habitat factors or modeling technique. PMID:25029506
The 4DILAN Project (4TH Dimension in Landscape and Artifacts Analyses)
NASA Astrophysics Data System (ADS)
Chiabrando, F.; Naretto, M.; Sammartano, G.; Sambuelli, L.; Spanò, A.; Teppati Losè, L.
2017-05-01
The project is part of the wider application and subsequent spread of innovative digital technologies involving robotic systems. Modern society needs knowledge and investigation of the environment and of the related built landscape; therefore it increasingly requires new types of information. The goal can be achieved through the innovative integration of methods to set new analysis strategies for the knowledge of the built heritage and cultural landscape. The experimental cooperation between different disciplines and the related tools and techniques, which this work suggests for the analysis of the architectural heritage and the historical territory, are the following: - 3D metric survey techniques with active and passive sensors - the latter operating in both terrestrial mode and by aerial pointof view. In some circumstances, beyond the use of terrestrial LiDAR, even the newest mobile mapping system using SLAMtechnology (simultaneous localization and mapping) has been tested. - Techniques of non-destructive investigation, such as geophysical analysis of the subsoil and built structures, in particularGPR (Ground Penetrating Radar) techniques. - Historic and stratigraphic surveys carried out primarily through the study and interpretation of documentary sources,cartography and historical iconography, closely related to the existing data or latent material. The experience through the application of these techniques of investigation connected to the built spaces and to the manmade environments has been achieved with the aim of improving the ability to analyse the occurred transformations/layers over time and no longer directly readable or interpretable on manufactured evidence.
Kumar, Nitesh; Kulkarni, Kaustubh; Behera, Laxmidhar; Verma, Vivek
2017-08-01
Maghemite (γ-Fe 2 O 3 ) nanoparticles for therapeutic applications are prepared from mild steel but the existing synthesis technique is very cumbersome. The entire process takes around 100 days with multiple steps which lack proper understanding. In the current work, maghemite nanoparticles of cuboidal and spheroidal morphologies were prepared from mild steel chips by a novel cost effective oil reduction technique for magnetically guided intravascular drug delivery. The technique developed in this work yields isometric sized γ-Fe 2 O 3 nanoparticles in 6 h with higher saturation magnetization as compared to the existing similar solid state synthesis route. Mass and heat flow kinetics during the heating and quenching steps were studied with the help of Finite element simulations. Qualitative and quantitative analysis of the γ-Fe 2 O 3 phase is performed with the help of x-ray diffraction, transmission electron microscope and x-ray photoelectron spectroscopy. Mechanism for the α-Fe 2 O 3 (haematite) to γ-Fe 2 O 3 (maghemite) phase evolution during the synthesis process is also investigated. Maghemite (γ-Fe 2 O 3 ) nanoparticles were prepared bya novel cost effective oil reduction technique as mentioned below in the figure. The raw materials included mild steel chips which is one of the most abundant engineering materials. These particles can be used as ideal nanocarriers for targeted drug delivery through the vascular network.
Non-Conventional Techniques for the Study of Phase Transitions in NiTi-Based Alloys
NASA Astrophysics Data System (ADS)
Nespoli, Adelaide; Villa, Elena; Passaretti, Francesca; Albertini, Franca; Cabassi, Riccardo; Pasquale, Massimo; Sasso, Carlo Paolo; Coïsson, Marco
2014-07-01
Differential scanning calorimetry and electrical resistance measurements are the two most common techniques for the study of the phase transition path and temperatures of shape memory alloys (SMA) in stress-free condition. Besides, it is well known that internal friction measurements are also useful for this purpose. There are indeed some further techniques which are seldom used for the basic characterization of SMA transition: dilatometric analysis, magnetic measurements, and Seebeck coefficient study. In this work, we discuss the attitude of these techniques for the study of NiTi-based phase transition. Measurements were conducted on several fully annealed Ni50- x Ti50Cu x samples ranging from 3 to 10 at.% in Cu content, fully annealed at 850 °C for 1 h in vacuum and quenched in water at room temperature. Results show that all these techniques are sensitive to phase transition, and they provide significant information about the existence of intermediate phases.
Lorido, Laura; Estévez, Mario; Ventanas, Sonia
2014-01-01
Although dynamic sensory techniques such as time-intensity (TI) have been applied to certain meat products, existing knowledge regarding the temporal sensory perception of muscle foods is still limited. The objective of the present study was to apply TI to the flavour and texture perception of three different Iberian meat products: liver pâté, dry-cured sausages ("salchichon") and dry-cured loin. Moreover, the advantages of using dynamic versus static sensory techniques were explored by subjecting the same products to a quantitative descriptive analysis (QDA). TI was a suitable technique to assess the impact of composition and structure of the three meat products on flavour and texture perception from a dynamic perspective. TI parameters extracted from the TI-curves and related to temporal perception enabled the detection of clear differences in sensory temporal perception between the meat products and provided additional insight on sensory perception compared to the conventional static sensory technique (QDA). © 2013.
Fourier-Mellin moment-based intertwining map for image encryption
NASA Astrophysics Data System (ADS)
Kaur, Manjit; Kumar, Vijay
2018-03-01
In this paper, a robust image encryption technique that utilizes Fourier-Mellin moments and intertwining logistic map is proposed. Fourier-Mellin moment-based intertwining logistic map has been designed to overcome the issue of low sensitivity of an input image. Multi-objective Non-Dominated Sorting Genetic Algorithm (NSGA-II) based on Reinforcement Learning (MNSGA-RL) has been used to optimize the required parameters of intertwining logistic map. Fourier-Mellin moments are used to make the secret keys more secure. Thereafter, permutation and diffusion operations are carried out on input image using secret keys. The performance of proposed image encryption technique has been evaluated on five well-known benchmark images and also compared with seven well-known existing encryption techniques. The experimental results reveal that the proposed technique outperforms others in terms of entropy, correlation analysis, a unified average changing intensity and the number of changing pixel rate. The simulation results reveal that the proposed technique provides high level of security and robustness against various types of attacks.
A Regev-Type Fully Homomorphic Encryption Scheme Using Modulus Switching
Chen, Zhigang; Wang, Jian; Song, Xinxia
2014-01-01
A critical challenge in a fully homomorphic encryption (FHE) scheme is to manage noise. Modulus switching technique is currently the most efficient noise management technique. When using the modulus switching technique to design and implement a FHE scheme, how to choose concrete parameters is an important step, but to our best knowledge, this step has drawn very little attention to the existing FHE researches in the literature. The contributions of this paper are twofold. On one hand, we propose a function of the lower bound of dimension value in the switching techniques depending on the LWE specific security levels. On the other hand, as a case study, we modify the Brakerski FHE scheme (in Crypto 2012) by using the modulus switching technique. We recommend concrete parameter values of our proposed scheme and provide security analysis. Our result shows that the modified FHE scheme is more efficient than the original Brakerski scheme in the same security level. PMID:25093212
AN OVERVIEW OF REDUCED ORDER MODELING TECHNIQUES FOR SAFETY APPLICATIONS
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mandelli, D.; Alfonsi, A.; Talbot, P.
2016-10-01
The RISMC project is developing new advanced simulation-based tools to perform Computational Risk Analysis (CRA) for the existing fleet of U.S. nuclear power plants (NPPs). These tools numerically model not only the thermal-hydraulic behavior of the reactors primary and secondary systems, but also external event temporal evolution and component/system ageing. Thus, this is not only a multi-physics problem being addressed, but also a multi-scale problem (both spatial, µm-mm-m, and temporal, seconds-hours-years). As part of the RISMC CRA approach, a large amount of computationally-expensive simulation runs may be required. An important aspect is that even though computational power is growing, themore » overall computational cost of a RISMC analysis using brute-force methods may be not viable for certain cases. A solution that is being evaluated to assist the computational issue is the use of reduced order modeling techniques. During the FY2015, we investigated and applied reduced order modeling techniques to decrease the RISMC analysis computational cost by decreasing the number of simulation runs; for this analysis improvement we used surrogate models instead of the actual simulation codes. This article focuses on the use of reduced order modeling techniques that can be applied to RISMC analyses in order to generate, analyze, and visualize data. In particular, we focus on surrogate models that approximate the simulation results but in a much faster time (microseconds instead of hours/days).« less
Neural net diagnostics for VLSI test
NASA Technical Reports Server (NTRS)
Lin, T.; Tseng, H.; Wu, A.; Dogan, N.; Meador, J.
1990-01-01
This paper discusses the application of neural network pattern analysis algorithms to the IC fault diagnosis problem. A fault diagnostic is a decision rule combining what is known about an ideal circuit test response with information about how it is distorted by fabrication variations and measurement noise. The rule is used to detect fault existence in fabricated circuits using real test equipment. Traditional statistical techniques may be used to achieve this goal, but they can employ unrealistic a priori assumptions about measurement data. Our approach to this problem employs an adaptive pattern analysis technique based on feedforward neural networks. During training, a feedforward network automatically captures unknown sample distributions. This is important because distributions arising from the nonlinear effects of process variation can be more complex than is typically assumed. A feedforward network is also able to extract measurement features which contribute significantly to making a correct decision. Traditional feature extraction techniques employ matrix manipulations which can be particularly costly for large measurement vectors. In this paper we discuss a software system which we are developing that uses this approach. We also provide a simple example illustrating the use of the technique for fault detection in an operational amplifier.
Cebi, Nur; Yilmaz, Mustafa Tahsin; Sagdic, Osman
2017-08-15
Sibutramine may be illicitly included in herbal slimming foods and supplements marketed as "100% natural" to enhance weight loss. Considering public health and legal regulations, there is an urgent need for effective, rapid and reliable techniques to detect sibutramine in dietetic herbal foods, teas and dietary supplements. This research comprehensively explored, for the first time, detection of sibutramine in green tea, green coffee and mixed herbal tea using ATR-FTIR spectroscopic technique combined with chemometrics. Hierarchical cluster analysis and PCA principle component analysis techniques were employed in spectral range (2746-2656cm -1 ) for classification and discrimination through Euclidian distance and Ward's algorithm. Unadulterated and adulterated samples were classified and discriminated with respect to their sibutramine contents with perfect accuracy without any false prediction. The results suggest that existence of the active substance could be successfully determined at the levels in the range of 0.375-12mg in totally 1.75g of green tea, green coffee and mixed herbal tea by using FTIR-ATR technique combined with chemometrics. Copyright © 2017 Elsevier Ltd. All rights reserved.
NASA Technical Reports Server (NTRS)
Richards, W. Lance
1996-01-01
Significant strain-gage errors may exist in measurements acquired in transient-temperature environments if conventional correction methods are applied. As heating or cooling rates increase, temperature gradients between the strain-gage sensor and substrate surface increase proportionally. These temperature gradients introduce strain-measurement errors that are currently neglected in both conventional strain-correction theory and practice. Therefore, the conventional correction theory has been modified to account for these errors. A new experimental method has been developed to correct strain-gage measurements acquired in environments experiencing significant temperature transients. The new correction technique has been demonstrated through a series of tests in which strain measurements were acquired for temperature-rise rates ranging from 1 to greater than 100 degrees F/sec. Strain-gage data from these tests have been corrected with both the new and conventional methods and then compared with an analysis. Results show that, for temperature-rise rates greater than 10 degrees F/sec, the strain measurements corrected with the conventional technique produced strain errors that deviated from analysis by as much as 45 percent, whereas results corrected with the new technique were in good agreement with analytical results.
Unsupervised EEG analysis for automated epileptic seizure detection
NASA Astrophysics Data System (ADS)
Birjandtalab, Javad; Pouyan, Maziyar Baran; Nourani, Mehrdad
2016-07-01
Epilepsy is a neurological disorder which can, if not controlled, potentially cause unexpected death. It is extremely crucial to have accurate automatic pattern recognition and data mining techniques to detect the onset of seizures and inform care-givers to help the patients. EEG signals are the preferred biosignals for diagnosis of epileptic patients. Most of the existing pattern recognition techniques used in EEG analysis leverage the notion of supervised machine learning algorithms. Since seizure data are heavily under-represented, such techniques are not always practical particularly when the labeled data is not sufficiently available or when disease progression is rapid and the corresponding EEG footprint pattern will not be robust. Furthermore, EEG pattern change is highly individual dependent and requires experienced specialists to annotate the seizure and non-seizure events. In this work, we present an unsupervised technique to discriminate seizures and non-seizures events. We employ power spectral density of EEG signals in different frequency bands that are informative features to accurately cluster seizure and non-seizure events. The experimental results tried so far indicate achieving more than 90% accuracy in clustering seizure and non-seizure events without having any prior knowledge on patient's history.
Fowler, Dawnovise N; Faulkner, Monica
2011-12-01
In this article, meta-analytic techniques are used to examine existing intervention studies (n = 11) to determine their effects on substance abuse among female samples of intimate partner abuse (IPA) survivors. This research serves as a starting point for greater attention in research and practice to the implementation of evidence-based, integrated services to address co-occurring substance abuse and IPA victimization among women as major intersecting public health problems. The results show greater effects in three main areas. First, greater effect sizes exist in studies where larger numbers of women experienced current IPA. Second, studies with a lower mean age also showed greater effect sizes than studies with a higher mean age. Lastly, studies with smaller sample sizes have greater effects. This research helps to facilitate cohesion in the knowledge base on this topic, and the findings of this meta-analysis, in particular, contribute needed information to gaps in the literature on the level of promise of existing interventions to impact substance abuse in this underserved population. Published by Elsevier Inc.
Ning, N; Wen, Y; Li, Y; Li, J
2013-11-01
Nonsteroidal anti-inflammatory drugs (NSAIDs) are commonly used to manage the pain and inflammation. NSAIDs can cause serious side effects, including vision problems. However, the underlying mechanisms are still unclear. Therefore, we aimed to investigate the effect of meclofenamic acid (MFA) on retinal pigment epithelium (RPE). In our study, we applied image analysis and whole-cell patch clamp recording to directly measure the effect of MFA on the gap junctional coupling between RPE cells. Analysis of Lucifer yellow (LY) transfer revealed that the gap junction communication existed between RPE cells. Functional experiments using the whole-cell configuration of the patch clamp technique showed that a gap junction conductance also existed between this kind of cells. Importantly, MFA largely inhibited the gap junction conductance and induced the uncoupling of RPE cells. Other NSAIDs, like aspirin and flufenamic acid (FFA), had the same effect. The gap junction functionally existed in RPE cells, which can be blocked by MFA. These findings may explain, at least partially, the vision problems with certain clinically used NSAIDs.
Analytical concepts for health management systems of liquid rocket engines
NASA Technical Reports Server (NTRS)
Williams, Richard; Tulpule, Sharayu; Hawman, Michael
1990-01-01
Substantial improvement in health management systems performance can be realized by implementing advanced analytical methods of processing existing liquid rocket engine sensor data. In this paper, such techniques ranging from time series analysis to multisensor pattern recognition to expert systems to fault isolation models are examined and contrasted. The performance of several of these methods is evaluated using data from test firings of the Space Shuttle main engines.
Development of a Theory-Driven Injury Prevention Communication Strategy for U.S. Army Personnel
2016-04-01
for Soldiers, though medical personnel were also a priority audience group for communication products. Opportunities exist to expand on these...interventions that take on an ecological focus will use tailored and targeted messaging for individuals and groups and more broad techniques (e.g...program’s communication plan. The steps are referred to as analysis, strategic design , development and pretesting , implementation and monitoring
Stability analysis of a Vlasov-Wave system describing particles interacting with their environment
NASA Astrophysics Data System (ADS)
De Bièvre, Stephan; Goudon, Thierry; Vavasseur, Arthur
2018-06-01
We study a kinetic equation of the Vlasov-Wave type, which arises in the description of the behavior of a large number of particles interacting weakly with an environment, composed of an infinite collection of local vibrational degrees of freedom, modeled by wave equations. We use variational techniques to establish the existence of large families of stationary states for this system, and analyze their stability.
[Hygienic and ergonomic analysis of the technology for sinking main and subsidiary mine shafts].
Meniaĭlo, N I; Tyshlek, E G; Gritsenko, V S; Shemiakin, G M
1989-01-01
The labour conditions in mine shafts do not correspond to the existing ergonomic and hygienic norms. Drilling and blasting techniques are most hazardous as to the gravity and duration of the factors involved. Working conditions normalization should be based on the elaboration of specifically innovative technologies which should envisage the workers' periodic staying in the mine shaft area during the work shift.
Exploring synchrotron radiation capabilities: The ALS-Intel CRADA
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gozzo, F.; Cossy-Favre, A; Trippleet, B.
1997-04-01
Synchrotron radiation spectroscopy and spectromicroscopy were applied, at the Advanced Light Source, to the analysis of materials and problems of interest to the commercial semiconductor industry. The authors discuss some of the results obtained at the ALS using existing capabilities, in particular the small spot ultra-ESCA instrument on beamline 7.0 and the AMS (Applied Material Science) endstation on beamline 9.3.2. The continuing trend towards smaller feature size and increased performance for semiconductor components has driven the semiconductor industry to invest in the development of sophisticated and complex instrumentation for the characterization of microstructures. Among the crucial milestones established by themore » Semiconductor Industry Association are the needs for high quality, defect free and extremely clean silicon wafers, very thin gate oxides, lithographies near 0.1 micron and advanced material interconnect structures. The requirements of future generations cannot be met with current industrial technologies. The purpose of the ALS-Intel CRADA (Cooperative Research And Development Agreement) is to explore, compare and improve the utility of synchrotron-based techniques for practical analysis of substrates of interest to semiconductor chip manufacturing. The first phase of the CRADA project consisted in exploring existing ALS capabilities and techniques on some problems of interest. Some of the preliminary results obtained on Intel samples are discussed here.« less
Hyperspectral range imaging for transportation systems evaluation
NASA Astrophysics Data System (ADS)
Bridgelall, Raj; Rafert, J. B.; Atwood, Don; Tolliver, Denver D.
2016-04-01
Transportation agencies expend significant resources to inspect critical infrastructure such as roadways, railways, and pipelines. Regular inspections identify important defects and generate data to forecast maintenance needs. However, cost and practical limitations prevent the scaling of current inspection methods beyond relatively small portions of the network. Consequently, existing approaches fail to discover many high-risk defect formations. Remote sensing techniques offer the potential for more rapid and extensive non-destructive evaluations of the multimodal transportation infrastructure. However, optical occlusions and limitations in the spatial resolution of typical airborne and space-borne platforms limit their applicability. This research proposes hyperspectral image classification to isolate transportation infrastructure targets for high-resolution photogrammetric analysis. A plenoptic swarm of unmanned aircraft systems will capture images with centimeter-scale spatial resolution, large swaths, and polarization diversity. The light field solution will incorporate structure-from-motion techniques to reconstruct three-dimensional details of the isolated targets from sequences of two-dimensional images. A comparative analysis of existing low-power wireless communications standards suggests an application dependent tradeoff in selecting the best-suited link to coordinate swarming operations. This study further produced a taxonomy of specific roadway and railway defects, distress symptoms, and other anomalies that the proposed plenoptic swarm sensing system would identify and characterize to estimate risk levels.
Communication: Electron ionization of DNA bases.
Rahman, M A; Krishnakumar, E
2016-04-28
No reliable experimental data exist for the partial and total electron ionization cross sections for DNA bases, which are very crucial for modeling radiation damage in genetic material of living cell. We have measured a complete set of absolute partial electron ionization cross sections up to 500 eV for DNA bases for the first time by using the relative flow technique. These partial cross sections are summed to obtain total ion cross sections for all the four bases and are compared with the existing theoretical calculations and the only set of measured absolute cross sections. Our measurements clearly resolve the existing discrepancy between the theoretical and experimental results, thereby providing for the first time reliable numbers for partial and total ion cross sections for these molecules. The results on fragmentation analysis of adenine supports the theory of its formation in space.
Statistical approach for selection of biologically informative genes.
Das, Samarendra; Rai, Anil; Mishra, D C; Rai, Shesh N
2018-05-20
Selection of informative genes from high dimensional gene expression data has emerged as an important research area in genomics. Many gene selection techniques have been proposed so far are either based on relevancy or redundancy measure. Further, the performance of these techniques has been adjudged through post selection classification accuracy computed through a classifier using the selected genes. This performance metric may be statistically sound but may not be biologically relevant. A statistical approach, i.e. Boot-MRMR, was proposed based on a composite measure of maximum relevance and minimum redundancy, which is both statistically sound and biologically relevant for informative gene selection. For comparative evaluation of the proposed approach, we developed two biological sufficient criteria, i.e. Gene Set Enrichment with QTL (GSEQ) and biological similarity score based on Gene Ontology (GO). Further, a systematic and rigorous evaluation of the proposed technique with 12 existing gene selection techniques was carried out using five gene expression datasets. This evaluation was based on a broad spectrum of statistically sound (e.g. subject classification) and biological relevant (based on QTL and GO) criteria under a multiple criteria decision-making framework. The performance analysis showed that the proposed technique selects informative genes which are more biologically relevant. The proposed technique is also found to be quite competitive with the existing techniques with respect to subject classification and computational time. Our results also showed that under the multiple criteria decision-making setup, the proposed technique is best for informative gene selection over the available alternatives. Based on the proposed approach, an R Package, i.e. BootMRMR has been developed and available at https://cran.r-project.org/web/packages/BootMRMR. This study will provide a practical guide to select statistical techniques for selecting informative genes from high dimensional expression data for breeding and system biology studies. Published by Elsevier B.V.
New atom probe approaches to studying segregation in nanocrystalline materials.
Samudrala, S K; Felfer, P J; Araullo-Peters, V J; Cao, Y; Liao, X Z; Cairney, J M
2013-09-01
Atom probe is a technique that is highly suited to the study of nanocrystalline materials. It can provide accurate atomic-scale information about the composition of grain boundaries in three dimensions. In this paper we have analysed the microstructure of a nanocrystalline super-duplex stainless steel prepared by high pressure torsion (HPT). Not all of the grain boundaries in this alloy display obvious segregation, making visualisation of the microstructure challenging. In addition, the grain boundaries present in the atom probe data acquired from this alloy have complex shapes that are curved at the scale of the dataset and the interfacial excess varies considerably over the boundaries, making the accurate characterisation of the distribution of solute challenging using existing analysis techniques. In this paper we present two new data treatment methods that allow the visualisation of boundaries with little or no segregation, the delineation of boundaries for further analysis and the quantitative analysis of Gibbsian interfacial excess at boundaries, including the capability of excess mapping. Copyright © 2013 Elsevier B.V. All rights reserved.
Using cluster analysis for medical resource decision making.
Dilts, D; Khamalah, J; Plotkin, A
1995-01-01
Escalating costs of health care delivery have in the recent past often made the health care industry investigate, adapt, and apply those management techniques relating to budgeting, resource control, and forecasting that have long been used in the manufacturing sector. A strategy that has contributed much in this direction is the definition and classification of a hospital's output into "products" or groups of patients that impose similar resource or cost demands on the hospital. Existing classification schemes have frequently employed cluster analysis in generating these groupings. Unfortunately, the myriad articles and books on clustering and classification contain few formalized selection methodologies for choosing a technique for solving a particular problem, hence they often leave the novice investigator at a loss. This paper reviews the literature on clustering, particularly as it has been applied in the medical resource-utilization domain, addresses the critical choices facing an investigator in the medical field using cluster analysis, and offers suggestions (using the example of clustering low-vision patients) for how such choices can be made.
Analysis-Preserving Video Microscopy Compression via Correlation and Mathematical Morphology
Shao, Chong; Zhong, Alfred; Cribb, Jeremy; Osborne, Lukas D.; O’Brien, E. Timothy; Superfine, Richard; Mayer-Patel, Ketan; Taylor, Russell M.
2015-01-01
The large amount video data produced by multi-channel, high-resolution microscopy system drives the need for a new high-performance domain-specific video compression technique. We describe a novel compression method for video microscopy data. The method is based on Pearson's correlation and mathematical morphology. The method makes use of the point-spread function (PSF) in the microscopy video acquisition phase. We compare our method to other lossless compression methods and to lossy JPEG, JPEG2000 and H.264 compression for various kinds of video microscopy data including fluorescence video and brightfield video. We find that for certain data sets, the new method compresses much better than lossless compression with no impact on analysis results. It achieved a best compressed size of 0.77% of the original size, 25× smaller than the best lossless technique (which yields 20% for the same video). The compressed size scales with the video's scientific data content. Further testing showed that existing lossy algorithms greatly impacted data analysis at similar compression sizes. PMID:26435032
NeuroLines: A Subway Map Metaphor for Visualizing Nanoscale Neuronal Connectivity.
Al-Awami, Ali K; Beyer, Johanna; Strobelt, Hendrik; Kasthuri, Narayanan; Lichtman, Jeff W; Pfister, Hanspeter; Hadwiger, Markus
2014-12-01
We present NeuroLines, a novel visualization technique designed for scalable detailed analysis of neuronal connectivity at the nanoscale level. The topology of 3D brain tissue data is abstracted into a multi-scale, relative distance-preserving subway map visualization that allows domain scientists to conduct an interactive analysis of neurons and their connectivity. Nanoscale connectomics aims at reverse-engineering the wiring of the brain. Reconstructing and analyzing the detailed connectivity of neurons and neurites (axons, dendrites) will be crucial for understanding the brain and its development and diseases. However, the enormous scale and complexity of nanoscale neuronal connectivity pose big challenges to existing visualization techniques in terms of scalability. NeuroLines offers a scalable visualization framework that can interactively render thousands of neurites, and that supports the detailed analysis of neuronal structures and their connectivity. We describe and analyze the design of NeuroLines based on two real-world use-cases of our collaborators in developmental neuroscience, and investigate its scalability to large-scale neuronal connectivity data.
Fuel-injector/air-swirl characterization
NASA Technical Reports Server (NTRS)
Mcvey, J. B.; Kennedy, J. B.; Bennett, J. C.
1985-01-01
The objectives of this program are to establish an experimental data base documenting the behavior of gas turbine engine fuel injector sprays as the spray interacts with the swirling gas flow existing in the combustor dome, and to conduct an assessment of the validity of current analytical techniques for predicting fuel spray behavior. Emphasis is placed on the acquisition of data using injector/swirler components which closely resemble components currently in use in advanced aircraft gas turbine engines, conducting tests under conditions that closely simulate or closely approximate those developed in actual combustors, and conducting a well-controlled experimental effort which will comprise using a combination of low-risk experiments and experiments requiring the use of state-of-the-art diagnostic instrumentation. Analysis of the data is to be conducted using an existing, TEACH-type code which employs a stochastic analysis of the motion of the dispersed phase in the turbulent continuum flow field.
Reiner, Bruce I
2017-10-01
Conventional peer review practice is compromised by a number of well-documented biases, which in turn limit standard of care analysis, which is fundamental to determination of medical malpractice. In addition to these intrinsic biases, other existing deficiencies exist in current peer review including the lack of standardization, objectivity, retrospective practice, and automation. An alternative model to address these deficiencies would be one which is completely blinded to the peer reviewer, requires independent reporting from both parties, utilizes automated data mining techniques for neutral and objective report analysis, and provides data reconciliation for resolution of finding-specific report differences. If properly implemented, this peer review model could result in creation of a standardized referenceable peer review database which could further assist in customizable education, technology refinement, and implementation of real-time context and user-specific decision support.
Sebastian, Brenda; Nelms, Jerrod
Over the past two decades, growing numbers of clinicians have been utilizing emotional freedom techniques (EFT) in the treatment of posttraumatic stress disorder (PTSD), anxiety, and depression. Randomized controlled trials (RCTs) have shown encouraging outcomes for all three conditions. To assess the efficacy of EFT in treating PTSD by conducting a meta-analysis of existing RCTs. A systematic review of databases was undertaken to identify RCTs investigating EFT in the treatment of PTSD. The RCTs were evaluated for quality using evidence-based standards published by the American Psychological Association Division 12 Task Force on Empirically Validated Therapies. Those meeting the criteria were assessed using a meta-analysis that synthesized the data to determine effect sizes. While uncontrolled outcome studies were excluded, they were examined for clinical implications of treatment that can extend knowledge of this condition. Seven randomized controlled trials were found to meet the criteria and were included in the meta-analysis. A large treatment effect was found, with a weighted Cohen׳s d = 2.96 (95% CI: 1.96-3.97, P < .001) for the studies that compared EFT to usual care or a waitlist. No treatment effect differences were found in studies comparing EFT to other evidence-based therapies such as eye movement desensitization and reprocessing (EMDR; 1 study) and cognitive behavior therapy (CBT; 1 study). The analysis of existing studies showed that a series of 4-10 EFT sessions is an efficacious treatment for PTSD with a variety of populations. The studies examined reported no adverse effects from EFT interventions and showed that it can be used both on a self-help basis and as a primary evidence-based treatment for PTSD. Copyright © 2017 Elsevier Inc. All rights reserved.
NASA Technical Reports Server (NTRS)
Tawfik, Hazem
1991-01-01
A relatively simple, inexpensive, and generic technique that could be used in both laboratories and some operation site environments is introduced at the Robotics Applications and Development Laboratory (RADL) at Kennedy Space Center (KSC). In addition, this report gives a detailed explanation of the set up procedure, data collection, and analysis using this new technique that was developed at the State University of New York at Farmingdale. The technique was used to evaluate the repeatability, accuracy, and overshoot of the Unimate Industrial Robot, PUMA 500. The data were statistically analyzed to provide an insight into the performance of the systems and components of the robot. Also, the same technique was used to check the forward kinematics against the inverse kinematics of RADL's PUMA robot. Recommendations were made for RADL to use this technique for laboratory calibration of the currently existing robots such as the ASEA, high speed controller, Automated Radiator Inspection Device (ARID) etc. Also, recommendations were made to develop and establish other calibration techniques that will be more suitable for site calibration environment and robot certification.
ECG-derived respiration based on iterated Hilbert transform and Hilbert vibration decomposition.
Sharma, Hemant; Sharma, K K
2018-06-01
Monitoring of the respiration using the electrocardiogram (ECG) is desirable for the simultaneous study of cardiac activities and the respiration in the aspects of comfort, mobility, and cost of the healthcare system. This paper proposes a new approach for deriving the respiration from single-lead ECG based on the iterated Hilbert transform (IHT) and the Hilbert vibration decomposition (HVD). The ECG signal is first decomposed into the multicomponent sinusoidal signals using the IHT technique. Afterward, the lower order amplitude components obtained from the IHT are filtered using the HVD to extract the respiration information. Experiments are performed on the Fantasia and Apnea-ECG datasets. The performance of the proposed ECG-derived respiration (EDR) approach is compared with the existing techniques including the principal component analysis (PCA), R-peak amplitudes (RPA), respiratory sinus arrhythmia (RSA), slopes of the QRS complex, and R-wave angle. The proposed technique showed the higher median values of correlation (first and third quartile) for both the Fantasia and Apnea-ECG datasets as 0.699 (0.55, 0.82) and 0.57 (0.40, 0.73), respectively. Also, the proposed algorithm provided the lowest values of the mean absolute error and the average percentage error computed from the EDR and reference (recorded) respiration signals for both the Fantasia and Apnea-ECG datasets as 1.27 and 9.3%, and 1.35 and 10.2%, respectively. In the experiments performed over different age group subjects of the Fantasia dataset, the proposed algorithm provided effective results in the younger population but outperformed the existing techniques in the case of elderly subjects. The proposed EDR technique has the advantages over existing techniques in terms of the better agreement in the respiratory rates and specifically, it reduces the need for an extra step required for the detection of fiducial points in the ECG for the estimation of respiration which makes the process effective and less-complex. The above performance results obtained from two different datasets validate that the proposed approach can be used for monitoring of the respiration using single-lead ECG.
Leveraging natural dynamical structures to explore multi-body systems
NASA Astrophysics Data System (ADS)
Bosanac, Natasha
Multi-body systems have become the target of an increasing number of mission concepts and observations, supplying further information about the composition, origin and dynamical environment of bodies within the solar system and beyond. In many of these scenarios, identification and characterization of the particular solutions that exist in a circular restricted three-body model is valuable. This insight into the underlying natural dynamical structures is achieved via the application of dynamical systems techniques. One application of such analysis is trajectory design for CubeSats, which are intended to explore cislunar space and other planetary systems. These increasingly complex mission objectives necessitate innovative trajectory design strategies for spacecraft within our solar system, as well as the capability for rapid and well-informed redesign. Accordingly, a trajectory design framework is constructed using dynamical systems techniques and demonstrated for the Lunar IceCube mission. An additional application explored in this investigation involves the motion of an exoplanet near a binary star system. Due to the strong gravitational field near a binary star, physicists have previously leveraged these systems as testbeds for examining the validity of gravitational and relativistic theories. In this investigation, a preliminary analysis into the effect of an additional three-body interaction on the dynamical environment near a large mass ratio binary system is conducted. As demonstrated through both of these sample applications, identification and characterization of the natural particular solutions that exist within a multi-body system supports a well-informed and guided analysis.
Tarantino, Cristina; Adamo, Maria; Lucas, Richard; Blonda, Palma
2016-03-15
Focusing on a Mediterranean Natura 2000 site in Italy, the effectiveness of the cross correlation analysis (CCA) technique for quantifying change in the area of semi-natural grasslands at different spatial resolutions (grain) was evaluated. In a fine scale analysis (2 m), inputs to the CCA were a) a semi-natural grasslands layer extracted from an existing validated land cover/land use (LC/LU) map (1:5000, time T 1 ) and b) a more recent single date very high resolution (VHR) WorldView-2 image (time T 2 ), with T 2 > T 1 . The changes identified through the CCA were compared against those detected by applying a traditional post-classification comparison (PCC) technique to the same reference T 1 map and an updated T 2 map obtained by a knowledge driven classification of four multi-seasonal Worldview-2 input images. Specific changes observed were those associated with agricultural intensification and fires. The study concluded that prior knowledge (spectral class signatures, awareness of local agricultural practices and pressures) was needed for the selection of the most appropriate image (in terms of seasonality) to be acquired at T 2 . CCA was also applied to the comparison of the existing T 1 map with recent high resolution (HR) Landsat 8 OLS images. The areas of change detected at VHR and HR were broadly similar with larger error values in HR change images.
Estimating psycho-physiological state of a human by speech analysis
NASA Astrophysics Data System (ADS)
Ronzhin, A. L.
2005-05-01
Adverse effects of intoxication, fatigue and boredom could degrade performance of highly trained operators of complex technical systems with potentially catastrophic consequences. Existing physiological fitness for duty tests are time consuming, costly, invasive, and highly unpopular. Known non-physiological tests constitute a secondary task and interfere with the busy workload of the tested operator. Various attempts to assess the current status of the operator by processing of "normal operational data" often lead to excessive amount of computations, poorly justified metrics, and ambiguity of results. At the same time, speech analysis presents a natural, non-invasive approach based upon well-established efficient data processing. In addition, it supports both behavioral and physiological biometric. This paper presents an approach facilitating robust speech analysis/understanding process in spite of natural speech variability and background noise. Automatic speech recognition is suggested as a technique for the detection of changes in the psycho-physiological state of a human that typically manifest themselves by changes of characteristics of voice tract and semantic-syntactic connectivity of conversation. Preliminary tests have confirmed that the statistically significant correlation between the error rate of automatic speech recognition and the extent of alcohol intoxication does exist. In addition, the obtained data allowed exploring some interesting correlations and establishing some quantitative models. It is proposed to utilize this approach as a part of fitness for duty test and compare its efficiency with analyses of iris, face geometry, thermography and other popular non-invasive biometric techniques.
Kleftogiannis, Dimitrios; Korfiati, Aigli; Theofilatos, Konstantinos; Likothanassis, Spiros; Tsakalidis, Athanasios; Mavroudi, Seferina
2013-06-01
Traditional biology was forced to restate some of its principles when the microRNA (miRNA) genes and their regulatory role were firstly discovered. Typically, miRNAs are small non-coding RNA molecules which have the ability to bind to the 3'untraslated region (UTR) of their mRNA target genes for cleavage or translational repression. Existing experimental techniques for their identification and the prediction of the target genes share some important limitations such as low coverage, time consuming experiments and high cost reagents. Hence, many computational methods have been proposed for these tasks to overcome these limitations. Recently, many researchers emphasized on the development of computational approaches to predict the participation of miRNA genes in regulatory networks and to analyze their transcription mechanisms. All these approaches have certain advantages and disadvantages which are going to be described in the present survey. Our work is differentiated from existing review papers by updating the methodologies list and emphasizing on the computational issues that arise from the miRNA data analysis. Furthermore, in the present survey, the various miRNA data analysis steps are treated as an integrated procedure whose aims and scope is to uncover the regulatory role and mechanisms of the miRNA genes. This integrated view of the miRNA data analysis steps may be extremely useful for all researchers even if they work on just a single step. Copyright © 2013 Elsevier Inc. All rights reserved.
Analysis and Design of a Fiber-optic Probe for DNA Sensors Final Report CRADA No. TSB-1147-95
DOE Office of Scientific and Technical Information (OSTI.GOV)
Molau, Nicole; Vail, Curtis
In 1995, a challenge in the field of genetics dealt with the acquisition of efficient DNA sequencing techniques for reading the 3 billion base-pairs that comprised the human genome. AccuPhotonics, Inc. proposed to develop and manufacture a state-of-the-art near-field scanning optical microscopy (NSOM) fiber-optic probe that was expected to increase probe efficiency by two orders of magnitude over the existing state-of-the-art and to improve resolution to 10Å. The detailed design calculation and optimization of electrical properties of the fiber-optic probe tip geometry would be performed at LLNL, using existing finite-difference time-domain (FDTD) electromagnetic (EM) codes.
Rigorous Numerics for ill-posed PDEs: Periodic Orbits in the Boussinesq Equation
NASA Astrophysics Data System (ADS)
Castelli, Roberto; Gameiro, Marcio; Lessard, Jean-Philippe
2018-04-01
In this paper, we develop computer-assisted techniques for the analysis of periodic orbits of ill-posed partial differential equations. As a case study, our proposed method is applied to the Boussinesq equation, which has been investigated extensively because of its role in the theory of shallow water waves. The idea is to use the symmetry of the solutions and a Newton-Kantorovich type argument (the radii polynomial approach) to obtain rigorous proofs of existence of the periodic orbits in a weighted ℓ1 Banach space of space-time Fourier coefficients with exponential decay. We present several computer-assisted proofs of the existence of periodic orbits at different parameter values.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Steinkamp, J.A.; Ingram, M.; Hansen, K.M.
1976-03-01
This report summarizes results of preliminary experiments to demonstrate the feasibility of using automated flow-systems analysis in detecting early changes of respiratory epithelium exposed to physical and chemical agents associated with the by-products of nonnuclear energy production. The Syrian hamster was selected as the experimental test animal to begin investigation of the effects of toxic agents to cells of the respiratory tract. Since initiation of the program approximately six months ago, the goals have been acquisition of adequate numbers of exfoliated cells from the lung; adaptation of cytological techniques developed on human exfoliated gynecological samples to hamster lung epithelium formore » obtaining single-cell suspensions; utilization of existing cell staining methods to measure DNA content in lung cells; and analysis of DNA content and cell size. As the flow-system cell analysis technology is adapted to the measurement of exfoliated lung cells, rapid and quantitative determination of early changes in the physical and biochemical cellular properties will be attempted as a function of exposure to the toxic agents. (auth)« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Clark, Haley; BC Cancer Agency, Surrey, B.C.; BC Cancer Agency, Vancouver, B.C.
2014-08-15
Many have speculated about the future of computational technology in clinical radiation oncology. It has been advocated that the next generation of computational infrastructure will improve on the current generation by incorporating richer aspects of automation, more heavily and seamlessly featuring distributed and parallel computation, and providing more flexibility toward aggregate data analysis. In this report we describe how a recently created — but currently existing — analysis framework (DICOMautomaton) incorporates these aspects. DICOMautomaton supports a variety of use cases but is especially suited for dosimetric outcomes correlation analysis, investigation and comparison of radiotherapy treatment efficacy, and dose-volume computation. Wemore » describe: how it overcomes computational bottlenecks by distributing workload across a network of machines; how modern, asynchronous computational techniques are used to reduce blocking and avoid unnecessary computation; and how issues of out-of-date data are addressed using reactive programming techniques and data dependency chains. We describe internal architecture of the software and give a detailed demonstration of how DICOMautomaton could be used to search for correlations between dosimetric and outcomes data.« less
NASA Astrophysics Data System (ADS)
Miquel, Benjamin
The dynamic or seismic behavior of hydraulic structures is, as for conventional structures, essential to assure protection of human lives. These types of analyses also aim at limiting structural damage caused by an earthquake to prevent rupture or collapse of the structure. The particularity of these hydraulic structures is that not only the internal displacements are caused by the earthquake, but also by the hydrodynamic loads resulting from fluid-structure interaction. This thesis reviews the existing complex and simplified methods to perform such dynamic analysis for hydraulic structures. For the complex existing methods, attention is placed on the difficulties arising from their use. Particularly, interest is given in this work on the use of transmitting boundary conditions to simulate the semi infinity of reservoirs. A procedure has been developed to estimate the error that these boundary conditions can introduce in finite element dynamic analysis. Depending on their formulation and location, we showed that they can considerably affect the response of such fluid-structure systems. For practical engineering applications, simplified procedures are still needed to evaluate the dynamic behavior of structures in contact with water. A review of the existing simplified procedures showed that these methods are based on numerous simplifications that can affect the prediction of the dynamic behavior of such systems. One of the main objectives of this thesis has been to develop new simplified methods that are more accurate than those existing. First, a new spectral analysis method has been proposed. Expressions for the fundamental frequency of fluid-structure systems, key parameter of spectral analysis, have been developed. We show that this new technique can easily be implemented in a spreadsheet or program, and that its calculation time is near instantaneous. When compared to more complex analytical or numerical method, this new procedure yields excellent prediction of the dynamic behavior of fluid-structure systems. Spectral analyses ignore the transient and oscillatory nature of vibrations. When such dynamic analyses show that some areas of the studied structure undergo excessive stresses, time history analyses allow a better estimate of the extent of these zones as well as a time notion of these excessive stresses. Furthermore, the existing spectral analyses methods for fluid-structure systems account only for the static effect of higher modes. Thought this can generally be sufficient for dams, for flexible structures the dynamic effect of these modes should be accounted for. New methods have been developed for fluid-structure systems to account for these observations as well as the flexibility of foundations. A first method was developed to study structures in contact with one or two finite or infinite water domains. This new technique includes flexibility of structures and foundations as well as the dynamic effect of higher vibration modes and variations of the levels of the water domains. Extension of this method was performed to study beam structures in contact with fluids. These new developments have also allowed extending existing analytical formulations of the dynamic properties of a dry beam to a new formulation that includes effect of fluid-structure interaction. The method yields a very good estimate of the dynamic behavior of beam-fluid systems or beam like structures in contact with fluid. Finally, a Modified Accelerogram Method (MAM) has been developed to modify the design earthquake into a new accelerogram that directly accounts for the effect of fluid-structure interaction. This new accelerogram can therefore be applied directly to the dry structure (i.e. without water) in order to calculate the dynamic response of the fluid-structure system. This original technique can include numerous parameters that influence the dynamic response of such systems and allows to treat analytically the fluid-structure interaction while keeping the advantages of finite element modeling.
NASA Astrophysics Data System (ADS)
Zhang, Ke-Jia; Kwek, Leong-Chuan; Ma, Chun-Guang; Zhang, Long; Sun, Hong-Wei
2018-02-01
Quantum sealed-bid auction (QSA) has been widely studied in quantum cryptography. For a successful auction, post-confirmation is regarded as an important mechanism to make every bidder verify the identity of the winner after the auctioneer has announced the result. However, since the auctioneer may be dishonest and collude with malicious bidders in practice, some potential loopholes could exist. In this paper, we point out two types of collusion attacks for a particular post-confirmation technique with EPR pairs. And it is not difficult to see that there exists no unconditionally secure post-confirmation mechanism in the existing QSA model, if the dishonest participants have the ability to control multiparticle entanglement. In the view of this, we note that some secure implementation could exist if the participants are supposed to be semi-quantum, i.e., they can only control single photons. Finally, two potential methods to design post-confirmation mechanism are presented in this restricted scenario.
Myneni, Sahiti; Cobb, Nathan; Cohen, Trevor
2016-02-02
Research studies involving health-related online communities have focused on examining network structure to understand mechanisms underlying behavior change. Content analysis of the messages exchanged in these communities has been limited to the "social support" perspective. However, existing behavior change theories suggest that message content plays a prominent role reflecting several sociocognitive factors that affect an individual's efforts to make a lifestyle change. An understanding of these factors is imperative to identify and harness the mechanisms of behavior change in the Health 2.0 era. The objective of this work is two-fold: (1) to harness digital communication data to capture essential meaning of communication and factors affecting a desired behavior change, and (2) to understand the applicability of existing behavior change theories to characterize peer-to-peer communication in online platforms. In this paper, we describe grounded theory-based qualitative analysis of digital communication in QuitNet, an online community promoting smoking cessation. A database of 16,492 de-identified public messages from 1456 users from March 1-April 30, 2007, was used in our study. We analyzed 795 messages using grounded theory techniques to ensure thematic saturation. This analysis enabled identification of key concepts contained in the messages exchanged by QuitNet members, allowing us to understand the sociobehavioral intricacies underlying an individual's efforts to cease smoking in a group setting. We further ascertained the relevance of the identified themes to theoretical constructs in existing behavior change theories (eg, Health Belief Model) and theoretically linked techniques of behavior change taxonomy. We identified 43 different concepts, which were then grouped under 12 themes based on analysis of 795 messages. Examples of concepts include "sleepiness," "pledge," "patch," "spouse," and "slip." Examples of themes include "traditions," "social support," "obstacles," "relapse," and "cravings." Results indicate that themes consisting of member-generated strategies such as "virtual bonfires" and "pledges" were related to the highest number of theoretical constructs from the existing behavior change theories. In addition, results indicate that the member-generated communication content supports sociocognitive constructs from more than one behavior change model, unlike the majority of the existing theory-driven interventions. With the onset of mobile phones and ubiquitous Internet connectivity, online social network data reflect the intricacies of human health behavior as experienced by health consumers in real time. This study offers methodological insights for qualitative investigations that examine the various kinds of behavioral constructs prevalent in the messages exchanged among users of online communities. Theoretically, this study establishes the manifestation of existing behavior change theories in QuitNet-like online health communities. Pragmatically, it sets the stage for real-time, data-driven sociobehavioral interventions promoting healthy lifestyle modifications by allowing us to understand the emergent user needs to sustain a desired behavior change.
Excavation-caused extra deformation of existing masonry residence in soft soil region
NASA Astrophysics Data System (ADS)
Tang, Y.; Franceschelli, S.
2017-04-01
Growing need for construction of infrastructures and buildings in fast urbanization process creates challenges of interaction between buildings under construction and adjacent existing buildings. This paper presents the mitigation of contradiction between two parties who are involved the interaction using civil engineering techniques. Through the in-depth analysis of the results of monitoring surveys and enhanced accuracy and reliability of surveys, a better understanding of the behavior of deformable buildings is achieved. Combination with the original construction documents, the two parties agree that both of them are responsible for building damages and a better understanding for the rehabilitation of the existing buildings is focused on. Two cases studies are used to demonstrate and describe the importance of better understanding of the behavior of existing buildings and their rehabilitations. The objective of this study is to insight into mechanisms of soil-structure interaction for buildings adjacent to deep excavations, which can result in a damage in existing masonry residence, and to take the optimized measures to make deep excavations safety and economic and adjacent buildings keep good serviceability in urban areas with soft soil conditions.
Light water reactor lower head failure analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rempe, J.L.; Chavez, S.A.; Thinnes, G.L.
1993-10-01
This document presents the results from a US Nuclear Regulatory Commission-sponsored research program to investigate the mode and timing of vessel lower head failure. Major objectives of the analysis were to identify plausible failure mechanisms and to develop a method for determining which failure mode would occur first in different light water reactor designs and accident conditions. Failure mechanisms, such as tube ejection, tube rupture, global vessel failure, and localized vessel creep rupture, were studied. Newly developed models and existing models were applied to predict which failure mechanism would occur first in various severe accident scenarios. So that a broadermore » range of conditions could be considered simultaneously, calculations relied heavily on models with closed-form or simplified numerical solution techniques. Finite element techniques-were employed for analytical model verification and examining more detailed phenomena. High-temperature creep and tensile data were obtained for predicting vessel and penetration structural response.« less
Fundamental limits of reconstruction-based superresolution algorithms under local translation.
Lin, Zhouchen; Shum, Heung-Yeung
2004-01-01
Superresolution is a technique that can produce images of a higher resolution than that of the originally captured ones. Nevertheless, improvement in resolution using such a technique is very limited in practice. This makes it significant to study the problem: "Do fundamental limits exist for superresolution?" In this paper, we focus on a major class of superresolution algorithms, called the reconstruction-based algorithms, which compute high-resolution images by simulating the image formation process. Assuming local translation among low-resolution images, this paper is the first attempt to determine the explicit limits of reconstruction-based algorithms, under both real and synthetic conditions. Based on the perturbation theory of linear systems, we obtain the superresolution limits from the conditioning analysis of the coefficient matrix. Moreover, we determine the number of low-resolution images that are sufficient to achieve the limit. Both real and synthetic experiments are carried out to verify our analysis.
NASA Astrophysics Data System (ADS)
Yang, Rui; Li, Xiangyang; Zhang, Tong
2014-10-01
This paper uses two physics-derived techniques, the minimum spanning tree and the hierarchical tree, to investigate the networks formed by CITIC (China International Trust and Investment Corporation) industry indices in three periods from 2006 to 2013. The study demonstrates that obvious industry clustering effects exist in the networks, and Durable Consumer Goods, Industrial Products, Information Technology, Frequently Consumption and Financial Industry are the core nodes in the networks. We also use the rolling window technique to investigate the dynamic evolution of the networks' stability, by calculating the mean correlations and mean distances, as well as the variance of correlations and the distances of these indices. China's stock market is still immature and subject to administrative interventions. Therefore, through this analysis, regulators can focus on monitoring the core nodes to ensure the overall stability of the entire market, while investors can enhance their portfolio allocations or investment decision-making.
Wang, Wansheng; Chen, Long; Zhou, Jie
2015-01-01
A postprocessing technique for mixed finite element methods for the Cahn-Hilliard equation is developed and analyzed. Once the mixed finite element approximations have been computed at a fixed time on the coarser mesh, the approximations are postprocessed by solving two decoupled Poisson equations in an enriched finite element space (either on a finer grid or a higher-order space) for which many fast Poisson solvers can be applied. The nonlinear iteration is only applied to a much smaller size problem and the computational cost using Newton and direct solvers is negligible compared with the cost of the linear problem. The analysis presented here shows that this technique remains the optimal rate of convergence for both the concentration and the chemical potential approximations. The corresponding error estimate obtained in our paper, especially the negative norm error estimates, are non-trivial and different with the existing results in the literatures. PMID:27110063
A novel murmur-based heart sound feature extraction technique using envelope-morphological analysis
NASA Astrophysics Data System (ADS)
Yao, Hao-Dong; Ma, Jia-Li; Fu, Bin-Bin; Wang, Hai-Yang; Dong, Ming-Chui
2015-07-01
Auscultation of heart sound (HS) signals serves as an important primary approach to diagnose cardiovascular diseases (CVDs) for centuries. Confronting the intrinsic drawbacks of traditional HS auscultation, computer-aided automatic HS auscultation based on feature extraction technique has witnessed explosive development. Yet, most existing HS feature extraction methods adopt acoustic or time-frequency features which exhibit poor relationship with diagnostic information, thus restricting the performance of further interpretation and analysis. Tackling such a bottleneck problem, this paper innovatively proposes a novel murmur-based HS feature extraction method since murmurs contain massive pathological information and are regarded as the first indications of pathological occurrences of heart valves. Adapting discrete wavelet transform (DWT) and Shannon envelope, the envelope-morphological characteristics of murmurs are obtained and three features are extracted accordingly. Validated by discriminating normal HS and 5 various abnormal HS signals with extracted features, the proposed method provides an attractive candidate in automatic HS auscultation.
Guided SAR image despeckling with probabilistic non local weights
NASA Astrophysics Data System (ADS)
Gokul, Jithin; Nair, Madhu S.; Rajan, Jeny
2017-12-01
SAR images are generally corrupted by granular disturbances called speckle, which makes visual analysis and detail extraction a difficult task. Non Local despeckling techniques with probabilistic similarity has been a recent trend in SAR despeckling. To achieve effective speckle suppression without compromising detail preservation, we propose an improvement for the existing Generalized Guided Filter with Bayesian Non-Local Means (GGF-BNLM) method. The proposed method (Guided SAR Image Despeckling with Probabilistic Non Local Weights) replaces parametric constants based on heuristics in GGF-BNLM method with dynamically derived values based on the image statistics for weight computation. Proposed changes make GGF-BNLM method adaptive and as a result, significant improvement is achieved in terms of performance. Experimental analysis on SAR images shows excellent speckle reduction without compromising feature preservation when compared to GGF-BNLM method. Results are also compared with other state-of-the-art and classic SAR depseckling techniques to demonstrate the effectiveness of the proposed method.
UMA/GAN network architecture analysis
NASA Astrophysics Data System (ADS)
Yang, Liang; Li, Wensheng; Deng, Chunjian; Lv, Yi
2009-07-01
This paper is to critically analyze the architecture of UMA which is one of Fix Mobile Convergence (FMC) solutions, and also included by the third generation partnership project(3GPP). In UMA/GAN network architecture, UMA Network Controller (UNC) is the key equipment which connects with cellular core network and mobile station (MS). UMA network could be easily integrated into the existing cellular networks without influencing mobile core network, and could provides high-quality mobile services with preferentially priced indoor voice and data usage. This helps to improve subscriber's experience. On the other hand, UMA/GAN architecture helps to integrate other radio technique into cellular network which includes WiFi, Bluetooth, and WiMax and so on. This offers the traditional mobile operators an opportunity to integrate WiMax technique into cellular network. In the end of this article, we also give an analysis of potential influence on the cellular core networks ,which is pulled by UMA network.
Tiong, T Joyce; Chandesa, Tissa; Yap, Yeow Hong
2017-05-01
One common method to determine the existence of cavitational activity in power ultrasonics systems is by capturing images of sonoluminescence (SL) or sonochemiluminescence (SCL) in a dark environment. Conventionally, the light emitted from SL or SCL was detected based on the number of photons. Though this method is effective, it could not identify the sonochemical zones of an ultrasonic systems. SL/SCL images, on the other hand, enable identification of 'active' sonochemical zones. However, these images often provide just qualitative data as the harvesting of light intensity data from the images is tedious and require high resolution images. In this work, we propose a new image analysis technique using pseudo-colouring images to quantify the SCL zones based on the intensities of the SCL images and followed by comparison of the active SCL zones with COMSOL simulated acoustic pressure zones. Copyright © 2016 Elsevier B.V. All rights reserved.
Nanomanipulation-coupled nanospray mass spectrometry as an approach for single cell analysis
NASA Astrophysics Data System (ADS)
Phelps, Mandy; Hamilton, Jason; Verbeck, Guido F.
2014-12-01
Electrospray mass spectrometry is now a widely used technique for observing cell content of various biological tissues. However, electrospray techniques (liquid chromatography and direct infusion) often involve lysing a group of cells and extracting the biomolecules of interest, rather than a sensitive, individual cell method to observe local chemistry. Presented here is an approach of combining a nanomanipulator workstation with nanospray mass spectrometry, which allows for extraction of a single cell, followed by rapid mass analysis that can provide a detailed metabolic profile. Triacylglycerol content was profiled with this tool coupled to mass spectrometry to investigate heterogeneity between healthy and tumorous tissues as well as lipid droplet containing adipocytes in vitro as proof of concept. This selective approach provides cellular resolution and complements existing bioanalytical techniques with minimal invasion to samples. In addition, the coupling of nanomanipulation and mass spectrometry holds the potential to be used in a great number of applications for individual organelles, diseased tissues, and in vitro cell cultures for observing heterogeneity even amongst cells and organelles of the same tissue.
Failure Analysis for Composition of Web Services Represented as Labeled Transition Systems
NASA Astrophysics Data System (ADS)
Nadkarni, Dinanath; Basu, Samik; Honavar, Vasant; Lutz, Robyn
The Web service composition problem involves the creation of a choreographer that provides the interaction between a set of component services to realize a goal service. Several methods have been proposed and developed to address this problem. In this paper, we consider those scenarios where the composition process may fail due to incomplete specification of goal service requirements or due to the fact that the user is unaware of the functionality provided by the existing component services. In such cases, it is desirable to have a composition algorithm that can provide feedback to the user regarding the cause of failure in the composition process. Such feedback will help guide the user to re-formulate the goal service and iterate the composition process. We propose a failure analysis technique for composition algorithms that views Web service behavior as multiple sequences of input/output events. Our technique identifies the possible cause of composition failure and suggests possible recovery options to the user. We discuss our technique using a simple e-Library Web service in the context of the MoSCoE Web service composition framework.
NASA Astrophysics Data System (ADS)
Laib dit Leksir, Y.; Mansour, M.; Moussaoui, A.
2018-03-01
Analysis and processing of databases obtained from infrared thermal inspections made on electrical installations require the development of new tools to obtain more information to visual inspections. Consequently, methods based on the capture of thermal images show a great potential and are increasingly employed in this field. However, there is a need for the development of effective techniques to analyse these databases in order to extract significant information relating to the state of the infrastructures. This paper presents a technique explaining how this approach can be implemented and proposes a system that can help to detect faults in thermal images of electrical installations. The proposed method classifies and identifies the region of interest (ROI). The identification is conducted using support vector machine (SVM) algorithm. The aim here is to capture the faults that exist in electrical equipments during an inspection of some machines using A40 FLIR camera. After that, binarization techniques are employed to select the region of interest. Later the comparative analysis of the obtained misclassification errors using the proposed method with Fuzzy c means and Ostu, has also be addressed.
Advances in carbonate exploration and reservoir analysis
Garland, J.; Neilson, J.; Laubach, S.E.; Whidden, Katherine J.
2012-01-01
The development of innovative techniques and concepts, and the emergence of new plays in carbonate rocks are creating a resurgence of oil and gas discoveries worldwide. The maturity of a basin and the application of exploration concepts have a fundamental influence on exploration strategies. Exploration success often occurs in underexplored basins by applying existing established geological concepts. This approach is commonly undertaken when new basins ‘open up’ owing to previous political upheavals. The strategy of using new techniques in a proven mature area is particularly appropriate when dealing with unconventional resources (heavy oil, bitumen, stranded gas), while the application of new play concepts (such as lacustrine carbonates) to new areas (i.e. ultra-deep South Atlantic basins) epitomizes frontier exploration. Many low-matrix-porosity hydrocarbon reservoirs are productive because permeability is controlled by fractures and faults. Understanding basic fracture properties is critical in reducing geological risk and therefore reducing well costs and increasing well recovery. The advent of resource plays in carbonate rocks, and the long-standing recognition of naturally fractured carbonate reservoirs means that new fracture and fault analysis and prediction techniques and concepts are essential.
Decoding human swallowing via electroencephalography: a state-of-the-art review
Jestrović, Iva; Coyle, James L.
2015-01-01
Swallowing and swallowing disorders have garnered continuing interest over the past several decades. Electroencephalography (EEG) is an inexpensive and non-invasive procedure with very high temporal resolution which enables analysis of short and fast swallowing events, as well as an analysis of the organizational and behavioral aspects of cortical motor preparation, swallowing execution and swallowing regulation. EEG is a powerful technique which can be used alone or in combination with other techniques for monitoring swallowing, detection of swallowing motor imagery for diagnostic or biofeedback purposes, or to modulate and measure the effects of swallowing rehabilitation. This paper provides a review of the existing literature which has deployed EEG in the investigation of oropharyngeal swallowing, smell, taste and texture related to swallowing, cortical pre-motor activation in swallowing, and swallowing motor imagery detection. Furthermore, this paper provides a brief review of the different modalities of brain imaging techniques used to study swallowing brain activities, as well as the EEG components of interest for studies on swallowing and on swallowing motor imagery. Lastly, this paper provides directions for future swallowing investigations using EEG. PMID:26372528
A special protection scheme utilizing trajectory sensitivity analysis in power transmission
NASA Astrophysics Data System (ADS)
Suriyamongkol, Dan
In recent years, new measurement techniques have provided opportunities to improve the North American Power System observability, control and protection. This dissertation discusses the formulation and design of a special protection scheme based on a novel utilization of trajectory sensitivity techniques with inputs consisting of system state variables and parameters. Trajectory sensitivity analysis (TSA) has been used in previous publications as a method for power system security and stability assessment, and the mathematical formulation of TSA lends itself well to some of the time domain power system simulation techniques. Existing special protection schemes often have limited sets of goals and control actions. The proposed scheme aims to maintain stability while using as many control actions as possible. The approach here will use the TSA in a novel way by using the sensitivities of system state variables with respect to state parameter variations to determine the state parameter controls required to achieve the desired state variable movements. The initial application will operate based on the assumption that the modeled power system has full system observability, and practical considerations will be discussed.
Parametric Model Based On Imputations Techniques for Partly Interval Censored Data
NASA Astrophysics Data System (ADS)
Zyoud, Abdallah; Elfaki, F. A. M.; Hrairi, Meftah
2017-12-01
The term ‘survival analysis’ has been used in a broad sense to describe collection of statistical procedures for data analysis. In this case, outcome variable of interest is time until an event occurs where the time to failure of a specific experimental unit might be censored which can be right, left, interval, and Partly Interval Censored data (PIC). In this paper, analysis of this model was conducted based on parametric Cox model via PIC data. Moreover, several imputation techniques were used, which are: midpoint, left & right point, random, mean, and median. Maximum likelihood estimate was considered to obtain the estimated survival function. These estimations were then compared with the existing model, such as: Turnbull and Cox model based on clinical trial data (breast cancer data), for which it showed the validity of the proposed model. Result of data set indicated that the parametric of Cox model proved to be more superior in terms of estimation of survival functions, likelihood ratio tests, and their P-values. Moreover, based on imputation techniques; the midpoint, random, mean, and median showed better results with respect to the estimation of survival function.
Ibrahim, El-Sayed H; Stojanovska, Jadranka; Hassanein, Azza; Duvernoy, Claire; Croisille, Pierre; Pop-Busui, Rodica; Swanson, Scott D
2018-05-16
Cardiac MRI tagging is a valuable technique for evaluating regional heart function. Currently, there are a number of different techniques for analyzing the tagged images. Specifically, k-space-based analysis techniques showed to be much faster than image-based techniques, where harmonic-phase (HARP) and sine-wave modeling (SinMod) stand as two famous techniques of the former group, which are frequently used in clinical studies. In this study, we compared HARP and SinMod and studied inter-observer variability between the two techniques for evaluating myocardial strain and apical-to-base torsion in numerical phantom, nine healthy controls, and thirty diabetic patients. Based on the ground-truth numerical phantom measurements (strain = -20% and rotation angle = -4.4°), HARP and SinMod resulted in overestimation (in absolute value terms) of strain by 1% and 5% (strain values), and of rotation angle by 0.4° and 2.0°, respectively. For the in-vivo results, global strain and torsion ranges were -10.6 to -35.3% and 1.8-12.7°/cm in patients, and -17.8 to -32.7% and 1.8-12.3°/cm in volunteers. On average, SinMod overestimated strain measurements by 5.7% and 5.9% (strain values) in the patients and volunteers, respectively, compared to HARP, and overestimated torsion measurements by 2.9°/cm and 2.5°/cm in the patients and volunteers, respectively, compared to HARP. Location-wise, the ranges for basal, mid-ventricular, and apical strain in patients (volunteers) were -8.4 to -31.5% (-11.6 to -33.3%), -6.3 to -37.2% (-17.8 to -33.3%), and -5.2 to -38.4% (-20.0 to -33.2%), respectively. SinMod overestimated strain in the basal, mid-ventricular, and apical slices by 4.7% (5.7%), 5.9% (5.5%), and 8.9% (6.8%), respectively, compared to HARP in the patients (volunteers). Nevertheless, there existed good correlation between the HARP and SinMod measurements. Finally, there were no significant strain or torsion measurement differences between patients and volunteers. There existed good inter-observer agreement, as all measurement differences lied within the Bland-Altman ± 2 standard-deviation (SD) difference limits. In conclusion, despite the consistency of the results by either HARP or SinMod and acceptable agreement of the generated strain and torsion patterns by both techniques, SinMod systematically overestimated the measurements compared to HARP. Under current operating conditions, the measurements from HARP and SinMod cannot be used interchangeably. Copyright © 2017. Published by Elsevier Inc.
High-Temperature Strain Sensing for Aerospace Applications
NASA Technical Reports Server (NTRS)
Piazza, Anthony; Richards, Lance W.; Hudson, Larry D.
2008-01-01
Thermal protection systems (TPS) and hot structures are utilizing advanced materials that operate at temperatures that exceed abilities to measure structural performance. Robust strain sensors that operate accurately and reliably beyond 1800 F are needed but do not exist. These shortcomings hinder the ability to validate analysis and modeling techniques and hinders the ability to optimize structural designs. This presentation examines high-temperature strain sensing for aerospace applications and, more specifically, seeks to provide strain data for validating finite element models and thermal-structural analyses. Efforts have been made to develop sensor attachment techniques for relevant structural materials at the small test specimen level and to perform laboratory tests to characterize sensor and generate corrections to apply to indicated strains. Areas highlighted in this presentation include sensors, sensor attachment techniques, laboratory evaluation/characterization of strain measurement, and sensor use in large-scale structures.
Wilson, Kris; Webster, Scott P; Iredale, John P; Zheng, Xiaozhong; Homer, Natalie Z; Pham, Nhan T; Auer, Manfred; Mole, Damian J
2017-12-15
The assessment of drug-target engagement for determining the efficacy of a compound inside cells remains challenging, particularly for difficult target proteins. Existing techniques are more suited to soluble protein targets. Difficult target proteins include those with challenging in vitro solubility, stability or purification properties that preclude target isolation. Here, we report a novel technique that measures intracellular compound-target complex formation, as well as cellular permeability, specificity and cytotoxicity-the toxicity-affinity-permeability-selectivity (TAPS) technique. The TAPS assay is exemplified here using human kynurenine 3-monooxygenase (KMO), a challenging intracellular membrane protein target of significant current interest. TAPS confirmed target binding of known KMO inhibitors inside cells. We conclude that the TAPS assay can be used to facilitate intracellular hit validation on most, if not all intracellular drug targets.
NASA Astrophysics Data System (ADS)
Wilson, Kris; Webster, Scott P.; Iredale, John P.; Zheng, Xiaozhong; Homer, Natalie Z.; Pham, Nhan T.; Auer, Manfred; Mole, Damian J.
2018-01-01
The assessment of drug-target engagement for determining the efficacy of a compound inside cells remains challenging, particularly for difficult target proteins. Existing techniques are more suited to soluble protein targets. Difficult target proteins include those with challenging in vitro solubility, stability or purification properties that preclude target isolation. Here, we report a novel technique that measures intracellular compound-target complex formation, as well as cellular permeability, specificity and cytotoxicity-the toxicity-affinity-permeability-selectivity (TAPS) technique. The TAPS assay is exemplified here using human kynurenine 3-monooxygenase (KMO), a challenging intracellular membrane protein target of significant current interest. TAPS confirmed target binding of known KMO inhibitors inside cells. We conclude that the TAPS assay can be used to facilitate intracellular hit validation on most, if not all intracellular drug targets.
NASA Technical Reports Server (NTRS)
Henderson, M. L.
1979-01-01
The benefits to high lift system maximum life and, alternatively, to high lift system complexity, of applying analytic design and analysis techniques to the design of high lift sections for flight conditions were determined and two high lift sections were designed to flight conditions. The influence of the high lift section on the sizing and economics of a specific energy efficient transport (EET) was clarified using a computerized sizing technique and an existing advanced airplane design data base. The impact of the best design resulting from the design applications studies on EET sizing and economics were evaluated. Flap technology trade studies, climb and descent studies, and augmented stability studies are included along with a description of the baseline high lift system geometry, a calculation of lift and pitching moment when separation is present, and an inverse boundary layer technique for pressure distribution synthesis and optimization.
LANDSAT-4 band 6 data evaluation
NASA Technical Reports Server (NTRS)
1983-01-01
The radiometric integrity of the LANDSAT-D thematic mapper (TM) thermal infrared channel (band 6) data was evaluated to develop improved radiometric preprocessing calibration techniques for removal of atmospheric effects. Primary data analysis was spent in evaluating the line to line and detector to detector variation in the thermal infrared data. The data studied was in the core area of Lake Ontario where very stable temperatures were expected. The detectors and the scan direction were taken as separate parameters and an analysis of variance was conducted. The data indicate that significant variability exists both between detectors and between scan directions.
Implementing a Reliability Centered Maintenance Program at NASA's Kennedy Space Center
NASA Technical Reports Server (NTRS)
Tuttle, Raymond E.; Pete, Robert R.
1998-01-01
Maintenance practices have long focused on time based "preventive maintenance" techniques. Components were changed out and parts replaced based on how long they had been in place instead of what condition they were in. A reliability centered maintenance (RCM) program seeks to offer equal or greater reliability at decreased cost by insuring only applicable, effective maintenance is performed and by in large part replacing time based maintenance with condition based maintenance. A significant portion of this program involved introducing non-intrusive technologies, such as vibration analysis, oil analysis and I/R cameras, to an existing labor force and management team.
Evolutionary computing for the design search and optimization of space vehicle power subsystems
NASA Technical Reports Server (NTRS)
Kordon, Mark; Klimeck, Gerhard; Hanks, David; Hua, Hook
2004-01-01
Evolutionary computing has proven to be a straightforward and robust approach for optimizing a wide range of difficult analysis and design problems. This paper discusses the application of these techniques to an existing space vehicle power subsystem resource and performance analysis simulation in a parallel processing environment. Out preliminary results demonstrate that this approach has the potential to improve the space system trade study process by allowing engineers to statistically weight subsystem goals of mass, cost and performance then automatically size power elements based on anticipated performance of the subsystem rather than on worst-case estimates.
Nitrogen Oxide Emission, Economic Growth and Urbanization in China: a Spatial Econometric Analysis
NASA Astrophysics Data System (ADS)
Zhou, Zhimin; Zhou, Yanli; Ge, Xiangyu
2018-01-01
This research studies the nexus of nitrogen oxide emissions and economic development/urbanization. Under the environmental Kuznets curve (EKC) hypothesis, we apply the analysis technique of spatial panel data in the STIRPAT framework, and thus obtain the estimated impacts of income/urbanization on nitrogen oxide emission systematically. The empirical findings suggest that spatial dependence on nitrogen oxide emission distribution exist at provincial level, and the inverse N-shape EKC describes both income-nitrogen oxide and urbanization-nitrogen oxide nexuses. In addition, some well-directed policy advices are made to reduce the nitrogen oxide emission in future.
NASA Technical Reports Server (NTRS)
Dash, S.; Delguidice, P.
1972-01-01
A second order numerical method employing reference plane characteristics has been developed for the calculation of geometrically complex three dimensional nozzle-exhaust flow fields, heretofore uncalculable by existing methods. The nozzles may have irregular cross sections with swept throats and may be stacked in modules using the vehicle undersurface for additional expansion. The nozzles may have highly nonuniform entrance conditions, the medium considered being an equilibrium hydrogen-air mixture. The program calculates and carries along the underexpansion shock and contact as discrete discontinuity surfaces, for a nonuniform vehicle external flow.
Policy analysis for prenatal genetic diagnosis.
Thompson, M; Milunsky, A
1979-01-01
Consideration of the analytic difficulties faced in estimating the benefits and costs of prenatal genetic diagnosis, coupled with a brief review of existing benefit-cost studies, leads to the conclusion that public subsidy of prenatal testing can yield benefits substantially in excess of costs. The practical obstacles to such programs include the attitudes of prospective parents, a lack of knowledge, monetary barriers, inadequately organized medical resources, and the political issue of abortion. Policy analysis can now nevertheless formulate principles and guide immediate actions to improve present utilization of prenatal testing and to facilitate possible future expansion of these diagnostic techniques.
Zero leakage separable and semipermanent ducting joints
NASA Technical Reports Server (NTRS)
Mischel, H. T.
1973-01-01
A study program has been conducted to explore new methods of achieving zero leakage, separable and semipermanent, ducting joints for space flight vehicles. The study consisted of a search of literature of existing zero leakage methods, the generation of concepts of new methods of achieving the desired zero leakage criteria and the development of detailed analysis and design of a selected concept. Other techniques of leak detection were explored with a view toward improving this area.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Turbak, Susan C.; Reichle, Donna R.; Shriner, Carole R.
1981-01-01
The purpose of this report is to provide summary information for use by potential developers and regulators of small-scale hydroelectric projects (defined as existing dams that can be retrofitted to a total site capacity of ≤30 MW), where turbine-related mortality of fish is a potential issue affecting site-specific development. Mitigation techniques for turbine-related mortality are not covered in this report.
Trinity Bay Study: Dye tracing experiments
NASA Technical Reports Server (NTRS)
Ward, G. H., Jr.
1972-01-01
An analysis of the heat balance and temperature distribution within Trinity Bay near Galveston, Texas is presented. The effects of tidal currents, wind driven circulations, and large volume inflows are examined. Emphasis is placed on the effects of turbulent diffusion and local shears in currents. The technique of dye tracing to determine the parameters characterizing dispersion is described. Aerial photographs and maps are provided to show the flow conditions existing at different times and seasons.
Application of Non-destructive Methods of Stress-strain State at Hazardous Production Facilities
NASA Astrophysics Data System (ADS)
Shram, V.; Kravtsova, Ye; Selsky, A.; Bezborodov, Yu; Lysyannikova, N.; Lysyannikov, A.
2016-06-01
The paper deals with the sources of accidents in distillation columns, on the basis of which the most dangerous defects are detected. The analysis of the currently existing methods of non-destructive testing of the stress-strain state is performed. It is proposed to apply strain and acoustic emission techniques to continuously monitor dangerous objects, which helps prevent the possibility of accidents, as well as reduce the work.
Youssef, Yassar; Lee, Gyusung; Godinez, Carlos; Sutton, Erica; Klein, Rosemary V; George, Ivan M; Seagull, F Jacob; Park, Adrian
2011-07-01
This study compares surgical techniques and surgeon's standing position during laparoscopic cholecystectomy (LC), investigating each with respect to surgeons' learning, performance, and ergonomics. Little homogeneity exists in LC performance and training. Variations in standing position (side-standing technique vs. between-standing technique) and hand technique (one-handed vs. two-handed) exist. Thirty-two LC procedures performed on a virtual reality simulator were video-recorded and analyzed. Each subject performed four different procedures: one-handed/side-standing, one-handed/between-standing, two-handed/side-standing, and two-handed/between-standing. Physical ergonomics were evaluated using Rapid Upper Limb Assessment (RULA). Mental workload assessment was acquired with the National Aeronautics and Space Administration-Task Load Index (NASA-TLX). Virtual reality (VR) simulator-generated performance evaluation and a subjective survey were analyzed. RULA scores were consistently lower (indicating better ergonomics) for the between-standing technique and higher (indicating worse ergonomics) for the side-standing technique, regardless of whether one- or two-handed. Anatomical scores overall showed side-standing to have a detrimental effect on the upper arms and trunk. The NASA-TLX showed significant association between the side-standing position and high physical demand, effort, and frustration (p<0.05). The two-handed technique in the side-standing position required more effort than the one-handed (p<0.05). No difference in operative time or complication rate was demonstrated among the four procedures. The two-handed/between-standing method was chosen as the best procedure to teach and standardize. Laparoscopic cholecystectomy poses a risk of physical injury to the surgeon. As LC is currently commonly performed in the United States, the left side-standing position may lead to increased physical demand and effort, resulting in ergonomically unsound conditions for the surgeon. Though further investigations should be conducted, adopting the between-standing position deserves serious consideration as it may be the best short-term ergonomic alternative.
Rodríguez-Entrena, Macario; Schuberth, Florian; Gelhard, Carsten
2018-01-01
Structural equation modeling using partial least squares (PLS-SEM) has become a main-stream modeling approach in various disciplines. Nevertheless, prior literature still lacks a practical guidance on how to properly test for differences between parameter estimates. Whereas existing techniques such as parametric and non-parametric approaches in PLS multi-group analysis solely allow to assess differences between parameters that are estimated for different subpopulations, the study at hand introduces a technique that allows to also assess whether two parameter estimates that are derived from the same sample are statistically different. To illustrate this advancement to PLS-SEM, we particularly refer to a reduced version of the well-established technology acceptance model.
Design of surface-water data networks for regional information
Moss, Marshall E.; Gilroy, E.J.; Tasker, Gary D.; Karlinger, M.R.
1982-01-01
This report describes a technique, Network Analysis of Regional Information (NARI), and the existing computer procedures that have been developed for the specification of the regional information-cost relation for several statistical parameters of streamflow. The measure of information used is the true standard error of estimate of a regional logarithmic regression. The cost is a function of the number of stations at which hydrologic data are collected and the number of years for which the data are collected. The technique can be used to obtain either (1) a minimum cost network that will attain a prespecified accuracy and reliability or (2) a network that maximizes information given a set of budgetary and time constraints.
NASA Technical Reports Server (NTRS)
Muszynska, Agnes; Bently, Donald E.
1991-01-01
Perturbation techniques used for identification of rotating system dynamic characteristics are described. A comparison between two periodic frequency-swept perturbation methods applied in identification of fluid forces of rotating machines is presented. The description of the fluid force model identified by inputting circular periodic frequency-swept force is given. This model is based on the existence and strength of the circumferential flow, most often generated by the shaft rotation. The application of the fluid force model in rotor dynamic analysis is presented. It is shown that the rotor stability is an entire rotating system property. Some areas for further research are discussed.
NASA Astrophysics Data System (ADS)
Arai, Hiroyuki; Miyagawa, Isao; Koike, Hideki; Haseyama, Miki
We propose a novel technique for estimating the number of people in a video sequence; it has the advantages of being stable even in crowded situations and needing no ground-truth data. By analyzing the geometrical relationships between image pixels and their intersection volumes in the real world quantitatively, a foreground image directly indicates the number of people. Because foreground detection is possible even in crowded situations, the proposed method can be applied in such situations. Moreover, it can estimate the number of people in an a priori manner, so it needs no ground-truth data unlike existing feature-based estimation techniques. Experiments show the validity of the proposed method.
Security and Correctness Analysis on Privacy-Preserving k-Means Clustering Schemes
NASA Astrophysics Data System (ADS)
Su, Chunhua; Bao, Feng; Zhou, Jianying; Takagi, Tsuyoshi; Sakurai, Kouichi
Due to the fast development of Internet and the related IT technologies, it becomes more and more easier to access a large amount of data. k-means clustering is a powerful and frequently used technique in data mining. Many research papers about privacy-preserving k-means clustering were published. In this paper, we analyze the existing privacy-preserving k-means clustering schemes based on the cryptographic techniques. We show those schemes will cause the privacy breach and cannot output the correct results due to the faults in the protocol construction. Furthermore, we analyze our proposal as an option to improve such problems but with intermediate information breach during the computation.
A relative performance analysis of atmospheric Laser Doppler Velocimeter methods.
NASA Technical Reports Server (NTRS)
Farmer, W. M.; Hornkohl, J. O.; Brayton, D. B.
1971-01-01
Evaluation of the effectiveness of atmospheric applications of a Laser Doppler Velocimeter (LDV) at a wavelength of about 0.5 micrometer in conjunction with dual scatter LDV illuminating techniques, or at a wavelength of 10.6 micrometer with local oscillator LDV illuminating techniques. Equations and examples are given to provide a quantitative basis for LDV system selection and performance criteria in atmospheric research. The comparative study shows that specific ranges and conditions exist where performance of one of the methods is superior to that of the other. It is also pointed out that great care must be exercised in choosing system parameters that optimize a particular LDV designed for atmospheric applications.
ATMOS Spacelab 1 science investigation
NASA Technical Reports Server (NTRS)
Park, J. H.; Smith, M. A. H.; Twitty, J. T.; Russell, J. M., III
1979-01-01
Existing infrared spectra from high speed interferometer balloon flights were analyzed and experimental analysis techniques applicable to similar data from the ATMOS experiment (Spacelab 3) were investigated. Specific techniques under investigation included line-by-line simulation of the spectra to aid in the identification of absorbing gases, simultaneous retrieval of pressure and temperature profiles using carefully chosen pairs of CO2 absorption lines, and the use of these pressures and temperatures in the retrieval of gas concentration profiles for many absorbing species. A search for a new absorption features was also carried out, and special attention was given to identification of absorbing gases in spectral bandpass regions to be measured by the halogen occultation experiment.
Urban environmental health applications of remote sensing, summary report
NASA Technical Reports Server (NTRS)
Rush, M.; Goldstein, J.; Hsi, B. P.; Olsen, C. B.
1975-01-01
Health and its association with the physical environment was studied based on the hypothesis that there is a relationship between the man-made physical environment and health status of a population. The statistical technique of regression analysis was employed to show the degree of association and aspects of physical environment which accounted for the greater variation in health status. Mortality, venereal disease, tuberculosis, hepatitis, meningitis, shigella/salmonella, hypertension and cardiac arrest/myocardial infarction were examined. The statistical techniques were used to measure association and variation, not necessarily cause and effect. Conclusions drawn show that the association still exists in the decade of the 1970's and that it can be successfully monitored with the methodology of remote sensing.
The Role of a Physical Analysis Laboratory in a 300 mm IC Development and Manufacturing Centre
NASA Astrophysics Data System (ADS)
Kwakman, L. F. Tz.; Bicais-Lepinay, N.; Courtas, S.; Delille, D.; Juhel, M.; Trouiller, C.; Wyon, C.; de la Bardonnie, M.; Lorut, F.; Ross, R.
2005-09-01
To remain competitive IC manufacturers have to accelerate the development of most advanced (CMOS) technology and to deliver high yielding products with best cycle times and at a competitive pricing. With the increase of technology complexity, also the need for physical characterization support increases, however many of the existing techniques are no longer adequate to effectively support the 65-45 nm technology node developments. New and improved techniques are definitely needed to better characterize the often marginal processes, but these should not significantly impact fabrication costs or cycle time. Hence, characterization and metrology challenges in state-of-the-art IC manufacturing are both of technical and economical nature. TEM microscopy is needed for high quality, high volume analytical support but several physical and practical hurdles have to be taken. The success rate of FIB-SEM based failure analysis drops as defects often are too small to be detected and fault isolation becomes more difficult in the nano-scale device structures. To remain effective and efficient, SEM and OBIRCH techniques have to be improved or complemented with other more effective methods. Chemical analysis of novel materials and critical interfaces requires improvements in the field of e.g. SIMS, ToF-SIMS. Techniques that previously were only used sporadically, like EBSD and XRD, have become a `must' to properly support backend process development. At the bright side, thanks to major technical advances, techniques that previously were practiced at laboratory level only now can be used effectively for at-line fab metrology: Voltage Contrast based defectivity control, XPS based gate dielectric metrology and XRD based control of copper metallization processes are practical examples. In this paper capabilities and shortcomings of several techniques and corresponding equipment are presented with practical illustrations of use in our Crolles facilities.
EXPERIMENTAL MODELLING OF AORTIC ANEURYSMS
Doyle, Barry J; Corbett, Timothy J; Cloonan, Aidan J; O’Donnell, Michael R; Walsh, Michael T; Vorp, David A; McGloughlin, Timothy M
2009-01-01
A range of silicone rubbers were created based on existing commercially available materials. These silicones were designed to be visually different from one another and have distinct material properties, in particular, ultimate tensile strengths and tear strengths. In total, eleven silicone rubbers were manufactured, with the materials designed to have a range of increasing tensile strengths from approximately 2-4MPa, and increasing tear strengths from approximately 0.45-0.7N/mm. The variations in silicones were detected using a standard colour analysis technique. Calibration curves were then created relating colour intensity to individual material properties. All eleven materials were characterised and a 1st order Ogden strain energy function applied. Material coefficients were determined and examined for effectiveness. Six idealised abdominal aortic aneurysm models were also created using the two base materials of the study, with a further model created using a new mixing technique to create a rubber model with randomly assigned material properties. These models were then examined using videoextensometry and compared to numerical results. Colour analysis revealed a statistically significant linear relationship (p<0.0009) with both tensile strength and tear strength, allowing material strength to be determined using a non-destructive experimental technique. The effectiveness of this technique was assessed by comparing predicted material properties to experimentally measured methods, with good agreement in the results. Videoextensometry and numerical modelling revealed minor percentage differences, with all results achieving significance (p<0.0009). This study has successfully designed and developed a range of silicone rubbers that have unique colour intensities and material strengths. Strengths can be readily determined using a non-destructive analysis technique with proven effectiveness. These silicones may further aid towards an improved understanding of the biomechanical behaviour of aneurysms using experimental techniques. PMID:19595622
Teaching audience analysis to the technical student
NASA Technical Reports Server (NTRS)
Debs, M. B.; Brillhart, L. V.
1981-01-01
Teaching audience analysis, as practiced in a technical writing course for engineering students, is discussed. Audience analysis is described as the task of defining the audience for a particular piece of writing and determining those characteristics of the audience which constrain the writer and effect reception of the message. A mature technical writing style that shows the tension produced when a text is written to be read and understood is considered in terms of audience analysis. Techniques include: (1) conveying to students the concept that a reader with certain expectations exist, (2) team teaching to preserve the context of a given technical discipline, and (3) assigning a technical report that addresses a variety of readers, thus establishing the complexity of audience oriented writing.
Optimization technique for problems with an inequality constraint
NASA Technical Reports Server (NTRS)
Russell, K. J.
1972-01-01
General technique uses a modified version of an existing technique termed the pattern search technique. New procedure called the parallel move strategy permits pattern search technique to be used with problems involving a constraint.
Surrogate marker analysis in cancer clinical trials through time-to-event mediation techniques.
Vandenberghe, Sjouke; Duchateau, Luc; Slaets, Leen; Bogaerts, Jan; Vansteelandt, Stijn
2017-01-01
The meta-analytic approach is the gold standard for validation of surrogate markers, but has the drawback of requiring data from several trials. We refine modern mediation analysis techniques for time-to-event endpoints and apply them to investigate whether pathological complete response can be used as a surrogate marker for disease-free survival in the EORTC 10994/BIG 1-00 randomised phase 3 trial in which locally advanced breast cancer patients were randomised to either taxane or anthracycline based neoadjuvant chemotherapy. In the mediation analysis, the treatment effect is decomposed into an indirect effect via pathological complete response and the remaining direct effect. It shows that only 4.2% of the treatment effect on disease-free survival after five years is mediated by the treatment effect on pathological complete response. There is thus no evidence from our analysis that pathological complete response is a valuable surrogate marker to evaluate the effect of taxane versus anthracycline based chemotherapies on progression free survival of locally advanced breast cancer patients. The proposed analysis strategy is broadly applicable to mediation analyses of time-to-event endpoints, is easy to apply and outperforms existing strategies in terms of precision as well as robustness against model misspecification.
Mollison, Daisy; Sellar, Robin; Bastin, Mark; Mollison, Denis; Chandran, Siddharthan; Wardlaw, Joanna; Connick, Peter
2017-01-01
Moderate correlation exists between the imaging quantification of brain white matter lesions and cognitive performance in people with multiple sclerosis (MS). This may reflect the greater importance of other features, including subvisible pathology, or methodological limitations of the primary literature. To summarise the cognitive clinico-radiological paradox and explore the potential methodological factors that could influence the assessment of this relationship. Systematic review and meta-analysis of primary research relating cognitive function to white matter lesion burden. Fifty papers met eligibility criteria for review, and meta-analysis of overall results was possible in thirty-two (2050 participants). Aggregate correlation between cognition and T2 lesion burden was r = -0.30 (95% confidence interval: -0.34, -0.26). Wide methodological variability was seen, particularly related to key factors in the cognitive data capture and image analysis techniques. Resolving the persistent clinico-radiological paradox will likely require simultaneous evaluation of multiple components of the complex pathology using optimum measurement techniques for both cognitive and MRI feature quantification. We recommend a consensus initiative to support common standards for image analysis in MS, enabling benchmarking while also supporting ongoing innovation.
Luo, P; Morrison, I; Dudkiewicz, A; Tiede, K; Boyes, E; O'Toole, P; Park, S; Boxall, A B
2013-04-01
Imaging and characterization of engineered nanoparticles (ENPs) in water, soils, sediment and food matrices is very important for research into the risks of ENPs to consumers and the environment. However, these analyses pose a significant challenge as most existing techniques require some form of sample manipulation prior to imaging and characterization, which can result in changes in the ENPs in a sample and in the introduction of analytical artefacts. This study therefore explored the application of a newly designed instrument, the atmospheric scanning electron microscope (ASEM), which allows the direct characterization of ENPs in liquid matrices and which therefore overcomes some of the limitations associated with existing imaging methods. ASEM was used to characterize the size distribution of a range of ENPs in a selection of environmental and food matrices, including supernatant of natural sediment, test medium used in ecotoxicology studies, bovine serum albumin and tomato soup under atmospheric conditions. The obtained imaging results were compared to results obtained using conventional imaging by transmission electron microscope (TEM) and SEM as well as to size distribution data derived from nanoparticle tracking analysis (NTA). ASEM analysis was found to be a complementary technique to existing methods that is able to visualize ENPs in complex liquid matrices and to provide ENP size information without extensive sample preparation. ASEM images can detect ENPs in liquids down to 30 nm and to a level of 1 mg L(-1) (9×10(8) particles mL(-1) , 50 nm Au ENPs). The results indicate ASEM is a highly complementary method to existing approaches for analyzing ENPs in complex media and that its use will allow those studying to study ENP behavior in situ, something that is currently extremely challenging to do. © 2013 The Authors Journal of Microscopy © 2013 Royal Microscopical Society.
Communication: Electron ionization of DNA bases
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rahman, M. A.; Krishnakumar, E., E-mail: ekkumar@tifr.res.in
2016-04-28
No reliable experimental data exist for the partial and total electron ionization cross sections for DNA bases, which are very crucial for modeling radiation damage in genetic material of living cell. We have measured a complete set of absolute partial electron ionization cross sections up to 500 eV for DNA bases for the first time by using the relative flow technique. These partial cross sections are summed to obtain total ion cross sections for all the four bases and are compared with the existing theoretical calculations and the only set of measured absolute cross sections. Our measurements clearly resolve themore » existing discrepancy between the theoretical and experimental results, thereby providing for the first time reliable numbers for partial and total ion cross sections for these molecules. The results on fragmentation analysis of adenine supports the theory of its formation in space.« less
Nonlinear, non-stationary image processing technique for eddy current NDE
NASA Astrophysics Data System (ADS)
Yang, Guang; Dib, Gerges; Kim, Jaejoon; Zhang, Lu; Xin, Junjun; Udpa, Lalita
2012-05-01
Automatic analysis of eddy current (EC) data has facilitated the analysis of large volumes of data generated in the inspection of steam generator tubes in nuclear power plants. The traditional procedure for analysis of EC data includes data calibration, pre-processing, region of interest (ROI) detection, feature extraction and classification. Accurate ROI detection has been enhanced by pre-processing, which involves reducing noise and other undesirable components as well as enhancing defect indications in the raw measurement. This paper presents the Hilbert-Huang Transform (HHT) for feature extraction and support vector machine (SVM) for classification. The performance is shown to significantly better than the existing rule based classification approach used in industry.
Sánchez-Sánchez, M Luz; Belda-Lois, Juan-Manuel; Mena-Del Horno, Silvia; Viosca-Herrero, Enrique; Igual-Camacho, Celedonia; Gisbert-Morant, Beatriz
2018-05-05
A major goal in stroke rehabilitation is the establishment of more effective physical therapy techniques to recover postural stability. Functional Principal Component Analysis provides greater insight into recovery trends. However, when missing values exist, obtaining functional data presents some difficulties. The purpose of this study was to reveal an alternative technique for obtaining the Functional Principal Components without requiring the conversion to functional data beforehand and to investigate this methodology to determine the effect of specific physical therapy techniques in balance recovery trends in elderly subjects with hemiplegia post-stroke. A randomized controlled pilot trial was developed. Thirty inpatients post-stroke were included. Control and target groups were treated with the same conventional physical therapy protocol based on functional criteria, but specific techniques were added to the target group depending on the subjects' functional level. Postural stability during standing was quantified by posturography. The assessments were performed once a month from the moment the participants were able to stand up to six months post-stroke. The target group showed a significant improvement in postural control recovery trend six months after stroke that was not present in the control group. Some of the assessed parameters revealed significant differences between treatment groups (P < 0.05). The proposed methodology allows Functional Principal Component Analysis to be performed when data is scarce. Moreover, it allowed the dynamics of recovery of two different treatment groups to be determined, showing that the techniques added in the target group increased postural stability compared to the base protocol. Copyright © 2018 Elsevier Ltd. All rights reserved.
Herbst, Eric A F; Holloway, Graham P
2015-02-15
Mitochondrial function in the brain is traditionally assessed through analysing respiration in isolated mitochondria, a technique that possesses significant tissue and time requirements while also disrupting the cooperative mitochondrial reticulum. We permeabilized brain tissue in situ to permit analysis of mitochondrial respiration with the native mitochondrial morphology intact, removing the need for isolation time and minimizing tissue requirements to ∼2 mg wet weight. The permeabilized brain technique was validated against the traditional method of isolated mitochondria and was then further applied to assess regional variation in the mouse brain with ischaemia-reperfusion injuries. A transgenic mouse model overexpressing catalase within mitochondria was applied to show the contribution of mitochondrial reactive oxygen species to ischaemia-reperfusion injuries in different brain regions. This technique enhances the accessibility of addressing physiological questions in small brain regions and in applying transgenic mouse models to assess mechanisms regulating mitochondrial function in health and disease. Mitochondria function as the core energy providers in the brain and symptoms of neurodegenerative diseases are often attributed to their dysregulation. Assessing mitochondrial function is classically performed in isolated mitochondria; however, this process requires significant isolation time, demand for abundant tissue and disruption of the cooperative mitochondrial reticulum, all of which reduce reliability when attempting to assess in vivo mitochondrial bioenergetics. Here we introduce a method that advances the assessment of mitochondrial respiration in the brain by permeabilizing existing brain tissue to grant direct access to the mitochondrial reticulum in situ. The permeabilized brain preparation allows for instant analysis of mitochondrial function with unaltered mitochondrial morphology using significantly small sample sizes (∼2 mg), which permits the analysis of mitochondrial function in multiple subregions within a single mouse brain. Here this technique was applied to assess regional variation in brain mitochondrial function with acute ischaemia-reperfusion injuries and to determine the role of reactive oxygen species in exacerbating dysfunction through the application of a transgenic mouse model overexpressing catalase within mitochondria. Through creating accessibility to small regions for the investigation of mitochondrial function, the permeabilized brain preparation enhances the capacity for examining regional differences in mitochondrial regulation within the brain, as the majority of genetic models used for unique approaches exist in the mouse model. © 2014 The Authors. The Journal of Physiology © 2014 The Physiological Society.
Enhanced Multiobjective Optimization Technique for Comprehensive Aerospace Design. Part A
NASA Technical Reports Server (NTRS)
Chattopadhyay, Aditi; Rajadas, John N.
1997-01-01
A multidisciplinary design optimization procedure which couples formal multiobjectives based techniques and complex analysis procedures (such as computational fluid dynamics (CFD) codes) developed. The procedure has been demonstrated on a specific high speed flow application involving aerodynamics and acoustics (sonic boom minimization). In order to account for multiple design objectives arising from complex performance requirements, multiobjective formulation techniques are used to formulate the optimization problem. Techniques to enhance the existing Kreisselmeier-Steinhauser (K-S) function multiobjective formulation approach have been developed. The K-S function procedure used in the proposed work transforms a constrained multiple objective functions problem into an unconstrained problem which then is solved using the Broyden-Fletcher-Goldfarb-Shanno (BFGS) algorithm. Weight factors are introduced during the transformation process to each objective function. This enhanced procedure will provide the designer the capability to emphasize specific design objectives during the optimization process. The demonstration of the procedure utilizes a computational Fluid dynamics (CFD) code which solves the three-dimensional parabolized Navier-Stokes (PNS) equations for the flow field along with an appropriate sonic boom evaluation procedure thus introducing both aerodynamic performance as well as sonic boom as the design objectives to be optimized simultaneously. Sensitivity analysis is performed using a discrete differentiation approach. An approximation technique has been used within the optimizer to improve the overall computational efficiency of the procedure in order to make it suitable for design applications in an industrial setting.
MIBPB: a software package for electrostatic analysis.
Chen, Duan; Chen, Zhan; Chen, Changjun; Geng, Weihua; Wei, Guo-Wei
2011-03-01
The Poisson-Boltzmann equation (PBE) is an established model for the electrostatic analysis of biomolecules. The development of advanced computational techniques for the solution of the PBE has been an important topic in the past two decades. This article presents a matched interface and boundary (MIB)-based PBE software package, the MIBPB solver, for electrostatic analysis. The MIBPB has a unique feature that it is the first interface technique-based PBE solver that rigorously enforces the solution and flux continuity conditions at the dielectric interface between the biomolecule and the solvent. For protein molecular surfaces, which may possess troublesome geometrical singularities, the MIB scheme makes the MIBPB by far the only existing PBE solver that is able to deliver the second-order convergence, that is, the accuracy increases four times when the mesh size is halved. The MIBPB method is also equipped with a Dirichlet-to-Neumann mapping technique that builds a Green's function approach to analytically resolve the singular charge distribution in biomolecules in order to obtain reliable solutions at meshes as coarse as 1 Å--whereas it usually takes other traditional PB solvers 0.25 Å to reach similar level of reliability. This work further accelerates the rate of convergence of linear equation systems resulting from the MIBPB by using the Krylov subspace (KS) techniques. Condition numbers of the MIBPB matrices are significantly reduced by using appropriate KS solver and preconditioner combinations. Both linear and nonlinear PBE solvers in the MIBPB package are tested by protein-solvent solvation energy calculations and analysis of salt effects on protein-protein binding energies, respectively. Copyright © 2010 Wiley Periodicals, Inc.
Exploring relation types for literature-based discovery.
Preiss, Judita; Stevenson, Mark; Gaizauskas, Robert
2015-09-01
Literature-based discovery (LBD) aims to identify "hidden knowledge" in the medical literature by: (1) analyzing documents to identify pairs of explicitly related concepts (terms), then (2) hypothesizing novel relations between pairs of unrelated concepts that are implicitly related via a shared concept to which both are explicitly related. Many LBD approaches use simple techniques to identify semantically weak relations between concepts, for example, document co-occurrence. These generate huge numbers of hypotheses, difficult for humans to assess. More complex techniques rely on linguistic analysis, for example, shallow parsing, to identify semantically stronger relations. Such approaches generate fewer hypotheses, but may miss hidden knowledge. The authors investigate this trade-off in detail, comparing techniques for identifying related concepts to discover which are most suitable for LBD. A generic LBD system that can utilize a range of relation types was developed. Experiments were carried out comparing a number of techniques for identifying relations. Two approaches were used for evaluation: replication of existing discoveries and the "time slicing" approach.(1) RESULTS: Previous LBD discoveries could be replicated using relations based either on document co-occurrence or linguistic analysis. Using relations based on linguistic analysis generated many fewer hypotheses, but a significantly greater proportion of them were candidates for hidden knowledge. The use of linguistic analysis-based relations improves accuracy of LBD without overly damaging coverage. LBD systems often generate huge numbers of hypotheses, which are infeasible to manually review. Improving their accuracy has the potential to make these systems significantly more usable. © The Author 2015. Published by Oxford University Press on behalf of the American Medical Informatics Association.
MIBPB: A software package for electrostatic analysis
Chen, Duan; Chen, Zhan; Chen, Changjun; Geng, Weihua; Wei, Guo-Wei
2010-01-01
The Poisson-Boltzmann equation (PBE) is an established model for the electrostatic analysis of biomolecules. The development of advanced computational techniques for the solution of the PBE has been an important topic in the past two decades. This paper presents a matched interface and boundary (MIB) based PBE software package, the MIBPB solver, for electrostatic analysis. The MIBPB has a unique feature that it is the first interface technique based PBE solver that rigorously enforces the solution and flux continuity conditions at the dielectric interface between the biomolecule and the solvent. For protein molecular surfaces which may possess troublesome geometrical singularities, the MIB scheme makes the MIBPB by far the only existing PBE solver that is able to deliver the second order convergence, i.e., the accuracy increases four times when the mesh size is halved. The MIBPB method is also equipped with a Dirichlet-to-Neumann mapping (DNM) technique, that builds a Green's function approach to analytically resolve the singular charge distribution in biomolecules in order to obtain reliable solutions at meshes as coarse as 1Å — while it usually takes other traditional PB solvers 0.25Å to reach similar level of reliability. The present work further accelerates the rate of convergence of linear equation systems resulting from the MIBPB by utilizing the Krylov subspace (KS) techniques. Condition numbers of the MIBPB matrices are significantly reduced by using appropriate Krylov subspace solver and preconditioner combinations. Both linear and nonlinear PBE solvers in the MIBPB package are tested by protein-solvent solvation energy calculations and analysis of salt effects on protein-protein binding energies, respectively. PMID:20845420
Reduced Order Model Implementation in the Risk-Informed Safety Margin Characterization Toolkit
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mandelli, Diego; Smith, Curtis L.; Alfonsi, Andrea
2015-09-01
The RISMC project aims to develop new advanced simulation-based tools to perform Probabilistic Risk Analysis (PRA) for the existing fleet of U.S. nuclear power plants (NPPs). These tools numerically model not only the thermo-hydraulic behavior of the reactor primary and secondary systems but also external events temporal evolution and components/system ageing. Thus, this is not only a multi-physics problem but also a multi-scale problem (both spatial, µm-mm-m, and temporal, ms-s-minutes-years). As part of the RISMC PRA approach, a large amount of computationally expensive simulation runs are required. An important aspect is that even though computational power is regularly growing, themore » overall computational cost of a RISMC analysis may be not viable for certain cases. A solution that is being evaluated is the use of reduce order modeling techniques. During the FY2015, we investigated and applied reduced order modeling techniques to decrease the RICM analysis computational cost by decreasing the number of simulations runs to perform and employ surrogate models instead of the actual simulation codes. This report focuses on the use of reduced order modeling techniques that can be applied to any RISMC analysis to generate, analyze and visualize data. In particular, we focus on surrogate models that approximate the simulation results but in a much faster time (µs instead of hours/days). We apply reduced order and surrogate modeling techniques to several RISMC types of analyses using RAVEN and RELAP-7 and show the advantages that can be gained.« less
First decadal lunar results from the Moon and Earth Radiation Budget Experiment.
Matthews, Grant
2018-03-01
A need to gain more confidence in computer model predictions of coming climate change has resulted in greater analysis of the quality of orbital Earth radiation budget (ERB) measurements being used today to constrain, validate, and hence improve such simulations. These studies conclude from time series analysis that for around a quarter of a century, no existing satellite ERB climate data record is of a sufficient standard to partition changes to the Earth from those of un-tracked and changing artificial instrumentation effects. This led to the creation of the Moon and Earth Radiation Budget Experiment (MERBE), which instead takes existing decades old climate data to a higher calibration standard using thousands of scans of Earth's Moon. The Terra and Aqua satellite ERB climate records have been completely regenerated using signal-processing improvements, combined with a substantial increase in precision from more comprehensive in-flight spectral characterization techniques. This study now builds on previous Optical Society of America work by describing new Moon measurements derived using accurate analytical mapping of telescope spatial response. That then allows a factor of three reduction in measurement noise along with an order of magnitude increase in the number of retrieved independent lunar results. Given decadal length device longevity and the use of solar and thermal lunar radiance models to normalize the improved ERB results to the International System of Units traceable radiance scale of the "MERBE Watt," the same established environmental time series analysis techniques are applied to MERBE data. They evaluate it to perhaps be of sufficient quality to immediately begin narrowing the largest of climate prediction uncertainties. It also shows that if such Terra/Aqua ERB devices can operate into the 2020s, it could become possible to halve these same uncertainties decades sooner than would be possible with existing or even planned new observing systems.
McNabb, Matthew; Cao, Yu; Devlin, Thomas; Baxter, Blaise; Thornton, Albert
2012-01-01
Mechanical Embolus Removal in Cerebral Ischemia (MERCI) has been supported by medical trials as an improved method of treating ischemic stroke past the safe window of time for administering clot-busting drugs, and was released for medical use in 2004. The importance of analyzing real-world data collected from MERCI clinical trials is key to providing insights on the effectiveness of MERCI. Most of the existing data analysis on MERCI results has thus far employed conventional statistical analysis techniques. To the best of our knowledge, advanced data analytics and data mining techniques have not yet been systematically applied. To address the issue in this thesis, we conduct a comprehensive study on employing state of the art machine learning algorithms to generate prediction criteria for the outcome of MERCI patients. Specifically, we investigate the issue of how to choose the most significant attributes of a data set with limited instance examples. We propose a few search algorithms to identify the significant attributes, followed by a thorough performance analysis for each algorithm. Finally, we apply our proposed approach to the real-world, de-identified patient data provided by Erlanger Southeast Regional Stroke Center, Chattanooga, TN. Our experimental results have demonstrated that our proposed approach performs well.
NASA Technical Reports Server (NTRS)
Bartos, Karen F.; Fite, E. Brian; Shalkhauser, Kurt A.; Sharp, G. Richard
1991-01-01
Current research in high-efficiency, high-performance traveling wave tubes (TWT's) has led to the development of novel thermal/ mechanical computer models for use with helical slow-wave structures. A three-dimensional, finite element computer model and analytical technique used to study the structural integrity and thermal operation of a high-efficiency, diamond-rod, K-band TWT designed for use in advanced space communications systems. This analysis focused on the slow-wave circuit in the radiofrequency section of the TWT, where an inherent localized heating problem existed and where failures were observed during earlier cold compression, or 'coining' fabrication technique that shows great potential for future TWT development efforts. For this analysis, a three-dimensional, finite element model was used along with MARC, a commercially available finite element code, to simulate the fabrication of a diamond-rod TWT. This analysis was conducted by using component and material specifications consistent with actual TWT fabrication and was verified against empirical data. The analysis is nonlinear owing to material plasticity introduced by the forming process and also to geometric nonlinearities presented by the component assembly configuration. The computer model was developed by using the high efficiency, K-band TWT design but is general enough to permit similar analyses to be performed on a wide variety of TWT designs and styles. The results of the TWT operating condition and structural failure mode analysis, as well as a comparison of analytical results to test data are presented.
NASA Technical Reports Server (NTRS)
Shalkhauser, Kurt A.; Bartos, Karen F.; Fite, E. B.; Sharp, G. R.
1992-01-01
Current research in high-efficiency, high-performance traveling wave tubes (TWT's) has led to the development of novel thermal/mechanical computer models for use with helical slow-wave structures. A three-dimensional, finite element computer model and analytical technique used to study the structural integrity and thermal operation of a high-efficiency, diamond-rod, K-band TWT designed for use in advanced space communications systems. This analysis focused on the slow-wave circuit in the radiofrequency section of the TWT, where an inherent localized heating problem existed and where failures were observed during earlier cold compression, or 'coining' fabrication technique that shows great potential for future TWT development efforts. For this analysis, a three-dimensional, finite element model was used along with MARC, a commercially available finite element code, to simulate the fabrication of a diamond-rod TWT. This analysis was conducted by using component and material specifications consistent with actual TWT fabrication and was verified against empirical data. The analysis is nonlinear owing to material plasticity introduced by the forming process and also to geometric nonlinearities presented by the component assembly configuration. The computer model was developed by using the high efficiency, K-band TWT design but is general enough to permit similar analyses to be performed on a wide variety of TWT designs and styles. The results of the TWT operating condition and structural failure mode analysis, as well as a comparison of analytical results to test data are presented.
This technical guidance document is designed to aid in the selection, design, installation and operation of indoor radon reduction techniques using soil depressurization in existing houses. Its emphasis is on active soil depressurization; i.e., on systems that use a fan to depre...
Dahlberg, Jerry; Tkacik, Peter T; Mullany, Brigid; Fleischhauer, Eric; Shahinian, Hossein; Azimi, Farzad; Navare, Jayesh; Owen, Spencer; Bisel, Tucker; Martin, Tony; Sholar, Jodie; Keanini, Russell G
2017-12-04
An analog, macroscopic method for studying molecular-scale hydrodynamic processes in dense gases and liquids is described. The technique applies a standard fluid dynamic diagnostic, particle image velocimetry (PIV), to measure: i) velocities of individual particles (grains), extant on short, grain-collision time-scales, ii) velocities of systems of particles, on both short collision-time- and long, continuum-flow-time-scales, iii) collective hydrodynamic modes known to exist in dense molecular fluids, and iv) short- and long-time-scale velocity autocorrelation functions, central to understanding particle-scale dynamics in strongly interacting, dense fluid systems. The basic system is composed of an imaging system, light source, vibrational sensors, vibrational system with a known media, and PIV and analysis software. Required experimental measurements and an outline of the theoretical tools needed when using the analog technique to study molecular-scale hydrodynamic processes are highlighted. The proposed technique provides a relatively straightforward alternative to photonic and neutron beam scattering methods traditionally used in molecular hydrodynamic studies.
Context-Aware Adaptive Hybrid Semantic Relatedness in Biomedical Science
NASA Astrophysics Data System (ADS)
Emadzadeh, Ehsan
Text mining of biomedical literature and clinical notes is a very active field of research in biomedical science. Semantic analysis is one of the core modules for different Natural Language Processing (NLP) solutions. Methods for calculating semantic relatedness of two concepts can be very useful in solutions solving different problems such as relationship extraction, ontology creation and question / answering [1--6]. Several techniques exist in calculating semantic relatedness of two concepts. These techniques utilize different knowledge sources and corpora. So far, researchers attempted to find the best hybrid method for each domain by combining semantic relatedness techniques and data sources manually. In this work, attempts were made to eliminate the needs for manually combining semantic relatedness methods targeting any new contexts or resources through proposing an automated method, which attempted to find the best combination of semantic relatedness techniques and resources to achieve the best semantic relatedness score in every context. This may help the research community find the best hybrid method for each context considering the available algorithms and resources.
Ruggeri, Andrea Gennaro; Cappelletti, Martina; Fazzolari, Benedetta; Marotta, Nicola; Delfini, Roberto
2016-04-01
Traditionally, the surgical removal of tuberculum sellae meningioma (TSM) and olfactory groove meningioma (OGM) requires transcranial approaches and microsurgical techniques, but in the last decade endoscopic expanded endonasal approaches have been introduced: transcribriform for OGMs and transtuberculum-transplanum for TSM. A comparative analysis of the literature concerning the two types of surgical treatment of OGMs and TSM is, however, difficult. We conducted a literature search using the PubMed database to compare data for endoscopic and microsurgical techniques in the literature. We also conducted a retrospective analysis of selected cases from our series presenting favorable characteristics for an endoscopic approach, based on the criteria of operability of these lesions as generally accepted in the literature, and we compared the results obtained in these patients with those in the endoscopic literature. We believe that making the sample more homogeneous, the difference between microsurgical technique and endoscopic technique is no longer so striking. A greater radical removal rate, a reduced incidence of cerebrospinal fluid fistula and, especially, the possibility of removing lesions of any size are advantages of transcranial surgery; a higher percentage of improvement in visual outcome and a lower risk of a worsening of a pre-existing deficit or onset of a new deficit are advantages of the endoscopic technique. At present, the microsurgical technique is still the gold standard for the removal of the anterior cranial fossa meningiomas of all sizes, and the endoscopic technique remains a second option in certain cases. Copyright © 2016 Elsevier Inc. All rights reserved.
NASA Technical Reports Server (NTRS)
Maule, J.; Wainwright, N.; Steele, A.; Gunter, D.; Flores, G.; Effinger, M.; Danibm N,; Wells, M.; Williams, S.; Morris, H.;
2008-01-01
Microorganisms within the space stations Salyut, Mir and the International Space Station (ISS), have traditionally been monitored with culture-based techniques. These techniques involve growing environmental samples (cabin water, air or surfaces) on agar-type media for several days, followed by visualization of resulting colonies; and return of samples to Earth for ground-based analysis. This approach has provided a wealth of useful data and enhanced our understanding of the microbial ecology within space stations. However, the approach is also limited by the following: i) More than 95% microorganisms in the environment cannot grow on conventional growth media; ii) Significant time lags occur between onboard sampling and colony visualization (3-5 days) and ground-based analysis (as long as several months); iii) Colonies are often difficult to visualize due to condensation within contact slide media plates; and iv) Techniques involve growth of potentially harmful microorganisms, which must then be disposed of safely. This report describes the operation of a new culture-independent technique onboard the ISS for rapid analysis (within minutes) of endotoxin and -1, 3-glucan, found in the cell walls of gram-negative bacteria and fungi, respectively. This technique involves analysis of environmental samples with the Limulus Amebocyte Lysate (LAL) assay in a handheld device. This handheld device and sampling system is known as the Lab-On-a-Chip Application Development Portable Test System (LOCAD-PTS). A poster will be presented that describes a comparative study between LOCAD-PTS analysis and existing culture-based methods onboard the ISS; together with an exploratory survey of surface endotoxin throughout the ISS. It is concluded that while a general correlation between LOCAD-PTS and traditional culture-based methods should not necessarily be expected, a combinatorial approach can be adopted where both sets of data are used together to generate a more complete story of the microbial ecology on the ISS.
NASA Astrophysics Data System (ADS)
Martel, Anne L.
2004-04-01
In order to extract quantitative information from dynamic contrast-enhanced MR images (DCE-MRI) it is usually necessary to identify an arterial input function. This is not a trivial problem if there are no major vessels present in the field of view. Most existing techniques rely on operator intervention or use various curve parameters to identify suitable pixels but these are often specific to the anatomical region or the acquisition method used. They also require the signal from several pixels to be averaged in order to improve the signal to noise ratio, however this introduces errors due to partial volume effects. We have described previously how factor analysis can be used to automatically separate arterial and venous components from DCE-MRI studies of the brain but although that method works well for single slice images through the brain when the blood brain barrier technique is intact, it runs into problems for multi-slice images with more complex dynamics. This paper will describe a factor analysis method that is more robust in such situations and is relatively insensitive to the number of physiological components present in the data set. The technique is very similar to that used to identify spectral end-members from multispectral remote sensing images.
NASA Technical Reports Server (NTRS)
Oswald, Hayden; Molthan, Andrew L.
2011-01-01
Satellite remote sensing has gained widespread use in the field of operational meteorology. Although raw satellite imagery is useful, several techniques exist which can convey multiple types of data in a more efficient way. One of these techniques is multispectral compositing. The NASA Short-term Prediction Research and Transition (SPoRT) Center has developed two multispectral satellite imagery products which utilize data from the Moderate Resolution Imaging Spectroradiometer (MODIS) aboard NASA's Terra and Aqua satellites, based upon products currently generated and used by the European Organization for the Exploitation of Meteorological Satellites (EUMETSAT). The nighttime microphysics product allows users to identify clouds occurring at different altitudes, but emphasizes fog and low cloud detection. This product improves upon current spectral difference and single channel infrared techniques. Each of the current products has its own set of advantages for nocturnal fog detection, but each also has limiting drawbacks which can hamper the analysis process. The multispectral product combines each current product with a third channel difference. Since the final image is enhanced with color, it simplifies the fog identification process. Analysis has shown that the nighttime microphysics imagery product represents a substantial improvement to conventional fog detection techniques, as well as provides a preview of future satellite capabilities to forecasters.
Logistic regression for risk factor modelling in stuttering research.
Reed, Phil; Wu, Yaqionq
2013-06-01
To outline the uses of logistic regression and other statistical methods for risk factor analysis in the context of research on stuttering. The principles underlying the application of a logistic regression are illustrated, and the types of questions to which such a technique has been applied in the stuttering field are outlined. The assumptions and limitations of the technique are discussed with respect to existing stuttering research, and with respect to formulating appropriate research strategies to accommodate these considerations. Finally, some alternatives to the approach are briefly discussed. The way the statistical procedures are employed are demonstrated with some hypothetical data. Research into several practical issues concerning stuttering could benefit if risk factor modelling were used. Important examples are early diagnosis, prognosis (whether a child will recover or persist) and assessment of treatment outcome. After reading this article you will: (a) Summarize the situations in which logistic regression can be applied to a range of issues about stuttering; (b) Follow the steps in performing a logistic regression analysis; (c) Describe the assumptions of the logistic regression technique and the precautions that need to be checked when it is employed; (d) Be able to summarize its advantages over other techniques like estimation of group differences and simple regression. Copyright © 2012 Elsevier Inc. All rights reserved.
Spreadsheet WATERSHED modeling for nonpoint-source pollution management in a Wisconsin basin
Walker, J.F.; Pickard, S.A.; Sonzogni, W.C.
1989-01-01
Although several sophisticated nonpoint pollution models exist, few are available that are easy to use, cover a variety of conditions, and integrate a wide range of information to allow managers and planners to assess different control strategies. Here, a straightforward pollutant input accounting approach is presented in the form of an existing model (WATERSHED) that has been adapted to run on modern electronic spreadsheets. As an application, WATERSHED is used to assess options to improve the quality of highly eutrophic Delavan Lake in Wisconsin. WATERSHED is flexible in that several techniques, such as the Universal Soil Loss Equation or unit-area loadings, can be used to estimate nonpoint-source inputs. Once the model parameters are determined (and calibrated, if possible), the spreadsheet features can be used to conduct a sensitivity analysis of management options. In the case of Delavan Lake, it was concluded that, although some nonpoint controls were cost-effective, the overall reduction in phosphorus would be insufficient to measurably improve water quality.A straightforward pollutant input accounting approach is presented in the form of an existing model (WATERSHED) that has been adapted to run on modern electronic spreadsheets. As an application, WATERSHED is used to assess options to improve the quality of highly eutrophic Delavan Lake in Wisconsin. WATERSHED is flexible in that several techniques, such as the Universal Soil Loss Equation or unit-area loadings, can be used to estimate nonpoint-source inputs. Once the model parameters are determined (and calibrated, if possible), the spreadsheet features can be used to conduct a sensitivity analysis of management options. In the case of Delavan Lake, it was concluded that, although some nonpoint controls were cost-effective, the overall reduction in phosphorus would be insufficient to measurably improve water quality.
White, Claire E; Provis, John L; Proffen, Thomas; Riley, Daniel P; van Deventer, Jannie S J
2010-04-07
Understanding the atomic structure of complex metastable (including glassy) materials is of great importance in research and industry, however, such materials resist solution by most standard techniques. Here, a novel technique combining thermodynamics and local structure is presented to solve the structure of the metastable aluminosilicate material metakaolin (calcined kaolinite) without the use of chemical constraints. The structure is elucidated by iterating between least-squares real-space refinement using neutron pair distribution function data, and geometry optimisation using density functional modelling. The resulting structural representation is both energetically feasible and in excellent agreement with experimental data. This accurate structural representation of metakaolin provides new insight into the local environment of the aluminium atoms, with evidence of the existence of tri-coordinated aluminium. By the availability of this detailed chemically feasible atomic description, without the need to artificially impose constraints during the refinement process, there exists the opportunity to tailor chemical and mechanical processes involving metakaolin and other complex metastable materials at the atomic level to obtain optimal performance at the macro-scale.
Variable Selection in the Presence of Missing Data: Imputation-based Methods.
Zhao, Yize; Long, Qi
2017-01-01
Variable selection plays an essential role in regression analysis as it identifies important variables that associated with outcomes and is known to improve predictive accuracy of resulting models. Variable selection methods have been widely investigated for fully observed data. However, in the presence of missing data, methods for variable selection need to be carefully designed to account for missing data mechanisms and statistical techniques used for handling missing data. Since imputation is arguably the most popular method for handling missing data due to its ease of use, statistical methods for variable selection that are combined with imputation are of particular interest. These methods, valid used under the assumptions of missing at random (MAR) and missing completely at random (MCAR), largely fall into three general strategies. The first strategy applies existing variable selection methods to each imputed dataset and then combine variable selection results across all imputed datasets. The second strategy applies existing variable selection methods to stacked imputed datasets. The third variable selection strategy combines resampling techniques such as bootstrap with imputation. Despite recent advances, this area remains under-developed and offers fertile ground for further research.
Liley, James; Wallace, Chris
2015-02-01
Genome-wide association studies (GWAS) have been successful in identifying single nucleotide polymorphisms (SNPs) associated with many traits and diseases. However, at existing sample sizes, these variants explain only part of the estimated heritability. Leverage of GWAS results from related phenotypes may improve detection without the need for larger datasets. The Bayesian conditional false discovery rate (cFDR) constitutes an upper bound on the expected false discovery rate (FDR) across a set of SNPs whose p values for two diseases are both less than two disease-specific thresholds. Calculation of the cFDR requires only summary statistics and have several advantages over traditional GWAS analysis. However, existing methods require distinct control samples between studies. Here, we extend the technique to allow for some or all controls to be shared, increasing applicability. Several different SNP sets can be defined with the same cFDR value, and we show that the expected FDR across the union of these sets may exceed expected FDR in any single set. We describe a procedure to establish an upper bound for the expected FDR among the union of such sets of SNPs. We apply our technique to pairwise analysis of p values from ten autoimmune diseases with variable sharing of controls, enabling discovery of 59 SNP-disease associations which do not reach GWAS significance after genomic control in individual datasets. Most of the SNPs we highlight have previously been confirmed using replication studies or larger GWAS, a useful validation of our technique; we report eight SNP-disease associations across five diseases not previously declared. Our technique extends and strengthens the previous algorithm, and establishes robust limits on the expected FDR. This approach can improve SNP detection in GWAS, and give insight into shared aetiology between phenotypically related conditions.
Regular Topologies for Gigabit Wide-Area Networks. Volume 1
NASA Technical Reports Server (NTRS)
Shacham, Nachum; Denny, Barbara A.; Lee, Diane S.; Khan, Irfan H.; Lee, Danny Y. C.; McKenney, Paul
1994-01-01
In general terms, this project aimed at the analysis and design of techniques for very high-speed networking. The formal objectives of the project were to: (1) Identify switch and network technologies for wide-area networks that interconnect a large number of users and can provide individual data paths at gigabit/s rates; (2) Quantitatively evaluate and compare existing and proposed architectures and protocols, identify their strength and growth potentials, and ascertain the compatibility of competing technologies; and (3) Propose new approaches to existing architectures and protocols, and identify opportunities for research to overcome deficiencies and enhance performance. The project was organized into two parts: 1. The design, analysis, and specification of techniques and protocols for very-high-speed network environments. In this part, SRI has focused on several key high-speed networking areas, including Forward Error Control (FEC) for high-speed networks in which data distortion is the result of packet loss, and the distribution of broadband, real-time traffic in multiple user sessions. 2. Congestion Avoidance Testbed Experiment (CATE). This part of the project was done within the framework of the DARTnet experimental T1 national network. The aim of the work was to advance the state of the art in benchmarking DARTnet's performance and traffic control by developing support tools for network experimentation, by designing benchmarks that allow various algorithms to be meaningfully compared, and by investigating new queueing techniques that better satisfy the needs of best-effort and reserved-resource traffic. This document is the final technical report describing the results obtained by SRI under this project. The report consists of three volumes: Volume 1 contains a technical description of the network techniques developed by SRI in the areas of FEC and multicast of real-time traffic. Volume 2 describes the work performed under CATE. Volume 3 contains the source code of all software developed under CATE.
NASA Astrophysics Data System (ADS)
Abdelguerfi, Mahdi; Wynne, Chris; Cooper, Edgar; Ladner, Roy V.; Shaw, Kevin B.
1997-08-01
Three-dimensional terrain representation plays an important role in a number of terrain database applications. Hierarchical triangulated irregular networks (TINs) provide a variable-resolution terrain representation that is based on a nested triangulation of the terrain. This paper compares and analyzes existing hierarchical triangulation techniques. The comparative analysis takes into account how aesthetically appealing and accurate the resulting terrain representation is. Parameters, such as adjacency, slivers, and streaks, are used to provide a measure on how aesthetically appealing the terrain representation is. Slivers occur when the triangulation produces thin and slivery triangles. Streaks appear when there are too many triangulations done at a given vertex. Simple mathematical expressions are derived for these parameters, thereby providing a fairer and a more easily duplicated comparison. In addition to meeting the adjacency requirement, an aesthetically pleasant hierarchical TINs generation algorithm is expected to reduce both slivers and streaks while maintaining accuracy. A comparative analysis of a number of existing approaches shows that a variant of a method originally proposed by Scarlatos exhibits better overall performance.
Carvalho, Margarida Lima; Costa Silva, Guilherme José da; Melo, Silvana; Ashikaga, Fernando Yuldi; Shimabukuro-Dias, Cristiane Kioko; Scacchetti, Priscilla Cardim; Devidé, Renato; Foresti, Fausto; Oliveira, Claudio
2018-01-31
The combination of cytogenetic and molecular data with those traditionally obtained in areas like systematics and taxonomy created interesting perspectives for the analysis of natural populations under different aspects. In this context, this study aimed to evaluate the genetic differentiation among populations of the genus Hemiodontichthys Bleeker, 1862, through combined genetic techniques and included the analysis of populations sampled in the Araguaia River, Guamá River, Madeira River and two populations from the Purus River. Hemiodontichthys samples from the two localities in Purus River were also karyotyped in order to address the degree of chromosomal variation between populations. Through GMYC analysis of the COI tree, the patterns of genetic variation among local populations revealed to be higher than the ones found among distinct species from other genera of the subfamily Loricariinae, suggesting the existence of probable four cryptic species in this genus. The possible existence of a species complex in the genus is corroborated by the different cytogenetic patterns between Hemiodontichthys sp. 1 and sp. 2, revealing the necessity of a deep taxonomic review of the group.
Lightweight and Statistical Techniques for Petascale PetaScale Debugging
DOE Office of Scientific and Technical Information (OSTI.GOV)
Miller, Barton
2014-06-30
This project investigated novel techniques for debugging scientific applications on petascale architectures. In particular, we developed lightweight tools that narrow the problem space when bugs are encountered. We also developed techniques that either limit the number of tasks and the code regions to which a developer must apply a traditional debugger or that apply statistical techniques to provide direct suggestions of the location and type of error. We extend previous work on the Stack Trace Analysis Tool (STAT), that has already demonstrated scalability to over one hundred thousand MPI tasks. We also extended statistical techniques developed to isolate programming errorsmore » in widely used sequential or threaded applications in the Cooperative Bug Isolation (CBI) project to large scale parallel applications. Overall, our research substantially improved productivity on petascale platforms through a tool set for debugging that complements existing commercial tools. Previously, Office Of Science application developers relied either on primitive manual debugging techniques based on printf or they use tools, such as TotalView, that do not scale beyond a few thousand processors. However, bugs often arise at scale and substantial effort and computation cycles are wasted in either reproducing the problem in a smaller run that can be analyzed with the traditional tools or in repeated runs at scale that use the primitive techniques. New techniques that work at scale and automate the process of identifying the root cause of errors were needed. These techniques significantly reduced the time spent debugging petascale applications, thus leading to a greater overall amount of time for application scientists to pursue the scientific objectives for which the systems are purchased. We developed a new paradigm for debugging at scale: techniques that reduced the debugging scenario to a scale suitable for traditional debuggers, e.g., by narrowing the search for the root-cause analysis to a small set of nodes or by identifying equivalence classes of nodes and sampling our debug targets from them. We implemented these techniques as lightweight tools that efficiently work on the full scale of the target machine. We explored four lightweight debugging refinements: generic classification parameters, such as stack traces, application-specific classification parameters, such as global variables, statistical data acquisition techniques and machine learning based approaches to perform root cause analysis. Work done under this project can be divided into two categories, new algorithms and techniques for scalable debugging, and foundation infrastructure work on our MRNet multicast-reduction framework for scalability, and Dyninst binary analysis and instrumentation toolkits.« less
3D thermography imaging standardization technique for inflammation diagnosis
NASA Astrophysics Data System (ADS)
Ju, Xiangyang; Nebel, Jean-Christophe; Siebert, J. Paul
2005-01-01
We develop a 3D thermography imaging standardization technique to allow quantitative data analysis. Medical Digital Infrared Thermal Imaging is very sensitive and reliable mean of graphically mapping and display skin surface temperature. It allows doctors to visualise in colour and quantify temperature changes in skin surface. The spectrum of colours indicates both hot and cold responses which may co-exist if the pain associate with an inflammatory focus excites an increase in sympathetic activity. However, due to thermograph provides only qualitative diagnosis information, it has not gained acceptance in the medical and veterinary communities as a necessary or effective tool in inflammation and tumor detection. Here, our technique is based on the combination of visual 3D imaging technique and thermal imaging technique, which maps the 2D thermography images on to 3D anatomical model. Then we rectify the 3D thermogram into a view independent thermogram and conform it a standard shape template. The combination of these imaging facilities allows the generation of combined 3D and thermal data from which thermal signatures can be quantified.
Hunter, N J R; Wilson, C J L; Luzin, V
2017-02-01
Three techniques are used to measure crystallographic preferred orientations (CPO) in a naturally deformed quartz mylonite: transmitted light cross-polarized microscopy using an automated fabric analyser, electron backscatter diffraction (EBSD) and neutron diffraction. Pole figure densities attributable to crystal-plastic deformation are variably recognizable across the techniques, particularly between fabric analyser and diffraction instruments. Although fabric analyser techniques offer rapid acquisition with minimal sample preparation, difficulties may exist when gathering orientation data parallel with the incident beam. Overall, we have found that EBSD and fabric analyser techniques are best suited for studying CPO distributions at the grain scale, where individual orientations can be linked to their source grain or nearest neighbours. Neutron diffraction serves as the best qualitative and quantitative means of estimating the bulk CPO, due to its three-dimensional data acquisition, greater sample area coverage, and larger sample size. However, a number of sampling methods can be applied to FA and EBSD data to make similar approximations. © 2016 The Authors Journal of Microscopy © 2016 Royal Microscopical Society.
Texture functions in image analysis: A computationally efficient solution
NASA Technical Reports Server (NTRS)
Cox, S. C.; Rose, J. F.
1983-01-01
A computationally efficient means for calculating texture measurements from digital images by use of the co-occurrence technique is presented. The calculation of the statistical descriptors of image texture and a solution that circumvents the need for calculating and storing a co-occurrence matrix are discussed. The results show that existing efficient algorithms for calculating sums, sums of squares, and cross products can be used to compute complex co-occurrence relationships directly from the digital image input.
Analysis of space tug operating techniques (study 2.4). Volume 1: Executive summary
NASA Technical Reports Server (NTRS)
1972-01-01
The costs of tug refurbishment were studied, using existing cost estimating relationships, to establish the cost of maintaining the reusable third stage of the space transportation system. Refurbishment operations sheets which describe the actual tasks that are necessary to keep the equipment functioning properly were used along with refurbishment operations sheets which contain all of the pertinent descriptive information for each of the major vehicle areas. Tug refurbishment costs per mission are tabulated.
The use of remote sensing in solving Florida's geological and coastal engineering problems
NASA Technical Reports Server (NTRS)
Brooks, H. K.; Ruth, B. E.; Wang, Y. H.; Ferguson, R. L.
1977-01-01
LANDSAT imagery and NASA high altitude color infrared (CIR) photography were used to select suitable sites for sanitary landfill in Volusia County, Florida and to develop techniques for preventing sand deposits in the Clearwater inlet. Activities described include the acquisition of imagery, its analysis by the IMAGE 100 system, conventional photointerpretation, evaluation of existing data sources (vegetation, soil, and ground water maps), site investigations for ground truth, and preparation of displays for reports.
An evaluation of the use of ERTS-1 satellite imagery for grizzly bear habitat analysis. [Montana
NASA Technical Reports Server (NTRS)
Varney, J. R.; Craighead, J. J.; Sumner, J. S.
1974-01-01
Improved classification and mapping of grizzly habitat will permit better estimates of population density and distribution, and allow accurate evaluation of the potential effects of changes in land use, hunting regulation, and management policies on existing populations. Methods of identifying favorable habitat from ERTS-1 multispectral scanner imagery were investigated and described. This technique could reduce the time and effort required to classify large wilderness areas in the Western United States.
2010-01-01
UAV Autonomy program which includes intelligent reasoning for autonomy, technologies to enhance see and avoid capabilities, object identification ...along the ship’s base recovery course (BRC). The pilot then flies toward the stern of the ship, aligning his approach path with the ship’s lineup line...quiescent point identification . CONCLUSIONS The primary goal for conducting dynamic interface analysis is to expand existing operating envelopes and
Treble, Ronald G; Johnson, Keith E; Xiao, Li; Thompson, Thomas S
2002-07-01
An existing gas chromatograph/mass spectrometer (GC/MS) can be used to analyze gas and liquid fractions from the same system within a few minutes. The technique was applied to (a) separate and identify the gaseous components of the products of cracking an alkane, (b) measure trace levels of acetone in ethyl acetate, (c) determine the relative partial pressures over a binary mixture, and (d) identify nine unknown compounds for the purpose of disposal.
Automated Techniques for Rapid Analysis of Momentum Exchange Devices
2013-12-01
estimated to average 1 hour per response, including the time for reviewing instruction, searching existing data sources, gathering and maintaining...NUMBER 9. SPONSORING /MONITORING AGENCY NAME(S) AND ADDRESS(ES) N /A 10. SPONSORING/MONITORING AGENCY REPORT NUMBER 11. SUPPLEMENTARY NOTES...1, pg. 12]. It allows a vector expressed in one frame to be differentiated with respect to another frame. /a a N N a a ad dv v v dt dt
A note on the effects of viscosity on the stability of a trailing-line vortex
NASA Technical Reports Server (NTRS)
Duck, Peter W.; Khorrami, Mehdi R.
1992-01-01
The linear stability of the Batchelor (1964) vortex is examined with emphasis on new viscous modes recently found numerically by Khorrami (1991). Unlike the previously reported inviscid modes of instability, these modes are destabilized by viscosity and exhibit small growth rates at large Reynolds numbers. The analysis presented here uses a combination of asymptotic and numerical techniques. The results confirm the existence of the additional modes of instability due to viscosity.
Economics of polysilicon processes
NASA Technical Reports Server (NTRS)
Yaws, C. L.; Li, K. Y.; Chou, S. M.
1986-01-01
Techniques are being developed to provide lower cost polysilicon material for solar cells. Existing technology which normally provides semiconductor industry polysilicon material is undergoing changes and also being used to provide polysilicon material for solar cells. Economics of new and existing technologies are presented for producing polysilicon. The economics are primarily based on the preliminary process design of a plant producing 1,000 metric tons/year of silicon. The polysilicon processes include: Siemen's process (hydrogen reduction of trichlorosilane); Union Carbide process (silane decomposition); and Hemlock Semiconductor process (hydrogen reduction of dichlorosilane). The economics include cost estimates of capital investment and product cost to produce polysilicon via the technology. Sensitivity analysis results are also presented to disclose the effect of major paramentes such as utilities, labor, raw materials and capital investment.
Characterizing Oscillatory Bursts in Single-Trial EEG Data
NASA Technical Reports Server (NTRS)
Knuth, K. H.; Shah, A. S.; Lakatos, P.; Schroeder, C. E.
2004-01-01
Oscillatory bursts in numerous bands ranging from low (theta) to high frequencies (e.g., gamma) undoubtedly play an important role in cortical dynamics. Largely because of the inadequacy of existing analytic techniques. however, oscillatory bursts and their role in cortical processing remains poorly understood. To study oscillatory bursts effectively one must be able to isolate them and characterize them in the single trial. We describe a series of straightforward analysis techniques that produce useful indices of burst characteristics. First, stimulus-evoked responses are estimated using Differentially Variable Component Analysis (dVCA), and are subtracted from the single-trial. The single-trial characteristics of the evoked responses are stored to identify possible correlations with burst activity. Time-frequency (T-F), or wavelet, analyses are then applied to the single trial residuals. While T-F plots have been used in recent studies to identify and isolate bursts, we go further by fitting each burst in the T-F plot with a two-dimensional Gaussian. This provides a set of burst characteristics, such as, center time. burst duration, center frequency. frequency dispersion. and amplitude, all of which contribute to the accurate characterization of the individual burst. The burst phase can also be estimated. Burst characteristics can be quantified with several standard techniques (e.g.. histogramming and clustering), as well as Bayesian techniques (e.g., blocking) to allow a more parametric description analysis of the characteristics of oscillatory bursts, and the relationships of specific parameters to cortical excitability and stimulus integration.
Taminau, Jonatan; Meganck, Stijn; Lazar, Cosmin; Steenhoff, David; Coletta, Alain; Molter, Colin; Duque, Robin; de Schaetzen, Virginie; Weiss Solís, David Y; Bersini, Hugues; Nowé, Ann
2012-12-24
With an abundant amount of microarray gene expression data sets available through public repositories, new possibilities lie in combining multiple existing data sets. In this new context, analysis itself is no longer the problem, but retrieving and consistently integrating all this data before delivering it to the wide variety of existing analysis tools becomes the new bottleneck. We present the newly released inSilicoMerging R/Bioconductor package which, together with the earlier released inSilicoDb R/Bioconductor package, allows consistent retrieval, integration and analysis of publicly available microarray gene expression data sets. Inside the inSilicoMerging package a set of five visual and six quantitative validation measures are available as well. By providing (i) access to uniformly curated and preprocessed data, (ii) a collection of techniques to remove the batch effects between data sets from different sources, and (iii) several validation tools enabling the inspection of the integration process, these packages enable researchers to fully explore the potential of combining gene expression data for downstream analysis. The power of using both packages is demonstrated by programmatically retrieving and integrating gene expression studies from the InSilico DB repository [https://insilicodb.org/app/].
Underwood, Peter; Waterson, Patrick
2014-07-01
The Swiss Cheese Model (SCM) is the most popular accident causation model and is widely used throughout various industries. A debate exists in the research literature over whether the SCM remains a viable tool for accident analysis. Critics of the model suggest that it provides a sequential, oversimplified view of accidents. Conversely, proponents suggest that it embodies the concepts of systems theory, as per the contemporary systemic analysis techniques. The aim of this paper was to consider whether the SCM can provide a systems thinking approach and remain a viable option for accident analysis. To achieve this, the train derailment at Grayrigg was analysed with an SCM-based model (the ATSB accident investigation model) and two systemic accident analysis methods (AcciMap and STAMP). The analysis outputs and usage of the techniques were compared. The findings of the study showed that each model applied the systems thinking approach. However, the ATSB model and AcciMap graphically presented their findings in a more succinct manner, whereas STAMP more clearly embodied the concepts of systems theory. The study suggests that, whilst the selection of an analysis method is subject to trade-offs that practitioners and researchers must make, the SCM remains a viable model for accident analysis. Copyright © 2013 Elsevier Ltd. All rights reserved.
Movement coordination patterns between the foot joints during walking.
Arnold, John B; Caravaggi, Paolo; Fraysse, François; Thewlis, Dominic; Leardini, Alberto
2017-01-01
In 3D gait analysis, kinematics of the foot joints are usually reported via isolated time histories of joint rotations and no information is provided on the relationship between rotations at different joints. The aim of this study was to identify movement coordination patterns in the foot during walking by expanding an existing vector coding technique according to an established multi-segment foot and ankle model. A graphical representation is also described to summarise the coordination patterns of joint rotations across multiple patients. Three-dimensional multi-segment foot kinematics were recorded in 13 adults during walking. A modified vector coding technique was used to identify coordination patterns between foot joints involving calcaneus, midfoot, metatarsus and hallux segments. According to the type and direction of joints rotations, these were classified as in-phase (same direction), anti-phase (opposite directions), proximal or distal joint dominant. In early stance, 51 to 75% of walking trials showed proximal-phase coordination between foot joints comprising the calcaneus, midfoot and metatarsus. In-phase coordination was more prominent in late stance, reflecting synergy in the simultaneous inversion occurring at multiple foot joints. Conversely, a distal-phase coordination pattern was identified for sagittal plane motion of the ankle relative to the midtarsal joint, highlighting the critical role of arch shortening to locomotor function in push-off. This study has identified coordination patterns between movement of the calcaneus, midfoot, metatarsus and hallux by expanding an existing vector cording technique for assessing and classifying coordination patterns of foot joints rotations during walking. This approach provides a different perspective in the analysis of multi-segment foot kinematics, and may be used for the objective quantification of the alterations in foot joint coordination patterns due to lower limb pathologies or following injuries.
In Situ Monitoring of Chemical Reactions at a Solid-Water Interface by Femtosecond Acoustics.
Shen, Chih-Chiang; Weng, Meng-Yu; Sheu, Jinn-Kong; Yao, Yi-Ting; Sun, Chi-Kuang
2017-11-02
Chemical reactions at a solid-liquid interface are of fundamental importance. Interfacial chemical reactions occur not only at the very interface but also in the subsurface area, while existing monitoring techniques either provide limited spatial resolution or are applicable only for the outmost atomic layer. Here, with the aid of the time-domain analysis with femtosecond acoustics, we demonstrate a subatomic-level-resolution technique to longitudinally monitor chemical reactions at solid-water interfaces, capable of in situ monitoring even the subsurface area under atmospheric conditions. Our work was proven by monitoring the already-known anode oxidation process occurring during photoelectrochemical water splitting. Furthermore, whenever the oxide layer thickness equals an integer number of the effective atomic layer thickness, the measured acoustic echo will show higher signal-to-noise ratios with reduced speckle noise, indicating the quantum-like behavior of this coherent-phonon-based technique.
A simple low cost latent fingerprint sensor based on deflectometry and WFT analysis
NASA Astrophysics Data System (ADS)
Dhanotia, Jitendra; Chatterjee, Amit; Bhatia, Vimal; Prakash, Shashi
2018-02-01
In criminal investigations, latent fingerprints are one of the most significant forms of evidence and most commonly used forensic investigation tool worldwide. The existing non-contact latent fingerprint detection systems are bulky, expensive and require environment which is shock and vibration resistant, thereby limiting their usability outside the laboratory. In this article, a compact, full field, low cost technique for profiling of fingerprints using deflectometry is proposed. Using inexpensive mobile phone screen based structured illumination, and windowed Fourier transform (WFT) based phase retrieval mechanism, the 2D and 3D phase plots reconstruct the profile information of the fingerprint. The phase information is also used to confirm a match between two fingerprints in real time. Since the proposed technique is non-interferometric, the measurements are least affected by environmental perturbations. Using the proposed technique, a portable sensor capable of field deployment has been realized.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Prentice, H. J.; Proud, W. G.
2006-07-28
A technique has been developed to determine experimentally the three-dimensional displacement field on the rear surface of a dynamically deforming plate. The technique combines speckle analysis with stereoscopy, using a modified angular-lens method: this incorporates split-frame photography and a simple method by which the effective lens separation can be adjusted and calibrated in situ. Whilst several analytical models exist to predict deformation in extended or semi-infinite targets, the non-trivial nature of the wave interactions complicates the generation and development of analytical models for targets of finite depth. By interrogating specimens experimentally to acquire three-dimensional strain data points, both analytical andmore » numerical model predictions can be verified more rigorously. The technique is applied to the quasi-static deformation of a rubber sheet and dynamically to Mild Steel sheets of various thicknesses.« less
Magneto-optical imaging technique for hostile environments: The ghost imaging approach
DOE Office of Scientific and Technical Information (OSTI.GOV)
Meda, A.; Caprile, A.; Avella, A.
2015-06-29
In this paper, we develop an approach to magneto optical imaging (MOI), applying a ghost imaging (GI) protocol to perform Faraday microscopy. MOI is of the utmost importance for the investigation of magnetic properties of material samples, through Weiss domains shape, dimension and dynamics analysis. Nevertheless, in some extreme conditions such as cryogenic temperatures or high magnetic field applications, there exists a lack of domain images due to the difficulty in creating an efficient imaging system in such environments. Here, we present an innovative MOI technique that separates the imaging optical path from the one illuminating the object. The techniquemore » is based on thermal light GI and exploits correlations between light beams to retrieve the image of magnetic domains. As a proof of principle, the proposed technique is applied to the Faraday magneto-optical observation of the remanence domain structure of an yttrium iron garnet sample.« less
Hicks, Amy; Fairhurst, Caroline; Torgerson, David J
2018-03-01
To perform a worked example of an approach that can be used to identify and remove potentially biased trials from meta-analyses via the analysis of baseline variables. True randomisation produces treatment groups that differ only by chance; therefore, a meta-analysis of a baseline measurement should produce no overall difference and zero heterogeneity. A meta-analysis from the British Medical Journal, known to contain significant heterogeneity and imbalance in baseline age, was chosen. Meta-analyses of baseline variables were performed and trials systematically removed, starting with those with the largest t-statistic, until the I 2 measure of heterogeneity became 0%, then the outcome meta-analysis repeated with only the remaining trials as a sensitivity check. We argue that heterogeneity in a meta-analysis of baseline variables should not exist, and therefore removing trials which contribute to heterogeneity from a meta-analysis will produce a more valid result. In our example none of the overall outcomes changed when studies contributing to heterogeneity were removed. We recommend routine use of this technique, using age and a second baseline variable predictive of outcome for the particular study chosen, to help eliminate potential bias in meta-analyses. Copyright © 2017 Elsevier Inc. All rights reserved.
NASA Technical Reports Server (NTRS)
Noor, A. K.
1983-01-01
Advances in continuum modeling, progress in reduction methods, and analysis and modeling needs for large space structures are covered with specific attention given to repetitive lattice trusses. As far as continuum modeling is concerned, an effective and verified analysis capability exists for linear thermoelastic stress, birfurcation buckling, and free vibration problems of repetitive lattices. However, application of continuum modeling to nonlinear analysis needs more development. Reduction methods are very effective for bifurcation buckling and static (steady-state) nonlinear analysis. However, more work is needed to realize their full potential for nonlinear dynamic and time-dependent problems. As far as analysis and modeling needs are concerned, three areas are identified: loads determination, modeling and nonclassical behavior characteristics, and computational algorithms. The impact of new advances in computer hardware, software, integrated analysis, CAD/CAM stems, and materials technology is also discussed.
A Systematic Review of Techniques and Sources of Big Data in the Healthcare Sector.
Alonso, Susel Góngora; de la Torre Díez, Isabel; Rodrigues, Joel J P C; Hamrioui, Sofiane; López-Coronado, Miguel
2017-10-14
The main objective of this paper is to present a review of existing researches in the literature, referring to Big Data sources and techniques in health sector and to identify which of these techniques are the most used in the prediction of chronic diseases. Academic databases and systems such as IEEE Xplore, Scopus, PubMed and Science Direct were searched, considering the date of publication from 2006 until the present time. Several search criteria were established as 'techniques' OR 'sources' AND 'Big Data' AND 'medicine' OR 'health', 'techniques' AND 'Big Data' AND 'chronic diseases', etc. Selecting the paper considered of interest regarding the description of the techniques and sources of Big Data in healthcare. It found a total of 110 articles on techniques and sources of Big Data on health from which only 32 have been identified as relevant work. Many of the articles show the platforms of Big Data, sources, databases used and identify the techniques most used in the prediction of chronic diseases. From the review of the analyzed research articles, it can be noticed that the sources and techniques of Big Data used in the health sector represent a relevant factor in terms of effectiveness, since it allows the application of predictive analysis techniques in tasks such as: identification of patients at risk of reentry or prevention of hospital or chronic diseases infections, obtaining predictive models of quality.
Novel hand-held device for exhaled nitric oxide-analysis in research and clinical applications.
Hemmingsson, Tryggve; Linnarsson, Dag; Gambert, Rudolf
2004-12-01
Changes in expired nitric oxide (NO) occur in airway inflammation and have proved to be important in the monitoring of inflammatory disease processes such as asthma. We set out to develop a novel hand-held NO-analyzer with a performance comparable to the present more costly and complex chemiluminescence instruments. The new device is based on a specially designed electrochemical sensor, where we have developed a novel sampling and analysis technology, compensating for the relatively slow response properties of the electrochemical sensor technique. A Lowest Detection Limit in NO-analysis from reference gas tests of less than 3 ppb and a response time of 15 seconds together with an average precision in human breath measurements of 1.4 ppb were obtained. We also show an agreement with the existing 'gold standard' FENO measurement technique, within 0.5 ppb in a group of 19 subjects together with a high linearity and accuracy compared to reference gases. The new analyzer enables affordable monitoring of inflammatory airway diseases in research and routine clinical practice.
VLBI Analysis with the Multi-Technique Software GEOSAT
NASA Technical Reports Server (NTRS)
Kierulf, Halfdan Pascal; Andersen, Per-Helge; Boeckmann, Sarah; Kristiansen, Oddgeir
2010-01-01
GEOSAT is a multi-technique geodetic analysis software developed at Forsvarets Forsknings Institutt (Norwegian defense research establishment). The Norwegian Mapping Authority has now installed the software and has, together with Forsvarets Forsknings Institutt, adapted the software to deliver datum-free normal equation systems in SINEX format. The goal is to be accepted as an IVS Associate Analysis Center and to provide contributions to the IVS EOP combination on a routine basis. GEOSAT is based on an upper diagonal factorized Kalman filter which allows estimation of time variable parameters like the troposphere and clocks as stochastic parameters. The tropospheric delays in various directions are mapped to tropospheric zenith delay using ray-tracing. Meteorological data from ECMWF with a resolution of six hours is used to perform the ray-tracing which depends both on elevation and azimuth. Other models are following the IERS and IVS conventions. The Norwegian Mapping Authority has submitted test SINEX files produced with GEOSAT to IVS. The results have been compared with the existing IVS combined products. In this paper the outcome of these comparisons is presented.
Query2Question: Translating Visualization Interaction into Natural Language.
Nafari, Maryam; Weaver, Chris
2015-06-01
Richly interactive visualization tools are increasingly popular for data exploration and analysis in a wide variety of domains. Existing systems and techniques for recording provenance of interaction focus either on comprehensive automated recording of low-level interaction events or on idiosyncratic manual transcription of high-level analysis activities. In this paper, we present the architecture and translation design of a query-to-question (Q2Q) system that automatically records user interactions and presents them semantically using natural language (written English). Q2Q takes advantage of domain knowledge and uses natural language generation (NLG) techniques to translate and transcribe a progression of interactive visualization states into a visual log of styled text that complements and effectively extends the functionality of visualization tools. We present Q2Q as a means to support a cross-examination process in which questions rather than interactions are the focus of analytic reasoning and action. We describe the architecture and implementation of the Q2Q system, discuss key design factors and variations that effect question generation, and present several visualizations that incorporate Q2Q for analysis in a variety of knowledge domains.
Achievements and perspectives of top-down proteomics.
Armirotti, Andrea; Damonte, Gianluca
2010-10-01
Over the last years, top-down (TD) MS has gained a remarkable space in proteomics, rapidly trespassing the limit between a promising approach and a solid, established technique. Several research groups worldwide have implemented TD analysis in their routine work on proteomics, deriving structural information on proteins with the level of accuracy that is impossible to achieve with classical bottom-up approaches. Complete maps of PTMs and assessment of single aminoacid polymorphisms are only a few of the results that can be obtained with this technique. Despite some existing technical and economical limitations, TD analysis is at present the most powerful instrument for MS-based proteomics and its implementation in routine workflow is a rapidly approaching turning point in proteomics. In this review article, the state-of-the-art of TD approach is described along with its major advantages and drawbacks and the most recent trends in TD analysis are discussed. References for all the covered topics are reported in the text, with the aim to support both newcomers and mass spectrometrists already introduced to TD proteomics.
A Monte Carlo study of Weibull reliability analysis for space shuttle main engine components
NASA Technical Reports Server (NTRS)
Abernethy, K.
1986-01-01
The incorporation of a number of additional capabilities into an existing Weibull analysis computer program and the results of Monte Carlo computer simulation study to evaluate the usefulness of the Weibull methods using samples with a very small number of failures and extensive censoring are discussed. Since the censoring mechanism inherent in the Space Shuttle Main Engine (SSME) data is hard to analyze, it was decided to use a random censoring model, generating censoring times from a uniform probability distribution. Some of the statistical techniques and computer programs that are used in the SSME Weibull analysis are described. The methods documented in were supplemented by adding computer calculations of approximate (using iteractive methods) confidence intervals for several parameters of interest. These calculations are based on a likelihood ratio statistic which is asymptotically a chisquared statistic with one degree of freedom. The assumptions built into the computer simulations are described. The simulation program and the techniques used in it are described there also. Simulation results are tabulated for various combinations of Weibull shape parameters and the numbers of failures in the samples.
Analysis of Site Position Time Series Derived From Space Geodetic Solutions
NASA Astrophysics Data System (ADS)
Angermann, D.; Meisel, B.; Kruegel, M.; Tesmer, V.; Miller, R.; Drewes, H.
2003-12-01
This presentation deals with the analysis of station coordinate time series obtained from VLBI, SLR, GPS and DORIS solutions. We also present time series for the origin and scale derived from these solutions and discuss their contribution to the realization of the terrestrial reference frame. For these investigations we used SLR and VLBI solutions computed at DGFI with the software systems DOGS (SLR) and OCCAM (VLBI). The GPS and DORIS time series were obtained from weekly station coordinates solutions provided by the IGS, and from the joint DORIS analysis center (IGN-JPL). We analysed the time series with respect to various aspects, such as non-linear motions, periodic signals and systematic differences (biases). A major focus is on a comparison of the results at co-location sites in order to identify technique- and/or solution related problems. This may also help to separate and quantify possible effects, and to understand the origin of still existing discrepancies. Technique-related systematic effects (biases) should be reduced to the highest possible extent, before using the space geodetic solutions for a geophysical interpretation of seasonal signals in site position time series.
True Ortho Generation of Urban Area Using High Resolution Aerial Photos
NASA Astrophysics Data System (ADS)
Hu, Yong; Stanley, David; Xin, Yubin
2016-06-01
The pros and cons of existing methods for true ortho generation are analyzed based on a critical literature review for its two major processing stages: visibility analysis and occlusion compensation. They process frame and pushbroom images using different algorithms for visibility analysis due to the need of perspective centers used by the z-buffer (or alike) techniques. For occlusion compensation, the pixel-based approach likely results in excessive seamlines in the ortho-rectified images due to the use of a quality measure on the pixel-by-pixel rating basis. In this paper, we proposed innovative solutions to tackle the aforementioned problems. For visibility analysis, an elevation buffer technique is introduced to employ the plain elevations instead of the distances from perspective centers by z-buffer, and has the advantage of sensor independency. A segment oriented strategy is developed to evaluate a plain cost measure per segment for occlusion compensation instead of the tedious quality rating per pixel. The cost measure directly evaluates the imaging geometry characteristics in ground space, and is also sensor independent. Experimental results are demonstrated using aerial photos acquired by UltraCam camera.
Seismic Analysis Capability in NASTRAN
NASA Technical Reports Server (NTRS)
Butler, T. G.; Strang, R. F.
1984-01-01
Seismic analysis is a technique which pertains to loading described in terms of boundary accelerations. Earthquake shocks to buildings is the type of excitation which usually comes to mind when one hears the word seismic, but this technique also applied to a broad class of acceleration excitations which are applied at the base of a structure such as vibration shaker testing or shocks to machinery foundations. Four different solution paths are available in NASTRAN for seismic analysis. They are: Direct Seismic Frequency Response, Direct Seismic Transient Response, Modal Seismic Frequency Response, and Modal Seismic Transient Response. This capability, at present, is invoked not as separate rigid formats, but as pre-packaged ALTER packets to existing RIGID Formats 8, 9, 11, and 12. These ALTER packets are included with the delivery of the NASTRAN program and are stored on the computer as a library of callable utilities. The user calls one of these utilities and merges it into the Executive Control Section of the data deck to perform any of the four options are invoked by setting parameter values in the bulk data.
Conceptual designs for in situ analysis of Mars soil
NASA Technical Reports Server (NTRS)
Mckay, C. P.; Zent, A. P.; Hartman, H.
1991-01-01
A goal of this research is to develop conceptual designs for instrumentation to perform in situ measurements of the Martian soil in order to determine the existence and nature of any reactive chemicals. Our approach involves assessment and critical review of the Viking biology results which indicated the presence of a soil oxidant, an investigation of the possible application of standard soil science techniques to the analysis of Martian soil, and a preliminary consideration of non-standard methods that may be necessary for use in the highly oxidizing Martian soil. Based on our preliminary analysis, we have developed strawman concepts for standard soil analysis on Mars, including pH, suitable for use on a Mars rover mission. In addition, we have devised a method for the determination of the possible strong oxidants on Mars.
Privacy-preserving heterogeneous health data sharing.
Mohammed, Noman; Jiang, Xiaoqian; Chen, Rui; Fung, Benjamin C M; Ohno-Machado, Lucila
2013-05-01
Privacy-preserving data publishing addresses the problem of disclosing sensitive data when mining for useful information. Among existing privacy models, ε-differential privacy provides one of the strongest privacy guarantees and makes no assumptions about an adversary's background knowledge. All existing solutions that ensure ε-differential privacy handle the problem of disclosing relational and set-valued data in a privacy-preserving manner separately. In this paper, we propose an algorithm that considers both relational and set-valued data in differentially private disclosure of healthcare data. The proposed approach makes a simple yet fundamental switch in differentially private algorithm design: instead of listing all possible records (ie, a contingency table) for noise addition, records are generalized before noise addition. The algorithm first generalizes the raw data in a probabilistic way, and then adds noise to guarantee ε-differential privacy. We showed that the disclosed data could be used effectively to build a decision tree induction classifier. Experimental results demonstrated that the proposed algorithm is scalable and performs better than existing solutions for classification analysis. The resulting utility may degrade when the output domain size is very large, making it potentially inappropriate to generate synthetic data for large health databases. Unlike existing techniques, the proposed algorithm allows the disclosure of health data containing both relational and set-valued data in a differentially private manner, and can retain essential information for discriminative analysis.
Privacy-preserving heterogeneous health data sharing
Mohammed, Noman; Jiang, Xiaoqian; Chen, Rui; Fung, Benjamin C M; Ohno-Machado, Lucila
2013-01-01
Objective Privacy-preserving data publishing addresses the problem of disclosing sensitive data when mining for useful information. Among existing privacy models, ε-differential privacy provides one of the strongest privacy guarantees and makes no assumptions about an adversary's background knowledge. All existing solutions that ensure ε-differential privacy handle the problem of disclosing relational and set-valued data in a privacy-preserving manner separately. In this paper, we propose an algorithm that considers both relational and set-valued data in differentially private disclosure of healthcare data. Methods The proposed approach makes a simple yet fundamental switch in differentially private algorithm design: instead of listing all possible records (ie, a contingency table) for noise addition, records are generalized before noise addition. The algorithm first generalizes the raw data in a probabilistic way, and then adds noise to guarantee ε-differential privacy. Results We showed that the disclosed data could be used effectively to build a decision tree induction classifier. Experimental results demonstrated that the proposed algorithm is scalable and performs better than existing solutions for classification analysis. Limitation The resulting utility may degrade when the output domain size is very large, making it potentially inappropriate to generate synthetic data for large health databases. Conclusions Unlike existing techniques, the proposed algorithm allows the disclosure of health data containing both relational and set-valued data in a differentially private manner, and can retain essential information for discriminative analysis. PMID:23242630
Surface Aesthetics and Analysis.
Çakır, Barış; Öreroğlu, Ali Rıza; Daniel, Rollin K
2016-01-01
Surface aesthetics of an attractive nose result from certain lines, shadows, and highlights with specific proportions and breakpoints. Analysis emphasizes geometric polygons as aesthetic subunits. Evaluation of the complete nasal surface aesthetics is achieved using geometric polygons to define the existing deformity and aesthetic goals. The relationship between the dome triangles, interdomal triangle, facet polygons, and infralobular polygon are integrated to form the "diamond shape" light reflection on the nasal tip. The principles of geometric polygons allow the surgeon to analyze the deformities of the nose, define an operative plan to achieve specific goals, and select the appropriate operative technique. Copyright © 2016 Elsevier Inc. All rights reserved.
ESSAA: Embedded system safety analysis assistant
NASA Technical Reports Server (NTRS)
Wallace, Peter; Holzer, Joseph; Guarro, Sergio; Hyatt, Larry
1987-01-01
The Embedded System Safety Analysis Assistant (ESSAA) is a knowledge-based tool that can assist in identifying disaster scenarios. Imbedded software issues hazardous control commands to the surrounding hardware. ESSAA is intended to work from outputs to inputs, as a complement to simulation and verification methods. Rather than treating the software in isolation, it examines the context in which the software is to be deployed. Given a specified disasterous outcome, ESSAA works from a qualitative, abstract model of the complete system to infer sets of environmental conditions and/or failures that could cause a disasterous outcome. The scenarios can then be examined in depth for plausibility using existing techniques.
High-level user interfaces for transfer function design with semantics.
Salama, Christof Rezk; Keller, Maik; Kohlmann, Peter
2006-01-01
Many sophisticated techniques for the visualization of volumetric data such as medical data have been published. While existing techniques are mature from a technical point of view, managing the complexity of visual parameters is still difficult for non-expert users. To this end, this paper presents new ideas to facilitate the specification of optical properties for direct volume rendering. We introduce an additional level of abstraction for parametric models of transfer functions. The proposed framework allows visualization experts to design high-level transfer function models which can intuitively be used by non-expert users. The results are user interfaces which provide semantic information for specialized visualization problems. The proposed method is based on principal component analysis as well as on concepts borrowed from computer animation.
Highly precise Re-Os dating for molybdenite using alkaline fusion and NTIMS.
Markey, R; Stein, H; Morgan, J
1998-03-01
The technique described in this paper represents the modification and combination of two previously existing methods, alkaline fusion and negative thermal ion mass spectrometry (NTIMS). We have used this technique to analyze repeatedly a homogeneous molybdenite powder used as a reference standard in our laboratory. Analyses were made over a period of 18 months, using four different calibrations of two different spike solutions. The age of this standard reproduces at a level of +/-0.13%. Each individual age analysis carries an uncertainty of about 0.4% that includes the uncertainty in the decay constant for (187)Re. This new level of resolution has allowed us to recognize real differences in ages for two grain-size populations of molybdenite from some Archean samples.
Highly precise Re-Os dating for molybdenite using alkaline fusion and NTIMS
Markey, R.; Stein, H.; Morgan, J.
1998-01-01
The technique described in this paper represents the modification and combination of two previously existing methods, alkaline fusion and negative thermal ion mass spectrometry (NTIMS). We have used this technique to analyze repeatedly a homogeneous molybdenite powder used as a reference standard in our laboratory. Analyses were made over a period of 18 months, using four different calibrations of two different spike solutions. The age of this standard reproduces at a level of ?? 0.13%. Each individual age analysis carries an uncertainty of about 0.4% that includes the uncertainty in the decay constant for 187Re. This new level of resolution has allowed us to recognize real differences in ages for two grain-size populations of molybdenite from some Archean samples.
Transfer-arm evaporator cell for rapid loading and deposition of organic thin films.
Greiner, M T; Helander, M G; Wang, Z B; Lu, Z H
2009-12-01
Described herein is a transfer-arm evaporator cell (TAE-cell), which allows for rapid loading of materials into vacuum for low-temperature sublimation deposition of thin films. This design can be incorporated with an existing analysis system for convenient in situ thin film characterization. This evaporator is especially well suited for photoemission characterization of organic semiconductor interfaces. Photoemission is one of the most important techniques for characterizing such, however, it generally requires in situ sample preparation. The ease with which materials can be loaded and evaporated with this design increases the throughput of in situ photoemission characterization, and broadens the research scope of the technique. Here, we describe the design, operation, and performance of the TAE-cell.
Desensitized Optimal Filtering and Sensor Fusion Toolkit
NASA Technical Reports Server (NTRS)
Karlgaard, Christopher D.
2015-01-01
Analytical Mechanics Associates, Inc., has developed a software toolkit that filters and processes navigational data from multiple sensor sources. A key component of the toolkit is a trajectory optimization technique that reduces the sensitivity of Kalman filters with respect to model parameter uncertainties. The sensor fusion toolkit also integrates recent advances in adaptive Kalman and sigma-point filters for non-Gaussian problems with error statistics. This Phase II effort provides new filtering and sensor fusion techniques in a convenient package that can be used as a stand-alone application for ground support and/or onboard use. Its modular architecture enables ready integration with existing tools. A suite of sensor models and noise distribution as well as Monte Carlo analysis capability are included to enable statistical performance evaluations.
Sparse Group Penalized Integrative Analysis of Multiple Cancer Prognosis Datasets
Liu, Jin; Huang, Jian; Xie, Yang; Ma, Shuangge
2014-01-01
SUMMARY In cancer research, high-throughput profiling studies have been extensively conducted, searching for markers associated with prognosis. Because of the “large d, small n” characteristic, results generated from the analysis of a single dataset can be unsatisfactory. Recent studies have shown that integrative analysis, which simultaneously analyzes multiple datasets, can be more effective than single-dataset analysis and classic meta-analysis. In most of existing integrative analysis, the homogeneity model has been assumed, which postulates that different datasets share the same set of markers. Several approaches have been designed to reinforce this assumption. In practice, different datasets may differ in terms of patient selection criteria, profiling techniques, and many other aspects. Such differences may make the homogeneity model too restricted. In this study, we assume the heterogeneity model, under which different datasets are allowed to have different sets of markers. With multiple cancer prognosis datasets, we adopt the AFT (accelerated failure time) model to describe survival. This model may have the lowest computational cost among popular semiparametric survival models. For marker selection, we adopt a sparse group MCP (minimax concave penalty) approach. This approach has an intuitive formulation and can be computed using an effective group coordinate descent algorithm. Simulation study shows that it outperforms the existing approaches under both the homogeneity and heterogeneity models. Data analysis further demonstrates the merit of heterogeneity model and proposed approach. PMID:23938111
Downie, H F; Adu, M O; Schmidt, S; Otten, W; Dupuy, L X; White, P J; Valentine, T A
2015-07-01
The morphology of roots and root systems influences the efficiency by which plants acquire nutrients and water, anchor themselves and provide stability to the surrounding soil. Plant genotype and the biotic and abiotic environment significantly influence root morphology, growth and ultimately crop yield. The challenge for researchers interested in phenotyping root systems is, therefore, not just to measure roots and link their phenotype to the plant genotype, but also to understand how the growth of roots is influenced by their environment. This review discusses progress in quantifying root system parameters (e.g. in terms of size, shape and dynamics) using imaging and image analysis technologies and also discusses their potential for providing a better understanding of root:soil interactions. Significant progress has been made in image acquisition techniques, however trade-offs exist between sample throughput, sample size, image resolution and information gained. All of these factors impact on downstream image analysis processes. While there have been significant advances in computation power, limitations still exist in statistical processes involved in image analysis. Utilizing and combining different imaging systems, integrating measurements and image analysis where possible, and amalgamating data will allow researchers to gain a better understanding of root:soil interactions. © 2014 John Wiley & Sons Ltd.
Inflight and Preflight Detection of Pitot Tube Anomalies
NASA Technical Reports Server (NTRS)
Mitchell, Darrell W.
2014-01-01
The health and integrity of aircraft sensors play a critical role in aviation safety. Inaccurate or false readings from these sensors can lead to improper decision making, resulting in serious and sometimes fatal consequences. This project demonstrated the feasibility of using advanced data analysis techniques to identify anomalies in Pitot tubes resulting from blockage such as icing, moisture, or foreign objects. The core technology used in this project is referred to as noise analysis because it relates sensors' response time to the dynamic component (noise) found in the signal of these same sensors. This analysis technique has used existing electrical signals of Pitot tube sensors that result from measured processes during inflight conditions and/or induced signals in preflight conditions to detect anomalies in the sensor readings. Analysis and Measurement Services Corporation (AMS Corp.) has routinely used this technology to determine the health of pressure transmitters in nuclear power plants. The application of this technology for the detection of aircraft anomalies is innovative. Instead of determining the health of process monitoring at a steady-state condition, this technology will be used to quickly inform the pilot when an air-speed indication becomes faulty under any flight condition as well as during preflight preparation.
Audio signal analysis for tool wear monitoring in sheet metal stamping
NASA Astrophysics Data System (ADS)
Ubhayaratne, Indivarie; Pereira, Michael P.; Xiang, Yong; Rolfe, Bernard F.
2017-02-01
Stamping tool wear can significantly degrade product quality, and hence, online tool condition monitoring is a timely need in many manufacturing industries. Even though a large amount of research has been conducted employing different sensor signals, there is still an unmet demand for a low-cost easy to set up condition monitoring system. Audio signal analysis is a simple method that has the potential to meet this demand, but has not been previously used for stamping process monitoring. Hence, this paper studies the existence and the significance of the correlation between emitted sound signals and the wear state of sheet metal stamping tools. The corrupting sources generated by the tooling of the stamping press and surrounding machinery have higher amplitudes compared to that of the sound emitted by the stamping operation itself. Therefore, a newly developed semi-blind signal extraction technique was employed as a pre-processing technique to mitigate the contribution of these corrupting sources. The spectral analysis results of the raw and extracted signals demonstrate a significant qualitative relationship between wear progression and the emitted sound signature. This study lays the basis for employing low-cost audio signal analysis in the development of a real-time industrial tool condition monitoring system.
Cobb, Nathan; Cohen, Trevor
2016-01-01
Background Research studies involving health-related online communities have focused on examining network structure to understand mechanisms underlying behavior change. Content analysis of the messages exchanged in these communities has been limited to the “social support” perspective. However, existing behavior change theories suggest that message content plays a prominent role reflecting several sociocognitive factors that affect an individual’s efforts to make a lifestyle change. An understanding of these factors is imperative to identify and harness the mechanisms of behavior change in the Health 2.0 era. Objective The objective of this work is two-fold: (1) to harness digital communication data to capture essential meaning of communication and factors affecting a desired behavior change, and (2) to understand the applicability of existing behavior change theories to characterize peer-to-peer communication in online platforms. Methods In this paper, we describe grounded theory–based qualitative analysis of digital communication in QuitNet, an online community promoting smoking cessation. A database of 16,492 de-identified public messages from 1456 users from March 1-April 30, 2007, was used in our study. We analyzed 795 messages using grounded theory techniques to ensure thematic saturation. This analysis enabled identification of key concepts contained in the messages exchanged by QuitNet members, allowing us to understand the sociobehavioral intricacies underlying an individual’s efforts to cease smoking in a group setting. We further ascertained the relevance of the identified themes to theoretical constructs in existing behavior change theories (eg, Health Belief Model) and theoretically linked techniques of behavior change taxonomy. Results We identified 43 different concepts, which were then grouped under 12 themes based on analysis of 795 messages. Examples of concepts include “sleepiness,” “pledge,” “patch,” “spouse,” and “slip.” Examples of themes include “traditions,” “social support,” “obstacles,” “relapse,” and “cravings.” Results indicate that themes consisting of member-generated strategies such as “virtual bonfires” and “pledges” were related to the highest number of theoretical constructs from the existing behavior change theories. In addition, results indicate that the member-generated communication content supports sociocognitive constructs from more than one behavior change model, unlike the majority of the existing theory-driven interventions. Conclusions With the onset of mobile phones and ubiquitous Internet connectivity, online social network data reflect the intricacies of human health behavior as experienced by health consumers in real time. This study offers methodological insights for qualitative investigations that examine the various kinds of behavioral constructs prevalent in the messages exchanged among users of online communities. Theoretically, this study establishes the manifestation of existing behavior change theories in QuitNet-like online health communities. Pragmatically, it sets the stage for real-time, data-driven sociobehavioral interventions promoting healthy lifestyle modifications by allowing us to understand the emergent user needs to sustain a desired behavior change. PMID:26839162
Scalable graphene production: perspectives and challenges of plasma applications
NASA Astrophysics Data System (ADS)
Levchenko, Igor; Ostrikov, Kostya (Ken); Zheng, Jie; Li, Xingguo; Keidar, Michael; B. K. Teo, Kenneth
2016-05-01
Graphene, a newly discovered and extensively investigated material, has many unique and extraordinary properties which promise major technological advances in fields ranging from electronics to mechanical engineering and food production. Unfortunately, complex techniques and high production costs hinder commonplace applications. Scaling of existing graphene production techniques to the industrial level without compromising its properties is a current challenge. This article focuses on the perspectives and challenges of scalability, equipment, and technological perspectives of the plasma-based techniques which offer many unique possibilities for the synthesis of graphene and graphene-containing products. The plasma-based processes are amenable for scaling and could also be useful to enhance the controllability of the conventional chemical vapour deposition method and some other techniques, and to ensure a good quality of the produced graphene. We examine the unique features of the plasma-enhanced graphene production approaches, including the techniques based on inductively-coupled and arc discharges, in the context of their potential scaling to mass production following the generic scaling approaches applicable to the existing processes and systems. This work analyses a large amount of the recent literature on graphene production by various techniques and summarizes the results in a tabular form to provide a simple and convenient comparison of several available techniques. Our analysis reveals a significant potential of scalability for plasma-based technologies, based on the scaling-related process characteristics. Among other processes, a greater yield of 1 g × h-1 m-2 was reached for the arc discharge technology, whereas the other plasma-based techniques show process yields comparable to the neutral-gas based methods. Selected plasma-based techniques show lower energy consumption than in thermal CVD processes, and the ability to produce graphene flakes of various sizes reaching hundreds of square millimetres, and the thickness varying from a monolayer to 10-20 layers. Additional factors such as electrical voltage and current, not available in thermal CVD processes could potentially lead to better scalability, flexibility and control of the plasma-based processes. Advantages and disadvantages of various systems are also considered.
Scalable graphene production: perspectives and challenges of plasma applications.
Levchenko, Igor; Ostrikov, Kostya Ken; Zheng, Jie; Li, Xingguo; Keidar, Michael; B K Teo, Kenneth
2016-05-19
Graphene, a newly discovered and extensively investigated material, has many unique and extraordinary properties which promise major technological advances in fields ranging from electronics to mechanical engineering and food production. Unfortunately, complex techniques and high production costs hinder commonplace applications. Scaling of existing graphene production techniques to the industrial level without compromising its properties is a current challenge. This article focuses on the perspectives and challenges of scalability, equipment, and technological perspectives of the plasma-based techniques which offer many unique possibilities for the synthesis of graphene and graphene-containing products. The plasma-based processes are amenable for scaling and could also be useful to enhance the controllability of the conventional chemical vapour deposition method and some other techniques, and to ensure a good quality of the produced graphene. We examine the unique features of the plasma-enhanced graphene production approaches, including the techniques based on inductively-coupled and arc discharges, in the context of their potential scaling to mass production following the generic scaling approaches applicable to the existing processes and systems. This work analyses a large amount of the recent literature on graphene production by various techniques and summarizes the results in a tabular form to provide a simple and convenient comparison of several available techniques. Our analysis reveals a significant potential of scalability for plasma-based technologies, based on the scaling-related process characteristics. Among other processes, a greater yield of 1 g × h(-1) m(-2) was reached for the arc discharge technology, whereas the other plasma-based techniques show process yields comparable to the neutral-gas based methods. Selected plasma-based techniques show lower energy consumption than in thermal CVD processes, and the ability to produce graphene flakes of various sizes reaching hundreds of square millimetres, and the thickness varying from a monolayer to 10-20 layers. Additional factors such as electrical voltage and current, not available in thermal CVD processes could potentially lead to better scalability, flexibility and control of the plasma-based processes. Advantages and disadvantages of various systems are also considered.