Sample records for analysis technique based

  1. Application of Petri net based analysis techniques to signal transduction pathways.

    PubMed

    Sackmann, Andrea; Heiner, Monika; Koch, Ina

    2006-11-02

    Signal transduction pathways are usually modelled using classical quantitative methods, which are based on ordinary differential equations (ODEs). However, some difficulties are inherent in this approach. On the one hand, the kinetic parameters involved are often unknown and have to be estimated. With increasing size and complexity of signal transduction pathways, the estimation of missing kinetic data is not possible. On the other hand, ODEs based models do not support any explicit insights into possible (signal-) flows within the network. Moreover, a huge amount of qualitative data is available due to high-throughput techniques. In order to get information on the systems behaviour, qualitative analysis techniques have been developed. Applications of the known qualitative analysis methods concern mainly metabolic networks. Petri net theory provides a variety of established analysis techniques, which are also applicable to signal transduction models. In this context special properties have to be considered and new dedicated techniques have to be designed. We apply Petri net theory to model and analyse signal transduction pathways first qualitatively before continuing with quantitative analyses. This paper demonstrates how to build systematically a discrete model, which reflects provably the qualitative biological behaviour without any knowledge of kinetic parameters. The mating pheromone response pathway in Saccharomyces cerevisiae serves as case study. We propose an approach for model validation of signal transduction pathways based on the network structure only. For this purpose, we introduce the new notion of feasible t-invariants, which represent minimal self-contained subnets being active under a given input situation. Each of these subnets stands for a signal flow in the system. We define maximal common transition sets (MCT-sets), which can be used for t-invariant examination and net decomposition into smallest biologically meaningful functional units. The

  2. Application of Petri net based analysis techniques to signal transduction pathways

    PubMed Central

    Sackmann, Andrea; Heiner, Monika; Koch, Ina

    2006-01-01

    Background Signal transduction pathways are usually modelled using classical quantitative methods, which are based on ordinary differential equations (ODEs). However, some difficulties are inherent in this approach. On the one hand, the kinetic parameters involved are often unknown and have to be estimated. With increasing size and complexity of signal transduction pathways, the estimation of missing kinetic data is not possible. On the other hand, ODEs based models do not support any explicit insights into possible (signal-) flows within the network. Moreover, a huge amount of qualitative data is available due to high-throughput techniques. In order to get information on the systems behaviour, qualitative analysis techniques have been developed. Applications of the known qualitative analysis methods concern mainly metabolic networks. Petri net theory provides a variety of established analysis techniques, which are also applicable to signal transduction models. In this context special properties have to be considered and new dedicated techniques have to be designed. Methods We apply Petri net theory to model and analyse signal transduction pathways first qualitatively before continuing with quantitative analyses. This paper demonstrates how to build systematically a discrete model, which reflects provably the qualitative biological behaviour without any knowledge of kinetic parameters. The mating pheromone response pathway in Saccharomyces cerevisiae serves as case study. Results We propose an approach for model validation of signal transduction pathways based on the network structure only. For this purpose, we introduce the new notion of feasible t-invariants, which represent minimal self-contained subnets being active under a given input situation. Each of these subnets stands for a signal flow in the system. We define maximal common transition sets (MCT-sets), which can be used for t-invariant examination and net decomposition into smallest biologically

  3. Wheeze sound analysis using computer-based techniques: a systematic review.

    PubMed

    Ghulam Nabi, Fizza; Sundaraj, Kenneth; Chee Kiang, Lam; Palaniappan, Rajkumar; Sundaraj, Sebastian

    2017-10-31

    Wheezes are high pitched continuous respiratory acoustic sounds which are produced as a result of airway obstruction. Computer-based analyses of wheeze signals have been extensively used for parametric analysis, spectral analysis, identification of airway obstruction, feature extraction and diseases or pathology classification. While this area is currently an active field of research, the available literature has not yet been reviewed. This systematic review identified articles describing wheeze analyses using computer-based techniques on the SCOPUS, IEEE Xplore, ACM, PubMed and Springer and Elsevier electronic databases. After a set of selection criteria was applied, 41 articles were selected for detailed analysis. The findings reveal that 1) computerized wheeze analysis can be used for the identification of disease severity level or pathology, 2) further research is required to achieve acceptable rates of identification on the degree of airway obstruction with normal breathing, 3) analysis using combinations of features and on subgroups of the respiratory cycle has provided a pathway to classify various diseases or pathology that stem from airway obstruction.

  4. Plasma spectroscopy analysis technique based on optimization algorithms and spectral synthesis for arc-welding quality assurance.

    PubMed

    Mirapeix, J; Cobo, A; González, D A; López-Higuera, J M

    2007-02-19

    A new plasma spectroscopy analysis technique based on the generation of synthetic spectra by means of optimization processes is presented in this paper. The technique has been developed for its application in arc-welding quality assurance. The new approach has been checked through several experimental tests, yielding results in reasonably good agreement with the ones offered by the traditional spectroscopic analysis technique.

  5. Can state-of-the-art HVS-based objective image quality criteria be used for image reconstruction techniques based on ROI analysis?

    NASA Astrophysics Data System (ADS)

    Dostal, P.; Krasula, L.; Klima, M.

    2012-06-01

    Various image processing techniques in multimedia technology are optimized using visual attention feature of the human visual system. Spatial non-uniformity causes that different locations in an image are of different importance in terms of perception of the image. In other words, the perceived image quality depends mainly on the quality of important locations known as regions of interest. The performance of such techniques is measured by subjective evaluation or objective image quality criteria. Many state-of-the-art objective metrics are based on HVS properties; SSIM, MS-SSIM based on image structural information, VIF based on the information that human brain can ideally gain from the reference image or FSIM utilizing the low-level features to assign the different importance to each location in the image. But still none of these objective metrics utilize the analysis of regions of interest. We solve the question if these objective metrics can be used for effective evaluation of images reconstructed by processing techniques based on ROI analysis utilizing high-level features. In this paper authors show that the state-of-the-art objective metrics do not correlate well with subjective evaluation while the demosaicing based on ROI analysis is used for reconstruction. The ROI were computed from "ground truth" visual attention data. The algorithm combining two known demosaicing techniques on the basis of ROI location is proposed to reconstruct the ROI in fine quality while the rest of image is reconstructed with low quality. The color image reconstructed by this ROI approach was compared with selected demosaicing techniques by objective criteria and subjective testing. The qualitative comparison of the objective and subjective results indicates that the state-of-the-art objective metrics are still not suitable for evaluation image processing techniques based on ROI analysis and new criteria is demanded.

  6. GIS-based bivariate statistical techniques for groundwater potential analysis (an example of Iran)

    NASA Astrophysics Data System (ADS)

    Haghizadeh, Ali; Moghaddam, Davoud Davoudi; Pourghasemi, Hamid Reza

    2017-12-01

    Groundwater potential analysis prepares better comprehension of hydrological settings of different regions. This study shows the potency of two GIS-based data driven bivariate techniques namely statistical index (SI) and Dempster-Shafer theory (DST) to analyze groundwater potential in Broujerd region of Iran. The research was done using 11 groundwater conditioning factors and 496 spring positions. Based on the ground water potential maps (GPMs) of SI and DST methods, 24.22% and 23.74% of the study area is covered by poor zone of groundwater potential, and 43.93% and 36.3% of Broujerd region is covered by good and very good potential zones, respectively. The validation of outcomes displayed that area under the curve (AUC) of SI and DST techniques are 81.23% and 79.41%, respectively, which shows SI method has slightly a better performance than the DST technique. Therefore, SI and DST methods are advantageous to analyze groundwater capacity and scrutinize the complicated relation between groundwater occurrence and groundwater conditioning factors, which permits investigation of both systemic and stochastic uncertainty. Finally, it can be realized that these techniques are very beneficial for groundwater potential analyzing and can be practical for water-resource management experts.

  7. Data analysis techniques

    NASA Technical Reports Server (NTRS)

    Park, Steve

    1990-01-01

    A large and diverse number of computational techniques are routinely used to process and analyze remotely sensed data. These techniques include: univariate statistics; multivariate statistics; principal component analysis; pattern recognition and classification; other multivariate techniques; geometric correction; registration and resampling; radiometric correction; enhancement; restoration; Fourier analysis; and filtering. Each of these techniques will be considered, in order.

  8. Sex-based differences in lifting technique under increasing load conditions: A principal component analysis.

    PubMed

    Sheppard, P S; Stevenson, J M; Graham, R B

    2016-05-01

    The objective of the present study was to determine if there is a sex-based difference in lifting technique across increasing-load conditions. Eleven male and 14 female participants (n = 25) with no previous history of low back disorder participated in the study. Participants completed freestyle, symmetric lifts of a box with handles from the floor to a table positioned at 50% of their height for five trials under three load conditions (10%, 20%, and 30% of their individual maximum isometric back strength). Joint kinematic data for the ankle, knee, hip, and lumbar and thoracic spine were collected using a two-camera Optotrak motion capture system. Joint angles were calculated using a three-dimensional Euler rotation sequence. Principal component analysis (PCA) and single component reconstruction were applied to assess differences in lifting technique across the entire waveforms. Thirty-two PCs were retained from the five joints and three axes in accordance with the 90% trace criterion. Repeated-measures ANOVA with a mixed design revealed no significant effect of sex for any of the PCs. This is contrary to previous research that used discrete points on the lifting curve to analyze sex-based differences, but agrees with more recent research using more complex analysis techniques. There was a significant effect of load on lifting technique for five PCs of the lower limb (PC1 of ankle flexion, knee flexion, and knee adduction, as well as PC2 and PC3 of hip flexion) (p < 0.005). However, there was no significant effect of load on the thoracic and lumbar spine. It was concluded that when load is standardized to individual back strength characteristics, males and females adopted a similar lifting technique. In addition, as load increased male and female participants changed their lifting technique in a similar manner. Copyright © 2016. Published by Elsevier Ltd.

  9. Comparative Analysis of Various Single-tone Frequency Estimation Techniques in High-order Instantaneous Moments Based Phase Estimation Method

    NASA Astrophysics Data System (ADS)

    Rajshekhar, G.; Gorthi, Sai Siva; Rastogi, Pramod

    2010-04-01

    For phase estimation in digital holographic interferometry, a high-order instantaneous moments (HIM) based method was recently developed which relies on piecewise polynomial approximation of phase and subsequent evaluation of the polynomial coefficients using the HIM operator. A crucial step in the method is mapping the polynomial coefficient estimation to single-tone frequency determination for which various techniques exist. The paper presents a comparative analysis of the performance of the HIM operator based method in using different single-tone frequency estimation techniques for phase estimation. The analysis is supplemented by simulation results.

  10. A novel mesh processing based technique for 3D plant analysis

    PubMed Central

    2012-01-01

    Background In recent years, imaging based, automated, non-invasive, and non-destructive high-throughput plant phenotyping platforms have become popular tools for plant biology, underpinning the field of plant phenomics. Such platforms acquire and record large amounts of raw data that must be accurately and robustly calibrated, reconstructed, and analysed, requiring the development of sophisticated image understanding and quantification algorithms. The raw data can be processed in different ways, and the past few years have seen the emergence of two main approaches: 2D image processing and 3D mesh processing algorithms. Direct image quantification methods (usually 2D) dominate the current literature due to comparative simplicity. However, 3D mesh analysis provides the tremendous potential to accurately estimate specific morphological features cross-sectionally and monitor them over-time. Result In this paper, we present a novel 3D mesh based technique developed for temporal high-throughput plant phenomics and perform initial tests for the analysis of Gossypium hirsutum vegetative growth. Based on plant meshes previously reconstructed from multi-view images, the methodology involves several stages, including morphological mesh segmentation, phenotypic parameters estimation, and plant organs tracking over time. The initial study focuses on presenting and validating the accuracy of the methodology on dicotyledons such as cotton but we believe the approach will be more broadly applicable. This study involved applying our technique to a set of six Gossypium hirsutum (cotton) plants studied over four time-points. Manual measurements, performed for each plant at every time-point, were used to assess the accuracy of our pipeline and quantify the error on the morphological parameters estimated. Conclusion By directly comparing our automated mesh based quantitative data with manual measurements of individual stem height, leaf width and leaf length, we obtained the mean

  11. Signal analysis techniques for incipient failure detection in turbomachinery

    NASA Technical Reports Server (NTRS)

    Coffin, T.

    1985-01-01

    Signal analysis techniques for the detection and classification of incipient mechanical failures in turbomachinery were developed, implemented and evaluated. Signal analysis techniques available to describe dynamic measurement characteristics are reviewed. Time domain and spectral methods are described, and statistical classification in terms of moments is discussed. Several of these waveform analysis techniques were implemented on a computer and applied to dynamic signals. A laboratory evaluation of the methods with respect to signal detection capability is described. Plans for further technique evaluation and data base development to characterize turbopump incipient failure modes from Space Shuttle main engine (SSME) hot firing measurements are outlined.

  12. Artificial intelligence techniques used in respiratory sound analysis--a systematic review.

    PubMed

    Palaniappan, Rajkumar; Sundaraj, Kenneth; Sundaraj, Sebastian

    2014-02-01

    Artificial intelligence (AI) has recently been established as an alternative method to many conventional methods. The implementation of AI techniques for respiratory sound analysis can assist medical professionals in the diagnosis of lung pathologies. This article highlights the importance of AI techniques in the implementation of computer-based respiratory sound analysis. Articles on computer-based respiratory sound analysis using AI techniques were identified by searches conducted on various electronic resources, such as the IEEE, Springer, Elsevier, PubMed, and ACM digital library databases. Brief descriptions of the types of respiratory sounds and their respective characteristics are provided. We then analyzed each of the previous studies to determine the specific respiratory sounds/pathology analyzed, the number of subjects, the signal processing method used, the AI techniques used, and the performance of the AI technique used in the analysis of respiratory sounds. A detailed description of each of these studies is provided. In conclusion, this article provides recommendations for further advancements in respiratory sound analysis.

  13. Fourier Spectroscopy: A Simple Analysis Technique

    ERIC Educational Resources Information Center

    Oelfke, William C.

    1975-01-01

    Presents a simple method of analysis in which the student can integrate, point by point, any interferogram to obtain its Fourier transform. The manual technique requires no special equipment and is based on relationships that most undergraduate physics students can derive from the Fourier integral equations. (Author/MLH)

  14. Ultrafast Method for the Analysis of Fluorescence Lifetime Imaging Microscopy Data Based on the Laguerre Expansion Technique

    PubMed Central

    Jo, Javier A.; Fang, Qiyin; Marcu, Laura

    2007-01-01

    We report a new deconvolution method for fluorescence lifetime imaging microscopy (FLIM) based on the Laguerre expansion technique. The performance of this method was tested on synthetic and real FLIM images. The following interesting properties of this technique were demonstrated. 1) The fluorescence intensity decay can be estimated simultaneously for all pixels, without a priori assumption of the decay functional form. 2) The computation speed is extremely fast, performing at least two orders of magnitude faster than current algorithms. 3) The estimated maps of Laguerre expansion coefficients provide a new domain for representing FLIM information. 4) The number of images required for the analysis is relatively small, allowing reduction of the acquisition time. These findings indicate that the developed Laguerre expansion technique for FLIM analysis represents a robust and extremely fast deconvolution method that enables practical applications of FLIM in medicine, biology, biochemistry, and chemistry. PMID:19444338

  15. Construction of dynamic stochastic simulation models using knowledge-based techniques

    NASA Technical Reports Server (NTRS)

    Williams, M. Douglas; Shiva, Sajjan G.

    1990-01-01

    Over the past three decades, computer-based simulation models have proven themselves to be cost-effective alternatives to the more structured deterministic methods of systems analysis. During this time, many techniques, tools and languages for constructing computer-based simulation models have been developed. More recently, advances in knowledge-based system technology have led many researchers to note the similarities between knowledge-based programming and simulation technologies and to investigate the potential application of knowledge-based programming techniques to simulation modeling. The integration of conventional simulation techniques with knowledge-based programming techniques is discussed to provide a development environment for constructing knowledge-based simulation models. A comparison of the techniques used in the construction of dynamic stochastic simulation models and those used in the construction of knowledge-based systems provides the requirements for the environment. This leads to the design and implementation of a knowledge-based simulation development environment. These techniques were used in the construction of several knowledge-based simulation models including the Advanced Launch System Model (ALSYM).

  16. Photomorphic analysis techniques: An interim spatial analysis using satellite remote sensor imagery and historical data

    NASA Technical Reports Server (NTRS)

    Keuper, H. R.; Peplies, R. W.; Gillooly, R. P.

    1977-01-01

    The use of machine scanning and/or computer-based techniques to provide greater objectivity in the photomorphic approach was investigated. Photomorphic analysis and its application in regional planning are discussed. Topics included: delineation of photomorphic regions; inadequacies of existing classification systems; tonal and textural characteristics and signature analysis techniques; pattern recognition and Fourier transform analysis; and optical experiments. A bibliography is included.

  17. Spectroscopic analysis technique for arc-welding process control

    NASA Astrophysics Data System (ADS)

    Mirapeix, Jesús; Cobo, Adolfo; Conde, Olga; Quintela, María Ángeles; López-Higuera, José-Miguel

    2005-09-01

    The spectroscopic analysis of the light emitted by thermal plasmas has found many applications, from chemical analysis to monitoring and control of industrial processes. Particularly, it has been demonstrated that the analysis of the thermal plasma generated during arc or laser welding can supply information about the process and, thus, about the quality of the weld. In some critical applications (e.g. the aerospace sector), an early, real-time detection of defects in the weld seam (oxidation, porosity, lack of penetration, ...) is highly desirable as it can reduce expensive non-destructive testing (NDT). Among others techniques, full spectroscopic analysis of the plasma emission is known to offer rich information about the process itself, but it is also very demanding in terms of real-time implementations. In this paper, we proposed a technique for the analysis of the plasma emission spectrum that is able to detect, in real-time, changes in the process parameters that could lead to the formation of defects in the weld seam. It is based on the estimation of the electronic temperature of the plasma through the analysis of the emission peaks from multiple atomic species. Unlike traditional techniques, which usually involve peak fitting to Voigt functions using the Levenberg-Marquardt recursive method, we employ the LPO (Linear Phase Operator) sub-pixel algorithm to accurately estimate the central wavelength of the peaks (allowing an automatic identification of each atomic species) and cubic-spline interpolation of the noisy data to obtain the intensity and width of the peaks. Experimental tests on TIG-welding using fiber-optic capture of light and a low-cost CCD-based spectrometer, show that some typical defects can be easily detected and identified with this technique, whose typical processing time for multiple peak analysis is less than 20msec. running in a conventional PC.

  18. Characterization of electrically-active defects in ultraviolet light-emitting diodes with laser-based failure analysis techniques

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Miller, Mary A.; Tangyunyong, Paiboon; Cole, Edward I.

    2016-01-14

    Laser-based failure analysis techniques demonstrate the ability to quickly and non-intrusively screen deep ultraviolet light-emitting diodes (LEDs) for electrically-active defects. In particular, two laser-based techniques, light-induced voltage alteration and thermally-induced voltage alteration, generate applied voltage maps (AVMs) that provide information on electrically-active defect behavior including turn-on bias, density, and spatial location. Here, multiple commercial LEDs were examined and found to have dark defect signals in the AVM indicating a site of reduced resistance or leakage through the diode. The existence of the dark defect signals in the AVM correlates strongly with an increased forward-bias leakage current. This increased leakage ismore » not present in devices without AVM signals. Transmission electron microscopy analysis of a dark defect signal site revealed a dislocation cluster through the pn junction. The cluster included an open core dislocation. Even though LEDs with few dark AVM defect signals did not correlate strongly with power loss, direct association between increased open core dislocation densities and reduced LED device performance has been presented elsewhere [M. W. Moseley et al., J. Appl. Phys. 117, 095301 (2015)].« less

  19. A Flexible Hierarchical Bayesian Modeling Technique for Risk Analysis of Major Accidents.

    PubMed

    Yu, Hongyang; Khan, Faisal; Veitch, Brian

    2017-09-01

    Safety analysis of rare events with potentially catastrophic consequences is challenged by data scarcity and uncertainty. Traditional causation-based approaches, such as fault tree and event tree (used to model rare event), suffer from a number of weaknesses. These include the static structure of the event causation, lack of event occurrence data, and need for reliable prior information. In this study, a new hierarchical Bayesian modeling based technique is proposed to overcome these drawbacks. The proposed technique can be used as a flexible technique for risk analysis of major accidents. It enables both forward and backward analysis in quantitative reasoning and the treatment of interdependence among the model parameters. Source-to-source variability in data sources is also taken into account through a robust probabilistic safety analysis. The applicability of the proposed technique has been demonstrated through a case study in marine and offshore industry. © 2017 Society for Risk Analysis.

  20. Singular value decomposition based feature extraction technique for physiological signal analysis.

    PubMed

    Chang, Cheng-Ding; Wang, Chien-Chih; Jiang, Bernard C

    2012-06-01

    Multiscale entropy (MSE) is one of the popular techniques to calculate and describe the complexity of the physiological signal. Many studies use this approach to detect changes in the physiological conditions in the human body. However, MSE results are easily affected by noise and trends, leading to incorrect estimation of MSE values. In this paper, singular value decomposition (SVD) is adopted to replace MSE to extract the features of physiological signals, and adopt the support vector machine (SVM) to classify the different physiological states. A test data set based on the PhysioNet website was used, and the classification results showed that using SVD to extract features of the physiological signal could attain a classification accuracy rate of 89.157%, which is higher than that using the MSE value (71.084%). The results show the proposed analysis procedure is effective and appropriate for distinguishing different physiological states. This promising result could be used as a reference for doctors in diagnosis of congestive heart failure (CHF) disease.

  1. Development of a sensitivity analysis technique for multiloop flight control systems

    NASA Technical Reports Server (NTRS)

    Vaillard, A. H.; Paduano, J.; Downing, D. R.

    1985-01-01

    This report presents the development and application of a sensitivity analysis technique for multiloop flight control systems. This analysis yields very useful information on the sensitivity of the relative-stability criteria of the control system, with variations or uncertainties in the system and controller elements. The sensitivity analysis technique developed is based on the computation of the singular values and singular-value gradients of a feedback-control system. The method is applicable to single-input/single-output as well as multiloop continuous-control systems. Application to sampled-data systems is also explored. The sensitivity analysis technique was applied to a continuous yaw/roll damper stability augmentation system of a typical business jet, and the results show that the analysis is very useful in determining the system elements which have the largest effect on the relative stability of the closed-loop system. As a secondary product of the research reported here, the relative stability criteria based on the concept of singular values were explored.

  2. Aerodynamic measurement techniques. [laser based diagnostic techniques

    NASA Technical Reports Server (NTRS)

    Hunter, W. W., Jr.

    1976-01-01

    Laser characteristics of intensity, monochromatic, spatial coherence, and temporal coherence were developed to advance laser based diagnostic techniques for aerodynamic related research. Two broad categories of visualization and optical measurements were considered, and three techniques received significant attention. These are holography, laser velocimetry, and Raman scattering. Examples of the quantitative laser velocimeter and Raman scattering measurements of velocity, temperature, and density indicated the potential of these nonintrusive techniques.

  3. A novel murmur-based heart sound feature extraction technique using envelope-morphological analysis

    NASA Astrophysics Data System (ADS)

    Yao, Hao-Dong; Ma, Jia-Li; Fu, Bin-Bin; Wang, Hai-Yang; Dong, Ming-Chui

    2015-07-01

    Auscultation of heart sound (HS) signals serves as an important primary approach to diagnose cardiovascular diseases (CVDs) for centuries. Confronting the intrinsic drawbacks of traditional HS auscultation, computer-aided automatic HS auscultation based on feature extraction technique has witnessed explosive development. Yet, most existing HS feature extraction methods adopt acoustic or time-frequency features which exhibit poor relationship with diagnostic information, thus restricting the performance of further interpretation and analysis. Tackling such a bottleneck problem, this paper innovatively proposes a novel murmur-based HS feature extraction method since murmurs contain massive pathological information and are regarded as the first indications of pathological occurrences of heart valves. Adapting discrete wavelet transform (DWT) and Shannon envelope, the envelope-morphological characteristics of murmurs are obtained and three features are extracted accordingly. Validated by discriminating normal HS and 5 various abnormal HS signals with extracted features, the proposed method provides an attractive candidate in automatic HS auscultation.

  4. NASA standard: Trend analysis techniques

    NASA Technical Reports Server (NTRS)

    1990-01-01

    Descriptive and analytical techniques for NASA trend analysis applications are presented in this standard. Trend analysis is applicable in all organizational elements of NASA connected with, or supporting, developmental/operational programs. This document should be consulted for any data analysis activity requiring the identification or interpretation of trends. Trend analysis is neither a precise term nor a circumscribed methodology: it generally connotes quantitative analysis of time-series data. For NASA activities, the appropriate and applicable techniques include descriptive and graphical statistics, and the fitting or modeling of data by linear, quadratic, and exponential models. Usually, but not always, the data is time-series in nature. Concepts such as autocorrelation and techniques such as Box-Jenkins time-series analysis would only rarely apply and are not included in this document. The basic ideas needed for qualitative and quantitative assessment of trends along with relevant examples are presented.

  5. Application of pattern recognition techniques to crime analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bender, C.F.; Cox, L.A. Jr.; Chappell, G.A.

    1976-08-15

    The initial goal was to evaluate the capabilities of current pattern recognition techniques when applied to existing computerized crime data. Performance was to be evaluated both in terms of the system's capability to predict crimes and to optimize police manpower allocation. A relation was sought to predict the crime's susceptibility to solution, based on knowledge of the crime type, location, time, etc. The preliminary results of this work are discussed. They indicate that automatic crime analysis involving pattern recognition techniques is feasible, and that efforts to determine optimum variables and techniques are warranted. 47 figures (RWR)

  6. NASA standard: Trend analysis techniques

    NASA Technical Reports Server (NTRS)

    1988-01-01

    This Standard presents descriptive and analytical techniques for NASA trend analysis applications. Trend analysis is applicable in all organizational elements of NASA connected with, or supporting, developmental/operational programs. Use of this Standard is not mandatory; however, it should be consulted for any data analysis activity requiring the identification or interpretation of trends. Trend Analysis is neither a precise term nor a circumscribed methodology, but rather connotes, generally, quantitative analysis of time-series data. For NASA activities, the appropriate and applicable techniques include descriptive and graphical statistics, and the fitting or modeling of data by linear, quadratic, and exponential models. Usually, but not always, the data is time-series in nature. Concepts such as autocorrelation and techniques such as Box-Jenkins time-series analysis would only rarely apply and are not included in this Standard. The document presents the basic ideas needed for qualitative and quantitative assessment of trends, together with relevant examples. A list of references provides additional sources of information.

  7. Development of evaluation technique of GMAW welding quality based on statistical analysis

    NASA Astrophysics Data System (ADS)

    Feng, Shengqiang; Terasaki, Hidenri; Komizo, Yuichi; Hu, Shengsun; Chen, Donggao; Ma, Zhihua

    2014-11-01

    Nondestructive techniques for appraising gas metal arc welding(GMAW) faults plays a very important role in on-line quality controllability and prediction of the GMAW process. On-line welding quality controllability and prediction have several disadvantages such as high cost, low efficiency, complication and greatly being affected by the environment. An enhanced, efficient evaluation technique for evaluating welding faults based on Mahalanobis distance(MD) and normal distribution is presented. In addition, a new piece of equipment, designated the weld quality tester(WQT), is developed based on the proposed evaluation technique. MD is superior to other multidimensional distances such as Euclidean distance because the covariance matrix used for calculating MD takes into account correlations in the data and scaling. The values of MD obtained from welding current and arc voltage are assumed to follow a normal distribution. The normal distribution has two parameters: the mean µ and standard deviation σ of the data. In the proposed evaluation technique used by the WQT, values of MD located in the range from zero to µ+3 σ are regarded as "good". Two experiments which involve changing the flow of shielding gas and smearing paint on the surface of the substrate are conducted in order to verify the sensitivity of the proposed evaluation technique and the feasibility of using WQT. The experimental results demonstrate the usefulness of the WQT for evaluating welding quality. The proposed technique can be applied to implement the on-line welding quality controllability and prediction, which is of great importance to design some novel equipment for weld quality detection.

  8. Polyspectral signal analysis techniques for condition based maintenance of helicopter drive-train system

    NASA Astrophysics Data System (ADS)

    Hassan Mohammed, Mohammed Ahmed

    For an efficient maintenance of a diverse fleet of air- and rotorcraft, effective condition based maintenance (CBM) must be established based on rotating components monitored vibration signals. In this dissertation, we present theory and applications of polyspectral signal processing techniques for condition monitoring of critical components in the AH-64D helicopter tail rotor drive train system. Currently available vibration-monitoring tools are mostly built around auto- and cross-power spectral analysis which have limited performance in detecting frequency correlations higher than second order. Studying higher order correlations and their Fourier transforms, higher order spectra, provides more information about the vibration signals which helps in building more accurate diagnostic models of the mechanical system. Based on higher order spectral analysis, different signal processing techniques are developed to assess health conditions of different critical rotating-components in the AH-64D helicopter drive-train. Based on cross-bispectrum, quadratic nonlinear transfer function is presented to model second order nonlinearity in a drive-shaft running between the two hanger bearings. Then, quadratic-nonlinearity coupling coefficient between frequency harmonics of the rotating shaft is used as condition metric to study different seeded shaft faults compared to baseline case, namely: shaft misalignment, shaft imbalance, and combination of shaft misalignment and imbalance. The proposed quadratic-nonlinearity metric shows better capabilities in distinguishing the four studied shaft settings than the conventional linear coupling based on cross-power spectrum. We also develop a new concept of Quadratic-Nonlinearity Power-Index spectrum, QNLPI(f), that can be used in signal detection and classification, based on bicoherence spectrum. The proposed QNLPI(f) is derived as a projection of the three-dimensional bicoherence spectrum into two-dimensional spectrum that

  9. A New Femtosecond Laser-Based Three-Dimensional Tomography Technique

    NASA Astrophysics Data System (ADS)

    Echlin, McLean P.

    2011-12-01

    Tomographic imaging has dramatically changed science, most notably in the fields of medicine and biology, by producing 3D views of structures which are too complex to understand in any other way. Current tomographic techniques require extensive time both for post-processing and data collection. Femtosecond laser based tomographic techniques have been developed in both standard atmosphere (femtosecond laser-based serial sectioning technique - FSLSS) and in vacuum (Tri-Beam System) for the fast collection (10 5mum3/s) of mm3 sized 3D datasets. Both techniques use femtosecond laser pulses to selectively remove layer-by-layer areas of material with low collateral damage and a negligible heat affected zone. To the authors knowledge, femtosecond lasers have never been used to serial section and these techniques have been entirely and uniquely developed by the author and his collaborators at the University of Michigan and University of California Santa Barbara. The FSLSS was applied to measure the 3D distribution of TiN particles in a 4330 steel. Single pulse ablation morphologies and rates were measured and collected from literature. Simultaneous two-phase ablation of TiN and steel matrix was shown to occur at fluences of 0.9-2 J/cm2. Laser scanning protocols were developed minimizing surface roughness to 0.1-0.4 mum for laser-based sectioning. The FSLSS technique was used to section and 3D reconstruct titanium nitride (TiN) containing 4330 steel. Statistical analysis of 3D TiN particle sizes, distribution parameters, and particle density were measured. A methodology was developed to use the 3D datasets to produce statistical volume elements (SVEs) for toughness modeling. Six FSLSS TiN datasets were sub-sampled into 48 SVEs for statistical analysis and toughness modeling using the Rice-Tracey and Garrison-Moody models. A two-parameter Weibull analysis was performed and variability in the toughness data agreed well with Ruggieri et al. bulk toughness measurements. The Tri

  10. Diffraction analysis of customized illumination technique

    NASA Astrophysics Data System (ADS)

    Lim, Chang-Moon; Kim, Seo-Min; Eom, Tae-Seung; Moon, Seung Chan; Shin, Ki S.

    2004-05-01

    Various enhancement techniques such as alternating PSM, chrome-less phase lithography, double exposure, etc. have been considered as driving forces to lead the production k1 factor towards below 0.35. Among them, a layer specific optimization of illumination mode, so-called customized illumination technique receives deep attentions from lithographers recently. A new approach for illumination customization based on diffraction spectrum analysis is suggested in this paper. Illumination pupil is divided into various diffraction domains by comparing the similarity of the confined diffraction spectrum. Singular imaging property of individual diffraction domain makes it easier to build and understand the customized illumination shape. By comparing the goodness of image in each domain, it was possible to achieve the customized shape of illumination. With the help from this technique, it was found that the layout change would not gives the change in the shape of customized illumination mode.

  11. DETECTION OF DNA DAMAGE USING MELTING ANALYSIS TECHNIQUES

    EPA Science Inventory

    A rapid and simple fluorescence screening assay for UV radiation-, chemical-, and enzyme-induced DNA damage is reported. This assay is based on a melting/annealing analysis technique and has been used with both calf thymus DNA and plasmid DNA (puc 19 plasmid from E. coli). DN...

  12. Improved Sectional Image Analysis Technique for Evaluating Fiber Orientations in Fiber-Reinforced Cement-Based Materials.

    PubMed

    Lee, Bang Yeon; Kang, Su-Tae; Yun, Hae-Bum; Kim, Yun Yong

    2016-01-12

    The distribution of fiber orientation is an important factor in determining the mechanical properties of fiber-reinforced concrete. This study proposes a new image analysis technique for improving the evaluation accuracy of fiber orientation distribution in the sectional image of fiber-reinforced concrete. A series of tests on the accuracy of fiber detection and the estimation performance of fiber orientation was performed on artificial fiber images to assess the validity of the proposed technique. The validation test results showed that the proposed technique estimates the distribution of fiber orientation more accurately than the direct measurement of fiber orientation by image analysis.

  13. Improved Sectional Image Analysis Technique for Evaluating Fiber Orientations in Fiber-Reinforced Cement-Based Materials

    PubMed Central

    Lee, Bang Yeon; Kang, Su-Tae; Yun, Hae-Bum; Kim, Yun Yong

    2016-01-01

    The distribution of fiber orientation is an important factor in determining the mechanical properties of fiber-reinforced concrete. This study proposes a new image analysis technique for improving the evaluation accuracy of fiber orientation distribution in the sectional image of fiber-reinforced concrete. A series of tests on the accuracy of fiber detection and the estimation performance of fiber orientation was performed on artificial fiber images to assess the validity of the proposed technique. The validation test results showed that the proposed technique estimates the distribution of fiber orientation more accurately than the direct measurement of fiber orientation by image analysis. PMID:28787839

  14. Choosing a DIVA: a comparison of emerging digital imagery vegetation analysis techniques

    USGS Publications Warehouse

    Jorgensen, Christopher F.; Stutzman, Ryan J.; Anderson, Lars C.; Decker, Suzanne E.; Powell, Larkin A.; Schacht, Walter H.; Fontaine, Joseph J.

    2013-01-01

    Question: What is the precision of five methods of measuring vegetation structure using ground-based digital imagery and processing techniques? Location: Lincoln, Nebraska, USA Methods: Vertical herbaceous cover was recorded using digital imagery techniques at two distinct locations in a mixed-grass prairie. The precision of five ground-based digital imagery vegetation analysis (DIVA) methods for measuring vegetation structure was tested using a split-split plot analysis of covariance. Variability within each DIVA technique was estimated using coefficient of variation of mean percentage cover. Results: Vertical herbaceous cover estimates differed among DIVA techniques. Additionally, environmental conditions affected the vertical vegetation obstruction estimates for certain digital imagery methods, while other techniques were more adept at handling various conditions. Overall, percentage vegetation cover values differed among techniques, but the precision of four of the five techniques was consistently high. Conclusions: DIVA procedures are sufficient for measuring various heights and densities of standing herbaceous cover. Moreover, digital imagery techniques can reduce measurement error associated with multiple observers' standing herbaceous cover estimates, allowing greater opportunity to detect patterns associated with vegetation structure.

  15. Modular techniques for dynamic fault-tree analysis

    NASA Technical Reports Server (NTRS)

    Patterson-Hine, F. A.; Dugan, Joanne B.

    1992-01-01

    It is noted that current approaches used to assess the dependability of complex systems such as Space Station Freedom and the Air Traffic Control System are incapable of handling the size and complexity of these highly integrated designs. A novel technique for modeling such systems which is built upon current techniques in Markov theory and combinatorial analysis is described. It enables the development of a hierarchical representation of system behavior which is more flexible than either technique alone. A solution strategy which is based on an object-oriented approach to model representation and evaluation is discussed. The technique is virtually transparent to the user since the fault tree models can be built graphically and the objects defined automatically. The tree modularization procedure allows the two model types, Markov and combinatoric, to coexist and does not require that the entire fault tree be translated to a Markov chain for evaluation. This effectively reduces the size of the Markov chain required and enables solutions with less truncation, making analysis of longer mission times possible. Using the fault-tolerant parallel processor as an example, a model is built and solved for a specific mission scenario and the solution approach is illustrated in detail.

  16. Developing techniques for cause-responsibility analysis of occupational accidents.

    PubMed

    Jabbari, Mousa; Ghorbani, Roghayeh

    2016-11-01

    The aim of this study was to specify the causes of occupational accidents, determine social responsibility and the role of groups involved in work-related accidents. This study develops occupational accidents causes tree, occupational accidents responsibility tree, and occupational accidents component-responsibility analysis worksheet; based on these methods, it develops cause-responsibility analysis (CRA) techniques, and for testing them, analyzes 100 fatal/disabling occupational accidents in the construction setting that were randomly selected from all the work-related accidents in Tehran, Iran, over a 5-year period (2010-2014). The main result of this study involves two techniques for CRA: occupational accidents tree analysis (OATA) and occupational accidents components analysis (OACA), used in parallel for determination of responsible groups and responsibilities rate. From the results, we find that the management group of construction projects has 74.65% responsibility of work-related accidents. The developed techniques are purposeful for occupational accidents investigation/analysis, especially for the determination of detailed list of tasks, responsibilities, and their rates. Therefore, it is useful for preventing work-related accidents by focusing on the responsible group's duties. Copyright © 2016 Elsevier Ltd. All rights reserved.

  17. Model building techniques for analysis.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Walther, Howard P.; McDaniel, Karen Lynn; Keener, Donald

    2009-09-01

    The practice of mechanical engineering for product development has evolved into a complex activity that requires a team of specialists for success. Sandia National Laboratories (SNL) has product engineers, mechanical designers, design engineers, manufacturing engineers, mechanical analysts and experimentalists, qualification engineers, and others that contribute through product realization teams to develop new mechanical hardware. The goal of SNL's Design Group is to change product development by enabling design teams to collaborate within a virtual model-based environment whereby analysis is used to guide design decisions. Computer-aided design (CAD) models using PTC's Pro/ENGINEER software tools are heavily relied upon in the productmore » definition stage of parts and assemblies at SNL. The three-dimensional CAD solid model acts as the design solid model that is filled with all of the detailed design definition needed to manufacture the parts. Analysis is an important part of the product development process. The CAD design solid model (DSM) is the foundation for the creation of the analysis solid model (ASM). Creating an ASM from the DSM currently is a time-consuming effort; the turnaround time for results of a design needs to be decreased to have an impact on the overall product development. This effort can be decreased immensely through simple Pro/ENGINEER modeling techniques that summarize to the method features are created in a part model. This document contains recommended modeling techniques that increase the efficiency of the creation of the ASM from the DSM.« less

  18. Model-Based Safety Analysis

    NASA Technical Reports Server (NTRS)

    Joshi, Anjali; Heimdahl, Mats P. E.; Miller, Steven P.; Whalen, Mike W.

    2006-01-01

    System safety analysis techniques are well established and are used extensively during the design of safety-critical systems. Despite this, most of the techniques are highly subjective and dependent on the skill of the practitioner. Since these analyses are usually based on an informal system model, it is unlikely that they will be complete, consistent, and error free. In fact, the lack of precise models of the system architecture and its failure modes often forces the safety analysts to devote much of their effort to gathering architectural details about the system behavior from several sources and embedding this information in the safety artifacts such as the fault trees. This report describes Model-Based Safety Analysis, an approach in which the system and safety engineers share a common system model created using a model-based development process. By extending the system model with a fault model as well as relevant portions of the physical system to be controlled, automated support can be provided for much of the safety analysis. We believe that by using a common model for both system and safety engineering and automating parts of the safety analysis, we can both reduce the cost and improve the quality of the safety analysis. Here we present our vision of model-based safety analysis and discuss the advantages and challenges in making this approach practical.

  19. Magnetic separation techniques in sample preparation for biological analysis: a review.

    PubMed

    He, Jincan; Huang, Meiying; Wang, Dongmei; Zhang, Zhuomin; Li, Gongke

    2014-12-01

    Sample preparation is a fundamental and essential step in almost all the analytical procedures, especially for the analysis of complex samples like biological and environmental samples. In past decades, with advantages of superparamagnetic property, good biocompatibility and high binding capacity, functionalized magnetic materials have been widely applied in various processes of sample preparation for biological analysis. In this paper, the recent advancements of magnetic separation techniques based on magnetic materials in the field of sample preparation for biological analysis were reviewed. The strategy of magnetic separation techniques was summarized. The synthesis, stabilization and bio-functionalization of magnetic nanoparticles were reviewed in detail. Characterization of magnetic materials was also summarized. Moreover, the applications of magnetic separation techniques for the enrichment of protein, nucleic acid, cell, bioactive compound and immobilization of enzyme were described. Finally, the existed problems and possible trends of magnetic separation techniques for biological analysis in the future were proposed. Copyright © 2014 Elsevier B.V. All rights reserved.

  20. Towards Effective Clustering Techniques for the Analysis of Electric Power Grids

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hogan, Emilie A.; Cotilla Sanchez, Jose E.; Halappanavar, Mahantesh

    2013-11-30

    Clustering is an important data analysis technique with numerous applications in the analysis of electric power grids. Standard clustering techniques are oblivious to the rich structural and dynamic information available for power grids. Therefore, by exploiting the inherent topological and electrical structure in the power grid data, we propose new methods for clustering with applications to model reduction, locational marginal pricing, phasor measurement unit (PMU or synchrophasor) placement, and power system protection. We focus our attention on model reduction for analysis based on time-series information from synchrophasor measurement devices, and spectral techniques for clustering. By comparing different clustering techniques onmore » two instances of realistic power grids we show that the solutions are related and therefore one could leverage that relationship for a computational advantage. Thus, by contrasting different clustering techniques we make a case for exploiting structure inherent in the data with implications for several domains including power systems.« less

  1. Performance analysis of clustering techniques over microarray data: A case study

    NASA Astrophysics Data System (ADS)

    Dash, Rasmita; Misra, Bijan Bihari

    2018-03-01

    Handling big data is one of the major issues in the field of statistical data analysis. In such investigation cluster analysis plays a vital role to deal with the large scale data. There are many clustering techniques with different cluster analysis approach. But which approach suits a particular dataset is difficult to predict. To deal with this problem a grading approach is introduced over many clustering techniques to identify a stable technique. But the grading approach depends on the characteristic of dataset as well as on the validity indices. So a two stage grading approach is implemented. In this study the grading approach is implemented over five clustering techniques like hybrid swarm based clustering (HSC), k-means, partitioning around medoids (PAM), vector quantization (VQ) and agglomerative nesting (AGNES). The experimentation is conducted over five microarray datasets with seven validity indices. The finding of grading approach that a cluster technique is significant is also established by Nemenyi post-hoc hypothetical test.

  2. Preconditioned conjugate gradient technique for the analysis of symmetric anisotropic structures

    NASA Technical Reports Server (NTRS)

    Noor, Ahmed K.; Peters, Jeanne M.

    1987-01-01

    An efficient preconditioned conjugate gradient (PCG) technique and a computational procedure are presented for the analysis of symmetric anisotropic structures. The technique is based on selecting the preconditioning matrix as the orthotropic part of the global stiffness matrix of the structure, with all the nonorthotropic terms set equal to zero. This particular choice of the preconditioning matrix results in reducing the size of the analysis model of the anisotropic structure to that of the corresponding orthotropic structure. The similarities between the proposed PCG technique and a reduction technique previously presented by the authors are identified and exploited to generate from the PCG technique direct measures for the sensitivity of the different response quantities to the nonorthotropic (anisotropic) material coefficients of the structure. The effectiveness of the PCG technique is demonstrated by means of a numerical example of an anisotropic cylindrical panel.

  3. Characterization of electrically-active defects in ultraviolet light-emitting diodes with laser-based failure analysis techniques

    DOE PAGES

    Miller, Mary A.; Tangyunyong, Paiboon; Edward I. Cole, Jr.

    2016-01-12

    In this study, laser-based failure analysis techniques demonstrate the ability to quickly and non-intrusively screen deep ultraviolet light-emitting diodes(LEDs) for electrically-active defects. In particular, two laser-based techniques, light-induced voltage alteration and thermally-induced voltage alteration, generate applied voltage maps (AVMs) that provide information on electrically-active defect behavior including turn-on bias, density, and spatial location. Here, multiple commercial LEDs were examined and found to have dark defect signals in the AVM indicating a site of reduced resistance or leakage through the diode. The existence of the dark defect signals in the AVM correlates strongly with an increased forward-bias leakage current. This increasedmore » leakage is not present in devices without AVM signals. Transmission electron microscopyanalysis of a dark defect signal site revealed a dislocation cluster through the pn junction. The cluster included an open core dislocation. Even though LEDs with few dark AVM defect signals did not correlate strongly with power loss, direct association between increased open core dislocation densities and reduced LED device performance has been presented elsewhere [M. W. Moseley et al., J. Appl. Phys. 117, 095301 (2015)].« less

  4. Ionospheric Plasma Drift Analysis Technique Based On Ray Tracing

    NASA Astrophysics Data System (ADS)

    Ari, Gizem; Toker, Cenk

    2016-07-01

    Ionospheric drift measurements provide important information about the variability in the ionosphere, which can be used to quantify ionospheric disturbances caused by natural phenomena such as solar, geomagnetic, gravitational and seismic activities. One of the prominent ways for drift measurement depends on instrumentation based measurements, e.g. using an ionosonde. The drift estimation of an ionosonde depends on measuring the Doppler shift on the received signal, where the main cause of Doppler shift is the change in the length of the propagation path of the signal between the transmitter and the receiver. Unfortunately, ionosondes are expensive devices and their installation and maintenance require special care. Furthermore, the ionosonde network over the world or even Europe is not dense enough to obtain a global or continental drift map. In order to overcome the difficulties related to an ionosonde, we propose a technique to perform ionospheric drift estimation based on ray tracing. First, a two dimensional TEC map is constructed by using the IONOLAB-MAP tool which spatially interpolates the VTEC estimates obtained from the EUREF CORS network. Next, a three dimensional electron density profile is generated by inputting the TEC estimates to the IRI-2015 model. Eventually, a close-to-real situation electron density profile is obtained in which ray tracing can be performed. These profiles can be constructed periodically with a period of as low as 30 seconds. By processing two consequent snapshots together and calculating the propagation paths, we estimate the drift measurements over any coordinate of concern. We test our technique by comparing the results to the drift measurements taken at the DPS ionosonde at Pruhonice, Czech Republic. This study is supported by TUBITAK 115E915 and Joint TUBITAK 114E092 and AS CR14/001 projects.

  5. A PLL-based resampling technique for vibration analysis in variable-speed wind turbines with PMSG: A bearing fault case

    NASA Astrophysics Data System (ADS)

    Pezzani, Carlos M.; Bossio, José M.; Castellino, Ariel M.; Bossio, Guillermo R.; De Angelo, Cristian H.

    2017-02-01

    Condition monitoring in permanent magnet synchronous machines has gained interest due to the increasing use in applications such as electric traction and power generation. Particularly in wind power generation, non-invasive condition monitoring techniques are of great importance. Usually, in such applications the access to the generator is complex and costly, while unexpected breakdowns results in high repair costs. This paper presents a technique which allows using vibration analysis for bearing fault detection in permanent magnet synchronous generators used in wind turbines. Given that in wind power applications the generator rotational speed may vary during normal operation, it is necessary to use special sampling techniques to apply spectral analysis of mechanical vibrations. In this work, a resampling technique based on order tracking without measuring the rotor position is proposed. To synchronize sampling with rotor position, an estimation of the rotor position obtained from the angle of the voltage vector is proposed. This angle is obtained from a phase-locked loop synchronized with the generator voltages. The proposed strategy is validated by laboratory experimental results obtained from a permanent magnet synchronous generator. Results with single point defects in the outer race of a bearing under variable speed and load conditions are presented.

  6. Applications of Advanced, Waveform Based AE Techniques for Testing Composite Materials

    NASA Technical Reports Server (NTRS)

    Prosser, William H.

    1996-01-01

    Advanced, waveform based acoustic emission (AE) techniques have been previously used to evaluate damage progression in laboratory tests of composite coupons. In these tests, broad band, high fidelity acoustic sensors were used to detect signals which were then digitized and stored for analysis. Analysis techniques were based on plate mode wave propagation characteristics. This approach, more recently referred to as Modal AE, provides an enhanced capability to discriminate and eliminate noise signals from those generated by damage mechanisms. This technique also allows much more precise source location than conventional, threshold crossing arrival time determination techniques. To apply Modal AE concepts to the interpretation of AE on larger composite structures, the effects of wave propagation over larger distances and through structural complexities must be well characterized and understood. In this research, measurements were made of the attenuation of the extensional and flexural plate mode components of broad band simulated AE signals in large composite panels. As these materials have applications in a cryogenic environment, the effects of cryogenic insulation on the attenuation of plate mode AE signals were also documented.

  7. A microhistological technique for analysis of food habits of mycophagous rodents.

    Treesearch

    Patrick W. McIntire; Andrew B. Carey

    1989-01-01

    We present a technique, based on microhistological analysis of fecal pellets, for quantifying the diets of forest rodents. This technique provides for the simultaneous recording of fungal spores and vascular plant material. Fecal samples should be freeze dried, weighed, and rehydrated with distilled water. We recommend a minimum sampling intensity of 50 fields of view...

  8. Electromechanical actuators affected by multiple failures: Prognostic method based on spectral analysis techniques

    NASA Astrophysics Data System (ADS)

    Belmonte, D.; Vedova, M. D. L. Dalla; Ferro, C.; Maggiore, P.

    2017-06-01

    The proposal of prognostic algorithms able to identify precursors of incipient failures of primary flight command electromechanical actuators (EMA) is beneficial for the anticipation of the incoming failure: an early and correct interpretation of the failure degradation pattern, in fact, can trig an early alert of the maintenance crew, who can properly schedule the servomechanism replacement. An innovative prognostic model-based approach, able to recognize the EMA progressive degradations before his anomalous behaviors become critical, is proposed: the Fault Detection and Identification (FDI) of the considered incipient failures is performed analyzing proper system operational parameters, able to put in evidence the corresponding degradation path, by means of a numerical algorithm based on spectral analysis techniques. Subsequently, these operational parameters will be correlated with the actual EMA health condition by means of failure maps created by a reference monitoring model-based algorithm. In this work, the proposed method has been tested in case of EMA affected by combined progressive failures: in particular, partial stator single phase turn to turn short-circuit and rotor static eccentricity are considered. In order to evaluate the prognostic method, a numerical test-bench has been conceived. Results show that the method exhibit adequate robustness and a high degree of confidence in the ability to early identify an eventual malfunctioning, minimizing the risk of fake alarms or unannounced failures.

  9. Application of the windowed-Fourier-transform-based fringe analysis technique for investigating temperature and concentration fields in fluids.

    PubMed

    Mohanan, Sharika; Srivastava, Atul

    2014-04-10

    The present work is concerned with the development and application of a novel fringe analysis technique based on the principles of the windowed-Fourier-transform (WFT) for the determination of temperature and concentration fields from interferometric images for a range of heat and mass transfer applications. Based on the extent of the noise level associated with the experimental data, the technique has been coupled with two different phase unwrapping methods: the Itoh algorithm and the quality guided phase unwrapping technique for phase extraction. In order to generate the experimental data, a range of experiments have been carried out which include cooling of a vertical flat plate in free convection conditions, combustion of mono-propellant flames, and growth of organic as well as inorganic crystals from their aqueous solutions. The flat plate and combustion experiments are modeled as heat transfer applications wherein the interest is to determine the whole-field temperature distribution. Aqueous-solution-based crystal growth experiments are performed to simulate the mass transfer phenomena and the interest is to determine the two-dimensional solute concentration field around the growing crystal. A Mach-Zehnder interferometer has been employed to record the path-integrated quantity of interest (temperature and/or concentration) in the form of interferometric images in the experiments. The potential of the WFT method has also been demonstrated on numerically simulated phase data for varying noise levels, and the accuracy in phase extraction have been quantified in terms of the root mean square errors. Three levels of noise, i.e., 0%, 10%, and 20% have been considered. Results of the present study show that the WFT technique allows an accurate extraction of phase values that can subsequently be converted into two-dimensional temperature and/or concentration distribution fields. Moreover, since WFT is a local processing technique, speckle patterns and the inherent

  10. Fusing modeling techniques to support domain analysis for reuse opportunities identification

    NASA Technical Reports Server (NTRS)

    Hall, Susan Main; Mcguire, Eileen

    1993-01-01

    Functional modeling techniques or object-oriented graphical representations, which are more useful to someone trying to understand the general design or high level requirements of a system? For a recent domain analysis effort, the answer was a fusion of popular modeling techniques of both types. By using both functional and object-oriented techniques, the analysts involved were able to lean on their experience in function oriented software development, while taking advantage of the descriptive power available in object oriented models. In addition, a base of familiar modeling methods permitted the group of mostly new domain analysts to learn the details of the domain analysis process while producing a quality product. This paper describes the background of this project and then provides a high level definition of domain analysis. The majority of this paper focuses on the modeling method developed and utilized during this analysis effort.

  11. Wood lens design philosophy based on a binary additive manufacturing technique

    NASA Astrophysics Data System (ADS)

    Marasco, Peter L.; Bailey, Christopher

    2016-04-01

    Using additive manufacturing techniques in optical engineering to construct a gradient index (GRIN) optic may overcome a number of limitations of GRIN technology. Such techniques are maturing quickly, yielding additional design degrees of freedom for the engineer. How best to employ these degrees of freedom is not completely clear at this time. This paper describes a preliminary design philosophy, including assumptions, pertaining to a particular printing technique for GRIN optics. It includes an analysis based on simulation and initial component measurement.

  12. Structural and crystal orientation analysis of Al-Si coating on Ni-based superalloy by means of EBSD technique

    NASA Astrophysics Data System (ADS)

    Muslimin, A. N.; Sugiarti, E.; Aritonang, T.; Purawiardi, R. I.; Desiati, R. D.

    2018-03-01

    Ni-based superalloy is widely used for high performance components in power generation turbine due to its excellent mechanical properties. However, Ni-based superalloy has low oxidation resistantance. Therefore, surface coating is required to improve oxidation resistance at high temperatures. Al-Si as a coting material was successfully co-deposited on Ni-based substrate by pack cementation method at 900 °C for about 4 hours. The oxidation test was carried out at high temperature of 1000 °C for 100 hours. Micro structural characterization and analysis on crystal orientation were perfomed by using Field Emission Scanning Electron Microscope (FE-SEM) and Electron Back Scatter Diffraction (EBSD) technique, respectively. The results showed that the coating layer with a homogenous layer and had a thickness of 53 μm consisting of β-NiAl with cubic structure and Ni2Al3 with hexagonal structure. TGO layer was developed after oxidation and had a thickness of about 5 μm consisting of α-Al2O3 and spinel NiCr2O4. The phase composition map and crystal orientation acquired by EBSD technique was also discussed both in TGO and coating layers.

  13. BATMAN: Bayesian Technique for Multi-image Analysis

    NASA Astrophysics Data System (ADS)

    Casado, J.; Ascasibar, Y.; García-Benito, R.; Guidi, G.; Choudhury, O. S.; Bellocchi, E.; Sánchez, S. F.; Díaz, A. I.

    2017-04-01

    This paper describes the Bayesian Technique for Multi-image Analysis (BATMAN), a novel image-segmentation technique based on Bayesian statistics that characterizes any astronomical data set containing spatial information and performs a tessellation based on the measurements and errors provided as input. The algorithm iteratively merges spatial elements as long as they are statistically consistent with carrying the same information (I.e. identical signal within the errors). We illustrate its operation and performance with a set of test cases including both synthetic and real integral-field spectroscopic data. The output segmentations adapt to the underlying spatial structure, regardless of its morphology and/or the statistical properties of the noise. The quality of the recovered signal represents an improvement with respect to the input, especially in regions with low signal-to-noise ratio. However, the algorithm may be sensitive to small-scale random fluctuations, and its performance in presence of spatial gradients is limited. Due to these effects, errors may be underestimated by as much as a factor of 2. Our analysis reveals that the algorithm prioritizes conservation of all the statistically significant information over noise reduction, and that the precise choice of the input data has a crucial impact on the results. Hence, the philosophy of BaTMAn is not to be used as a 'black box' to improve the signal-to-noise ratio, but as a new approach to characterize spatially resolved data prior to its analysis. The source code is publicly available at http://astro.ft.uam.es/SELGIFS/BaTMAn.

  14. Digital techniques for ULF wave polarization analysis

    NASA Technical Reports Server (NTRS)

    Arthur, C. W.

    1979-01-01

    Digital power spectral and wave polarization analysis are powerful techniques for studying ULF waves in the earth's magnetosphere. Four different techniques for using the spectral matrix to perform such an analysis have been presented in the literature. Three of these techniques are similar in that they require transformation of the spectral matrix to the principal axis system prior to performing the polarization analysis. The differences in the three techniques lie in the manner in which determine this transformation. A comparative study of these three techniques using both simulated and real data has shown them to be approximately equal in quality of performance. The fourth technique does not require transformation of the spectral matrix. Rather, it uses the measured spectral matrix and state vectors for a desired wave type to design a polarization detector function in the frequency domain. The design of various detector functions and their application to both simulated and real data will be presented.

  15. A novel fast and flexible technique of radical kinetic behaviour investigation based on pallet for plasma evaluation structure and numerical analysis

    NASA Astrophysics Data System (ADS)

    Malinowski, Arkadiusz; Takeuchi, Takuya; Chen, Shang; Suzuki, Toshiya; Ishikawa, Kenji; Sekine, Makoto; Hori, Masaru; Lukasiak, Lidia; Jakubowski, Andrzej

    2013-07-01

    This paper describes a new, fast, and case-independent technique for sticking coefficient (SC) estimation based on pallet for plasma evaluation (PAPE) structure and numerical analysis. Our approach does not require complicated structure, apparatus, or time-consuming measurements but offers high reliability of data and high flexibility. Thermal analysis is also possible. This technique has been successfully applied to estimation of very low value of SC of hydrogen radicals on chemically amplified ArF 193 nm photoresist (the main goal of this study). Upper bound of our technique has been determined by investigation of SC of fluorine radical on polysilicon (in elevated temperature). Sources of estimation error and ways of its reduction have been also discussed. Results of this study give an insight into the process kinetics, and not only they are helpful in better process understanding but additionally they may serve as parameters in a phenomenological model development for predictive modelling of etching for ultimate CMOS topography simulation.

  16. Improved quantitative analysis of spectra using a new method of obtaining derivative spectra based on a singular perturbation technique.

    PubMed

    Li, Zhigang; Wang, Qiaoyun; Lv, Jiangtao; Ma, Zhenhe; Yang, Linjuan

    2015-06-01

    Spectroscopy is often applied when a rapid quantitative analysis is required, but one challenge is the translation of raw spectra into a final analysis. Derivative spectra are often used as a preliminary preprocessing step to resolve overlapping signals, enhance signal properties, and suppress unwanted spectral features that arise due to non-ideal instrument and sample properties. In this study, to improve quantitative analysis of near-infrared spectra, derivatives of noisy raw spectral data need to be estimated with high accuracy. A new spectral estimator based on singular perturbation technique, called the singular perturbation spectra estimator (SPSE), is presented, and the stability analysis of the estimator is given. Theoretical analysis and simulation experimental results confirm that the derivatives can be estimated with high accuracy using this estimator. Furthermore, the effectiveness of the estimator for processing noisy infrared spectra is evaluated using the analysis of beer spectra. The derivative spectra of the beer and the marzipan are used to build the calibration model using partial least squares (PLS) modeling. The results show that the PLS based on the new estimator can achieve better performance compared with the Savitzky-Golay algorithm and can serve as an alternative choice for quantitative analytical applications.

  17. Comparative Analysis of RF Emission Based Fingerprinting Techniques for ZigBee Device Classification

    DTIC Science & Technology

    quantify the differences invarious RF fingerprinting techniques via comparative analysis of MDA/ML classification results. The findings herein demonstrate...correct classification rates followed by COR-DNA and then RF-DNA in most test cases and especially in low Eb/N0 ranges, where ZigBee is designed to operate.

  18. The Role of Liquid Based Cytology and Ancillary Techniques in the Peritoneal Washing Analysis: Our Institutional Experience.

    PubMed

    Rossi, Esther; Bizzarro, Tommaso; Martini, Maurizio; Longatto-Filho, Adhemar; Schmitt, Fernando; Fagotti, Anna; Scambia, Giovanni; Zannoni, Gian Franco

    2017-01-01

    The cytological analysis of peritoneal effusions serves as a diagnostic and prognostic aid for either primary or metastatic diseases. Among the different cytological preparations, liquid based cytology (LBC) represents a feasible and reliable method ensuring also the application of ancillary techniques (i.e immunocytochemistry-ICC and molecular testing). We recorded 10348 LBC peritoneal effusions between January 2000 and December 2014. They were classified as non-diagnostic (ND), negative for malignancy-NM, atypical-suspicious for malignancy-SM and positive for malignancy-PM. The cytological diagnosis included 218 ND, 9.035 NM, 213 SM and 882 PM. A total of 8048 (7228 NM, 115SM, 705 PM) cases with histological follow-up were included. Our NM included 21 malignant and 7207 benign histological diagnoses. Our 820 SMs+PMs were diagnosed as 107 unknown malignancies (30SM and 77PM), 691 metastatic lesions (81SM and 610PM), 9 lymphomas (2SM and 7PM), 9 mesotheliomas (1SM and 8SM), 4 sarcomas (1SM and 3PM). Primary gynecological cancers contributed with 64% of the cases. We documented 97.4% sensitivity, 99.9% specificity, 98% diagnostic accuracy, 99.7% negative predictive value (NPV) and 99.7% positive predictive value (PPV). Furthermore, the morphological diagnoses were supported by either 173 conclusive ICC results or 50 molecular analyses. Specifically the molecular testing was performed for the EGFR and KRAS mutational analysis based on the previous or contemporary diagnoses of Non Small Cell Lung Cancer (NSCLC) and colon carcinomas. We identified 10 EGFR in NSCCL and 7 KRAS mutations on LBC stored material. Peritoneal cytology is an adjunctive tool in the surgical management of tumors mostly gynecological cancers. LBC maximizes the application of ancillary techniques such as ICC and molecular analysis with feasible diagnostic and predictive yields also in controversial cases.

  19. The Role of Liquid Based Cytology and Ancillary Techniques in the Peritoneal Washing Analysis: Our Institutional Experience

    PubMed Central

    Rossi, Esther; Bizzarro, Tommaso; Martini, Maurizio; Longatto-Filho, Adhemar; Schmitt, Fernando; Fagotti, Anna; Scambia, Giovanni; Zannoni, Gian Franco

    2017-01-01

    Background The cytological analysis of peritoneal effusions serves as a diagnostic and prognostic aid for either primary or metastatic diseases. Among the different cytological preparations, liquid based cytology (LBC) represents a feasible and reliable method ensuring also the application of ancillary techniques (i.e immunocytochemistry-ICC and molecular testing). Methods We recorded 10348 LBC peritoneal effusions between January 2000 and December 2014. They were classified as non-diagnostic (ND), negative for malignancy-NM, atypical-suspicious for malignancy-SM and positive for malignancy-PM. Results The cytological diagnosis included 218 ND, 9.035 NM, 213 SM and 882 PM. A total of 8048 (7228 NM, 115SM, 705 PM) cases with histological follow-up were included. Our NM included 21 malignant and 7207 benign histological diagnoses. Our 820 SMs+PMs were diagnosed as 107 unknown malignancies (30SM and 77PM), 691 metastatic lesions (81SM and 610PM), 9 lymphomas (2SM and 7PM), 9 mesotheliomas (1SM and 8SM), 4 sarcomas (1SM and 3PM). Primary gynecological cancers contributed with 64% of the cases. We documented 97.4% sensitivity, 99.9% specificity, 98% diagnostic accuracy, 99.7% negative predictive value (NPV) and 99.7% positive predictive value (PPV). Furthermore, the morphological diagnoses were supported by either 173 conclusive ICC results or 50 molecular analyses. Specifically the molecular testing was performed for the EGFR and KRAS mutational analysis based on the previous or contemporary diagnoses of Non Small Cell Lung Cancer (NSCLC) and colon carcinomas. We identified 10 EGFR in NSCCL and 7 KRAS mutations on LBC stored material. Conclusions Peritoneal cytology is an adjunctive tool in the surgical management of tumors mostly gynecological cancers. LBC maximizes the application of ancillary techniques such as ICC and molecular analysis with feasible diagnostic and predictive yields also in controversial cases. PMID:28099523

  20. Retention of denture bases fabricated by three different processing techniques – An in vivo study

    PubMed Central

    Chalapathi Kumar, V. H.; Surapaneni, Hemchand; Ravikiran, V.; Chandra, B. Sarat; Balusu, Srilatha; Reddy, V. Naveen

    2016-01-01

    Aim: Distortion due to Polymerization shrinkage compromises the retention. To evaluate the amount of retention of denture bases fabricated by conventional, anchorized, and injection molding polymerization techniques. Materials and Methods: Ten completely edentulous patients were selected, impressions were made, and master cast obtained was duplicated to fabricate denture bases by three polymerization techniques. Loop was attached to the finished denture bases to estimate the force required to dislodge them by retention apparatus. Readings were subjected to nonparametric Friedman two-way analysis of variance followed by Bonferroni correction methods and Wilcoxon matched-pairs signed-ranks test. Results: Denture bases fabricated by injection molding (3740 g), anchorized techniques (2913 g) recorded greater retention values than conventional technique (2468 g). Significant difference was seen between these techniques. Conclusions: Denture bases obtained by injection molding polymerization technique exhibited maximum retention, followed by anchorized technique, and least retention was seen in conventional molding technique. PMID:27382542

  1. Statistics and Machine Learning based Outlier Detection Techniques for Exoplanets

    NASA Astrophysics Data System (ADS)

    Goel, Amit; Montgomery, Michele

    2015-08-01

    Architectures of planetary systems are observable snapshots in time that can indicate formation and dynamic evolution of planets. The observable key parameters that we consider are planetary mass and orbital period. If planet masses are significantly less than their host star masses, then Keplerian Motion is defined as P^2 = a^3 where P is the orbital period in units of years and a is the orbital period in units of Astronomical Units (AU). Keplerian motion works on small scales such as the size of the Solar System but not on large scales such as the size of the Milky Way Galaxy. In this work, for confirmed exoplanets of known stellar mass, planetary mass, orbital period, and stellar age, we analyze Keplerian motion of systems based on stellar age to seek if Keplerian motion has an age dependency and to identify outliers. For detecting outliers, we apply several techniques based on statistical and machine learning methods such as probabilistic, linear, and proximity based models. In probabilistic and statistical models of outliers, the parameters of a closed form probability distributions are learned in order to detect the outliers. Linear models use regression analysis based techniques for detecting outliers. Proximity based models use distance based algorithms such as k-nearest neighbour, clustering algorithms such as k-means, or density based algorithms such as kernel density estimation. In this work, we will use unsupervised learning algorithms with only the proximity based models. In addition, we explore the relative strengths and weaknesses of the various techniques by validating the outliers. The validation criteria for the outliers is if the ratio of planetary mass to stellar mass is less than 0.001. In this work, we present our statistical analysis of the outliers thus detected.

  2. Statistical evaluation of vibration analysis techniques

    NASA Technical Reports Server (NTRS)

    Milner, G. Martin; Miller, Patrice S.

    1987-01-01

    An evaluation methodology is presented for a selection of candidate vibration analysis techniques applicable to machinery representative of the environmental control and life support system of advanced spacecraft; illustrative results are given. Attention is given to the statistical analysis of small sample experiments, the quantification of detection performance for diverse techniques through the computation of probability of detection versus probability of false alarm, and the quantification of diagnostic performance.

  3. Get to Understand More from Single-Cells: Current Studies of Microfluidic-Based Techniques for Single-Cell Analysis.

    PubMed

    Lo, Shih-Jie; Yao, Da-Jeng

    2015-07-23

    This review describes the microfluidic techniques developed for the analysis of a single cell. The characteristics of microfluidic (e.g., little sample amount required, high-throughput performance) make this tool suitable to answer and to solve biological questions of interest about a single cell. This review aims to introduce microfluidic related techniques for the isolation, trapping and manipulation of a single cell. The major approaches for detection in single-cell analysis are introduced; the applications of single-cell analysis are then summarized. The review concludes with discussions of the future directions and opportunities of microfluidic systems applied in analysis of a single cell.

  4. 2D Flood Modelling Using Advanced Terrain Analysis Techniques And A Fully Continuous DEM-Based Rainfall-Runoff Algorithm

    NASA Astrophysics Data System (ADS)

    Nardi, F.; Grimaldi, S.; Petroselli, A.

    2012-12-01

    Remotely sensed Digital Elevation Models (DEMs), largely available at high resolution, and advanced terrain analysis techniques built in Geographic Information Systems (GIS), provide unique opportunities for DEM-based hydrologic and hydraulic modelling in data-scarce river basins paving the way for flood mapping at the global scale. This research is based on the implementation of a fully continuous hydrologic-hydraulic modelling optimized for ungauged basins with limited river flow measurements. The proposed procedure is characterized by a rainfall generator that feeds a continuous rainfall-runoff model producing flow time series that are routed along the channel using a bidimensional hydraulic model for the detailed representation of the inundation process. The main advantage of the proposed approach is the characterization of the entire physical process during hydrologic extreme events of channel runoff generation, propagation, and overland flow within the floodplain domain. This physically-based model neglects the need for synthetic design hyetograph and hydrograph estimation that constitute the main source of subjective analysis and uncertainty of standard methods for flood mapping. Selected case studies show results and performances of the proposed procedure as respect to standard event-based approaches.

  5. Frontier-based techniques in measuring hospital efficiency in Iran: a systematic review and meta-regression analysis

    PubMed Central

    2013-01-01

    Background In recent years, there has been growing interest in measuring the efficiency of hospitals in Iran and several studies have been conducted on the topic. The main objective of this paper was to review studies in the field of hospital efficiency and examine the estimated technical efficiency (TE) of Iranian hospitals. Methods Persian and English databases were searched for studies related to measuring hospital efficiency in Iran. Ordinary least squares (OLS) regression models were applied for statistical analysis. The PRISMA guidelines were followed in the search process. Results A total of 43 efficiency scores from 29 studies were retrieved and used to approach the research question. Data envelopment analysis was the principal frontier efficiency method in the estimation of efficiency scores. The pooled estimate of mean TE was 0.846 (±0.134). There was a considerable variation in the efficiency scores between the different studies performed in Iran. There were no differences in efficiency scores between data envelopment analysis (DEA) and stochastic frontier analysis (SFA) techniques. The reviewed studies are generally similar and suffer from similar methodological deficiencies, such as no adjustment for case mix and quality of care differences. The results of OLS regression revealed that studies that included more variables and more heterogeneous hospitals generally reported higher TE. Larger sample size was associated with reporting lower TE. Conclusions The features of frontier-based techniques had a profound impact on the efficiency scores among Iranian hospital studies. These studies suffer from major methodological deficiencies and were of sub-optimal quality, limiting their validity and reliability. It is suggested that improving data collection and processing in Iranian hospital databases may have a substantial impact on promoting the quality of research in this field. PMID:23945011

  6. Network meta-analysis: a technique to gather evidence from direct and indirect comparisons

    PubMed Central

    2017-01-01

    Systematic reviews and pairwise meta-analyses of randomized controlled trials, at the intersection of clinical medicine, epidemiology and statistics, are positioned at the top of evidence-based practice hierarchy. These are important tools to base drugs approval, clinical protocols and guidelines formulation and for decision-making. However, this traditional technique only partially yield information that clinicians, patients and policy-makers need to make informed decisions, since it usually compares only two interventions at the time. In the market, regardless the clinical condition under evaluation, usually many interventions are available and few of them have been studied in head-to-head studies. This scenario precludes conclusions to be drawn from comparisons of all interventions profile (e.g. efficacy and safety). The recent development and introduction of a new technique – usually referred as network meta-analysis, indirect meta-analysis, multiple or mixed treatment comparisons – has allowed the estimation of metrics for all possible comparisons in the same model, simultaneously gathering direct and indirect evidence. Over the last years this statistical tool has matured as technique with models available for all types of raw data, producing different pooled effect measures, using both Frequentist and Bayesian frameworks, with different software packages. However, the conduction, report and interpretation of network meta-analysis still poses multiple challenges that should be carefully considered, especially because this technique inherits all assumptions from pairwise meta-analysis but with increased complexity. Thus, we aim to provide a basic explanation of network meta-analysis conduction, highlighting its risks and benefits for evidence-based practice, including information on statistical methods evolution, assumptions and steps for performing the analysis. PMID:28503228

  7. Applications of Electromigration Techniques: Applications of Electromigration Techniques in Food Analysis

    NASA Astrophysics Data System (ADS)

    Wieczorek, Piotr; Ligor, Magdalena; Buszewski, Bogusław

    Electromigration techniques, including capillary electrophoresis (CE), are widely used for separation and identification of compounds present in food products. These techniques may also be considered as alternate and complementary with respect to commonly used analytical techniques, such as high-performance liquid chromatography (HPLC), or gas chromatography (GC). Applications of CE concern the determination of high-molecular compounds, like polyphenols, including flavonoids, pigments, vitamins, food additives (preservatives, antioxidants, sweeteners, artificial pigments) are presented. Also, the method developed for the determination of proteins and peptides composed of amino acids, which are basic components of food products, are studied. Other substances such as carbohydrates, nucleic acids, biogenic amines, natural toxins, and other contaminations including pesticides and antibiotics are discussed. The possibility of CE application in food control laboratories, where analysis of the composition of food and food products are conducted, is of great importance. CE technique may be used during the control of technological processes in the food industry and for the identification of numerous compounds present in food. Due to the numerous advantages of the CE technique it is successfully used in routine food analysis.

  8. Flow analysis techniques for phosphorus: an overview.

    PubMed

    Estela, José Manuel; Cerdà, Víctor

    2005-04-15

    A bibliographical review on the implementation and the results obtained in the use of different flow analytical techniques for the determination of phosphorus is carried out. The sources, occurrence and importance of phosphorus together with several aspects regarding the analysis and terminology used in the determination of this element are briefly described. A classification as well as a brief description of the basis, advantages and disadvantages of the different existing flow techniques, namely; segmented flow analysis (SFA), flow injection analysis (FIA), sequential injection analysis (SIA), all injection analysis (AIA), batch injection analysis (BIA), multicommutated FIA (MCFIA), multisyringe FIA (MSFIA) and multipumped FIA (MPFIA) is also carried out. The most relevant manuscripts regarding the analysis of phosphorus by means of flow techniques are herein classified according to the detection instrumental technique used with the aim to facilitate their study and obtain an overall scope. Finally, the analytical characteristics of numerous flow-methods reported in the literature are provided in the form of a table and their applicability to samples with different matrixes, namely water samples (marine, river, estuarine, waste, industrial, drinking, etc.), soils leachates, plant leaves, toothpaste, detergents, foodstuffs (wine, orange juice, milk), biological samples, sugars, fertilizer, hydroponic solutions, soils extracts and cyanobacterial biofilms are tabulated.

  9. BCC skin cancer diagnosis based on texture analysis techniques

    NASA Astrophysics Data System (ADS)

    Chuang, Shao-Hui; Sun, Xiaoyan; Chang, Wen-Yu; Chen, Gwo-Shing; Huang, Adam; Li, Jiang; McKenzie, Frederic D.

    2011-03-01

    In this paper, we present a texture analysis based method for diagnosing the Basal Cell Carcinoma (BCC) skin cancer using optical images taken from the suspicious skin regions. We first extracted the Run Length Matrix and Haralick texture features from the images and used a feature selection algorithm to identify the most effective feature set for the diagnosis. We then utilized a Multi-Layer Perceptron (MLP) classifier to classify the images to BCC or normal cases. Experiments showed that detecting BCC cancer based on optical images is feasible. The best sensitivity and specificity we achieved on our data set were 94% and 95%, respectively.

  10. The Analysis of Dimensionality Reduction Techniques in Cryptographic Object Code Classification

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jason L. Wright; Milos Manic

    2010-05-01

    This paper compares the application of three different dimension reduction techniques to the problem of locating cryptography in compiled object code. A simple classi?er is used to compare dimension reduction via sorted covariance, principal component analysis, and correlation-based feature subset selection. The analysis concentrates on the classi?cation accuracy as the number of dimensions is increased.

  11. DART-MS: A New Analytical Technique for Forensic Paint Analysis.

    PubMed

    Marić, Mark; Marano, James; Cody, Robert B; Bridge, Candice

    2018-06-05

    Automotive paint evidence is one of the most significant forms of evidence obtained in automotive-related incidents. Therefore, the analysis of automotive paint evidence is imperative in forensic casework. Most analytical schemes for automotive paint characterization involve optical microscopy, followed by infrared spectroscopy and pyrolysis-gas chromatography mass spectrometry ( py-GCMS) if required. The main drawback with py-GCMS, aside from its destructive nature, is that this technique is relatively time intensive in comparison to other techniques. Direct analysis in real-time-time-of-flight mass spectrometry (DART-TOFMS) may provide an alternative to py-GCMS, as the rapidity of analysis and minimal sample preparation affords a significant advantage. In this study, automotive clear coats from four vehicles were characterized by DART-TOFMS and a standard py-GCMS protocol. Principal component analysis was utilized to interpret the resultant data and suggested the two techniques provided analogous sample discrimination. Moreover, in some instances DART-TOFMS was able to identify components not observed by py-GCMS and vice versa, which indicates that the two techniques may provide complementary information. Additionally, a thermal desorption/pyrolysis DART-TOFMS methodology was also evaluated to characterize the intact paint chips from the vehicles to ascertain if the linear temperature gradient provided additional discriminatory information. All the paint samples were able to be discriminated based on the distinctive thermal desorption plots afforded from this technique, which may also be utilized for sample discrimination. On the basis of the results, DART-TOFMS may provide an additional tool to the forensic paint examiner.

  12. Get to Understand More from Single-Cells: Current Studies of Microfluidic-Based Techniques for Single-Cell Analysis

    PubMed Central

    Lo, Shih-Jie; Yao, Da-Jeng

    2015-01-01

    This review describes the microfluidic techniques developed for the analysis of a single cell. The characteristics of microfluidic (e.g., little sample amount required, high-throughput performance) make this tool suitable to answer and to solve biological questions of interest about a single cell. This review aims to introduce microfluidic related techniques for the isolation, trapping and manipulation of a single cell. The major approaches for detection in single-cell analysis are introduced; the applications of single-cell analysis are then summarized. The review concludes with discussions of the future directions and opportunities of microfluidic systems applied in analysis of a single cell. PMID:26213918

  13. Speckle noise reduction technique for Lidar echo signal based on self-adaptive pulse-matching independent component analysis

    NASA Astrophysics Data System (ADS)

    Xu, Fan; Wang, Jiaxing; Zhu, Daiyin; Tu, Qi

    2018-04-01

    Speckle noise has always been a particularly tricky problem in improving the ranging capability and accuracy of Lidar system especially in harsh environment. Currently, effective speckle de-noising techniques are extremely scarce and should be further developed. In this study, a speckle noise reduction technique has been proposed based on independent component analysis (ICA). Since normally few changes happen in the shape of laser pulse itself, the authors employed the laser source as a reference pulse and executed the ICA decomposition to find the optimal matching position. In order to achieve the self-adaptability of algorithm, local Mean Square Error (MSE) has been defined as an appropriate criterion for investigating the iteration results. The obtained experimental results demonstrated that the self-adaptive pulse-matching ICA (PM-ICA) method could effectively decrease the speckle noise and recover the useful Lidar echo signal component with high quality. Especially, the proposed method achieves 4 dB more improvement of signal-to-noise ratio (SNR) than a traditional homomorphic wavelet method.

  14. Evaluation of culture-based techniques and 454 pyrosequencing for the analysis of fungal diversity in potting media and organic fertilizers.

    PubMed

    Al-Sadi, A M; Al-Mazroui, S S; Phillips, A J L

    2015-08-01

    Potting media and organic fertilizers (OFs) are commonly used in agricultural systems. However, there is a lack of studies on the efficiency of culture-based techniques in assessing the level of fungal diversity in these products. A study was conducted to investigate the efficiency of seven culture-based techniques and pyrosequencing for characterizing fungal diversity in potting media and OFs. Fungal diversity was evaluated using serial dilution, direct plating and baiting with carrot slices, potato slices, radish seeds, cucumber seeds and cucumber cotyledons. Identity of all the isolates was confirmed on the basis of the internal transcribed spacer region of the ribosomal RNA (ITS rRNA) sequence data. The direct plating technique was found to be superior over other culture-based techniques in the number of fungal species detected. It was also found to be simple and the least time consuming technique. Comparing the efficiency of direct plating with 454 pyrosequencing revealed that pyrosequencing detected 12 and 15 times more fungal species from potting media and OFs respectively. Analysis revealed that there were differences between potting media and OFs in the dominant phyla, classes, orders, families, genera and species detected. Zygomycota (52%) and Chytridiomycota (60%) were the predominant phyla in potting media and OFs respectively. The superiority of pyrosequencing over cultural methods could be related to the ability to detect obligate fungi, slow growing fungi and fungi that exist at low population densities. The evaluated methods in this study, especially direct plating and pyrosequencing, may be used as tools to help detect and reduce movement of unwanted fungi between countries and regions. © 2015 The Society for Applied Microbiology.

  15. 77 FR 40552 - Federal Acquisition Regulation; Price Analysis Techniques

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-07-10

    ... Regulation; Price Analysis Techniques AGENCY: Department of Defense (DoD), General Services Administration... price analysis technique in order to establish a fair and reasonable price. DATES: Interested parties....404-1(b)(2) addresses various price analysis techniques and procedures the Government may use to...

  16. Reliability analysis of a robotic system using hybridized technique

    NASA Astrophysics Data System (ADS)

    Kumar, Naveen; Komal; Lather, J. S.

    2017-09-01

    In this manuscript, the reliability of a robotic system has been analyzed using the available data (containing vagueness, uncertainty, etc). Quantification of involved uncertainties is done through data fuzzification using triangular fuzzy numbers with known spreads as suggested by system experts. With fuzzified data, if the existing fuzzy lambda-tau (FLT) technique is employed, then the computed reliability parameters have wide range of predictions. Therefore, decision-maker cannot suggest any specific and influential managerial strategy to prevent unexpected failures and consequently to improve complex system performance. To overcome this problem, the present study utilizes a hybridized technique. With this technique, fuzzy set theory is utilized to quantify uncertainties, fault tree is utilized for the system modeling, lambda-tau method is utilized to formulate mathematical expressions for failure/repair rates of the system, and genetic algorithm is utilized to solve established nonlinear programming problem. Different reliability parameters of a robotic system are computed and the results are compared with the existing technique. The components of the robotic system follow exponential distribution, i.e., constant. Sensitivity analysis is also performed and impact on system mean time between failures (MTBF) is addressed by varying other reliability parameters. Based on analysis some influential suggestions are given to improve the system performance.

  17. Polarization-based material classification technique using passive millimeter-wave polarimetric imagery.

    PubMed

    Hu, Fei; Cheng, Yayun; Gui, Liangqi; Wu, Liang; Zhang, Xinyi; Peng, Xiaohui; Su, Jinlong

    2016-11-01

    The polarization properties of thermal millimeter-wave emission capture inherent information of objects, e.g., material composition, shape, and surface features. In this paper, a polarization-based material-classification technique using passive millimeter-wave polarimetric imagery is presented. Linear polarization ratio (LPR) is created to be a new feature discriminator that is sensitive to material type and to remove the reflected ambient radiation effect. The LPR characteristics of several common natural and artificial materials are investigated by theoretical and experimental analysis. Based on a priori information about LPR characteristics, the optimal range of incident angle and the classification criterion are discussed. Simulation and measurement results indicate that the presented classification technique is effective for distinguishing between metals and dielectrics. This technique suggests possible applications for outdoor metal target detection in open scenes.

  18. Examining Returned Samples in their Collection Tubes Using Synchrotron Radiation-Based Techniques

    NASA Astrophysics Data System (ADS)

    Schoonen, M. A.; Hurowitz, J. A.; Thieme, J.; Dooryhee, E.; Fogelqvist, E.; Gregerson, J.; Farley, K. A.; Sherman, S.; Hill, J.

    2018-04-01

    Synchrotron radiation-based techniques can be leveraged for triaging and analysis of returned samples before unsealing collection tubes. Proof-of-concept measurements conducted at Brookhaven National Lab's National Synchrotron Light Source-II.

  19. Image Analysis Technique for Material Behavior Evaluation in Civil Structures

    PubMed Central

    Moretti, Michele; Rossi, Gianluca

    2017-01-01

    The article presents a hybrid monitoring technique for the measurement of the deformation field. The goal is to obtain information about crack propagation in existing structures, for the purpose of monitoring their state of health. The measurement technique is based on the capture and analysis of a digital image set. Special markers were used on the surface of the structures that can be removed without damaging existing structures as the historical masonry. The digital image analysis was done using software specifically designed in Matlab to follow the tracking of the markers and determine the evolution of the deformation state. The method can be used in any type of structure but is particularly suitable when it is necessary not to damage the surface of structures. A series of experiments carried out on masonry walls of the Oliverian Museum (Pesaro, Italy) and Palazzo Silvi (Perugia, Italy) have allowed the validation of the procedure elaborated by comparing the results with those derived from traditional measuring techniques. PMID:28773129

  20. Image Analysis Technique for Material Behavior Evaluation in Civil Structures.

    PubMed

    Speranzini, Emanuela; Marsili, Roberto; Moretti, Michele; Rossi, Gianluca

    2017-07-08

    The article presents a hybrid monitoring technique for the measurement of the deformation field. The goal is to obtain information about crack propagation in existing structures, for the purpose of monitoring their state of health. The measurement technique is based on the capture and analysis of a digital image set. Special markers were used on the surface of the structures that can be removed without damaging existing structures as the historical masonry. The digital image analysis was done using software specifically designed in Matlab to follow the tracking of the markers and determine the evolution of the deformation state. The method can be used in any type of structure but is particularly suitable when it is necessary not to damage the surface of structures. A series of experiments carried out on masonry walls of the Oliverian Museum (Pesaro, Italy) and Palazzo Silvi (Perugia, Italy) have allowed the validation of the procedure elaborated by comparing the results with those derived from traditional measuring techniques.

  1. [Analysis of syndrome discipline of generalized anxiety disorder using data mining techniques].

    PubMed

    Tang, Qi-sheng; Sun, Wen-jun; Qu, Miao; Guo, Dong-fang

    2012-09-01

    To study the use of data mining techniques in analyzing the syndrome discipline of generalized anxiety disorder (GAD). From August 1, 2009 to July 31, 2010, 705 patients with GAD in 10 hospitals of Beijing were investigated over one year. Data mining techniques, such as Bayes net and cluster analysis, were used to analyze the syndrome discipline of GAD. A total of 61 symptoms of GAD were screened out. By using Bayes net, nine syndromes of GAD were abstracted based on the symptoms. Eight syndromes were abstracted by cluster analysis. After screening for duplicate syndromes and combining the experts' experience and traditional Chinese medicine theory, six syndromes of GAD were defined. These included depressed liver qi transforming into fire, phlegm-heat harassing the heart, liver depression and spleen deficiency, heart-kidney non-interaction, dual deficiency of the heart and spleen, and kidney deficiency and liver yang hyperactivity. Based on the results, the draft of Syndrome Diagnostic Criteria for Generalized Anxiety Disorder was developed. Data mining techniques such as Bayes net and cluster analysis have certain future potential for establishing syndrome models and analyzing syndrome discipline, thus they are suitable for the research of syndrome differentiation.

  2. Breath Analysis Using Laser Spectroscopic Techniques: Breath Biomarkers, Spectral Fingerprints, and Detection Limits

    PubMed Central

    Wang, Chuji; Sahay, Peeyush

    2009-01-01

    Breath analysis, a promising new field of medicine and medical instrumentation, potentially offers noninvasive, real-time, and point-of-care (POC) disease diagnostics and metabolic status monitoring. Numerous breath biomarkers have been detected and quantified so far by using the GC-MS technique. Recent advances in laser spectroscopic techniques and laser sources have driven breath analysis to new heights, moving from laboratory research to commercial reality. Laser spectroscopic detection techniques not only have high-sensitivity and high-selectivity, as equivalently offered by the MS-based techniques, but also have the advantageous features of near real-time response, low instrument costs, and POC function. Of the approximately 35 established breath biomarkers, such as acetone, ammonia, carbon dioxide, ethane, methane, and nitric oxide, 14 species in exhaled human breath have been analyzed by high-sensitivity laser spectroscopic techniques, namely, tunable diode laser absorption spectroscopy (TDLAS), cavity ringdown spectroscopy (CRDS), integrated cavity output spectroscopy (ICOS), cavity enhanced absorption spectroscopy (CEAS), cavity leak-out spectroscopy (CALOS), photoacoustic spectroscopy (PAS), quartz-enhanced photoacoustic spectroscopy (QEPAS), and optical frequency comb cavity-enhanced absorption spectroscopy (OFC-CEAS). Spectral fingerprints of the measured biomarkers span from the UV to the mid-IR spectral regions and the detection limits achieved by the laser techniques range from parts per million to parts per billion levels. Sensors using the laser spectroscopic techniques for a few breath biomarkers, e.g., carbon dioxide, nitric oxide, etc. are commercially available. This review presents an update on the latest developments in laser-based breath analysis. PMID:22408503

  3. Recent Electrochemical and Optical Sensors in Flow-Based Analysis

    PubMed Central

    Chailapakul, Orawon; Ngamukot, Passapol; Yoosamran, Alongkorn; Siangproh, Weena; Wangfuengkanagul, Nattakarn

    2006-01-01

    Some recent analytical sensors based on electrochemical and optical detection coupled with different flow techniques have been chosen in this overview. A brief description of fundamental concepts and applications of each flow technique, such as flow injection analysis (FIA), sequential injection analysis (SIA), all injection analysis (AIA), batch injection analysis (BIA), multicommutated FIA (MCFIA), multisyringe FIA (MSFIA), and multipumped FIA (MPFIA) were reviewed.

  4. AVR Microcontroller-based automated technique for analysis of DC motors

    NASA Astrophysics Data System (ADS)

    Kaur, P.; Chatterji, S.

    2014-01-01

    This paper provides essential information on the development of a 'dc motor test and analysis control card' using AVR series ATMega32 microcontroller. This card can be interfaced to PC and calculates parameters like motor losses, efficiency and plot characteristics for dc motors. Presently, there are different tests and methods available to evaluate motor parameters, but a single and universal user-friendly automated set-up has been discussed in this paper. It has been accomplished by designing a data acquisition and SCR bridge firing hardware based on AVR ATMega32 microcontroller. This hardware has the capability to drive the phase-controlled rectifiers and acquire real-time values of current, voltage, temperature and speed of motor. Various analyses feasible with the designed hardware are of immense importance for dc motor manufacturers and quality-sensitive users. Authors, through this paper aim to provide details of this AVR-based hardware which can be used for dc motor parameter analysis and also for motor control applications.

  5. A review of recent developments in parametric based acoustic emission techniques applied to concrete structures

    NASA Astrophysics Data System (ADS)

    Vidya Sagar, R.; Raghu Prasad, B. K.

    2012-03-01

    This article presents a review of recent developments in parametric based acoustic emission (AE) techniques applied to concrete structures. It recapitulates the significant milestones achieved by previous researchers including various methods and models developed in AE testing of concrete structures. The aim is to provide an overview of the specific features of parametric based AE techniques of concrete structures carried out over the years. Emphasis is given to traditional parameter-based AE techniques applied to concrete structures. A significant amount of research on AE techniques applied to concrete structures has already been published and considerable attention has been given to those publications. Some recent studies such as AE energy analysis and b-value analysis used to assess damage of concrete bridge beams have also been discussed. The formation of fracture process zone and the AE energy released during the fracture process in concrete beam specimens have been summarised. A large body of experimental data on AE characteristics of concrete has accumulated over the last three decades. This review of parametric based AE techniques applied to concrete structures may be helpful to the concerned researchers and engineers to better understand the failure mechanism of concrete and evolve more useful methods and approaches for diagnostic inspection of structural elements and failure prediction/prevention of concrete structures.

  6. Analysis of initial changes in the proteins of soybean root tip under flooding stress using gel-free and gel-based proteomic techniques.

    PubMed

    Yin, Xiaojian; Sakata, Katsumi; Nanjo, Yohei; Komatsu, Setsuko

    2014-06-25

    Flooding has a severe negative effect on soybean cultivation in the early stages of growth. To obtain a better understanding of the response mechanisms of soybean to flooding stress, initial changes in root tip proteins under flooding were analyzed using two proteomic techniques. Two-day-old soybeans were treated with flooding for 3, 6, 12, and 24h. The weight of soybeans increased during the first 3h of flooding, but root elongation was not observed. Using gel-based and gel-free proteomic techniques, 115 proteins were identified in root tips, of which 9 proteins were commonly detected by both methods. The 71 proteins identified by the gel-free proteomics were analyzed by a hierarchical clustering method based on induction levels during the flooding, and the proteins were divided into 5 clusters. Additional interaction analysis of the proteins revealed that ten proteins belonging to cluster I formed the center of a protein interaction network. mRNA expression analysis of these ten proteins showed that citrate lyase and heat shock protein 70 were down-regulated, whereas calreticulin was up-regulated in initial phase of flooding. These results suggest that flooding stress to soybean induces calcium-related signal transduction, which might play important roles in the early responses to flooding. Flooding has a severe negative effect on soybean cultivation, particularly in the early stages of growth. To better understand the response mechanisms of soybean to the early stages of flooding stress, two proteomic techniques were used. Two-day-old soybeans were treated without or with flooding for 3, 6, 12, and 24h. The fresh weight of soybeans increased during the first 3h of flooding stress, but the growth then slowed and no root elongation was observed. Using gel-based and gel-free proteomic techniques, 115 proteins were identified in root tips, of which 9 proteins were commonly detected by both methods. The 71 proteins identified by the gel-free proteomics were analyzed

  7. BaTMAn: Bayesian Technique for Multi-image Analysis

    NASA Astrophysics Data System (ADS)

    Casado, J.; Ascasibar, Y.; García-Benito, R.; Guidi, G.; Choudhury, O. S.; Bellocchi, E.; Sánchez, S. F.; Díaz, A. I.

    2016-12-01

    Bayesian Technique for Multi-image Analysis (BaTMAn) characterizes any astronomical dataset containing spatial information and performs a tessellation based on the measurements and errors provided as input. The algorithm iteratively merges spatial elements as long as they are statistically consistent with carrying the same information (i.e. identical signal within the errors). The output segmentations successfully adapt to the underlying spatial structure, regardless of its morphology and/or the statistical properties of the noise. BaTMAn identifies (and keeps) all the statistically-significant information contained in the input multi-image (e.g. an IFS datacube). The main aim of the algorithm is to characterize spatially-resolved data prior to their analysis.

  8. Field-based detection of biological samples for forensic analysis: Established techniques, novel tools, and future innovations.

    PubMed

    Morrison, Jack; Watts, Giles; Hobbs, Glyn; Dawnay, Nick

    2018-04-01

    Field based forensic tests commonly provide information on the presence and identity of biological stains and can also support the identification of species. Such information can support downstream processing of forensic samples and generate rapid intelligence. These approaches have traditionally used chemical and immunological techniques to elicit the result but some are known to suffer from a lack of specificity and sensitivity. The last 10 years has seen the development of field-based genetic profiling systems, with specific focus on moving the mainstay of forensic genetic analysis, namely STR profiling, out of the laboratory and into the hands of the non-laboratory user. In doing so it is now possible for enforcement officers to generate a crime scene DNA profile which can then be matched to a reference or database profile. The introduction of these novel genetic platforms also allows for further development of new molecular assays aimed at answering the more traditional questions relating to body fluid identity and species detection. The current drive for field-based molecular tools is in response to the needs of the criminal justice system and enforcement agencies, and promises a step-change in how forensic evidence is processed. However, the adoption of such systems by the law enforcement community does not represent a new strategy in the way forensic science has integrated previous novel approaches. Nor do they automatically represent a threat to the quality control and assurance practices that are central to the field. This review examines the historical need and subsequent research and developmental breakthroughs in field-based forensic analysis over the past two decades with particular focus on genetic methods Emerging technologies from a range of scientific fields that have potential applications in forensic analysis at the crime scene are identified and associated issues that arise from the shift from laboratory into operational field use are discussed

  9. The combined use of order tracking techniques for enhanced Fourier analysis of order components

    NASA Astrophysics Data System (ADS)

    Wang, K. S.; Heyns, P. S.

    2011-04-01

    Order tracking is one of the most important vibration analysis techniques for diagnosing faults in rotating machinery. It can be performed in many different ways, each of these with distinct advantages and disadvantages. However, in the end the analyst will often use Fourier analysis to transform the data from a time series to frequency or order spectra. It is therefore surprising that the study of the Fourier analysis of order-tracked systems seems to have been largely ignored in the literature. This paper considers the frequently used Vold-Kalman filter-based order tracking and computed order tracking techniques. The main pros and cons of each technique for Fourier analysis are discussed and the sequential use of Vold-Kalman filtering and computed order tracking is proposed as a novel idea to enhance the results of Fourier analysis for determining the order components. The advantages of the combined use of these order tracking techniques are demonstrated numerically on an SDOF rotor simulation model. Finally, the approach is also demonstrated on experimental data from a real rotating machine.

  10. The balance sheet technique. Volume I. The balance sheet analysis technique for preconstruction review of airports and highways

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    LaBelle, S.J.; Smith, A.E.; Seymour, D.A.

    1977-02-01

    The technique applies equally well to new or existing airports. The importance of accurate accounting of emissions, cannot be overstated. The regional oxidant modelling technique used in conjunction with a balance sheet review must be a proportional reduction technique. This type of emission balancing presumes equality of all sources in the analysis region. The technique can be applied successfully in the highway context, either in planning at the system level or looking only at projects individually. The project-by-project reviews could be used to examine each project in the same way as the airport projects are examined for their impact onmore » regional desired emission levels. The primary limitation of this technique is that it should not be used when simulation models have been used for regional oxidant air quality. In the case of highway projects, the balance sheet technique might appear to be limited; the real limitations are in the transportation planning process. That planning process is not well-suited to the needs of air quality forecasting. If the transportation forecasting techniques are insensitive to change in the variables that affect HC emissions, then no internal emission trade-offs can be identified, and the initial highway emission forecasts are themselves suspect. In general, the balance sheet technique is limited by the quality of the data used in the review. Additionally, the technique does not point out effective trade-off strategies, nor does it indicate when it might be worthwhile to ignore small amounts of excess emissions. Used in the context of regional air quality plans based on proportional reduction models, the balance sheet analysis technique shows promise as a useful method by state or regional reviewing agencies.« less

  11. Noninvasive in vivo glucose sensing using an iris based technique

    NASA Astrophysics Data System (ADS)

    Webb, Anthony J.; Cameron, Brent D.

    2011-03-01

    Physiological glucose monitoring is important aspect in the treatment of individuals afflicted with diabetes mellitus. Although invasive techniques for glucose monitoring are widely available, it would be very beneficial to make such measurements in a noninvasive manner. In this study, a New Zealand White (NZW) rabbit animal model was utilized to evaluate a developed iris-based imaging technique for the in vivo measurement of physiological glucose concentration. The animals were anesthetized with isoflurane and an insulin/dextrose protocol was used to control blood glucose concentration. To further help restrict eye movement, a developed ocular fixation device was used. During the experimental time frame, near infrared illuminated iris images were acquired along with corresponding discrete blood glucose measurements taken with a handheld glucometer. Calibration was performed using an image based Partial Least Squares (PLS) technique. Independent validation was also performed to assess model performance along with Clarke Error Grid Analysis (CEGA). Initial validation results were promising and show that a high percentage of the predicted glucose concentrations are within 20% of the reference values.

  12. Parametric Model Based On Imputations Techniques for Partly Interval Censored Data

    NASA Astrophysics Data System (ADS)

    Zyoud, Abdallah; Elfaki, F. A. M.; Hrairi, Meftah

    2017-12-01

    The term ‘survival analysis’ has been used in a broad sense to describe collection of statistical procedures for data analysis. In this case, outcome variable of interest is time until an event occurs where the time to failure of a specific experimental unit might be censored which can be right, left, interval, and Partly Interval Censored data (PIC). In this paper, analysis of this model was conducted based on parametric Cox model via PIC data. Moreover, several imputation techniques were used, which are: midpoint, left & right point, random, mean, and median. Maximum likelihood estimate was considered to obtain the estimated survival function. These estimations were then compared with the existing model, such as: Turnbull and Cox model based on clinical trial data (breast cancer data), for which it showed the validity of the proposed model. Result of data set indicated that the parametric of Cox model proved to be more superior in terms of estimation of survival functions, likelihood ratio tests, and their P-values. Moreover, based on imputation techniques; the midpoint, random, mean, and median showed better results with respect to the estimation of survival function.

  13. Prediction of drug synergy in cancer using ensemble-based machine learning techniques

    NASA Astrophysics Data System (ADS)

    Singh, Harpreet; Rana, Prashant Singh; Singh, Urvinder

    2018-04-01

    Drug synergy prediction plays a significant role in the medical field for inhibiting specific cancer agents. It can be developed as a pre-processing tool for therapeutic successes. Examination of different drug-drug interaction can be done by drug synergy score. It needs efficient regression-based machine learning approaches to minimize the prediction errors. Numerous machine learning techniques such as neural networks, support vector machines, random forests, LASSO, Elastic Nets, etc., have been used in the past to realize requirement as mentioned above. However, these techniques individually do not provide significant accuracy in drug synergy score. Therefore, the primary objective of this paper is to design a neuro-fuzzy-based ensembling approach. To achieve this, nine well-known machine learning techniques have been implemented by considering the drug synergy data. Based on the accuracy of each model, four techniques with high accuracy are selected to develop ensemble-based machine learning model. These models are Random forest, Fuzzy Rules Using Genetic Cooperative-Competitive Learning method (GFS.GCCL), Adaptive-Network-Based Fuzzy Inference System (ANFIS) and Dynamic Evolving Neural-Fuzzy Inference System method (DENFIS). Ensembling is achieved by evaluating the biased weighted aggregation (i.e. adding more weights to the model with a higher prediction score) of predicted data by selected models. The proposed and existing machine learning techniques have been evaluated on drug synergy score data. The comparative analysis reveals that the proposed method outperforms others in terms of accuracy, root mean square error and coefficient of correlation.

  14. Wear Detection of Drill Bit by Image-based Technique

    NASA Astrophysics Data System (ADS)

    Sukeri, Maziyah; Zulhilmi Paiz Ismadi, Mohd; Rahim Othman, Abdul; Kamaruddin, Shahrul

    2018-03-01

    Image processing for computer vision function plays an essential aspect in the manufacturing industries for the tool condition monitoring. This study proposes a dependable direct measurement method to measure the tool wear using image-based analysis. Segmentation and thresholding technique were used as the means to filter and convert the colour image to binary datasets. Then, the edge detection method was applied to characterize the edge of the drill bit. By using cross-correlation method, the edges of original and worn drill bits were correlated to each other. Cross-correlation graphs were able to detect the difference of the worn edge despite small difference between the graphs. Future development will focus on quantifying the worn profile as well as enhancing the sensitivity of the technique.

  15. Balloon-based interferometric techniques

    NASA Technical Reports Server (NTRS)

    Rees, David

    1985-01-01

    A balloon-borne triple-etalon Fabry-Perot Interferometer, observing the Doppler shifts of absorption lines caused by molecular oxygen and water vapor in the far red/near infrared spectrum of backscattered sunlight, has been used to evaluate a passive spaceborne remote sensing technique for measuring winds in the troposphere and stratosphere. There have been two successful high altitude balloon flights of the prototype UCL instrument from the National Scientific Balloon Facility at Palestine, TE (May 80, Oct. 83). The results from these flights have demonstrated that an interferometer with adequate resolution, stability and sensitivity can be built. The wind data are of comparable quality to those obtained from operational techniques (balloon and rocket sonde, cloud-top drift analysis, and from the gradient wind analysis of satellite radiance measurements). However, the interferometric data can provide a regular global grid, over a height range from 5 to 50 km in regions of clear air. Between the middle troposphere (5 km) and the upper stratosphere (40 to 50 km), an optimized instrument can make wind measurements over the daylit hemisphere with an accuracy of about 3 to 5 m/sec (2 sigma). It is possible to obtain full height profiles between altitudes of 5 and 50 km, with 4 km height resolution, and a spatial resolution of about 200 km, along the orbit track. Below an altitude of about 10 km, Fraunhofer lines of solar origin are possible targets of the Doppler wind analysis. Above an altitude of 50 km, the weakness of the backscattered solar spectrum (decreasing air density) is coupled with the low absorption crosssection of all atmospheric species in the spectral region up to 800 nm (where imaging photon detectors can be used), causing the along-the-track resolution (or error) to increase beyond values useful for operational purposes. Within the region of optimum performance (5 to 50 km), however, the technique is a valuable potential complement to existing wind

  16. Analysis of Cultural Heritage by Accelerator Techniques and Analytical Imaging

    NASA Astrophysics Data System (ADS)

    Ide-Ektessabi, Ari; Toque, Jay Arre; Murayama, Yusuke

    2011-12-01

    In this paper we present the result of experimental investigation using two very important accelerator techniques: (1) synchrotron radiation XRF and XAFS; and (2) accelerator mass spectrometry and multispectral analytical imaging for the investigation of cultural heritage. We also want to introduce a complementary approach to the investigation of artworks which is noninvasive and nondestructive that can be applied in situ. Four major projects will be discussed to illustrate the potential applications of these accelerator and analytical imaging techniques: (1) investigation of Mongolian Textile (Genghis Khan and Kublai Khan Period) using XRF, AMS and electron microscopy; (2) XRF studies of pigments collected from Korean Buddhist paintings; (3) creating a database of elemental composition and spectral reflectance of more than 1000 Japanese pigments which have been used for traditional Japanese paintings; and (4) visible light-near infrared spectroscopy and multispectral imaging of degraded malachite and azurite. The XRF measurements of the Japanese and Korean pigments could be used to complement the results of pigment identification by analytical imaging through spectral reflectance reconstruction. On the other hand, analysis of the Mongolian textiles revealed that they were produced between 12th and 13th century. Elemental analysis of the samples showed that they contained traces of gold, copper, iron and titanium. Based on the age and trace elements in the samples, it was concluded that the textiles were produced during the height of power of the Mongol empire, which makes them a valuable cultural heritage. Finally, the analysis of the degraded and discolored malachite and azurite demonstrates how multispectral analytical imaging could be used to complement the results of high energy-based techniques.

  17. Applying knowledge compilation techniques to model-based reasoning

    NASA Technical Reports Server (NTRS)

    Keller, Richard M.

    1991-01-01

    Researchers in the area of knowledge compilation are developing general purpose techniques for improving the efficiency of knowledge-based systems. In this article, an attempt is made to define knowledge compilation, to characterize several classes of knowledge compilation techniques, and to illustrate how some of these techniques can be applied to improve the performance of model-based reasoning systems.

  18. A web-based overview, systematic review and meta-analysis of pancreatic anastomosis techniques following pancreatoduodenectomy.

    PubMed

    Daamen, Lois A; Smits, F Jasmijn; Besselink, Marc G; Busch, Olivier R; Borel Rinkes, Inne H; van Santvoort, Hjalmar C; Molenaar, I Quintus

    2018-05-14

    Many pancreatic anastomoses have been proposed to reduce the incidence of postoperative pancreatic fistula (POPF) after pancreatoduodenectomy, but a complete overview is lacking. This systematic review and meta-analysis aims to provide an online overview of all pancreatic anastomosis techniques and to evaluate the incidence of clinically relevant POPF in randomized controlled trials (RCTs). A literature search was performed to December 2017. Included were studies giving a detailed description of the pancreatic anastomosis after open pancreatoduodenectomy and RCTs comparing techniques for the incidence of POPF (International Study Group of Pancreatic Surgery [ISGPS] Grade B/C). Meta-analyses were performed using a random-effects model. A total of 61 different anastomoses were found and summarized in 19 subgroups (www.pancreatic-anastomosis.com). In 6 RCTs, the POPF rate was 12% after pancreaticogastrostomy (n = 69/555) versus 20% after pancreaticojejunostomy (n = 106/531) (RR0.59; 95%CI 0.35-1.01, P = 0.05). Six RCTs comparing subtypes of pancreaticojejunostomy showed a pooled POPF rate of 10% (n = 109/1057). Duct-to-mucosa and invagination pancreaticojejunostomy showed similar results, respectively 14% (n = 39/278) versus 10% (n = 27/278) (RR1.40, 95%CI 0.47-4.15, P = 0.54). The proposed online overview can be used as an interactive platform, for uniformity in reporting anastomotic techniques and for educational purposes. The meta-analysis showed no significant difference in POPF rate between pancreatic anastomosis techniques. Copyright © 2018 International Hepato-Pancreato-Biliary Association Inc. Published by Elsevier Ltd. All rights reserved.

  19. Automated Sneak Circuit Analysis Technique

    DTIC Science & Technology

    1990-06-01

    the OrCAD/SDT module Port facility. 2. The terminals of all in- circuit voltage sources (e , batteries) must be labeled using the OrCAD/SDT module port...ELECTE 1 MAY 2 01994 _- AUTOMATED SNEAK CIRCUIT ANALYSIS TECHNIQUEIt~ w I wtA who RADC 94-14062 Systems Reliability & Engineering Division Rome...Air Develpment Center Best Avai~lable copy AUTOMATED SNEAK CIRCUIT ANALYSIS TECHNIQUE RADC June 1990 Systems Reliability & Engineering Division Rome Air

  20. A novel pulse height analysis technique for nuclear spectroscopic and imaging systems

    NASA Astrophysics Data System (ADS)

    Tseng, H. H.; Wang, C. Y.; Chou, H. P.

    2005-08-01

    The proposed pulse height analysis technique is based on the constant and linear relationship between pulse width and pulse height generated from front-end electronics of nuclear spectroscopic and imaging systems. The present technique has successfully implemented into the sump water radiation monitoring system in a nuclear power plant. The radiation monitoring system uses a NaI(Tl) scintillator to detect radioactive nuclides of Radon daughters brought down by rain. The technique is also used for a nuclear medical imaging system. The system uses a position sensitive photomultiplier tube coupled with a scintillator. The proposed techniques has greatly simplified the electronic design and made the system a feasible one for potable applications.

  1. Post-test navigation data analysis techniques for the shuttle ALT

    NASA Technical Reports Server (NTRS)

    1975-01-01

    Postflight test analysis data processing techniques for shuttle approach and landing tests (ALT) navigation data are defined. Postfight test processor requirements are described along with operational and design requirements, data input requirements, and software test requirements. The postflight test data processing is described based on the natural test sequence: quick-look analysis, postflight navigation processing, and error isolation processing. Emphasis is placed on the tradeoffs that must remain open and subject to analysis until final definition is achieved in the shuttle data processing system and the overall ALT plan. A development plan for the implementation of the ALT postflight test navigation data processing system is presented. Conclusions are presented.

  2. Impact during equine locomotion: techniques for measurement and analysis.

    PubMed

    Burn, J F; Wilson, A; Nason, G P

    1997-05-01

    Impact is implicated in the development of several types of musculoskeletal injury in the horse. Characterisation of impact experienced during strenuous exercise is an important first step towards understanding the mechanism for injury. Measurement and analysis of large, short duration impacts is difficult. The measurement system must be able to record transient peaks and high frequencies accurately. The analysis technique must be able to characterise the impact signal in time and frequency. This paper presents a measurement system and analysis technique for the characterisation of large impacts. A piezo-electric accelerometer was securely mounted on the dorsal surface of the horses hoof. Saddle mounted charge amplifiers and a 20 m coaxial cable transferred these data to a PC based logging system. Data were down-loaded onto a UNIX workstation and analysed using a proprietary statistics package. The values of parameters calculated from the time series data were comparable to those of other authors. A wavelet decomposition showed that the frequency profile of the signal changed with time. While most spectral energy was seen at impact, a significant amount of energy was contained in the signal immediately following impact. Over 99% of this energy was contained in frequencies less than 1250 Hz. The sampling rate and the frequency response of a measurement system for recording impact should be chosen carefully to prevent loss or corruption of data. Time scale analysis using a wavelet decomposition is a powerful technique which can be used to characterise impact data. The use of contour plots provides a highly visual representation of the time and frequency localisation of power during impact.

  3. Techniques for Enhancing Web-Based Education.

    ERIC Educational Resources Information Center

    Barbieri, Kathy; Mehringer, Susan

    The Virtual Workshop is a World Wide Web-based set of modules on high performance computing developed at the Cornell Theory Center (CTC) (New York). This approach reaches a large audience, leverages staff effort, and poses challenges for developing interesting presentation techniques. This paper describes the following techniques with their…

  4. Power strain imaging based on vibro-elastography techniques

    NASA Astrophysics Data System (ADS)

    Wen, Xu; Salcudean, S. E.

    2007-03-01

    This paper describes a new ultrasound elastography technique, power strain imaging, based on vibro-elastography (VE) techniques. With this method, tissue is compressed by a vibrating actuator driven by low-pass or band-pass filtered white noise, typically in the 0-20 Hz range. Tissue displacements at different spatial locations are estimated by correlation-based approaches on the raw ultrasound radio frequency signals and recorded in time sequences. The power spectra of these time sequences are computed by Fourier spectral analysis techniques. As the average of the power spectrum is proportional to the squared amplitude of the tissue motion, the square root of the average power over the range of excitation frequencies is used as a measure of the tissue displacement. Then tissue strain is determined by the least squares estimation of the gradient of the displacement field. The computation of the power spectra of the time sequences can be implemented efficiently by using Welch's periodogram method with moving windows or with accumulative windows with a forgetting factor. Compared to the transfer function estimation originally used in VE, the computation of cross spectral densities is not needed, which saves both the memory and computational times. Phantom experiments demonstrate that the proposed method produces stable and operator-independent strain images with high signal-to-noise ratio in real time. This approach has been also tested on a few patient data of the prostate region, and the results are encouraging.

  5. Compression technique for large statistical data bases

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Eggers, S.J.; Olken, F.; Shoshani, A.

    1981-03-01

    The compression of large statistical databases is explored and are proposed for organizing the compressed data, such that the time required to access the data is logarithmic. The techniques exploit special characteristics of statistical databases, namely, variation in the space required for the natural encoding of integer attributes, a prevalence of a few repeating values or constants, and the clustering of both data of the same length and constants in long, separate series. The techniques are variations of run-length encoding, in which modified run-lengths for the series are extracted from the data stream and stored in a header, which ismore » used to form the base level of a B-tree index into the database. The run-lengths are cumulative, and therefore the access time of the data is logarithmic in the size of the header. The details of the compression scheme and its implementation are discussed, several special cases are presented, and an analysis is given of the relative performance of the various versions.« less

  6. Hand-Based Biometric Analysis

    NASA Technical Reports Server (NTRS)

    Bebis, George

    2013-01-01

    Hand-based biometric analysis systems and techniques provide robust hand-based identification and verification. An image of a hand is obtained, which is then segmented into a palm region and separate finger regions. Acquisition of the image is performed without requiring particular orientation or placement restrictions. Segmentation is performed without the use of reference points on the images. Each segment is analyzed by calculating a set of Zernike moment descriptors for the segment. The feature parameters thus obtained are then fused and compared to stored sets of descriptors in enrollment templates to arrive at an identity decision. By using Zernike moments, and through additional manipulation, the biometric analysis is invariant to rotation, scale, or translation or an input image. Additionally, the analysis uses re-use of commonly seen terms in Zernike calculations to achieve additional efficiencies over traditional Zernike moment calculation.

  7. Maintenance Audit through Value Analysis Technique: A Case Study

    NASA Astrophysics Data System (ADS)

    Carnero, M. C.; Delgado, S.

    2008-11-01

    The increase in competitiveness, technological changes and the increase in the requirements of quality and service have forced a change in the design and application of maintenance, as well as the way in which it is considered within the managerial strategy. There are numerous maintenance activities that must be developed in a service company. As a result the maintenance functions as a whole have to be outsourced. Nevertheless, delegating this subject to specialized personnel does not exempt the company from responsibilities, but rather leads to the need for control of each maintenance activity. In order to achieve this control and to evaluate the efficiency and effectiveness of the company it is essential to carry out an audit that diagnoses the problems that could develop. In this paper a maintenance audit applied to a service company is developed. The methodology applied is based on the expert systems. The expert system by means of rules uses the weighting technique SMART and value analysis to obtain the weighting between the decision functions and between the alternatives. The expert system applies numerous rules and relations between different variables associated with the specific maintenance functions, to obtain the maintenance state by sections and the general maintenance state of the enterprise. The contributions of this paper are related to the development of a maintenance audit in a service enterprise, in which maintenance is not generally considered a strategic subject and to the integration of decision-making tools such as the weighting technique SMART with value analysis techniques, typical in the design of new products, in the area of the rule-based expert systems.

  8. Accelerometer Data Analysis and Presentation Techniques

    NASA Technical Reports Server (NTRS)

    Rogers, Melissa J. B.; Hrovat, Kenneth; McPherson, Kevin; Moskowitz, Milton E.; Reckart, Timothy

    1997-01-01

    The NASA Lewis Research Center's Principal Investigator Microgravity Services project analyzes Orbital Acceleration Research Experiment and Space Acceleration Measurement System data for principal investigators of microgravity experiments. Principal investigators need a thorough understanding of data analysis techniques so that they can request appropriate analyses to best interpret accelerometer data. Accelerometer data sampling and filtering is introduced along with the related topics of resolution and aliasing. Specific information about the Orbital Acceleration Research Experiment and Space Acceleration Measurement System data sampling and filtering is given. Time domain data analysis techniques are discussed and example environment interpretations are made using plots of acceleration versus time, interval average acceleration versus time, interval root-mean-square acceleration versus time, trimmean acceleration versus time, quasi-steady three dimensional histograms, and prediction of quasi-steady levels at different locations. An introduction to Fourier transform theory and windowing is provided along with specific analysis techniques and data interpretations. The frequency domain analyses discussed are power spectral density versus frequency, cumulative root-mean-square acceleration versus frequency, root-mean-square acceleration versus frequency, one-third octave band root-mean-square acceleration versus frequency, and power spectral density versus frequency versus time (spectrogram). Instructions for accessing NASA Lewis Research Center accelerometer data and related information using the internet are provided.

  9. A comparative analysis of conventional cytopreparatory and liquid based cytological techniques (Sure Path) in evaluation of serous effusion fluids.

    PubMed

    Dadhich, Hrishikesh; Toi, Pampa Ch; Siddaraju, Neelaiah; Sevvanthi, Kalidas

    2016-11-01

    Clinically, detection of malignant cells in serous body fluids is critical, as their presence implies the upstaging of the disease. Cytology of body cavity fluids serves as an important tool when other diagnostic tests cannot be performed. In most laboratories, currently, the effusion fluid samples are analysed chiefly by the conventional cytopreparatory (CCP) technique. Although, there are several studies comparing the liquid-based cytology (LBC), with CCP technique in the field of cervicovaginal cytology; the literature on such comparison with respect to serous body fluid examination is sparse. One hundred samples of serous body fluids were processed by both CCP and LBC techniques. Slides prepared by these techniques were studied using six parameters. A comparative analysis of the advantages and disadvantages of the techniques in detection of malignant cells was carried out with appropriate statistical tests. The samples comprised 52 pleural, 44 peritoneal and four pericardial fluids. No statistically significant difference was noted with respect to cellularity (P values = 0.22), cell distribution (P values = 0.39) and diagnosis of malignancy (P values = 0.20). As for the remaining parameters, LBC provided statistically significant clearer smear background (P values < 0.0001) and shorter screening time (P values < 0.0001), while CPP technique provided a significantly better staining quality (P values 0.01) and sharper cytomorphologic features (P values 0.05). Although, a reduced screening time and clearer smear background are the two major advantages of LBC; the CCP technique provides the better staining quality with sharper cytomorphologic features which is more critical from the cytologic interpretation point of view. Diagn. Cytopathol. 2016;44:874-879. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.

  10. Quaternion-Based Signal Analysis for Motor Imagery Classification from Electroencephalographic Signals.

    PubMed

    Batres-Mendoza, Patricia; Montoro-Sanjose, Carlos R; Guerra-Hernandez, Erick I; Almanza-Ojeda, Dora L; Rostro-Gonzalez, Horacio; Romero-Troncoso, Rene J; Ibarra-Manzano, Mario A

    2016-03-05

    Quaternions can be used as an alternative to model the fundamental patterns of electroencephalographic (EEG) signals in the time domain. Thus, this article presents a new quaternion-based technique known as quaternion-based signal analysis (QSA) to represent EEG signals obtained using a brain-computer interface (BCI) device to detect and interpret cognitive activity. This quaternion-based signal analysis technique can extract features to represent brain activity related to motor imagery accurately in various mental states. Experimental tests in which users where shown visual graphical cues related to left and right movements were used to collect BCI-recorded signals. These signals were then classified using decision trees (DT), support vector machine (SVM) and k-nearest neighbor (KNN) techniques. The quantitative analysis of the classifiers demonstrates that this technique can be used as an alternative in the EEG-signal modeling phase to identify mental states.

  11. Quaternion-Based Signal Analysis for Motor Imagery Classification from Electroencephalographic Signals

    PubMed Central

    Batres-Mendoza, Patricia; Montoro-Sanjose, Carlos R.; Guerra-Hernandez, Erick I.; Almanza-Ojeda, Dora L.; Rostro-Gonzalez, Horacio; Romero-Troncoso, Rene J.; Ibarra-Manzano, Mario A.

    2016-01-01

    Quaternions can be used as an alternative to model the fundamental patterns of electroencephalographic (EEG) signals in the time domain. Thus, this article presents a new quaternion-based technique known as quaternion-based signal analysis (QSA) to represent EEG signals obtained using a brain-computer interface (BCI) device to detect and interpret cognitive activity. This quaternion-based signal analysis technique can extract features to represent brain activity related to motor imagery accurately in various mental states. Experimental tests in which users where shown visual graphical cues related to left and right movements were used to collect BCI-recorded signals. These signals were then classified using decision trees (DT), support vector machine (SVM) and k-nearest neighbor (KNN) techniques. The quantitative analysis of the classifiers demonstrates that this technique can be used as an alternative in the EEG-signal modeling phase to identify mental states. PMID:26959029

  12. Recent mass spectrometry-based techniques and considerations for disulfide bond characterization in proteins.

    PubMed

    Lakbub, Jude C; Shipman, Joshua T; Desaire, Heather

    2018-04-01

    Disulfide bonds are important structural moieties of proteins: they ensure proper folding, provide stability, and ensure proper function. With the increasing use of proteins for biotherapeutics, particularly monoclonal antibodies, which are highly disulfide bonded, it is now important to confirm the correct disulfide bond connectivity and to verify the presence, or absence, of disulfide bond variants in the protein therapeutics. These studies help to ensure safety and efficacy. Hence, disulfide bonds are among the critical quality attributes of proteins that have to be monitored closely during the development of biotherapeutics. However, disulfide bond analysis is challenging because of the complexity of the biomolecules. Mass spectrometry (MS) has been the go-to analytical tool for the characterization of such complex biomolecules, and several methods have been reported to meet the challenging task of mapping disulfide bonds in proteins. In this review, we describe the relevant, recent MS-based techniques and provide important considerations needed for efficient disulfide bond analysis in proteins. The review focuses on methods for proper sample preparation, fragmentation techniques for disulfide bond analysis, recent disulfide bond mapping methods based on the fragmentation techniques, and automated algorithms designed for rapid analysis of disulfide bonds from liquid chromatography-MS/MS data. Researchers involved in method development for protein characterization can use the information herein to facilitate development of new MS-based methods for protein disulfide bond analysis. In addition, individuals characterizing biotherapeutics, especially by disulfide bond mapping in antibodies, can use this review to choose the best strategies for disulfide bond assignment of their biologic products. Graphical Abstract This review, describing characterization methods for disulfide bonds in proteins, focuses on three critical components: sample preparation, mass

  13. Comparison of two headspace sampling techniques for the analysis of off-flavour volatiles from oat based products.

    PubMed

    Cognat, Claudine; Shepherd, Tom; Verrall, Susan R; Stewart, Derek

    2012-10-01

    Two different headspace sampling techniques were compared for analysis of aroma volatiles from freshly produced and aged plain oatcakes. Solid phase microextraction (SPME) using a Carboxen-Polydimethylsiloxane (PDMS) fibre and entrainment on Tenax TA within an adsorbent tube were used for collection of volatiles. The effects of variation in the sampling method were also considered using SPME. The data obtained using both techniques were processed by multivariate statistical analysis (PCA). Both techniques showed similar capacities to discriminate between the samples at different ages. Discrimination between fresh and rancid samples could be made on the basis of changes in the relative abundances of 14-15 of the constituents in the volatile profiles. A significant effect on the detection level of volatile compounds was observed when samples were crushed and analysed by SPME-GC-MS, in comparison to undisturbed product. The applicability and cost effectiveness of both methods were considered. Copyright © 2012 Elsevier Ltd. All rights reserved.

  14. The Statistical Analysis Techniques to Support the NGNP Fuel Performance Experiments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bihn T. Pham; Jeffrey J. Einerson

    2010-06-01

    This paper describes the development and application of statistical analysis techniques to support the AGR experimental program on NGNP fuel performance. The experiments conducted in the Idaho National Laboratory’s Advanced Test Reactor employ fuel compacts placed in a graphite cylinder shrouded by a steel capsule. The tests are instrumented with thermocouples embedded in graphite blocks and the target quantity (fuel/graphite temperature) is regulated by the He-Ne gas mixture that fills the gap volume. Three techniques for statistical analysis, namely control charting, correlation analysis, and regression analysis, are implemented in the SAS-based NGNP Data Management and Analysis System (NDMAS) for automatedmore » processing and qualification of the AGR measured data. The NDMAS also stores daily neutronic (power) and thermal (heat transfer) code simulation results along with the measurement data, allowing for their combined use and comparative scrutiny. The ultimate objective of this work includes (a) a multi-faceted system for data monitoring and data accuracy testing, (b) identification of possible modes of diagnostics deterioration and changes in experimental conditions, (c) qualification of data for use in code validation, and (d) identification and use of data trends to support effective control of test conditions with respect to the test target. Analysis results and examples given in the paper show the three statistical analysis techniques providing a complementary capability to warn of thermocouple failures. It also suggests that the regression analysis models relating calculated fuel temperatures and thermocouple readings can enable online regulation of experimental parameters (i.e. gas mixture content), to effectively maintain the target quantity (fuel temperature) within a given range.« less

  15. Dynamic Chest Image Analysis: Evaluation of Model-Based Pulmonary Perfusion Analysis With Pyramid Images

    DTIC Science & Technology

    2001-10-25

    Image Analysis aims to develop model-based computer analysis and visualization methods for showing focal and general abnormalities of lung ventilation and perfusion based on a sequence of digital chest fluoroscopy frames collected with the Dynamic Pulmonary Imaging technique 18,5,17,6. We have proposed and evaluated a multiresolutional method with an explicit ventilation model based on pyramid images for ventilation analysis. We have further extended the method for ventilation analysis to pulmonary perfusion. This paper focuses on the clinical evaluation of our method for

  16. Advanced grazing-incidence techniques for modern soft-matter materials analysis

    DOE PAGES

    Hexemer, Alexander; Müller-Buschbaum, Peter

    2015-01-01

    The complex nano-morphology of modern soft-matter materials is successfully probed with advanced grazing-incidence techniques. Based on grazing-incidence small- and wide-angle X-ray and neutron scattering (GISAXS, GIWAXS, GISANS and GIWANS), new possibilities arise which are discussed with selected examples. Due to instrumental progress, highly interesting possibilities for local structure analysis in this material class arise from the use of micro- and nanometer-sized X-ray beams in micro- or nanofocused GISAXS and GIWAXS experiments. The feasibility of very short data acquisition times down to milliseconds creates exciting possibilities forin situandin operandoGISAXS and GIWAXS studies. Tuning the energy of GISAXS and GIWAXS in themore » soft X-ray regime and in time-of flight GISANS allows the tailoring of contrast conditions and thereby the probing of more complex morphologies. In addition, recent progress in software packages, useful for data analysis for advanced grazing-incidence techniques, is discussed.« less

  17. Advanced grazing-incidence techniques for modern soft-matter materials analysis

    PubMed Central

    Hexemer, Alexander; Müller-Buschbaum, Peter

    2015-01-01

    The complex nano-morphology of modern soft-matter materials is successfully probed with advanced grazing-incidence techniques. Based on grazing-incidence small- and wide-angle X-ray and neutron scattering (GISAXS, GIWAXS, GISANS and GIWANS), new possibilities arise which are discussed with selected examples. Due to instrumental progress, highly interesting possibilities for local structure analysis in this material class arise from the use of micro- and nanometer-sized X-ray beams in micro- or nanofocused GISAXS and GIWAXS experiments. The feasibility of very short data acquisition times down to milliseconds creates exciting possibilities for in situ and in operando GISAXS and GIWAXS studies. Tuning the energy of GISAXS and GIWAXS in the soft X-ray regime and in time-of flight GISANS allows the tailoring of contrast conditions and thereby the probing of more complex morphologies. In addition, recent progress in software packages, useful for data analysis for advanced grazing-incidence techniques, is discussed. PMID:25610632

  18. Damage detection methodology under variable load conditions based on strain field pattern recognition using FBGs, nonlinear principal component analysis, and clustering techniques

    NASA Astrophysics Data System (ADS)

    Sierra-Pérez, Julián; Torres-Arredondo, M.-A.; Alvarez-Montoya, Joham

    2018-01-01

    Structural health monitoring consists of using sensors integrated within structures together with algorithms to perform load monitoring, damage detection, damage location, damage size and severity, and prognosis. One possibility is to use strain sensors to infer structural integrity by comparing patterns in the strain field between the pristine and damaged conditions. In previous works, the authors have demonstrated that it is possible to detect small defects based on strain field pattern recognition by using robust machine learning techniques. They have focused on methodologies based on principal component analysis (PCA) and on the development of several unfolding and standardization techniques, which allow dealing with multiple load conditions. However, before a real implementation of this approach in engineering structures, changes in the strain field due to conditions different from damage occurrence need to be isolated. Since load conditions may vary in most engineering structures and promote significant changes in the strain field, it is necessary to implement novel techniques for uncoupling such changes from those produced by damage occurrence. A damage detection methodology based on optimal baseline selection (OBS) by means of clustering techniques is presented. The methodology includes the use of hierarchical nonlinear PCA as a nonlinear modeling technique in conjunction with Q and nonlinear-T 2 damage indices. The methodology is experimentally validated using strain measurements obtained by 32 fiber Bragg grating sensors bonded to an aluminum beam under dynamic bending loads and simultaneously submitted to variations in its pitch angle. The results demonstrated the capability of the methodology for clustering data according to 13 different load conditions (pitch angles), performing the OBS and detecting six different damages induced in a cumulative way. The proposed methodology showed a true positive rate of 100% and a false positive rate of 1.28% for a

  19. Hand-Based Biometric Analysis

    NASA Technical Reports Server (NTRS)

    Bebis, George (Inventor); Amayeh, Gholamreza (Inventor)

    2015-01-01

    Hand-based biometric analysis systems and techniques are described which provide robust hand-based identification and verification. An image of a hand is obtained, which is then segmented into a palm region and separate finger regions. Acquisition of the image is performed without requiring particular orientation or placement restrictions. Segmentation is performed without the use of reference points on the images. Each segment is analyzed by calculating a set of Zernike moment descriptors for the segment. The feature parameters thus obtained are then fused and compared to stored sets of descriptors in enrollment templates to arrive at an identity decision. By using Zernike moments, and through additional manipulation, the biometric analysis is invariant to rotation, scale, or translation or an in put image. Additionally, the analysis utilizes re-use of commonly-seen terms in Zernike calculations to achieve additional efficiencies over traditional Zernike moment calculation.

  20. Noncontact techniques for diesel engine diagnostics using exhaust waveform analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gore, D.A.; Cooke, G.J.

    1987-01-01

    RCA Corporation's continuing efforts to develop noncontact test techniques for diesel engines have led to recent advancements in deep engine diagnostics. The U.S. Army Tank-Automotive Command (TACOM) has been working with RCA for the development of new noncontact sensors and test techniques which use these sensors in conjunction with their family of Simplified Test Equipment (STE) to perform vehicle diagnostics. The STE systems are microprocessor-based maintenance tools that assist the Army mechanic in diagnosing malfunctions in both tactical and combat vehicles. The test systems support the mechanic by providing the sophisticated signal processing capabilities necessary for a wide range ofmore » diagnostic testing including exhaust waveform analysis.« less

  1. An effective content-based image retrieval technique for image visuals representation based on the bag-of-visual-words model.

    PubMed

    Jabeen, Safia; Mehmood, Zahid; Mahmood, Toqeer; Saba, Tanzila; Rehman, Amjad; Mahmood, Muhammad Tariq

    2018-01-01

    For the last three decades, content-based image retrieval (CBIR) has been an active research area, representing a viable solution for retrieving similar images from an image repository. In this article, we propose a novel CBIR technique based on the visual words fusion of speeded-up robust features (SURF) and fast retina keypoint (FREAK) feature descriptors. SURF is a sparse descriptor whereas FREAK is a dense descriptor. Moreover, SURF is a scale and rotation-invariant descriptor that performs better in the case of repeatability, distinctiveness, and robustness. It is robust to noise, detection errors, geometric, and photometric deformations. It also performs better at low illumination within an image as compared to the FREAK descriptor. In contrast, FREAK is a retina-inspired speedy descriptor that performs better for classification-based problems as compared to the SURF descriptor. Experimental results show that the proposed technique based on the visual words fusion of SURF-FREAK descriptors combines the features of both descriptors and resolves the aforementioned issues. The qualitative and quantitative analysis performed on three image collections, namely Corel-1000, Corel-1500, and Caltech-256, shows that proposed technique based on visual words fusion significantly improved the performance of the CBIR as compared to the feature fusion of both descriptors and state-of-the-art image retrieval techniques.

  2. 48 CFR 15.404-1 - Proposal analysis techniques.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... are: I Price Analysis, II Quantitative Techniques for Contract Pricing, III Cost Analysis, IV Advanced... obtained through market research for the same or similar items. (vii) Analysis of data other than certified...

  3. 48 CFR 15.404-1 - Proposal analysis techniques.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... are: I Price Analysis, II Quantitative Techniques for Contract Pricing, III Cost Analysis, IV Advanced... obtained through market research for the same or similar items. (vii) Analysis of data other than certified...

  4. Facilitating children's views of therapy: an analysis of the use of play-based techniques to evaluate clinical practice.

    PubMed

    Jäger, Jessica

    2013-07-01

    This article reports on a follow-up study exploring the use of play-based evaluation methods to facilitate children's views of therapy. The development and piloting of these techniques, with 12 children in the author's own practice, was previously reported in this journal. It was argued that play-based evaluation methods reduce the power imbalance inherent in adult researcher/interviewer-child relationships and provide children with meaningful ways to share their views. In this article, follow-up research into play-based evaluations with 20 children and 7 different play therapists is drawn upon to explore in greater depth the strengths and weaknesses of these techniques. The study shows that play-based evaluation techniques are important and flexible methods for facilitating children's views of child therapy. It is argued that those play therapists who incorporate their therapeutic skills effectively, maintain flexibility and sensitively attune to the child during the evaluation session, enable the child to explore their views most fully.

  5. Comparison of denture base adaptation between CAD-CAM and conventional fabrication techniques.

    PubMed

    Goodacre, Brian J; Goodacre, Charles J; Baba, Nadim Z; Kattadiyil, Mathew T

    2016-08-01

    Currently no data comparing the denture base adaptation of CAD-CAM and conventional denture processing techniques have been reported. The purpose of this in vitro study was to compare the denture base adaptation of pack and press, pour, injection, and CAD-CAM techniques for fabricating dentures to determine which process produces the most accurate and reproducible adaptation. A definitive cast was duplicated to create 40 gypsum casts that were laser scanned before any fabrication procedures were initiated. A master denture was made using the CAD-CAM process and was then used to create a putty mold for the fabrication of 30 standardized wax festooned dentures, 10 for each of the conventional processing techniques (pack and press, pour, injection). Scan files from 10 casts were sent to Global Dental Science, LLC for fabrication of the CAD-CAM test specimens. After specimens for each of the 4 techniques had been fabricated, they were hydrated for 24 hours and the intaglio surface laser scanned. The scan file of each denture was superimposed on the scan file of the corresponding preprocessing cast using surface matching software. Measurements were made at 60 locations, providing evaluation of fit discrepancies at the following areas: apex of the denture border, 6 mm from the denture border, crest of the ridge, palate, and posterior palatal seal. The use of median and interquartile range was used to assess accuracy and reproducibility. The Levine and Kruskal-Wallis analysis of variance was used to evaluate differences between processing techniques at the 5 specified locations (α=.05). The ranking of results based on median and interquartile range determined that the accuracy and reproducibility of the CAD-CAM technique was more consistently localized around zero at 3 of the 5 locations. Therefore, the CAD-CAM technique showed the best combination of accuracy and reproducibility among the tested fabrication techniques. The pack and press technique was more accurate at

  6. Near-infrared spectral image analysis of pork marbling based on Gabor filter and wide line detector techniques.

    PubMed

    Huang, Hui; Liu, Li; Ngadi, Michael O; Gariépy, Claude; Prasher, Shiv O

    2014-01-01

    Marbling is an important quality attribute of pork. Detection of pork marbling usually involves subjective scoring, which raises the efficiency costs to the processor. In this study, the ability to predict pork marbling using near-infrared (NIR) hyperspectral imaging (900-1700 nm) and the proper image processing techniques were studied. Near-infrared images were collected from pork after marbling evaluation according to current standard chart from the National Pork Producers Council. Image analysis techniques-Gabor filter, wide line detector, and spectral averaging-were applied to extract texture, line, and spectral features, respectively, from NIR images of pork. Samples were grouped into calibration and validation sets. Wavelength selection was performed on calibration set by stepwise regression procedure. Prediction models of pork marbling scores were built using multiple linear regressions based on derivatives of mean spectra and line features at key wavelengths. The results showed that the derivatives of both texture and spectral features produced good results, with correlation coefficients of validation of 0.90 and 0.86, respectively, using wavelengths of 961, 1186, and 1220 nm. The results revealed the great potential of the Gabor filter for analyzing NIR images of pork for the effective and efficient objective evaluation of pork marbling.

  7. A Critical Appraisal of Techniques, Software Packages, and Standards for Quantitative Proteomic Analysis

    PubMed Central

    Lawless, Craig; Hubbard, Simon J.; Fan, Jun; Bessant, Conrad; Hermjakob, Henning; Jones, Andrew R.

    2012-01-01

    Abstract New methods for performing quantitative proteome analyses based on differential labeling protocols or label-free techniques are reported in the literature on an almost monthly basis. In parallel, a correspondingly vast number of software tools for the analysis of quantitative proteomics data has also been described in the literature and produced by private companies. In this article we focus on the review of some of the most popular techniques in the field and present a critical appraisal of several software packages available to process and analyze the data produced. We also describe the importance of community standards to support the wide range of software, which may assist researchers in the analysis of data using different platforms and protocols. It is intended that this review will serve bench scientists both as a useful reference and a guide to the selection and use of different pipelines to perform quantitative proteomics data analysis. We have produced a web-based tool (http://www.proteosuite.org/?q=other_resources) to help researchers find appropriate software for their local instrumentation, available file formats, and quantitative methodology. PMID:22804616

  8. The detection of bulk explosives using nuclear-based techniques

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Morgado, R.E.; Gozani, T.; Seher, C.C.

    1988-01-01

    In 1986 we presented a rationale for the detection of bulk explosives based on nuclear techniques that addressed the requirements of civil aviation security in the airport environment. Since then, efforts have intensified to implement a system based on thermal neutron activation (TNA), with new work developing in fast neutron and energetic photon reactions. In this paper we will describe these techniques and present new results from laboratory and airport testing. Based on preliminary results, we contended in our earlier paper that nuclear-based techniques did provide sufficiently penetrating probes and distinguishable detectable reaction products to achieve the FAA operational goals;more » new data have supported this contention. The status of nuclear-based techniques for the detection of bulk explosives presently under investigation by the US Federal Aviation Administration (FAA) is reviewed. These include thermal neutron activation (TNA), fast neutron activation (FNA), the associated particle technique, nuclear resonance absorption, and photoneutron activation. The results of comprehensive airport testing of the TNA system performed during 1987-88 are summarized. From a technical point of view, nuclear-based techniques now represent the most comprehensive and feasible approach for meeting the operational criteria of detection, false alarms, and throughput. 9 refs., 5 figs., 2 tabs.« less

  9. Web-Based Trainer for Electrical Circuit Analysis

    ERIC Educational Resources Information Center

    Weyten, L.; Rombouts, P.; De Maeyer, J.

    2009-01-01

    A Web-based system for training electric circuit analysis is presented in this paper. It is centered on symbolic analysis techniques and it not only verifies the student's final answer, but it also tracks and coaches him/her through all steps of his/her reasoning path. The system mimics homework assignments, enhanced by immediate personalized…

  10. Objective Assessment of Patient Inhaler User Technique Using an Audio-Based Classification Approach.

    PubMed

    Taylor, Terence E; Zigel, Yaniv; Egan, Clarice; Hughes, Fintan; Costello, Richard W; Reilly, Richard B

    2018-02-01

    Many patients make critical user technique errors when using pressurised metered dose inhalers (pMDIs) which reduce the clinical efficacy of respiratory medication. Such critical errors include poor actuation coordination (poor timing of medication release during inhalation) and inhaling too fast (peak inspiratory flow rate over 90 L/min). Here, we present a novel audio-based method that objectively assesses patient pMDI user technique. The Inhaler Compliance Assessment device was employed to record inhaler audio signals from 62 respiratory patients as they used a pMDI with an In-Check Flo-Tone device attached to the inhaler mouthpiece. Using a quadratic discriminant analysis approach, the audio-based method generated a total frame-by-frame accuracy of 88.2% in classifying sound events (actuation, inhalation and exhalation). The audio-based method estimated the peak inspiratory flow rate and volume of inhalations with an accuracy of 88.2% and 83.94% respectively. It was detected that 89% of patients made at least one critical user technique error even after tuition from an expert clinical reviewer. This method provides a more clinically accurate assessment of patient inhaler user technique than standard checklist methods.

  11. Wilcoxon signed-rank-based technique for the pulse-shape analysis of HPGe detectors

    NASA Astrophysics Data System (ADS)

    Martín, S.; Quintana, B.; Barrientos, D.

    2016-07-01

    The characterization of the electric response of segmented-contact high-purity germanium detectors requires scanning systems capable of accurately associating each pulse with the position of the interaction that generated it. This process requires an algorithm sensitive to changes above the electronic noise in the pulse shapes produced at different positions, depending on the resolution of the Ge crystal. In this work, a pulse-shape comparison technique based on the Wilcoxon signed-rank test has been developed. It provides a method to distinguish pulses coming from different interaction points in the germanium crystal. Therefore, this technique is a necessary step for building a reliable pulse-shape database that can be used later for the determination of the position of interaction for γ-ray tracking spectrometry devices such as AGATA, GRETA or GERDA. The method was validated by comparison with a χ2 test using simulated and experimental pulses corresponding to a Broad Energy germanium detector (BEGe).

  12. A ground-based technique for millimeter wave spectroscopic observations of stratospheric trace constituents

    NASA Technical Reports Server (NTRS)

    Parrish, A.; Dezafra, R. L.; Solomon, P. M.; Barrett, J. W.

    1988-01-01

    Recent concern over possible long term stratospheric changes caused by the introduction of man-made compounds has increased the need for instrumentation that can accurately measure stratospheric minor constituents. The technique of radio spectroscopy at millimeter wavelengths was first used to observe rotational transitions of stratospheric ozone nearly two decades ago, but has not been highly developed until recently. A ground-based observing technique is reported which employs a millimeter-wave superheterodyne receiver and multichannel filter spectrometer for measurements of stratospheric constituents that have peak volume mixing ratios that are less than 10 to the -9th, more than 3 orders of magnitude less than that for ozone. The technique is used for an extensive program of observations of stratospheric chlorine monoxide and also for observations of other stratospheric trace gases such as (O-16)3, vibrationally excited (O-16)3, (O-18)2(O-16), N2O, HO2, and HCN. In the present paper, analysis of the observing technique is given, including the method of calibration and analysis of sources of error. The technique is found to be a reliable means of observing and monitoring important stratospheric trace constituents.

  13. Dimensional changes of acrylic resin denture bases: conventional versus injection-molding technique.

    PubMed

    Gharechahi, Jafar; Asadzadeh, Nafiseh; Shahabian, Foad; Gharechahi, Maryam

    2014-07-01

    Acrylic resin denture bases undergo dimensional changes during polymerization. Injection molding techniques are reported to reduce these changes and thereby improve physical properties of denture bases. The aim of this study was to compare dimensional changes of specimens processed by conventional and injection-molding techniques. SR-Ivocap Triplex Hot resin was used for conventional pressure-packed and SR-Ivocap High Impact was used for injection-molding techniques. After processing, all the specimens were stored in distilled water at room temperature until measured. For dimensional accuracy evaluation, measurements were recorded at 24-hour, 48-hour and 12-day intervals using a digital caliper with an accuracy of 0.01 mm. Statistical analysis was carried out by SPSS (SPSS Inc., Chicago, IL, USA) using t-test and repeated-measures ANOVA. Statistical significance was defined at P<0.05. After each water storage period, the acrylic specimens produced by injection exhibited less dimensional changes compared to those produced by the conventional technique. Curing shrinkage was compensated by water sorption with an increase in water storage time decreasing dimensional changes. Within the limitations of this study, dimensional changes of acrylic resin specimens were influenced by the molding technique used and SR-Ivocap injection procedure exhibited higher dimensional accuracy compared to conventional molding.

  14. Analysis and Validation of Contactless Time-Gated Interrogation Technique for Quartz Resonator Sensors

    PubMed Central

    Baù, Marco; Ferrari, Marco; Ferrari, Vittorio

    2017-01-01

    A technique for contactless electromagnetic interrogation of AT-cut quartz piezoelectric resonator sensors is proposed based on a primary coil electromagnetically air-coupled to a secondary coil connected to the electrodes of the resonator. The interrogation technique periodically switches between interleaved excitation and detection phases. During the excitation phase, the resonator is set into vibration by a driving voltage applied to the primary coil, whereas in the detection phase, the excitation signal is turned off and the transient decaying response of the resonator is sensed without contact by measuring the voltage induced back across the primary coil. This approach ensures that the readout frequency of the sensor signal is to a first order approximation independent of the interrogation distance between the primary and secondary coils. A detailed theoretical analysis of the interrogation principle based on a lumped-element equivalent circuit is presented. The analysis has been experimentally validated on a 4.432 MHz AT-cut quartz crystal resonator, demonstrating the accurate readout of the series resonant frequency and quality factor over an interrogation distance of up to 2 cm. As an application, the technique has been applied to the measurement of liquid microdroplets deposited on a 4.8 MHz AT-cut quartz crystal. More generally, the proposed technique can be exploited for the measurement of any physical or chemical quantities affecting the resonant response of quartz resonator sensors. PMID:28574459

  15. Analysis and Validation of Contactless Time-Gated Interrogation Technique for Quartz Resonator Sensors.

    PubMed

    Baù, Marco; Ferrari, Marco; Ferrari, Vittorio

    2017-06-02

    A technique for contactless electromagnetic interrogation of AT-cut quartz piezoelectric resonator sensors is proposed based on a primary coil electromagnetically air-coupled to a secondary coil connected to the electrodes of the resonator. The interrogation technique periodically switches between interleaved excitation and detection phases. During the excitation phase, the resonator is set into vibration by a driving voltage applied to the primary coil, whereas in the detection phase, the excitation signal is turned off and the transient decaying response of the resonator is sensed without contact by measuring the voltage induced back across the primary coil. This approach ensures that the readout frequency of the sensor signal is to a first order approximation independent of the interrogation distance between the primary and secondary coils. A detailed theoretical analysis of the interrogation principle based on a lumped-element equivalent circuit is presented. The analysis has been experimentally validated on a 4.432 MHz AT-cut quartz crystal resonator, demonstrating the accurate readout of the series resonant frequency and quality factor over an interrogation distance of up to 2 cm. As an application, the technique has been applied to the measurement of liquid microdroplets deposited on a 4.8 MHz AT-cut quartz crystal. More generally, the proposed technique can be exploited for the measurement of any physical or chemical quantities affecting the resonant response of quartz resonator sensors.

  16. Laser-based direct-write techniques for cell printing

    PubMed Central

    Schiele, Nathan R; Corr, David T; Huang, Yong; Raof, Nurazhani Abdul; Xie, Yubing; Chrisey, Douglas B

    2016-01-01

    Fabrication of cellular constructs with spatial control of cell location (±5 μm) is essential to the advancement of a wide range of applications including tissue engineering, stem cell and cancer research. Precise cell placement, especially of multiple cell types in co- or multi-cultures and in three dimensions, can enable research possibilities otherwise impossible, such as the cell-by-cell assembly of complex cellular constructs. Laser-based direct writing, a printing technique first utilized in electronics applications, has been adapted to transfer living cells and other biological materials (e.g., enzymes, proteins and bioceramics). Many different cell types have been printed using laser-based direct writing, and this technique offers significant improvements when compared to conventional cell patterning techniques. The predominance of work to date has not been in application of the technique, but rather focused on demonstrating the ability of direct writing to pattern living cells, in a spatially precise manner, while maintaining cellular viability. This paper reviews laser-based additive direct-write techniques for cell printing, and the various cell types successfully laser direct-written that have applications in tissue engineering, stem cell and cancer research are highlighted. A particular focus is paid to process dynamics modeling and process-induced cell injury during laser-based cell direct writing. PMID:20814088

  17. A discrete wavelet based feature extraction and hybrid classification technique for microarray data analysis.

    PubMed

    Bennet, Jaison; Ganaprakasam, Chilambuchelvan Arul; Arputharaj, Kannan

    2014-01-01

    Cancer classification by doctors and radiologists was based on morphological and clinical features and had limited diagnostic ability in olden days. The recent arrival of DNA microarray technology has led to the concurrent monitoring of thousands of gene expressions in a single chip which stimulates the progress in cancer classification. In this paper, we have proposed a hybrid approach for microarray data classification based on nearest neighbor (KNN), naive Bayes, and support vector machine (SVM). Feature selection prior to classification plays a vital role and a feature selection technique which combines discrete wavelet transform (DWT) and moving window technique (MWT) is used. The performance of the proposed method is compared with the conventional classifiers like support vector machine, nearest neighbor, and naive Bayes. Experiments have been conducted on both real and benchmark datasets and the results indicate that the ensemble approach produces higher classification accuracy than conventional classifiers. This paper serves as an automated system for the classification of cancer and can be applied by doctors in real cases which serve as a boon to the medical community. This work further reduces the misclassification of cancers which is highly not allowed in cancer detection.

  18. An effective content-based image retrieval technique for image visuals representation based on the bag-of-visual-words model

    PubMed Central

    Jabeen, Safia; Mehmood, Zahid; Mahmood, Toqeer; Saba, Tanzila; Rehman, Amjad; Mahmood, Muhammad Tariq

    2018-01-01

    For the last three decades, content-based image retrieval (CBIR) has been an active research area, representing a viable solution for retrieving similar images from an image repository. In this article, we propose a novel CBIR technique based on the visual words fusion of speeded-up robust features (SURF) and fast retina keypoint (FREAK) feature descriptors. SURF is a sparse descriptor whereas FREAK is a dense descriptor. Moreover, SURF is a scale and rotation-invariant descriptor that performs better in the case of repeatability, distinctiveness, and robustness. It is robust to noise, detection errors, geometric, and photometric deformations. It also performs better at low illumination within an image as compared to the FREAK descriptor. In contrast, FREAK is a retina-inspired speedy descriptor that performs better for classification-based problems as compared to the SURF descriptor. Experimental results show that the proposed technique based on the visual words fusion of SURF-FREAK descriptors combines the features of both descriptors and resolves the aforementioned issues. The qualitative and quantitative analysis performed on three image collections, namely Corel-1000, Corel-1500, and Caltech-256, shows that proposed technique based on visual words fusion significantly improved the performance of the CBIR as compared to the feature fusion of both descriptors and state-of-the-art image retrieval techniques. PMID:29694429

  19. MS-Based Analytical Techniques: Advances in Spray-Based Methods and EI-LC-MS Applications

    PubMed Central

    Medina, Isabel; Cappiello, Achille; Careri, Maria

    2018-01-01

    Mass spectrometry is the most powerful technique for the detection and identification of organic compounds. It can provide molecular weight information and a wealth of structural details that give a unique fingerprint for each analyte. Due to these characteristics, mass spectrometry-based analytical methods are showing an increasing interest in the scientific community, especially in food safety, environmental, and forensic investigation areas where the simultaneous detection of targeted and nontargeted compounds represents a key factor. In addition, safety risks can be identified at the early stage through online and real-time analytical methodologies. In this context, several efforts have been made to achieve analytical instrumentation able to perform real-time analysis in the native environment of samples and to generate highly informative spectra. This review article provides a survey of some instrumental innovations and their applications with particular attention to spray-based MS methods and food analysis issues. The survey will attempt to cover the state of the art from 2012 up to 2017. PMID:29850370

  20. 48 CFR 215.404-1 - Proposal analysis techniques.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 48 Federal Acquisition Regulations System 3 2011-10-01 2011-10-01 false Proposal analysis... Contract Pricing 215.404-1 Proposal analysis techniques. (1) Follow the procedures at PGI 215.404-1 for proposal analysis. (2) For spare parts or support equipment, perform an analysis of— (i) Those line items...

  1. A comparison of autonomous techniques for multispectral image analysis and classification

    NASA Astrophysics Data System (ADS)

    Valdiviezo-N., Juan C.; Urcid, Gonzalo; Toxqui-Quitl, Carina; Padilla-Vivanco, Alfonso

    2012-10-01

    Multispectral imaging has given place to important applications related to classification and identification of objects from a scene. Because of multispectral instruments can be used to estimate the reflectance of materials in the scene, these techniques constitute fundamental tools for materials analysis and quality control. During the last years, a variety of algorithms has been developed to work with multispectral data, whose main purpose has been to perform the correct classification of the objects in the scene. The present study introduces a brief review of some classical as well as a novel technique that have been used for such purposes. The use of principal component analysis and K-means clustering techniques as important classification algorithms is here discussed. Moreover, a recent method based on the min-W and max-M lattice auto-associative memories, that was proposed for endmember determination in hyperspectral imagery, is introduced as a classification method. Besides a discussion of their mathematical foundation, we emphasize their main characteristics and the results achieved for two exemplar images conformed by objects similar in appearance, but spectrally different. The classification results state that the first components computed from principal component analysis can be used to highlight areas with different spectral characteristics. In addition, the use of lattice auto-associative memories provides good results for materials classification even in the cases where some spectral similarities appears in their spectral responses.

  2. Developing a hybrid dictionary-based bio-entity recognition technique.

    PubMed

    Song, Min; Yu, Hwanjo; Han, Wook-Shin

    2015-01-01

    Bio-entity extraction is a pivotal component for information extraction from biomedical literature. The dictionary-based bio-entity extraction is the first generation of Named Entity Recognition (NER) techniques. This paper presents a hybrid dictionary-based bio-entity extraction technique. The approach expands the bio-entity dictionary by combining different data sources and improves the recall rate through the shortest path edit distance algorithm. In addition, the proposed technique adopts text mining techniques in the merging stage of similar entities such as Part of Speech (POS) expansion, stemming, and the exploitation of the contextual cues to further improve the performance. The experimental results show that the proposed technique achieves the best or at least equivalent performance among compared techniques, GENIA, MESH, UMLS, and combinations of these three resources in F-measure. The results imply that the performance of dictionary-based extraction techniques is largely influenced by information resources used to build the dictionary. In addition, the edit distance algorithm shows steady performance with three different dictionaries in precision whereas the context-only technique achieves a high-end performance with three difference dictionaries in recall.

  3. Developing a hybrid dictionary-based bio-entity recognition technique

    PubMed Central

    2015-01-01

    Background Bio-entity extraction is a pivotal component for information extraction from biomedical literature. The dictionary-based bio-entity extraction is the first generation of Named Entity Recognition (NER) techniques. Methods This paper presents a hybrid dictionary-based bio-entity extraction technique. The approach expands the bio-entity dictionary by combining different data sources and improves the recall rate through the shortest path edit distance algorithm. In addition, the proposed technique adopts text mining techniques in the merging stage of similar entities such as Part of Speech (POS) expansion, stemming, and the exploitation of the contextual cues to further improve the performance. Results The experimental results show that the proposed technique achieves the best or at least equivalent performance among compared techniques, GENIA, MESH, UMLS, and combinations of these three resources in F-measure. Conclusions The results imply that the performance of dictionary-based extraction techniques is largely influenced by information resources used to build the dictionary. In addition, the edit distance algorithm shows steady performance with three different dictionaries in precision whereas the context-only technique achieves a high-end performance with three difference dictionaries in recall. PMID:26043907

  4. 48 CFR 15.404-1 - Proposal analysis techniques.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... are: I Price Analysis, II Quantitative Techniques for Contract Pricing, III Cost Analysis, IV Advanced... estimates. (vi) Comparison of proposed prices with prices obtained through market research for the same or...

  5. 48 CFR 15.404-1 - Proposal analysis techniques.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... are: I Price Analysis, II Quantitative Techniques for Contract Pricing, III Cost Analysis, IV Advanced... estimates. (vi) Comparison of proposed prices with prices obtained through market research for the same or...

  6. 48 CFR 15.404-1 - Proposal analysis techniques.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... are: I Price Analysis, II Quantitative Techniques for Contract Pricing, III Cost Analysis, IV Advanced... estimates. (vi) Comparison of proposed prices with prices obtained through market research for the same or...

  7. Determining the Number of Factors in P-Technique Factor Analysis

    ERIC Educational Resources Information Center

    Lo, Lawrence L.; Molenaar, Peter C. M.; Rovine, Michael

    2017-01-01

    Determining the number of factors is a critical first step in exploratory factor analysis. Although various criteria and methods for determining the number of factors have been evaluated in the usual between-subjects R-technique factor analysis, there is still question of how these methods perform in within-subjects P-technique factor analysis. A…

  8. Scoping Study of Machine Learning Techniques for Visualization and Analysis of Multi-source Data in Nuclear Safeguards

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cui, Yonggang

    In implementation of nuclear safeguards, many different techniques are being used to monitor operation of nuclear facilities and safeguard nuclear materials, ranging from radiation detectors, flow monitors, video surveillance, satellite imagers, digital seals to open source search and reports of onsite inspections/verifications. Each technique measures one or more unique properties related to nuclear materials or operation processes. Because these data sets have no or loose correlations, it could be beneficial to analyze the data sets together to improve the effectiveness and efficiency of safeguards processes. Advanced visualization techniques and machine-learning based multi-modality analysis could be effective tools in such integratedmore » analysis. In this project, we will conduct a survey of existing visualization and analysis techniques for multi-source data and assess their potential values in nuclear safeguards.« less

  9. Cognitive task analysis: Techniques applied to airborne weapons training

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Terranova, M.; Seamster, T.L.; Snyder, C.E.

    1989-01-01

    This is an introduction to cognitive task analysis as it may be used in Naval Air Systems Command (NAVAIR) training development. The focus of a cognitive task analysis is human knowledge, and its methods of analysis are those developed by cognitive psychologists. This paper explains the role that cognitive task analysis and presents the findings from a preliminary cognitive task analysis of airborne weapons operators. Cognitive task analysis is a collection of powerful techniques that are quantitative, computational, and rigorous. The techniques are currently not in wide use in the training community, so examples of this methodology are presented alongmore » with the results. 6 refs., 2 figs., 4 tabs.« less

  10. An interdisciplinary analysis of ERTS data for Colorado mountain environments using ADP Techniques

    NASA Technical Reports Server (NTRS)

    Hoffer, R. M. (Principal Investigator)

    1972-01-01

    Author identified significant preliminary results from the Ouachita portion of the Texoma frame of data indicate many potentials in the analysis and interpretation of ERTS data. It is believed that one of the more significant aspects of this analysis sequence has been the investigation of a technique to relate ERTS analysis and surface observation analysis. At present a sequence involving (1) preliminary analysis based solely upon the spectral characteristics of the data, followed by (2) a surface observation mission to obtain visual information and oblique photography to particular points of interest in the test site area, appears to provide an extremely efficient technique for obtaining particularly meaningful surface observation data. Following such a procedure permits concentration on particular points of interest in the entire ERTS frame and thereby makes the surface observation data obtained to be particularly significant and meaningful. The analysis of the Texoma frame has also been significant from the standpoint of demonstrating a fast turn around analysis capability. Additionally, the analysis has shown the potential accuracy and degree of complexity of features that can be identified and mapped using ERTS data.

  11. Review and classification of variability analysis techniques with clinical applications.

    PubMed

    Bravi, Andrea; Longtin, André; Seely, Andrew J E

    2011-10-10

    Analysis of patterns of variation of time-series, termed variability analysis, represents a rapidly evolving discipline with increasing applications in different fields of science. In medicine and in particular critical care, efforts have focussed on evaluating the clinical utility of variability. However, the growth and complexity of techniques applicable to this field have made interpretation and understanding of variability more challenging. Our objective is to provide an updated review of variability analysis techniques suitable for clinical applications. We review more than 70 variability techniques, providing for each technique a brief description of the underlying theory and assumptions, together with a summary of clinical applications. We propose a revised classification for the domains of variability techniques, which include statistical, geometric, energetic, informational, and invariant. We discuss the process of calculation, often necessitating a mathematical transform of the time-series. Our aims are to summarize a broad literature, promote a shared vocabulary that would improve the exchange of ideas, and the analyses of the results between different studies. We conclude with challenges for the evolving science of variability analysis.

  12. Review and classification of variability analysis techniques with clinical applications

    PubMed Central

    2011-01-01

    Analysis of patterns of variation of time-series, termed variability analysis, represents a rapidly evolving discipline with increasing applications in different fields of science. In medicine and in particular critical care, efforts have focussed on evaluating the clinical utility of variability. However, the growth and complexity of techniques applicable to this field have made interpretation and understanding of variability more challenging. Our objective is to provide an updated review of variability analysis techniques suitable for clinical applications. We review more than 70 variability techniques, providing for each technique a brief description of the underlying theory and assumptions, together with a summary of clinical applications. We propose a revised classification for the domains of variability techniques, which include statistical, geometric, energetic, informational, and invariant. We discuss the process of calculation, often necessitating a mathematical transform of the time-series. Our aims are to summarize a broad literature, promote a shared vocabulary that would improve the exchange of ideas, and the analyses of the results between different studies. We conclude with challenges for the evolving science of variability analysis. PMID:21985357

  13. Investigation of safety analysis methods using computer vision techniques

    NASA Astrophysics Data System (ADS)

    Shirazi, Mohammad Shokrolah; Morris, Brendan Tran

    2017-09-01

    This work investigates safety analysis methods using computer vision techniques. The vision-based tracking system is developed to provide the trajectory of road users including vehicles and pedestrians. Safety analysis methods are developed to estimate time to collision (TTC) and postencroachment time (PET) that are two important safety measurements. Corresponding algorithms are presented and their advantages and drawbacks are shown through their success in capturing the conflict events in real time. The performance of the tracking system is evaluated first, and probability density estimation of TTC and PET are shown for 1-h monitoring of a Las Vegas intersection. Finally, an idea of an intersection safety map is introduced, and TTC values of two different intersections are estimated for 1 day from 8:00 a.m. to 6:00 p.m.

  14. A methodological comparison of customer service analysis techniques

    Treesearch

    James Absher; Alan Graefe; Robert Burns

    2003-01-01

    Techniques used to analyze customer service data need to be studied. Two primary analysis protocols, importance-performance analysis (IP) and gap score analysis (GA), are compared in a side-by-side comparison using data from two major customer service research projects. A central concern is what, if any, conclusion might be different due solely to the analysis...

  15. Comparative analysis of DNA polymorphisms and phylogenetic relationships among Syzygium cumini Skeels based on phenotypic characters and RAPD technique.

    PubMed

    Singh, Jitendra P; Singh, Ak; Bajpai, Anju; Ahmad, Iffat Zareen

    2014-01-01

    The Indian black berry (Syzygium cumini Skeels) has a great nutraceutical and medicinal properties. As in other fruit crops, the fruit characteristics are important attributes for differentiation were also determined for different accessions of S. cumini. The fruit weight, length, breadth, length: breadth ratio, pulp weight, pulp content, seed weight and pulp: seed ratio significantly varied in different accessions. Molecular characterization was carried out using PCR based RAPD technique. Out of 80 RAPD primers, only 18 primers produced stable polymorphisms that were used to examine the phylogenetic relationship. A sum of 207 loci were generated out of which 201 loci found polymorphic. The average genetic dissimilarity was 97 per cent among jamun accessions. The phylogenetic relationship was also determined by principal coordinates analysis (PCoA) that explained 46.95 per cent cumulative variance. The two-dimensional PCoA analysis showed grouping of the different accessions that were plotted into four sub-plots, representing clustering of accessions. The UPGMA (r = 0.967) and NJ (r = 0.987) dendrogram constructed based on the dissimilarity matrix revealed a good degree of fit with the cophenetic correlation value. The dendrogram grouped the accessions into three main clusters according to their eco-geographical regions which given useful insight into their phylogenetic relationships.

  16. Model-checking techniques based on cumulative residuals.

    PubMed

    Lin, D Y; Wei, L J; Ying, Z

    2002-03-01

    Residuals have long been used for graphical and numerical examinations of the adequacy of regression models. Conventional residual analysis based on the plots of raw residuals or their smoothed curves is highly subjective, whereas most numerical goodness-of-fit tests provide little information about the nature of model misspecification. In this paper, we develop objective and informative model-checking techniques by taking the cumulative sums of residuals over certain coordinates (e.g., covariates or fitted values) or by considering some related aggregates of residuals, such as moving sums and moving averages. For a variety of statistical models and data structures, including generalized linear models with independent or dependent observations, the distributions of these stochastic processes tinder the assumed model can be approximated by the distributions of certain zero-mean Gaussian processes whose realizations can be easily generated by computer simulation. Each observed process can then be compared, both graphically and numerically, with a number of realizations from the Gaussian process. Such comparisons enable one to assess objectively whether a trend seen in a residual plot reflects model misspecification or natural variation. The proposed techniques are particularly useful in checking the functional form of a covariate and the link function. Illustrations with several medical studies are provided.

  17. Dimensional Changes of Acrylic Resin Denture Bases: Conventional Versus Injection-Molding Technique

    PubMed Central

    Gharechahi, Jafar; Asadzadeh, Nafiseh; Shahabian, Foad; Gharechahi, Maryam

    2014-01-01

    Objective: Acrylic resin denture bases undergo dimensional changes during polymerization. Injection molding techniques are reported to reduce these changes and thereby improve physical properties of denture bases. The aim of this study was to compare dimensional changes of specimens processed by conventional and injection-molding techniques. Materials and Methods: SR-Ivocap Triplex Hot resin was used for conventional pressure-packed and SR-Ivocap High Impact was used for injection-molding techniques. After processing, all the specimens were stored in distilled water at room temperature until measured. For dimensional accuracy evaluation, measurements were recorded at 24-hour, 48-hour and 12-day intervals using a digital caliper with an accuracy of 0.01 mm. Statistical analysis was carried out by SPSS (SPSS Inc., Chicago, IL, USA) using t-test and repeated-measures ANOVA. Statistical significance was defined at P<0.05. Results: After each water storage period, the acrylic specimens produced by injection exhibited less dimensional changes compared to those produced by the conventional technique. Curing shrinkage was compensated by water sorption with an increase in water storage time decreasing dimensional changes. Conclusion: Within the limitations of this study, dimensional changes of acrylic resin specimens were influenced by the molding technique used and SR-Ivocap injection procedure exhibited higher dimensional accuracy compared to conventional molding. PMID:25584050

  18. Automated thermal mapping techniques using chromatic image analysis

    NASA Technical Reports Server (NTRS)

    Buck, Gregory M.

    1989-01-01

    Thermal imaging techniques are introduced using a chromatic image analysis system and temperature sensitive coatings. These techniques are used for thermal mapping and surface heat transfer measurements on aerothermodynamic test models in hypersonic wind tunnels. Measurements are made on complex vehicle configurations in a timely manner and at minimal expense. The image analysis system uses separate wavelength filtered images to analyze surface spectral intensity data. The system was initially developed for quantitative surface temperature mapping using two-color thermographic phosphors but was found useful in interpreting phase change paint and liquid crystal data as well.

  19. Neutron-based nonintrusive inspection techniques

    NASA Astrophysics Data System (ADS)

    Gozani, Tsahi

    1997-02-01

    Non-intrusive inspection of large objects such as trucks, sea-going shipping containers, air cargo containers and pallets is gaining attention as a vital tool in combating terrorism, drug smuggling and other violation of international and national transportation and Customs laws. Neutrons are the preferred probing radiation when material specificity is required, which is most often the case. Great strides have been made in neutron based inspection techniques. Fast and thermal neutrons, whether in steady state or in microsecond, or even nanosecond pulses are being employed to interrogate, at high speeds, for explosives, drugs, chemical agents, and nuclear and many other smuggled materials. Existing neutron techniques will be compared and their current status reported.

  20. Efficient techniques for wave-based sound propagation in interactive applications

    NASA Astrophysics Data System (ADS)

    Mehra, Ravish

    Sound propagation techniques model the effect of the environment on sound waves and predict their behavior from point of emission at the source to the final point of arrival at the listener. Sound is a pressure wave produced by mechanical vibration of a surface that propagates through a medium such as air or water, and the problem of sound propagation can be formulated mathematically as a second-order partial differential equation called the wave equation. Accurate techniques based on solving the wave equation, also called the wave-based techniques, are too expensive computationally and memory-wise. Therefore, these techniques face many challenges in terms of their applicability in interactive applications including sound propagation in large environments, time-varying source and listener directivity, and high simulation cost for mid-frequencies. In this dissertation, we propose a set of efficient wave-based sound propagation techniques that solve these three challenges and enable the use of wave-based sound propagation in interactive applications. Firstly, we propose a novel equivalent source technique for interactive wave-based sound propagation in large scenes spanning hundreds of meters. It is based on the equivalent source theory used for solving radiation and scattering problems in acoustics and electromagnetics. Instead of using a volumetric or surface-based approach, this technique takes an object-centric approach to sound propagation. The proposed equivalent source technique generates realistic acoustic effects and takes orders of magnitude less runtime memory compared to prior wave-based techniques. Secondly, we present an efficient framework for handling time-varying source and listener directivity for interactive wave-based sound propagation. The source directivity is represented as a linear combination of elementary spherical harmonic sources. This spherical harmonic-based representation of source directivity can support analytical, data

  1. Preliminary Evaluation of BIM-based Approaches for Schedule Delay Analysis

    NASA Astrophysics Data System (ADS)

    Chou, Hui-Yu; Yang, Jyh-Bin

    2017-10-01

    The problem of schedule delay commonly occurs in construction projects. The quality of delay analysis depends on the availability of schedule-related information and delay evidence. More information used in delay analysis usually produces more accurate and fair analytical results. How to use innovative techniques to improve the quality of schedule delay analysis results have received much attention recently. As Building Information Modeling (BIM) technique has been quickly developed, using BIM and 4D simulation techniques have been proposed and implemented. Obvious benefits have been achieved especially in identifying and solving construction consequence problems in advance of construction. This study preforms an intensive literature review to discuss the problems encountered in schedule delay analysis and the possibility of using BIM as a tool in developing a BIM-based approach for schedule delay analysis. This study believes that most of the identified problems can be dealt with by BIM technique. Research results could be a fundamental of developing new approaches for resolving schedule delay disputes.

  2. Comparison of three‐dimensional analysis and stereological techniques for quantifying lithium‐ion battery electrode microstructures

    PubMed Central

    TAIWO, OLUWADAMILOLA O.; FINEGAN, DONAL P.; EASTWOOD, DAVID S.; FIFE, JULIE L.; BROWN, LEON D.; DARR, JAWWAD A.; LEE, PETER D.; BRETT, DANIEL J.L.

    2016-01-01

    Summary Lithium‐ion battery performance is intrinsically linked to electrode microstructure. Quantitative measurement of key structural parameters of lithium‐ion battery electrode microstructures will enable optimization as well as motivate systematic numerical studies for the improvement of battery performance. With the rapid development of 3‐D imaging techniques, quantitative assessment of 3‐D microstructures from 2‐D image sections by stereological methods appears outmoded; however, in spite of the proliferation of tomographic imaging techniques, it remains significantly easier to obtain two‐dimensional (2‐D) data sets. In this study, stereological prediction and three‐dimensional (3‐D) analysis techniques for quantitative assessment of key geometric parameters for characterizing battery electrode microstructures are examined and compared. Lithium‐ion battery electrodes were imaged using synchrotron‐based X‐ray tomographic microscopy. For each electrode sample investigated, stereological analysis was performed on reconstructed 2‐D image sections generated from tomographic imaging, whereas direct 3‐D analysis was performed on reconstructed image volumes. The analysis showed that geometric parameter estimation using 2‐D image sections is bound to be associated with ambiguity and that volume‐based 3‐D characterization of nonconvex, irregular and interconnected particles can be used to more accurately quantify spatially‐dependent parameters, such as tortuosity and pore‐phase connectivity. PMID:26999804

  3. Comparison of three-dimensional analysis and stereological techniques for quantifying lithium-ion battery electrode microstructures.

    PubMed

    Taiwo, Oluwadamilola O; Finegan, Donal P; Eastwood, David S; Fife, Julie L; Brown, Leon D; Darr, Jawwad A; Lee, Peter D; Brett, Daniel J L; Shearing, Paul R

    2016-09-01

    Lithium-ion battery performance is intrinsically linked to electrode microstructure. Quantitative measurement of key structural parameters of lithium-ion battery electrode microstructures will enable optimization as well as motivate systematic numerical studies for the improvement of battery performance. With the rapid development of 3-D imaging techniques, quantitative assessment of 3-D microstructures from 2-D image sections by stereological methods appears outmoded; however, in spite of the proliferation of tomographic imaging techniques, it remains significantly easier to obtain two-dimensional (2-D) data sets. In this study, stereological prediction and three-dimensional (3-D) analysis techniques for quantitative assessment of key geometric parameters for characterizing battery electrode microstructures are examined and compared. Lithium-ion battery electrodes were imaged using synchrotron-based X-ray tomographic microscopy. For each electrode sample investigated, stereological analysis was performed on reconstructed 2-D image sections generated from tomographic imaging, whereas direct 3-D analysis was performed on reconstructed image volumes. The analysis showed that geometric parameter estimation using 2-D image sections is bound to be associated with ambiguity and that volume-based 3-D characterization of nonconvex, irregular and interconnected particles can be used to more accurately quantify spatially-dependent parameters, such as tortuosity and pore-phase connectivity. © 2016 The Authors. Journal of Microscopy published by John Wiley & Sons Ltd on behalf of Royal Microscopical Society.

  4. Demonstration of Wavelet Techniques in the Spectral Analysis of Bypass Transition Data

    NASA Technical Reports Server (NTRS)

    Lewalle, Jacques; Ashpis, David E.; Sohn, Ki-Hyeon

    1997-01-01

    A number of wavelet-based techniques for the analysis of experimental data are developed and illustrated. A multiscale analysis based on the Mexican hat wavelet is demonstrated as a tool for acquiring physical and quantitative information not obtainable by standard signal analysis methods. Experimental data for the analysis came from simultaneous hot-wire velocity traces in a bypass transition of the boundary layer on a heated flat plate. A pair of traces (two components of velocity) at one location was excerpted. A number of ensemble and conditional statistics related to dominant time scales for energy and momentum transport were calculated. The analysis revealed a lack of energy-dominant time scales inside turbulent spots but identified transport-dominant scales inside spots that account for the largest part of the Reynolds stress. Momentum transport was much more intermittent than were energetic fluctuations. This work is the first step in a continuing study of the spatial evolution of these scale-related statistics, the goal being to apply the multiscale analysis results to improve the modeling of transitional and turbulent industrial flows.

  5. Advanced analysis techniques for uranium assay

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Geist, W. H.; Ensslin, Norbert; Carrillo, L. A.

    2001-01-01

    Uranium has a negligible passive neutron emission rate making its assay practicable only with an active interrogation method. The active interrogation uses external neutron sources to induce fission events in the uranium in order to determine the mass. This technique requires careful calibration with standards that are representative of the items to be assayed. The samples to be measured are not always well represented by the available standards which often leads to large biases. A technique of active multiplicity counting is being developed to reduce some of these assay difficulties. Active multiplicity counting uses the measured doubles and triples countmore » rates to determine the neutron multiplication (f4) and the product of the source-sample coupling ( C ) and the 235U mass (m). Since the 35U mass always appears in the multiplicity equations as the product of Cm, the coupling needs to be determined before the mass can be known. A relationship has been developed that relates the coupling to the neutron multiplication. The relationship is based on both an analytical derivation and also on empirical observations. To determine a scaling constant present in this relationship, known standards must be used. Evaluation of experimental data revealed an improvement over the traditional calibration curve analysis method of fitting the doubles count rate to the 235Um ass. Active multiplicity assay appears to relax the requirement that the calibration standards and unknown items have the same chemical form and geometry.« less

  6. Ultrabroadband Phased-Array Receivers Based on Optical Techniques

    DTIC Science & Technology

    2016-02-26

    AFRL-AFOSR-VA-TR-2016-0121 Ultrabroadband Phased- array Receivers Based on Optical Techniques Christopher Schuetz UNIVERSITY OF DELAWARE Final Report...Jul 15 4. TITLE AND SUBTITLE Ultrabroadband Phased- Array Receivers Based on Optical Techniques 5a. CONTRACT NUMBER 5b. GRANT NUMBER FA9550-12-1...receiver that enables us to capture and convert signals across an array using photonic modulators, routing these signals to a central location using

  7. Comparison of sample preparation techniques and data analysis for the LC-MS/MS-based identification of proteins in human follicular fluid.

    PubMed

    Lehmann, Roland; Schmidt, André; Pastuschek, Jana; Müller, Mario M; Fritzsche, Andreas; Dieterle, Stefan; Greb, Robert R; Markert, Udo R; Slevogt, Hortense

    2018-06-25

    The proteomic analysis of complex body fluids by liquid chromatography tandem mass spectrometry (LC-MS/MS) analysis requires the selection of suitable sample preparation techniques and optimal parameter settings in data analysis software packages to obtain reliable results. Proteomic analysis of follicular fluid, as a representative of a complex body fluid similar to serum or plasma, is difficult as it contains a vast amount of high abundant proteins and a variety of proteins with different concentrations. However, the accessibility of this complex body fluid for LC-MS/MS analysis is an opportunity to gain insights into the status, the composition of fertility-relevant proteins including immunological factors or for the discovery of new diagnostic and prognostic markers for, for example, the treatment of infertility. In this study, we compared different sample preparation methods (FASP, eFASP and in-solution digestion) and three different data analysis software packages (Proteome Discoverer with SEQUEST, Mascot and MaxQuant with Andromeda) combined with semi- and full-tryptic databank search options to obtain a maximum coverage of the follicular fluid proteome. We found that the most comprehensive proteome coverage is achieved by the eFASP sample preparation method using SDS in the initial denaturing step and the SEQUEST-based semi-tryptic data analysis. In conclusion, we have developed a fractionation-free methodical workflow for in depth LC-MS/MS-based analysis for the standardized investigation of human follicle fluid as an important representative of a complex body fluid. Taken together, we were able to identify a total of 1392 proteins in follicular fluid. © 2018 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  8. Visualization techniques for tongue analysis in traditional Chinese medicine

    NASA Astrophysics Data System (ADS)

    Pham, Binh L.; Cai, Yang

    2004-05-01

    Visual inspection of the tongue has been an important diagnostic method of Traditional Chinese Medicine (TCM). Clinic data have shown significant connections between various viscera cancers and abnormalities in the tongue and the tongue coating. Visual inspection of the tongue is simple and inexpensive, but the current practice in TCM is mainly experience-based and the quality of the visual inspection varies between individuals. The computerized inspection method provides quantitative models to evaluate color, texture and surface features on the tongue. In this paper, we investigate visualization techniques and processes to allow interactive data analysis with the aim to merge computerized measurements with human expert's diagnostic variables based on five-scale diagnostic conditions: Healthy (H), History Cancers (HC), History of Polyps (HP), Polyps (P) and Colon Cancer (C).

  9. Chromatographic Techniques for Rare Earth Elements Analysis

    NASA Astrophysics Data System (ADS)

    Chen, Beibei; He, Man; Zhang, Huashan; Jiang, Zucheng; Hu, Bin

    2017-04-01

    The present capability of rare earth element (REE) analysis has been achieved by the development of two instrumental techniques. The efficiency of spectroscopic methods was extraordinarily improved for the detection and determination of REE traces in various materials. On the other hand, the determination of REEs very often depends on the preconcentration and separation of REEs, and chromatographic techniques are very powerful tools for the separation of REEs. By coupling with sensitive detectors, many ambitious analytical tasks can be fulfilled. Liquid chromatography is the most widely used technique. Different combinations of stationary phases and mobile phases could be used in ion exchange chromatography, ion chromatography, ion-pair reverse-phase chromatography and some other techniques. The application of gas chromatography is limited because only volatile compounds of REEs can be separated. Thin-layer and paper chromatography are techniques that cannot be directly coupled with suitable detectors, which limit their applications. For special demands, separations can be performed by capillary electrophoresis, which has very high separation efficiency.

  10. Separation techniques: Chromatography

    PubMed Central

    Coskun, Ozlem

    2016-01-01

    Chromatography is an important biophysical technique that enables the separation, identification, and purification of the components of a mixture for qualitative and quantitative analysis. Proteins can be purified based on characteristics such as size and shape, total charge, hydrophobic groups present on the surface, and binding capacity with the stationary phase. Four separation techniques based on molecular characteristics and interaction type use mechanisms of ion exchange, surface adsorption, partition, and size exclusion. Other chromatography techniques are based on the stationary bed, including column, thin layer, and paper chromatography. Column chromatography is one of the most common methods of protein purification. PMID:28058406

  11. Image-Based 3d Reconstruction and Analysis for Orthodontia

    NASA Astrophysics Data System (ADS)

    Knyaz, V. A.

    2012-08-01

    Among the main tasks of orthodontia are analysis of teeth arches and treatment planning for providing correct position for every tooth. The treatment plan is based on measurement of teeth parameters and designing perfect teeth arch curve which teeth are to create after treatment. The most common technique for teeth moving uses standard brackets which put on teeth and a wire of given shape which is clamped by these brackets for producing necessary forces to every tooth for moving it in given direction. The disadvantages of standard bracket technique are low accuracy of tooth dimensions measurements and problems with applying standard approach for wide variety of complex orthodontic cases. The image-based technique for orthodontic planning, treatment and documenting aimed at overcoming these disadvantages is proposed. The proposed approach provides performing accurate measurements of teeth parameters needed for adequate planning, designing correct teeth position and monitoring treatment process. The developed technique applies photogrammetric means for teeth arch 3D model generation, brackets position determination and teeth shifting analysis.

  12. Genetic analysis of seasonal runoff based on automatic techniques of hydrometeorological data processing

    NASA Astrophysics Data System (ADS)

    Kireeva, Maria; Sazonov, Alexey; Rets, Ekaterina; Ezerova, Natalia; Frolova, Natalia; Samsonov, Timofey

    2017-04-01

    Detection of the rivers' feeding type is a complex and multifactor task. Such partitioning should be based, on the one hand, on the genesis of the feeding water, on the other hand, on its physical path. At the same time it should consider relationship of the feeding type with corresponding phase of the water regime. Due to the above difficulties and complexity of the approach, there are many different variants of separation of flow hydrograph for feeding types. The most common method is extraction of so called basic component which in one way or another reflects groundwater feeding of the river. In this case, the selection most often is based on the principle of local minima or graphic separation of this component. However, in this case neither origin of the water nor corresponding phase of water regime is considered. In this paper, the authors offer a method of complex automated analysis of genetic components of the river's feeding together with the separation of specific phases of the water regime. The objects of the study are medium and large rivers of European Russia having a pronounced spring flood, formed due to melt water, and summer-autumn and winter low water which is periodically interrupted by rain or thaw flooding. The method is based on genetic separation of hydrograph proposed in 1960s years by B. I. Kudelin. This technique is considered for large rivers having hydraulic connection with groundwater horizons during flood. For better detection of floods genesis the analysis involves reanalysis data on temperature and precipitation. Separation is based on the following fundamental graphic-analytical principles: • Ground feeding during the passage of flood peak tends to zero • Beginning of the flood is determined as the exceeding of critical value of low water discharge • Flood periods are determined on the basis of exceeding the critical low-water discharge; they relate to thaw in case of above-zero temperatures • During thaw and rain floods

  13. Development of analysis techniques for remote sensing of vegetation resources

    NASA Technical Reports Server (NTRS)

    Draeger, W. C.

    1972-01-01

    Various data handling and analysis techniques are summarized for evaluation of ERTS-A and supporting high flight imagery. These evaluations are concerned with remote sensors applied to wildland and agricultural vegetation resource inventory problems. Monitoring California's annual grassland, automatic texture analysis, agricultural ground data collection techniques, and spectral measurements are included.

  14. Global precipitation estimates based on a technique for combining satellite-based estimates, rain gauge analysis, and NWP model precipitation information

    NASA Technical Reports Server (NTRS)

    Huffman, George J.; Adler, Robert F.; Rudolf, Bruno; Schneider, Udo; Keehn, Peter R.

    1995-01-01

    The 'satellite-gauge model' (SGM) technique is described for combining precipitation estimates from microwave satellite data, infrared satellite data, rain gauge analyses, and numerical weather prediction models into improved estimates of global precipitation. Throughout, monthly estimates on a 2.5 degrees x 2.5 degrees lat-long grid are employed. First, a multisatellite product is developed using a combination of low-orbit microwave and geosynchronous-orbit infrared data in the latitude range 40 degrees N - 40 degrees S (the adjusted geosynchronous precipitation index) and low-orbit microwave data alone at higher latitudes. Then the rain gauge analysis is brougth in, weighting each field by its inverse relative error variance to produce a nearly global, observationally based precipitation estimate. To produce a complete global estimate, the numerical model results are used to fill data voids in the combined satellite-gauge estimate. Our sequential approach to combining estimates allows a user to select the multisatellite estimate, the satellite-gauge estimate, or the full SGM estimate (observationally based estimates plus the model information). The primary limitation in the method is imperfections in the estimation of relative error for the individual fields. The SGM results for one year of data (July 1987 to June 1988) show important differences from the individual estimates, including model estimates as well as climatological estimates. In general, the SGM results are drier in the subtropics than the model and climatological results, reflecting the relatively dry microwave estimates that dominate the SGM in oceanic regions.

  15. Analysis techniques for residual acceleration data

    NASA Technical Reports Server (NTRS)

    Rogers, Melissa J. B.; Alexander, J. Iwan D.; Snyder, Robert S.

    1990-01-01

    Various aspects of residual acceleration data are of interest to low-gravity experimenters. Maximum and mean values and various other statistics can be obtained from data as collected in the time domain. Additional information may be obtained through manipulation of the data. Fourier analysis is discussed as a means of obtaining information about dominant frequency components of a given data window. Transformation of data into different coordinate axes is useful in the analysis of experiments with different orientations and can be achieved by the use of a transformation matrix. Application of such analysis techniques to residual acceleration data provides additional information than what is provided in a time history and increases the effectiveness of post-flight analysis of low-gravity experiments.

  16. [Application of array-based comparative genomic hybridization technique in genetic analysis of patients with spontaneous abortion].

    PubMed

    Chu, Y; Wu, D; Hou, Q F; Huo, X D; Gao, Y; Wang, T; Wang, H D; Yang, Y L; Liao, S X

    2016-08-25

    To investigate the value of array-based comparative genomic hybridization (array-CGH) technique for the detection of chromosomal analysis of miscarried embryo, and to provide genetic counseling for couples with spontaneous abortion. Totally 382 patients who underwent miscarriage were enrolled in this study. All aborted tissues were analyzed with conventional cytogenetic karyotyping and array-CGH, respectively. Through genetic analysis, all of the 382 specimens were successfully analyzed by array-CGH (100.0%, 382/382), and the detection rate of chromosomal aberrations was 46.6% (178/382). However, conventional karyotype analysis was successfully performed in 281 cases (73.6%, 281/382), and 113 (40.2%, 113/281) were found with chromosomal aberrations. Of these 178 samples identified by array-CGH, 163 samples (91.6%, 163/178) were aneuploidy, 15 samples (8.4%, 15/178) were segmental deletion and (or) duplication cases. Four of 10 cases with small segmental deletion and duplication were validated to be transferred from their fathers or mathers who were carriers of submicroscopic reciprocal translocation. Of these 113 abnormal karyotypes founded by conventional karyotyping, 108 cases (95.6%, 108/113) were aneuploidy and 5 cases (4.4%, 5/113) had chromosome structural aberrations. Most array-CGH results were consistent with conventional karyotyping but with 3 cases of discrepancy, which included 2 cases of triploids, 1 case of low-level mosaicism that undetcted by array-CGH. Compared with conventional karyotyping, there is an increased detection rate of chromosomal abnormalities when array-CGH is used to analyse the products of conception, primarilly because of its sucess with nonviable tissues. It could be a first-line method to determine the reason of miscarrage with higher accuracy and sensitivity.

  17. Flow-based analysis using microfluidics-chemiluminescence systems.

    PubMed

    Al Lawati, Haider A J

    2013-01-01

    This review will discuss various approaches and techniques in which analysis using microfluidics-chemiluminescence systems (MF-CL) has been reported. A variety of applications is examined, including environmental, pharmaceutical, biological, food and herbal analysis. Reported uses of CL reagents, sample introduction techniques, sample pretreatment methods, CL signal enhancement and detection systems are discussed. A hydrodynamic pumping system is predominately used for these applications. However, several reports are available in which electro-osmotic (EO) pumping has been implemented. Various sample pretreatment methods have been used, including liquid-liquid extraction, solid-phase extraction and molecularly imprinted polymers. A wide range of innovative techniques has been reported for CL signal enhancement. Most of these techniques are based on enhancement of the mixing process in the microfluidics channels, which leads to enhancement of the CL signal. However, other techniques are also reported, such as mirror reaction, liquid core waveguide, on-line pre-derivatization and the use of an opaque white chip with a thin transparent seal. Photodetectors are the most commonly used detectors; however, other detection systems have also been used, including integrated electrochemiluminescence (ECL) and organic photodiodes (OPDs). Copyright © 2012 John Wiley & Sons, Ltd.

  18. Error analysis of multi-needle Langmuir probe measurement technique.

    PubMed

    Barjatya, Aroh; Merritt, William

    2018-04-01

    Multi-needle Langmuir probe is a fairly new instrument technique that has been flown on several recent sounding rockets and is slated to fly on a subset of QB50 CubeSat constellation. This paper takes a fundamental look into the data analysis procedures used for this instrument to derive absolute electron density. Our calculations suggest that while the technique remains promising, the current data analysis procedures could easily result in errors of 50% or more. We present a simple data analysis adjustment that can reduce errors by at least a factor of five in typical operation.

  19. Error analysis of multi-needle Langmuir probe measurement technique

    NASA Astrophysics Data System (ADS)

    Barjatya, Aroh; Merritt, William

    2018-04-01

    Multi-needle Langmuir probe is a fairly new instrument technique that has been flown on several recent sounding rockets and is slated to fly on a subset of QB50 CubeSat constellation. This paper takes a fundamental look into the data analysis procedures used for this instrument to derive absolute electron density. Our calculations suggest that while the technique remains promising, the current data analysis procedures could easily result in errors of 50% or more. We present a simple data analysis adjustment that can reduce errors by at least a factor of five in typical operation.

  20. Design and Evaluation of Perceptual-based Object Group Selection Techniques

    NASA Astrophysics Data System (ADS)

    Dehmeshki, Hoda

    Selecting groups of objects is a frequent task in graphical user interfaces. It is required prior to many standard operations such as deletion, movement, or modification. Conventional selection techniques are lasso, rectangle selection, and the selection and de-selection of items through the use of modifier keys. These techniques may become time-consuming and error-prone when target objects are densely distributed or when the distances between target objects are large. Perceptual-based selection techniques can considerably improve selection tasks when targets have a perceptual structure, for example when arranged along a line. Current methods to detect such groups use ad hoc grouping algorithms that are not based on results from perception science. Moreover, these techniques do not allow selecting groups with arbitrary arrangements or permit modifying a selection. This dissertation presents two domain-independent perceptual-based systems that address these issues. Based on established group detection models from perception research, the proposed systems detect perceptual groups formed by the Gestalt principles of good continuation and proximity. The new systems provide gesture-based or click-based interaction techniques for selecting groups with curvilinear or arbitrary structures as well as clusters. Moreover, the gesture-based system is adapted for the graph domain to facilitate path selection. This dissertation includes several user studies that show the proposed systems outperform conventional selection techniques when targets form salient perceptual groups and are still competitive when targets are semi-structured.

  1. Applications Of Binary Image Analysis Techniques

    NASA Astrophysics Data System (ADS)

    Tropf, H.; Enderle, E.; Kammerer, H. P.

    1983-10-01

    After discussing the conditions where binary image analysis techniques can be used, three new applications of the fast binary image analysis system S.A.M. (Sensorsystem for Automation and Measurement) are reported: (1) The human view direction is measured at TV frame rate while the subject's head is free movable. (2) Industrial parts hanging on a moving conveyor are classified prior to spray painting by robot. (3) In automotive wheel assembly, the eccentricity of the wheel is minimized by turning the tyre relative to the rim in order to balance the eccentricity of the components.

  2. Paper-based analytical devices for environmental analysis.

    PubMed

    Meredith, Nathan A; Quinn, Casey; Cate, David M; Reilly, Thomas H; Volckens, John; Henry, Charles S

    2016-03-21

    The field of paper-based microfluidics has experienced rapid growth over the past decade. Microfluidic paper-based analytical devices (μPADs), originally developed for point-of-care medical diagnostics in resource-limited settings, are now being applied in new areas, such as environmental analyses. Low-cost paper sensors show great promise for on-site environmental analysis; the theme of ongoing research complements existing instrumental techniques by providing high spatial and temporal resolution for environmental monitoring. This review highlights recent applications of μPADs for environmental analysis along with technical advances that may enable μPADs to be more widely implemented in field testing.

  3. Complex fluid flow and heat transfer analysis inside a calandria based reactor using CFD technique

    NASA Astrophysics Data System (ADS)

    Kulkarni, P. S.

    2017-04-01

    Series of numerical experiments have been carried out on a calandria based reactor for optimizing the design to increase the overall heat transfer efficiency by using Computational Fluid Dynamic (CFD) technique. Fluid flow and heat transfer inside the calandria is governed by many geometric and flow parameters like orientation of inlet, inlet mass flow rate, fuel channel configuration (in-line, staggered, etc.,), location of inlet and outlet, etc.,. It was well established that heat transfer is more wherever forced convection dominates but for geometries like calandria it is very difficult to achieve forced convection flow everywhere, intern it strongly depends on the direction of inlet jet. In the present paper the initial design was optimized with respect to inlet jet angle, the optimized design has been numerically tested for different heat load mass flow conditions. To further increase the heat removal capacity of a calandria, further numerical studies has been carried out for different inlet geometry. In all the analysis same overall geometry size and same number of tubes has been considered. The work gives good insight into the fluid flow and heat transfer inside the calandria and offer a guideline for optimizing the design and/or capacity enhancement of a present design.

  4. Analysis techniques for background rejection at the Majorana Demonstrator

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cuestra, Clara; Rielage, Keith Robert; Elliott, Steven Ray

    2015-06-11

    The MAJORANA Collaboration is constructing the MAJORANA DEMONSTRATOR, an ultra-low background, 40-kg modular HPGe detector array to search for neutrinoless double beta decay in 76Ge. In view of the next generation of tonne-scale Ge-based 0νββ-decay searches that will probe the neutrino mass scale in the inverted-hierarchy region, a major goal of the MAJORANA DEMONSTRATOR is to demonstrate a path forward to achieving a background rate at or below 1 count/tonne/year in the 4 keV region of interest around the Q-value at 2039 keV. The background rejection techniques to be applied to the data include cuts based on data reduction, pulsemore » shape analysis, event coincidences, and time correlations. The Point Contact design of the DEMONSTRATOR's germanium detectors allows for significant reduction of gamma background.« less

  5. Contests versus Norms: Implications of Contest-Based and Norm-Based Intervention Techniques

    PubMed Central

    Bergquist, Magnus; Nilsson, Andreas; Hansla, André

    2017-01-01

    Interventions using either contests or norms can promote environmental behavioral change. Yet research on the implications of contest-based and norm-based interventions is lacking. Based on Goal-framing theory, we suggest that a contest-based intervention frames a gain goal promoting intensive but instrumental behavioral engagement. In contrast, the norm-based intervention was expected to frame a normative goal activating normative obligations for targeted and non-targeted behavior and motivation to engage in pro-environmental behaviors in the future. In two studies participants (n = 347) were randomly assigned to either a contest- or a norm-based intervention technique. Participants in the contest showed more intensive engagement in both studies. Participants in the norm-based intervention tended to report higher intentions for future energy conservation (Study 1) and higher personal norms for non-targeted pro-environmental behaviors (Study 2). These findings suggest that contest-based intervention technique frames a gain goal, while norm-based intervention frames a normative goal. PMID:29218026

  6. Contests versus Norms: Implications of Contest-Based and Norm-Based Intervention Techniques.

    PubMed

    Bergquist, Magnus; Nilsson, Andreas; Hansla, André

    2017-01-01

    Interventions using either contests or norms can promote environmental behavioral change. Yet research on the implications of contest-based and norm-based interventions is lacking. Based on Goal-framing theory, we suggest that a contest-based intervention frames a gain goal promoting intensive but instrumental behavioral engagement. In contrast, the norm-based intervention was expected to frame a normative goal activating normative obligations for targeted and non-targeted behavior and motivation to engage in pro-environmental behaviors in the future. In two studies participants ( n = 347) were randomly assigned to either a contest- or a norm-based intervention technique. Participants in the contest showed more intensive engagement in both studies. Participants in the norm-based intervention tended to report higher intentions for future energy conservation (Study 1) and higher personal norms for non-targeted pro-environmental behaviors (Study 2). These findings suggest that contest-based intervention technique frames a gain goal, while norm-based intervention frames a normative goal.

  7. Choice-Based Conjoint Analysis: Classification vs. Discrete Choice Models

    NASA Astrophysics Data System (ADS)

    Giesen, Joachim; Mueller, Klaus; Taneva, Bilyana; Zolliker, Peter

    Conjoint analysis is a family of techniques that originated in psychology and later became popular in market research. The main objective of conjoint analysis is to measure an individual's or a population's preferences on a class of options that can be described by parameters and their levels. We consider preference data obtained in choice-based conjoint analysis studies, where one observes test persons' choices on small subsets of the options. There are many ways to analyze choice-based conjoint analysis data. Here we discuss the intuition behind a classification based approach, and compare this approach to one based on statistical assumptions (discrete choice models) and to a regression approach. Our comparison on real and synthetic data indicates that the classification approach outperforms the discrete choice models.

  8. Advantage of the modified Lunn-McNeil technique over Kalbfleisch-Prentice technique in competing risks

    NASA Astrophysics Data System (ADS)

    Lukman, Iing; Ibrahim, Noor A.; Daud, Isa B.; Maarof, Fauziah; Hassan, Mohd N.

    2002-03-01

    Survival analysis algorithm is often applied in the data mining process. Cox regression is one of the survival analysis tools that has been used in many areas, and it can be used to analyze the failure times of aircraft crashed. Another survival analysis tool is the competing risks where we have more than one cause of failure acting simultaneously. Lunn-McNeil analyzed the competing risks in the survival model using Cox regression with censored data. The modified Lunn-McNeil technique is a simplify of the Lunn-McNeil technique. The Kalbfleisch-Prentice technique is involving fitting models separately from each type of failure, treating other failure types as censored. To compare the two techniques, (the modified Lunn-McNeil and Kalbfleisch-Prentice) a simulation study was performed. Samples with various sizes and censoring percentages were generated and fitted using both techniques. The study was conducted by comparing the inference of models, using Root Mean Square Error (RMSE), the power tests, and the Schoenfeld residual analysis. The power tests in this study were likelihood ratio test, Rao-score test, and Wald statistics. The Schoenfeld residual analysis was conducted to check the proportionality of the model through its covariates. The estimated parameters were computed for the cause-specific hazard situation. Results showed that the modified Lunn-McNeil technique was better than the Kalbfleisch-Prentice technique based on the RMSE measurement and Schoenfeld residual analysis. However, the Kalbfleisch-Prentice technique was better than the modified Lunn-McNeil technique based on power tests measurement.

  9. Figure analysis: A teaching technique to promote visual literacy and active Learning.

    PubMed

    Wiles, Amy M

    2016-07-08

    Learning often improves when active learning techniques are used in place of traditional lectures. For many of these techniques, however, students are expected to apply concepts that they have already grasped. A challenge, therefore, is how to incorporate active learning into the classroom of courses with heavy content, such as molecular-based biology courses. An additional challenge is that visual literacy is often overlooked in undergraduate science education. To address both of these challenges, a technique called figure analysis was developed and implemented in three different levels of undergraduate biology courses. Here, students learn content while gaining practice in interpreting visual information by discussing figures with their peers. Student groups also make connections between new and previously learned concepts on their own while in class. The instructor summarizes the material for the class only after students grapple with it in small groups. Students reported a preference for learning by figure analysis over traditional lecture, and female students in particular reported increased confidence in their analytical abilities. There is not a technology requirement for this technique; therefore, it may be utilized both in classrooms and in nontraditional spaces. Additionally, the amount of preparation required is comparable to that of a traditional lecture. © 2016 by The International Union of Biochemistry and Molecular Biology, 44(4):336-344, 2016. © 2016 The International Union of Biochemistry and Molecular Biology.

  10. Application of dermoscopy image analysis technique in diagnosing urethral condylomata acuminata.

    PubMed

    Zhang, Yunjie; Jiang, Shuang; Lin, Hui; Guo, Xiaojuan; Zou, Xianbiao

    2018-01-01

    In this study, cases with suspected urethral condylomata acuminata were examined by dermoscopy, in order to explore an effective method for clinical. To study the application of dermoscopy image analysis technique in clinical diagnosis of urethral condylomata acuminata. A total of 220 suspected urethral condylomata acuminata were clinically diagnosed first with the naked eyes, and then by using dermoscopy image analysis technique. Afterwards, a comparative analysis was made for the two diagnostic methods. Among the 220 suspected urethral condylomata acuminata, there was a higher positive rate by dermoscopy examination than visual observation. Dermoscopy examination technique is still restricted by its inapplicability in deep urethral orifice and skin wrinkles, and concordance between different clinicians may also vary. Dermoscopy image analysis technique features a high sensitivity, quick and accurate diagnosis and is non-invasive, and we recommend its use.

  11. Application of gas chromatography to analysis of spirit-based alcoholic beverages.

    PubMed

    Wiśniewska, Paulina; Śliwińska, Magdalena; Dymerski, Tomasz; Wardencki, Waldemar; Namieśnik, Jacek

    2015-01-01

    Spirit-based beverages are alcoholic drinks; their production processes are dependent on the type and origin of raw materials. The composition of this complex matrix is difficult to analyze, and scientists commonly choose gas chromatography techniques for this reason. With a wide selection of extraction methods and detectors it is possible to provide qualitative and quantitative analysis for many chemical compounds with various functional groups. This article describes different types of gas chromatography techniques and their most commonly used associated extraction techniques (e.g., LLE, SPME, SPE, SFE, and SBME) and detectors (MS, TOFMS, FID, ECD, NPD, AED, O or EPD). Additionally, brief characteristics of internationally popular spirit-based beverages and application of gas chromatography to the analysis of selected alcoholic drinks are presented.

  12. Reliability analysis of laminated CMC components through shell subelement techniques

    NASA Technical Reports Server (NTRS)

    Starlinger, Alois; Duffy, Stephen F.; Gyekenyesi, John P.

    1992-01-01

    An updated version of the integrated design program Composite Ceramics Analysis and Reliability Evaluation of Structures (C/CARES) was developed for the reliability evaluation of ceramic matrix composites (CMC) laminated shell components. The algorithm is now split into two modules: a finite-element data interface program and a reliability evaluation algorithm. More flexibility is achieved, allowing for easy implementation with various finite-element programs. The interface program creates a neutral data base which is then read by the reliability module. This neutral data base concept allows easy data transfer between different computer systems. The new interface program from the finite-element code Matrix Automated Reduction and Coupling (MARC) also includes the option of using hybrid laminates (a combination of plies of different materials or different layups) and allows for variations in temperature fields throughout the component. In the current version of C/CARES, a subelement technique was implemented, enabling stress gradients within an element to be taken into account. The noninteractive reliability function is now evaluated at each Gaussian integration point instead of using averaging techniques. As a result of the increased number of stress evaluation points, considerable improvements in the accuracy of reliability analyses were realized.

  13. A histogram-based technique for rapid vector extraction from PIV photographs

    NASA Technical Reports Server (NTRS)

    Humphreys, William M., Jr.

    1991-01-01

    A new analysis technique, performed totally in the image plane, is proposed which rapidly extracts all available vectors from individual interrogation regions on PIV photographs. The technique avoids the need for using Fourier transforms with the associated computational burden. The data acquisition and analysis procedure is described, and results of a preliminary simulation study to evaluate the accuracy of the technique are presented. Recently obtained PIV photographs are analyzed.

  14. Object-Based Image Analysis Beyond Remote Sensing - the Human Perspective

    NASA Astrophysics Data System (ADS)

    Blaschke, T.; Lang, S.; Tiede, D.; Papadakis, M.; Györi, A.

    2016-06-01

    We introduce a prototypical methodological framework for a place-based GIS-RS system for the spatial delineation of place while incorporating spatial analysis and mapping techniques using methods from different fields such as environmental psychology, geography, and computer science. The methodological lynchpin for this to happen - when aiming to delineate place in terms of objects - is object-based image analysis (OBIA).

  15. Initial planetary base construction techniques and machine implementation

    NASA Technical Reports Server (NTRS)

    Crockford, William W.

    1987-01-01

    Conceptual designs of (1) initial planetary base structures, and (2) an unmanned machine to perform the construction of these structures using materials local to the planet are presented. Rock melting is suggested as a possible technique to be used by the machine in fabricating roads, platforms, and interlocking bricks. Identification of problem areas in machine design and materials processing is accomplished. The feasibility of the designs is contingent upon favorable results of an analysis of the engineering behavior of the product materials. The analysis requires knowledge of several parameters for solution of the constitutive equations of the theory of elasticity. An initial collection of these parameters is presented which helps to define research needed to perform a realistic feasibility study. A qualitative approach to estimating power and mass lift requirements for the proposed machine is used which employs specifications of currently available equipment. An initial, unmanned mission scenario is discussed with emphasis on identifying uncompleted tasks and suggesting design considerations for vehicles and primitive structures which use the products of the machine processing.

  16. Magnetic Analysis Techniques Applied to Desert Varnish

    NASA Technical Reports Server (NTRS)

    Schmidgall, E. R.; Moskowitz, B. M.; Dahlberg, E. D.; Kuhlman, K. R.

    2003-01-01

    Desert varnish is a black or reddish coating commonly found on rock samples from arid regions. Typically, the coating is very thin, less than half a millimeter thick. Previous research has shown that the primary components of desert varnish are silicon oxide clay minerals (60%), manganese and iron oxides (20-30%), and trace amounts of other compounds [1]. Desert varnish is thought to originate when windborne particles containing iron and manganese oxides are deposited onto rock surfaces where manganese oxidizing bacteria concentrate the manganese and form the varnish [4,5]. If desert varnish is indeed biogenic, then the presence of desert varnish on rock surfaces could serve as a biomarker, indicating the presence of microorganisms. This idea has considerable appeal, especially for Martian exploration [6]. Magnetic analysis techniques have not been extensively applied to desert varnish. The only previous magnetic study reported that based on room temperature demagnetization experiments, there were noticeable differences in magnetic properties between a sample of desert varnish and the substrate sandstone [7]. Based upon the results of the demagnetization experiments, the authors concluded that the primary magnetic component of desert varnish was either magnetite (Fe3O4) or maghemite ( Fe2O3).

  17. Coarse-to-fine markerless gait analysis based on PCA and Gauss-Laguerre decomposition

    NASA Astrophysics Data System (ADS)

    Goffredo, Michela; Schmid, Maurizio; Conforto, Silvia; Carli, Marco; Neri, Alessandro; D'Alessio, Tommaso

    2005-04-01

    Human movement analysis is generally performed through the utilization of marker-based systems, which allow reconstructing, with high levels of accuracy, the trajectories of markers allocated on specific points of the human body. Marker based systems, however, show some drawbacks that can be overcome by the use of video systems applying markerless techniques. In this paper, a specifically designed computer vision technique for the detection and tracking of relevant body points is presented. It is based on the Gauss-Laguerre Decomposition, and a Principal Component Analysis Technique (PCA) is used to circumscribe the region of interest. Results obtained on both synthetic and experimental tests provide significant reduction of the computational costs, with no significant reduction of the tracking accuracy.

  18. Sensitivity analysis of hybrid thermoelastic techniques

    Treesearch

    W.A. Samad; J.M. Considine

    2017-01-01

    Stress functions have been used as a complementary tool to support experimental techniques, such as thermoelastic stress analysis (TSA) and digital image correlation (DIC), in an effort to evaluate the complete and separate full-field stresses of loaded structures. The need for such coupling between experimental data and stress functions is due to the fact that...

  19. Introducing Risk Management Techniques Within Project Based Software Engineering Courses

    NASA Astrophysics Data System (ADS)

    Port, Daniel; Boehm, Barry

    2002-03-01

    In 1996, USC switched its core two-semester software engineering course from a hypothetical-project, homework-and-exam course based on the Bloom taxonomy of educational objectives (knowledge, comprehension, application, analysis, synthesis, and evaluation). The revised course is a real-client team-project course based on the CRESST model of learning objectives (content understanding, problem solving, collaboration, communication, and self-regulation). We used the CRESST cognitive demands analysis to determine the necessary student skills required for software risk management and the other major project activities, and have been refining the approach over the last 5 years of experience, including revised versions for one-semester undergraduate and graduate project course at Columbia. This paper summarizes our experiences in evolving the risk management aspects of the project course. These have helped us mature more general techniques such as risk-driven specifications, domain-specific simplifier and complicator lists, and the schedule as an independent variable (SAIV) process model. The largely positive results in terms of review of pass / fail rates, client evaluations, product adoption rates, and hiring manager feedback are summarized as well.

  20. Behavior Change Techniques in Apps for Medication Adherence: A Content Analysis.

    PubMed

    Morrissey, Eimear C; Corbett, Teresa K; Walsh, Jane C; Molloy, Gerard J

    2016-05-01

    There are a vast number of smartphone applications (apps) aimed at promoting medication adherence on the market; however, the theory and evidence base in terms of applying established health behavior change techniques underpinning these apps remains unclear. This study aimed to code these apps using the Behavior Change Technique Taxonomy (v1) for the presence or absence of established behavior change techniques. The sample of apps was identified through systematic searches in both the Google Play Store and Apple App Store in February 2015. All apps that fell into the search categories were downloaded for analysis. The downloaded apps were screened with exclusion criteria, and suitable apps were reviewed and coded for behavior change techniques in March 2015. Two researchers performed coding independently. In total, 166 medication adherence apps were identified and coded. The number of behavior change techniques contained in an app ranged from zero to seven (mean=2.77). A total of 12 of a possible 96 behavior change techniques were found to be present across apps. The most commonly included behavior change techniques were "action planning" and "prompt/cues," which were included in 96% of apps, followed by "self-monitoring" (37%) and "feedback on behavior" (36%). The current extent to which established behavior change techniques are used in medication adherence apps is limited. The development of medication adherence apps may not have benefited from advances in the theory and practice of health behavior change. Copyright © 2016 American Journal of Preventive Medicine. Published by Elsevier Inc. All rights reserved.

  1. Evaluation of Analysis Techniques for Fluted-Core Sandwich Cylinders

    NASA Technical Reports Server (NTRS)

    Lovejoy, Andrew E.; Schultz, Marc R.

    2012-01-01

    Buckling-critical launch-vehicle structures require structural concepts that have high bending stiffness and low mass. Fluted-core, also known as truss-core, sandwich construction is one such concept. In an effort to identify an analysis method appropriate for the preliminary design of fluted-core cylinders, the current paper presents and compares results from several analysis techniques applied to a specific composite fluted-core test article. The analysis techniques are evaluated in terms of their ease of use and for their appropriateness at certain stages throughout a design analysis cycle (DAC). Current analysis techniques that provide accurate determination of the global buckling load are not readily applicable early in the DAC, such as during preliminary design, because they are too costly to run. An analytical approach that neglects transverse-shear deformation is easily applied during preliminary design, but the lack of transverse-shear deformation results in global buckling load predictions that are significantly higher than those from more detailed analysis methods. The current state of the art is either too complex to be applied for preliminary design, or is incapable of the accuracy required to determine global buckling loads for fluted-core cylinders. Therefore, it is necessary to develop an analytical method for calculating global buckling loads of fluted-core cylinders that includes transverse-shear deformations, and that can be easily incorporated in preliminary design.

  2. VLBI Analysis with the Multi-Technique Software GEOSAT

    NASA Technical Reports Server (NTRS)

    Kierulf, Halfdan Pascal; Andersen, Per-Helge; Boeckmann, Sarah; Kristiansen, Oddgeir

    2010-01-01

    GEOSAT is a multi-technique geodetic analysis software developed at Forsvarets Forsknings Institutt (Norwegian defense research establishment). The Norwegian Mapping Authority has now installed the software and has, together with Forsvarets Forsknings Institutt, adapted the software to deliver datum-free normal equation systems in SINEX format. The goal is to be accepted as an IVS Associate Analysis Center and to provide contributions to the IVS EOP combination on a routine basis. GEOSAT is based on an upper diagonal factorized Kalman filter which allows estimation of time variable parameters like the troposphere and clocks as stochastic parameters. The tropospheric delays in various directions are mapped to tropospheric zenith delay using ray-tracing. Meteorological data from ECMWF with a resolution of six hours is used to perform the ray-tracing which depends both on elevation and azimuth. Other models are following the IERS and IVS conventions. The Norwegian Mapping Authority has submitted test SINEX files produced with GEOSAT to IVS. The results have been compared with the existing IVS combined products. In this paper the outcome of these comparisons is presented.

  3. Statistical Evaluation of Time Series Analysis Techniques

    NASA Technical Reports Server (NTRS)

    Benignus, V. A.

    1973-01-01

    The performance of a modified version of NASA's multivariate spectrum analysis program is discussed. A multiple regression model was used to make the revisions. Performance improvements were documented and compared to the standard fast Fourier transform by Monte Carlo techniques.

  4. Radiomics-based Prognosis Analysis for Non-Small Cell Lung Cancer

    NASA Astrophysics Data System (ADS)

    Zhang, Yucheng; Oikonomou, Anastasia; Wong, Alexander; Haider, Masoom A.; Khalvati, Farzad

    2017-04-01

    Radiomics characterizes tumor phenotypes by extracting large numbers of quantitative features from radiological images. Radiomic features have been shown to provide prognostic value in predicting clinical outcomes in several studies. However, several challenges including feature redundancy, unbalanced data, and small sample sizes have led to relatively low predictive accuracy. In this study, we explore different strategies for overcoming these challenges and improving predictive performance of radiomics-based prognosis for non-small cell lung cancer (NSCLC). CT images of 112 patients (mean age 75 years) with NSCLC who underwent stereotactic body radiotherapy were used to predict recurrence, death, and recurrence-free survival using a comprehensive radiomics analysis. Different feature selection and predictive modeling techniques were used to determine the optimal configuration of prognosis analysis. To address feature redundancy, comprehensive analysis indicated that Random Forest models and Principal Component Analysis were optimum predictive modeling and feature selection methods, respectively, for achieving high prognosis performance. To address unbalanced data, Synthetic Minority Over-sampling technique was found to significantly increase predictive accuracy. A full analysis of variance showed that data endpoints, feature selection techniques, and classifiers were significant factors in affecting predictive accuracy, suggesting that these factors must be investigated when building radiomics-based predictive models for cancer prognosis.

  5. Loss-compensation technique using a split-spectrum approach for optical fiber air-gap intensity-based sensors

    NASA Astrophysics Data System (ADS)

    Wang, Anbo; Miller, Mark S.; Gunther, Michael F.; Murphy, Kent A.; Claus, Richard O.

    1993-03-01

    A self-referencing technique compensating for fiber losses and source fluctuations in air-gap intensity-based optical fiber sensors is described and demonstrated. A resolution of 0.007 micron has been obtained over a measurement range of 0-250 microns for an intensity-based displacement sensor using this referencing technique. The sensor is shown to have minimal sensitivity to fiber bending losses and variations in the LED input power. A theoretical model for evaluation of step-index multimode optical fiber splice is proposed. The performance of the sensor as a displacement sensor agrees well with the theoretical analysis.

  6. Using High Speed Smartphone Cameras and Video Analysis Techniques to Teach Mechanical Wave Physics

    ERIC Educational Resources Information Center

    Bonato, Jacopo; Gratton, Luigi M.; Onorato, Pasquale; Oss, Stefano

    2017-01-01

    We propose the use of smartphone-based slow-motion video analysis techniques as a valuable tool for investigating physics concepts ruling mechanical wave propagation. The simple experimental activities presented here, suitable for both high school and undergraduate students, allows one to measure, in a simple yet rigorous way, the speed of pulses…

  7. Effects of Interventions Based in Behavior Analysis on Motor Skill Acquisition: A Meta-Analysis

    ERIC Educational Resources Information Center

    Alstot, Andrew E.; Kang, Minsoo; Alstot, Crystal D.

    2013-01-01

    Techniques based in applied behavior analysis (ABA) have been shown to be useful across a variety of settings to improve numerous behaviors. Specifically within physical activity settings, several studies have examined the effect of interventions based in ABA on a variety of motor skills, but the overall effects of these interventions are unknown.…

  8. The Heliospheric Cataloguing, Analysis and Techniques Service (HELCATS) project

    NASA Astrophysics Data System (ADS)

    Barnes, D.; Harrison, R. A.; Davies, J. A.; Perry, C. H.; Moestl, C.; Rouillard, A.; Bothmer, V.; Rodriguez, L.; Eastwood, J. P.; Kilpua, E.; Gallagher, P.; Odstrcil, D.

    2017-12-01

    Understanding solar wind evolution is fundamental to advancing our knowledge of energy and mass transport in the solar system, whilst also being crucial to space weather and its prediction. The advent of truly wide-angle heliospheric imaging has revolutionised the study of solar wind evolution, by enabling direct and continuous observation of both transient and background components of the solar wind as they propagate from the Sun to 1 AU and beyond. The recently completed, EU-funded FP7 Heliospheric Cataloguing, Analysis and Techniques Service (HELCATS) project (1st May 2014 - 30th April 2017) combined European expertise in heliospheric imaging, built up over the last decade in particular through leadership of the Heliospheric Imager (HI) instruments aboard NASA's STEREO mission, with expertise in solar and coronal imaging as well as the interpretation of in-situ and radio diagnostic measurements of solar wind phenomena. HELCATS involved: (1) the cataloguing of transient (coronal mass ejections) and background (stream/corotating interaction regions) solar wind structures observed by the STEREO/HI instruments, including estimates of their kinematic properties based on a variety of modelling techniques; (2) the verification of these kinematic properties through comparison with solar source observations and in-situ measurements at multiple points throughout the heliosphere; (3) the assessment of the potential for initialising numerical models based on the derived kinematic properties of transient and background solar wind components; and (4) the assessment of the complementarity of radio observations (Type II radio bursts and interplanetary scintillation) in the detection and analysis of heliospheric structure in combination with heliospheric imaging observations. In this presentation, we provide an overview of the HELCATS project emphasising, in particular, the principal achievements and legacy of this unprecedented project.

  9. A framework for graph-based synthesis, analysis, and visualization of HPC cluster job data.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mayo, Jackson R.; Kegelmeyer, W. Philip, Jr.; Wong, Matthew H.

    The monitoring and system analysis of high performance computing (HPC) clusters is of increasing importance to the HPC community. Analysis of HPC job data can be used to characterize system usage and diagnose and examine failure modes and their effects. This analysis is not straightforward, however, due to the complex relationships that exist between jobs. These relationships are based on a number of factors, including shared compute nodes between jobs, proximity of jobs in time, etc. Graph-based techniques represent an approach that is particularly well suited to this problem, and provide an effective technique for discovering important relationships in jobmore » queuing and execution data. The efficacy of these techniques is rooted in the use of a semantic graph as a knowledge representation tool. In a semantic graph job data, represented in a combination of numerical and textual forms, can be flexibly processed into edges, with corresponding weights, expressing relationships between jobs, nodes, users, and other relevant entities. This graph-based representation permits formal manipulation by a number of analysis algorithms. This report presents a methodology and software implementation that leverages semantic graph-based techniques for the system-level monitoring and analysis of HPC clusters based on job queuing and execution data. Ontology development and graph synthesis is discussed with respect to the domain of HPC job data. The framework developed automates the synthesis of graphs from a database of job information. It also provides a front end, enabling visualization of the synthesized graphs. Additionally, an analysis engine is incorporated that provides performance analysis, graph-based clustering, and failure prediction capabilities for HPC systems.« less

  10. Symbolic manipulation techniques for vibration analysis of laminated elliptic plates

    NASA Technical Reports Server (NTRS)

    Andersen, C. M.; Noor, A. K.

    1977-01-01

    A computational scheme is presented for the free vibration analysis of laminated composite elliptic plates. The scheme is based on Hamilton's principle, the Rayleigh-Ritz technique and symmetry considerations and is implemented with the aid of the MACSYMA symbolic manipulation system. The MACYSMA system, through differentiation, integration, and simplification of analytic expressions, produces highly-efficient FORTRAN code for the evaluation of the stiffness and mass coefficients. Multiple use is made of this code to obtain not only the frequencies and mode shapes of the plate, but also the derivatives of the frequencies with respect to various material and geometric parameters.

  11. Subspace techniques to remove artifacts from EEG: a quantitative analysis.

    PubMed

    Teixeira, A R; Tome, A M; Lang, E W; Martins da Silva, A

    2008-01-01

    In this work we discuss and apply projective subspace techniques to both multichannel as well as single channel recordings. The single-channel approach is based on singular spectrum analysis(SSA) and the multichannel approach uses the extended infomax algorithm which is implemented in the opensource toolbox EEGLAB. Both approaches will be evaluated using artificial mixtures of a set of selected EEG signals. The latter were selected visually to contain as the dominant activity one of the characteristic bands of an electroencephalogram (EEG). The evaluation is performed both in the time and frequency domain by using correlation coefficients and coherence function, respectively.

  12. Intramuscular injection technique: an evidence-based approach.

    PubMed

    Ogston-Tuck, Sherri

    2014-09-30

    Intramuscular injections require a thorough and meticulous approach to patient assessment and injection technique. This article, the second in a series of two, reviews the evidence base to inform safer practice and to consider the evidence for nursing practice in this area. A framework for safe practice is included, identifying important points for safe technique, patient care and clinical decision making. It also highlights the ongoing debate in selection of intramuscular injection sites, predominately the ventrogluteal and dorsogluteal muscles.

  13. Analysis of security of optical encryption with spatially incoherent illumination technique

    NASA Astrophysics Data System (ADS)

    Cheremkhin, Pavel A.; Evtikhiev, Nikolay N.; Krasnov, Vitaly V.; Rodin, Vladislav G.; Shifrina, Anna V.

    2017-03-01

    Applications of optical methods for encryption purposes have been attracting interest of researchers for decades. The first and the most popular is double random phase encoding (DRPE) technique. There are many optical encryption techniques based on DRPE. Main advantage of DRPE based techniques is high security due to transformation of spectrum of image to be encrypted into white spectrum via use of first phase random mask which allows for encrypted images with white spectra. Downsides are necessity of using holographic registration scheme in order to register not only light intensity distribution but also its phase distribution, and speckle noise occurring due to coherent illumination. Elimination of these disadvantages is possible via usage of incoherent illumination instead of coherent one. In this case, phase registration no longer matters, which means that there is no need for holographic setup, and speckle noise is gone. This technique does not have drawbacks inherent to coherent methods, however, as only light intensity distribution is considered, mean value of image to be encrypted is always above zero which leads to intensive zero spatial frequency peak in image spectrum. Consequently, in case of spatially incoherent illumination, image spectrum, as well as encryption key spectrum, cannot be white. This might be used to crack encryption system. If encryption key is very sparse, encrypted image might contain parts or even whole unhidden original image. Therefore, in this paper analysis of security of optical encryption with spatially incoherent illumination depending on encryption key size and density is conducted.

  14. Investigations on landmine detection by neutron-based techniques.

    PubMed

    Csikai, J; Dóczi, R; Király, B

    2004-07-01

    Principles and techniques of some neutron-based methods used to identify the antipersonnel landmines (APMs) are discussed. New results have been achieved in the field of neutron reflection, transmission, scattering and reaction techniques. Some conclusions are as follows: The neutron hand-held detector is suitable for the observation of anomaly caused by a DLM2-like sample in different soils with a scanning speed of 1m(2)/1.5 min; the reflection cross section of thermal neutrons rendered the determination of equivalent thickness of different soil components possible; a simple method was developed for the determination of the thermal neutron flux perturbation factor needed for multi-elemental analysis of bulky samples; unfolded spectra of elastically backscattered neutrons using broad-spectrum sources render the identification of APMs possible; the knowledge of leakage spectra of different source neutrons is indispensable for the determination of the differential and integrated reaction rates and through it the dimension of the interrogated volume; the precise determination of the C/O atom fraction requires the investigations on the angular distribution of the 6.13MeV gamma-ray emitted in the (16)O(n,n'gamma) reaction. These results, in addition to the identification of landmines, render the improvement of the non-intrusive neutron methods possible.

  15. Prompt Gamma Activation Analysis (PGAA): Technique of choice for nondestructive bulk analysis of returned comet samples

    NASA Technical Reports Server (NTRS)

    Lindstrom, David J.; Lindstrom, Richard M.

    1989-01-01

    Prompt gamma activation analysis (PGAA) is a well-developed analytical technique. The technique involves irradiation of samples in an external neutron beam from a nuclear reactor, with simultaneous counting of gamma rays produced in the sample by neutron capture. Capture of neutrons leads to excited nuclei which decay immediately with the emission of energetic gamma rays to the ground state. PGAA has several advantages over other techniques for the analysis of cometary materials: (1) It is nondestructive; (2) It can be used to determine abundances of a wide variety of elements, including most major and minor elements (Na, Mg, Al, Si, P, K, Ca, Ti, Cr, Mn, Fe, Co, Ni), volatiles (H, C, N, F, Cl, S), and some trace elements (those with high neutron capture cross sections, including B, Cd, Nd, Sm, and Gd); and (3) It is a true bulk analysis technique. Recent developments should improve the technique's sensitivity and accuracy considerably.

  16. Analysis technique for controlling system wavefront error with active/adaptive optics

    NASA Astrophysics Data System (ADS)

    Genberg, Victor L.; Michels, Gregory J.

    2017-08-01

    The ultimate goal of an active mirror system is to control system level wavefront error (WFE). In the past, the use of this technique was limited by the difficulty of obtaining a linear optics model. In this paper, an automated method for controlling system level WFE using a linear optics model is presented. An error estimate is included in the analysis output for both surface error disturbance fitting and actuator influence function fitting. To control adaptive optics, the technique has been extended to write system WFE in state space matrix form. The technique is demonstrated by example with SigFit, a commercially available tool integrating mechanical analysis with optical analysis.

  17. Robust Global Image Registration Based on a Hybrid Algorithm Combining Fourier and Spatial Domain Techniques

    DTIC Science & Technology

    2012-09-01

    Robust global image registration based on a hybrid algorithm combining Fourier and spatial domain techniques Peter N. Crabtree, Collin Seanor...00-00-2012 to 00-00-2012 4. TITLE AND SUBTITLE Robust global image registration based on a hybrid algorithm combining Fourier and spatial domain...demonstrate performance of a hybrid algorithm . These results are from analysis of a set of images of an ISO 12233 [12] resolution chart captured in the

  18. Integrated Formulation of Beacon-Based Exception Analysis for Multimissions

    NASA Technical Reports Server (NTRS)

    Mackey, Ryan; James, Mark; Park, Han; Zak, Mickail

    2003-01-01

    Further work on beacon-based exception analysis for multimissions (BEAM), a method of real-time, automated diagnosis of a complex electromechanical systems, has greatly expanded its capability and suitability of application. This expanded formulation, which fully integrates physical models and symbolic analysis, is described. The new formulation of BEAM expands upon previous advanced techniques for analysis of signal data, utilizing mathematical modeling of the system physics, and expert-system reasoning,

  19. WE-G-BRA-07: Analyzing the Safety Implications of a Brachytherapy Process Improvement Project Utilizing a Novel System-Theory-Based Hazard-Analysis Technique

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tang, A; Samost, A; Viswanathan, A

    Purpose: To investigate the hazards in cervical-cancer HDR brachytherapy using a novel hazard-analysis technique, System Theoretic Process Analysis (STPA). The applicability and benefit of STPA to the field of radiation oncology is demonstrated. Methods: We analyzed the tandem and ring HDR procedure through observations, discussions with physicists and physicians, and the use of a previously developed process map. Controllers and their respective control actions were identified and arranged into a hierarchical control model of the system, modeling the workflow from applicator insertion through initiating treatment delivery. We then used the STPA process to identify potentially unsafe control actions. Scenarios weremore » then generated from the identified unsafe control actions and used to develop recommendations for system safety constraints. Results: 10 controllers were identified and included in the final model. From these controllers 32 potentially unsafe control actions were identified, leading to more than 120 potential accident scenarios, including both clinical errors (e.g., using outdated imaging studies for planning), and managerial-based incidents (e.g., unsafe equipment, budget, or staffing decisions). Constraints identified from those scenarios include common themes, such as the need for appropriate feedback to give the controllers an adequate mental model to maintain safe boundaries of operations. As an example, one finding was that the likelihood of the potential accident scenario of the applicator breaking during insertion might be reduced by establishing a feedback loop of equipment-usage metrics and equipment-failure reports to the management controller. Conclusion: The utility of STPA in analyzing system hazards in a clinical brachytherapy system was demonstrated. This technique, rooted in system theory, identified scenarios both technical/clinical and managerial in nature. These results suggest that STPA can be successfully used to analyze

  20. Electrical Load Profile Analysis Using Clustering Techniques

    NASA Astrophysics Data System (ADS)

    Damayanti, R.; Abdullah, A. G.; Purnama, W.; Nandiyanto, A. B. D.

    2017-03-01

    Data mining is one of the data processing techniques to collect information from a set of stored data. Every day the consumption of electricity load is recorded by Electrical Company, usually at intervals of 15 or 30 minutes. This paper uses a clustering technique, which is one of data mining techniques to analyse the electrical load profiles during 2014. The three methods of clustering techniques were compared, namely K-Means (KM), Fuzzy C-Means (FCM), and K-Means Harmonics (KHM). The result shows that KHM is the most appropriate method to classify the electrical load profile. The optimum number of clusters is determined using the Davies-Bouldin Index. By grouping the load profile, the demand of variation analysis and estimation of energy loss from the group of load profile with similar pattern can be done. From the group of electric load profile, it can be known cluster load factor and a range of cluster loss factor that can help to find the range of values of coefficients for the estimated loss of energy without performing load flow studies.

  1. A methodology for automated CPA extraction using liver biopsy image analysis and machine learning techniques.

    PubMed

    Tsipouras, Markos G; Giannakeas, Nikolaos; Tzallas, Alexandros T; Tsianou, Zoe E; Manousou, Pinelopi; Hall, Andrew; Tsoulos, Ioannis; Tsianos, Epameinondas

    2017-03-01

    Collagen proportional area (CPA) extraction in liver biopsy images provides the degree of fibrosis expansion in liver tissue, which is the most characteristic histological alteration in hepatitis C virus (HCV). Assessment of the fibrotic tissue is currently based on semiquantitative staging scores such as Ishak and Metavir. Since its introduction as a fibrotic tissue assessment technique, CPA calculation based on image analysis techniques has proven to be more accurate than semiquantitative scores. However, CPA has yet to reach everyday clinical practice, since the lack of standardized and robust methods for computerized image analysis for CPA assessment have proven to be a major limitation. The current work introduces a three-stage fully automated methodology for CPA extraction based on machine learning techniques. Specifically, clustering algorithms have been employed for background-tissue separation, as well as for fibrosis detection in liver tissue regions, in the first and the third stage of the methodology, respectively. Due to the existence of several types of tissue regions in the image (such as blood clots, muscle tissue, structural collagen, etc.), classification algorithms have been employed to identify liver tissue regions and exclude all other non-liver tissue regions from CPA computation. For the evaluation of the methodology, 79 liver biopsy images have been employed, obtaining 1.31% mean absolute CPA error, with 0.923 concordance correlation coefficient. The proposed methodology is designed to (i) avoid manual threshold-based and region selection processes, widely used in similar approaches presented in the literature, and (ii) minimize CPA calculation time. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  2. A preclustering-based ensemble learning technique for acute appendicitis diagnoses.

    PubMed

    Lee, Yen-Hsien; Hu, Paul Jen-Hwa; Cheng, Tsang-Hsiang; Huang, Te-Chia; Chuang, Wei-Yao

    2013-06-01

    Acute appendicitis is a common medical condition, whose effective, timely diagnosis can be difficult. A missed diagnosis not only puts the patient in danger but also requires additional resources for corrective treatments. An acute appendicitis diagnosis constitutes a classification problem, for which a further fundamental challenge pertains to the skewed outcome class distribution of instances in the training sample. A preclustering-based ensemble learning (PEL) technique aims to address the associated imbalanced sample learning problems and thereby support the timely, accurate diagnosis of acute appendicitis. The proposed PEL technique employs undersampling to reduce the number of majority-class instances in a training sample, uses preclustering to group similar majority-class instances into multiple groups, and selects from each group representative instances to create more balanced samples. The PEL technique thereby reduces potential information loss from random undersampling. It also takes advantage of ensemble learning to improve performance. We empirically evaluate this proposed technique with 574 clinical cases obtained from a comprehensive tertiary hospital in southern Taiwan, using several prevalent techniques and a salient scoring system as benchmarks. The comparative results show that PEL is more effective and less biased than any benchmarks. The proposed PEL technique seems more sensitive to identifying positive acute appendicitis than the commonly used Alvarado scoring system and exhibits higher specificity in identifying negative acute appendicitis. In addition, the sensitivity and specificity values of PEL appear higher than those of the investigated benchmarks that follow the resampling approach. Our analysis suggests PEL benefits from the more representative majority-class instances in the training sample. According to our overall evaluation results, PEL records the best overall performance, and its area under the curve measure reaches 0.619. The

  3. Analysis of RDSS positioning accuracy based on RNSS wide area differential technique

    NASA Astrophysics Data System (ADS)

    Xing, Nan; Su, RanRan; Zhou, JianHua; Hu, XiaoGong; Gong, XiuQiang; Liu, Li; He, Feng; Guo, Rui; Ren, Hui; Hu, GuangMing; Zhang, Lei

    2013-10-01

    The BeiDou Navigation Satellite System (BDS) provides Radio Navigation Service System (RNSS) as well as Radio Determination Service System (RDSS). RDSS users can obtain positioning by responding the Master Control Center (MCC) inquiries to signal transmitted via GEO satellite transponder. The positioning result can be calculated with elevation constraint by MCC. The primary error sources affecting the RDSS positioning accuracy are the RDSS signal transceiver delay, atmospheric trans-mission delay and GEO satellite position error. During GEO orbit maneuver, poor orbit forecast accuracy significantly impacts RDSS services. A real-time 3-D orbital correction method based on wide-area differential technique is raised to correct the orbital error. Results from the observation shows that the method can successfully improve positioning precision during orbital maneuver, independent from the RDSS reference station. This improvement can reach 50% in maximum. Accurate calibration of the RDSS signal transceiver delay precision and digital elevation map may have a critical role in high precise RDSS positioning services.

  4. A PCR technique based on the Hip1 interspersed repetitive sequence distinguishes cyanobacterial species and strains.

    PubMed

    Smith, J K; Parry, J D; Day, J G; Smith, R J

    1998-10-01

    The use of primers based on the Hip1 sequence as a typing technique for cyanobacteria has been investigated. The discovery of short repetitive sequence structures in bacterial DNA during the last decade has led to the development of PCR-based methods for typing, i.e., distinguishing and identifying, bacterial species and strains. An octameric palindromic sequence known as Hip1 has been shown to be present in the chromosomal DNA of many species of cyanobacteria as a highly repetitious interspersed sequence. PCR primers were constructed that extended the Hip1 sequence at the 3' end by two bases. Five of the 16 possible extended primers were tested. Each of the five primers produced a different set of products when used to prime PCR from cyanobacterial genomic DNA. Each primer produced a distinct set of products for each of the 15 cyanobacterial species tested. The ability of Hip1-based PCR to resolve taxonomic differences was assessed by analysis of independent isolates of Anabaena flos-aquae and Nostoc ellipsosporum obtained from the CCAP (Culture Collection of Algae and Protozoa, IFE, Cumbria, UK). A PCR-based RFLP analysis of products amplified from the 23S-16S rDNA intergenic region was used to characterize the isolates and to compare with the Hip1 typing data. The RFLP and Hip1 typing yielded similar results and both techniques were able to distinguish different strains. On the basis of these results it is suggested that the Hip1 PCR technique may assist in distinguishing cyanobacterial species and strains.

  5. Communication methods and production techniques in fixed prosthesis fabrication: a UK based survey. Part 2: Production techniques

    PubMed Central

    Berry, J.; Nesbit, M.; Saberi, S.; Petridis, H.

    2014-01-01

    Aim The aim of this study was to identify the communication methods and production techniques used by dentists and dental technicians for the fabrication of fixed prostheses within the UK from the dental technicians' perspective. This second paper reports on the production techniques utilised. Materials and methods Seven hundred and eighty-two online questionnaires were distributed to the Dental Laboratories Association membership and included a broad range of topics, such as demographics, impression disinfection and suitability, and various production techniques. Settings were managed in order to ensure anonymity of respondents. Statistical analysis was undertaken to test the influence of various demographic variables such as the source of information, the location, and the size of the dental laboratory. Results The number of completed responses totalled 248 (32% response rate). Ninety percent of the respondents were based in England and the majority of dental laboratories were categorised as small sized (working with up to 25 dentists). Concerns were raised regarding inadequate disinfection protocols between dentists and dental laboratories and the poor quality of master impressions. Full arch plastic trays were the most popular impression tray used by dentists in the fabrication of crowns (61%) and bridgework (68%). The majority (89%) of jaw registration records were considered inaccurate. Forty-four percent of dental laboratories preferred using semi-adjustable articulators. Axial and occlusal under-preparation of abutment teeth was reported as an issue in about 25% of cases. Base metal alloy was the most (52%) commonly used alloy material. Metal-ceramic crowns were the most popular choice for anterior (69%) and posterior (70%) cases. The various factors considered did not have any statistically significant effect on the answers provided. The only notable exception was the fact that more methods of communicating the size and shape of crowns were utilised for

  6. Comparison of the Joel-Cohen-based technique and the transverse Pfannenstiel for caesarean section for safety and effectiveness: A systematic review and meta-analysis

    PubMed Central

    Olyaeemanesh, Alireza; Bavandpour, Elahe; Mobinizadeh, Mohammadreza; Ashrafinia, Mansoor; Bavandpour, Maryam; Nouhi, Mojtaba

    2017-01-01

    Background: Caesarean section (C-section) is the most common surgery among women worldwide, and the global rate of this surgical procedure has been continuously rising. Hence, it is significantly crucial to develop and apply highly effective and safe caesarean section techniques. In this review study, we aimed at assessing the safety and effectiveness of the Joel-Cohen-based technique and comparing the results with the transverse Pfannenstiel incision for C-section. Methods: In this study, various reliable databases such as the PubMed Central, COCHRANE, DARE, and Ovid MEDLINE were targeted. Reviews, systematic reviews, and randomized clinical trial studies comparing the Joel-Cohen-based technique and the transverse Pfannenstiel incision were selected based on the inclusion criteria. Selected studies were checked by 2 independent reviewers based on the inclusion criteria, and the quality of these studies was assessed. Then, their data were extracted and analyzed. Results: Five randomized clinical trial studies met the inclusion criteria. According to the exiting evidence, statistical results of the Joel-Cohen-based technique showed that this technique is more effective compared to the transverse Pfannenstiel incision. Metaanalysis results of the 3 outcomes were as follow: operation time (5 trials, 764 women; WMD -9.78; 95% CI:-14.49-5.07 minutes, p<0.001), blood loss (3 trials, 309 women; WMD -53.23ml; 95% –CI: 90.20-16.26 ml, p= 0.004), and post-operative hospital stay (3 trials, 453 women; WMD -.69 day; 95% CI: 1.4-0.03 day, p<0.001). Statistical results revealed a significant difference between the 2 techniques. Conclusion: According to the literature, despite having a number of side effects, the Joel-Cohen-based technique is generally more effective than the Pfannenstiel incision technique. In addition, it was recommended that the Joel-Cohen-based technique be used as a replacement for the Pfannenstiel incision technique according to the surgeons

  7. Comparison of the Joel-Cohen-based technique and the transverse Pfannenstiel for caesarean section for safety and effectiveness: A systematic review and meta-analysis.

    PubMed

    Olyaeemanesh, Alireza; Bavandpour, Elahe; Mobinizadeh, Mohammadreza; Ashrafinia, Mansoor; Bavandpour, Maryam; Nouhi, Mojtaba

    2017-01-01

    Background: Caesarean section (C-section) is the most common surgery among women worldwide, and the global rate of this surgical procedure has been continuously rising. Hence, it is significantly crucial to develop and apply highly effective and safe caesarean section techniques. In this review study, we aimed at assessing the safety and effectiveness of the Joel-Cohen-based technique and comparing the results with the transverse Pfannenstiel incision for C-section. Methods: In this study, various reliable databases such as the PubMed Central, COCHRANE, DARE, and Ovid MEDLINE were targeted. Reviews, systematic reviews, and randomized clinical trial studies comparing the Joel-Cohen-based technique and the transverse Pfannenstiel incision were selected based on the inclusion criteria. Selected studies were checked by 2 independent reviewers based on the inclusion criteria, and the quality of these studies was assessed. Then, their data were extracted and analyzed. Results: Five randomized clinical trial studies met the inclusion criteria. According to the exiting evidence, statistical results of the Joel-Cohen-based technique showed that this technique is more effective compared to the transverse Pfannenstiel incision. Metaanalysis results of the 3 outcomes were as follow: operation time (5 trials, 764 women; WMD -9.78; 95% CI:-14.49-5.07 minutes, p<0.001), blood loss (3 trials, 309 women; WMD -53.23ml; 95% -CI: 90.20-16.26 ml, p= 0.004), and post-operative hospital stay (3 trials, 453 women; WMD -.69 day; 95% CI: 1.4-0.03 day, p<0.001). Statistical results revealed a significant difference between the 2 techniques. Conclusion: According to the literature, despite having a number of side effects, the Joel-Cohen-based technique is generally more effective than the Pfannenstiel incision technique. In addition, it was recommended that the Joel-Cohen-based technique be used as a replacement for the Pfannenstiel incision technique according to the surgeons' preferences

  8. Flash Infrared Thermography Contrast Data Analysis Technique

    NASA Technical Reports Server (NTRS)

    Koshti, Ajay

    2014-01-01

    This paper provides information on an IR Contrast technique that involves extracting normalized contrast versus time evolutions from the flash thermography inspection infrared video data. The analysis calculates thermal measurement features from the contrast evolution. In addition, simulation of the contrast evolution is achieved through calibration on measured contrast evolutions from many flat-bottom holes in the subject material. The measurement features and the contrast simulation are used to evaluate flash thermography data in order to characterize delamination-like anomalies. The thermal measurement features relate to the anomaly characteristics. The contrast evolution simulation is matched to the measured contrast evolution over an anomaly to provide an assessment of the anomaly depth and width which correspond to the depth and diameter of the equivalent flat-bottom hole (EFBH) similar to that used as input to the simulation. A similar analysis, in terms of diameter and depth of an equivalent uniform gap (EUG) providing a best match with the measured contrast evolution, is also provided. An edge detection technique called the half-max is used to measure width and length of the anomaly. Results of the half-max width and the EFBH/EUG diameter are compared to evaluate the anomaly. The information provided here is geared towards explaining the IR Contrast technique. Results from a limited amount of validation data on reinforced carbon-carbon (RCC) hardware are included in this paper.

  9. Experiments on Adaptive Techniques for Host-Based Intrusion Detection

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    DRAELOS, TIMOTHY J.; COLLINS, MICHAEL J.; DUGGAN, DAVID P.

    2001-09-01

    This research explores four experiments of adaptive host-based intrusion detection (ID) techniques in an attempt to develop systems that can detect novel exploits. The technique considered to have the most potential is adaptive critic designs (ACDs) because of their utilization of reinforcement learning, which allows learning exploits that are difficult to pinpoint in sensor data. Preliminary results of ID using an ACD, an Elman recurrent neural network, and a statistical anomaly detection technique demonstrate an ability to learn to distinguish between clean and exploit data. We used the Solaris Basic Security Module (BSM) as a data source and performed considerablemore » preprocessing on the raw data. A detection approach called generalized signature-based ID is recommended as a middle ground between signature-based ID, which has an inability to detect novel exploits, and anomaly detection, which detects too many events including events that are not exploits. The primary results of the ID experiments demonstrate the use of custom data for generalized signature-based intrusion detection and the ability of neural network-based systems to learn in this application environment.« less

  10. Highly efficient peptide separations in proteomics. Part 2: bi- and multidimensional liquid-based separation techniques.

    PubMed

    Sandra, Koen; Moshir, Mahan; D'hondt, Filip; Tuytten, Robin; Verleysen, Katleen; Kas, Koen; François, Isabelle; Sandra, Pat

    2009-04-15

    Multidimensional liquid-based separation techniques are described for maximizing the resolution of the enormous number of peptides generated upon tryptic digestion of proteomes, and hence, reduce the spatial and temporal complexity of the sample to a level that allows successful mass spectrometric analysis. This review complements the previous contribution on unidimensional high performance liquid chromatography (HPLC). Both chromatography and electrophoresis will be discussed albeit with reversed-phase HPLC (RPLC) as the final separation dimension prior to MS analysis.

  11. 13C-based metabolic flux analysis: fundamentals and practice.

    PubMed

    Yang, Tae Hoon

    2013-01-01

    Isotope-based metabolic flux analysis is one of the emerging technologies applied to system level metabolic phenotype characterization in metabolic engineering. Among the developed approaches, (13)C-based metabolic flux analysis has been established as a standard tool and has been widely applied to quantitative pathway characterization of diverse biological systems. To implement (13)C-based metabolic flux analysis in practice, comprehending the underlying mathematical and computational modeling fundamentals is of importance along with carefully conducted experiments and analytical measurements. Such knowledge is also crucial when designing (13)C-labeling experiments and properly acquiring key data sets essential for in vivo flux analysis implementation. In this regard, the modeling fundamentals of (13)C-labeling systems and analytical data processing are the main topics we will deal with in this chapter. Along with this, the relevant numerical optimization techniques are addressed to help implementation of the entire computational procedures aiming at (13)C-based metabolic flux analysis in vivo.

  12. Exploring Techniques for Vision Based Human Activity Recognition: Methods, Systems, and Evaluation

    PubMed Central

    Xu, Xin; Tang, Jinshan; Zhang, Xiaolong; Liu, Xiaoming; Zhang, Hong; Qiu, Yimin

    2013-01-01

    With the wide applications of vision based intelligent systems, image and video analysis technologies have attracted the attention of researchers in the computer vision field. In image and video analysis, human activity recognition is an important research direction. By interpreting and understanding human activities, we can recognize and predict the occurrence of crimes and help the police or other agencies react immediately. In the past, a large number of papers have been published on human activity recognition in video and image sequences. In this paper, we provide a comprehensive survey of the recent development of the techniques, including methods, systems, and quantitative evaluation of the performance of human activity recognition. PMID:23353144

  13. Low energy analysis techniques for CUORE

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Alduino, C.; Alfonso, K.; Artusa, D. R.

    CUORE is a tonne-scale cryogenic detector operating at the Laboratori Nazionali del Gran Sasso (LNGS) that uses tellurium dioxide bolometers to search for neutrinoless double-beta decay of 130Te. CUORE is also suitable to search for low energy rare events such as solar axions or WIMP scattering, thanks to its ultra-low background and large target mass. However, to conduct such sensitive searches requires improving the energy threshold to 10 keV. Here in this article, we describe the analysis techniques developed for the low energy analysis of CUORE-like detectors, using the data acquired from November 2013 to March 2015 by CUORE-0, amore » single-tower prototype designed to validate the assembly procedure and new cleaning techniques of CUORE. We explain the energy threshold optimization, continuous monitoring of the trigger efficiency, data and event selection, and energy calibration at low energies in detail. We also present the low energy background spectrum of CUORE-0 below 60keV. Finally, we report the sensitivity of CUORE to WIMP annual modulation using the CUORE-0 energy threshold and background, as well as an estimate of the uncertainty on the nuclear quenching factor from nuclear recoils inCUORE-0.« less

  14. Low energy analysis techniques for CUORE

    DOE PAGES

    Alduino, C.; Alfonso, K.; Artusa, D. R.; ...

    2017-12-12

    CUORE is a tonne-scale cryogenic detector operating at the Laboratori Nazionali del Gran Sasso (LNGS) that uses tellurium dioxide bolometers to search for neutrinoless double-beta decay of 130Te. CUORE is also suitable to search for low energy rare events such as solar axions or WIMP scattering, thanks to its ultra-low background and large target mass. However, to conduct such sensitive searches requires improving the energy threshold to 10 keV. Here in this article, we describe the analysis techniques developed for the low energy analysis of CUORE-like detectors, using the data acquired from November 2013 to March 2015 by CUORE-0, amore » single-tower prototype designed to validate the assembly procedure and new cleaning techniques of CUORE. We explain the energy threshold optimization, continuous monitoring of the trigger efficiency, data and event selection, and energy calibration at low energies in detail. We also present the low energy background spectrum of CUORE-0 below 60keV. Finally, we report the sensitivity of CUORE to WIMP annual modulation using the CUORE-0 energy threshold and background, as well as an estimate of the uncertainty on the nuclear quenching factor from nuclear recoils inCUORE-0.« less

  15. Event-based analysis of free-living behaviour.

    PubMed

    Granat, Malcolm H

    2012-11-01

    The quantification of free-living physical activities is important in understanding how physical activity and sedentary behaviour impact on health and also on how interventions might modify free-living behaviour to enhance health. Quantification, and the terminology used, has in many ways been determined by the choice of measurement technique. The inter-related issues around measurement devices and terminology used are explored. This paper proposes a terminology and a systematic approach for the analysis of free-living activity information using event-based activity data. The event-based approach uses a flexible hierarchical classification of events and, dependent on the research question, analysis can then be undertaken on a selection of these events. The quantification of free-living behaviour is therefore the result of the analysis on the patterns of these chosen events. The application of this approach is illustrated with results from a range of published studies by our group showing how event-based analysis provides a flexible yet robust method of addressing the research question(s) and provides a deeper insight into free-living behaviour. It is proposed that it is through event-based analysis we can more clearly understand how behaviour is related to health and also how we can produce more relevant outcome measures.

  16. Analysis, Simulation, and Verification of Knowledge-Based, Rule-Based, and Expert Systems

    NASA Technical Reports Server (NTRS)

    Hinchey, Mike; Rash, James; Erickson, John; Gracanin, Denis; Rouff, Chris

    2010-01-01

    Mathematically sound techniques are used to view a knowledge-based system (KBS) as a set of processes executing in parallel and being enabled in response to specific rules being fired. The set of processes can be manipulated, examined, analyzed, and used in a simulation. The tool that embodies this technology may warn developers of errors in their rules, but may also highlight rules (or sets of rules) in the system that are underspecified (or overspecified) and need to be corrected for the KBS to operate as intended. The rules embodied in a KBS specify the allowed situations, events, and/or results of the system they describe. In that sense, they provide a very abstract specification of a system. The system is implemented through the combination of the system specification together with an appropriate inference engine, independent of the algorithm used in that inference engine. Viewing the rule base as a major component of the specification, and choosing an appropriate specification notation to represent it, reveals how additional power can be derived from an approach to the knowledge-base system that involves analysis, simulation, and verification. This innovative approach requires no special knowledge of the rules, and allows a general approach where standardized analysis, verification, simulation, and model checking techniques can be applied to the KBS.

  17. S-192 analysis: Conventional and special data processing techniques. [Michigan

    NASA Technical Reports Server (NTRS)

    Nalepka, R. F. (Principal Investigator); Morganstern, J.; Cicone, R.; Sarno, J.; Lambeck, P.; Malila, W.

    1975-01-01

    The author has identified the following significant results. Multispectral scanner data gathered over test sites in southeast Michigan were analyzed. This analysis showed the data to be somewhat deficient especially in terms of the limited signal range in most SDOs and also in regard to SDO-SDO misregistration. Further analysis showed that the scan line straightening algorithm increased the misregistration of the data. Data were processed using the conic format. The effects of such misregistration on classification accuracy was analyzed via simulation and found to be significant. Results of employing conventional as well as special, unresolved object, processing techniques were disappointing due, at least in part, to the limited signal range and noise content of the data. Application of a second class of special processing techniques, signature extension techniques, yielded better results. Two of the more basic signature extension techniques seemed to be useful in spite of the difficulties.

  18. Application of multivariable statistical techniques in plant-wide WWTP control strategies analysis.

    PubMed

    Flores, X; Comas, J; Roda, I R; Jiménez, L; Gernaey, K V

    2007-01-01

    The main objective of this paper is to present the application of selected multivariable statistical techniques in plant-wide wastewater treatment plant (WWTP) control strategies analysis. In this study, cluster analysis (CA), principal component analysis/factor analysis (PCA/FA) and discriminant analysis (DA) are applied to the evaluation matrix data set obtained by simulation of several control strategies applied to the plant-wide IWA Benchmark Simulation Model No 2 (BSM2). These techniques allow i) to determine natural groups or clusters of control strategies with a similar behaviour, ii) to find and interpret hidden, complex and casual relation features in the data set and iii) to identify important discriminant variables within the groups found by the cluster analysis. This study illustrates the usefulness of multivariable statistical techniques for both analysis and interpretation of the complex multicriteria data sets and allows an improved use of information for effective evaluation of control strategies.

  19. Rock surface roughness measurement using CSI technique and analysis of surface characterization by qualitative and quantitative results

    NASA Astrophysics Data System (ADS)

    Mukhtar, Husneni; Montgomery, Paul; Gianto; Susanto, K.

    2016-01-01

    In order to develop image processing that is widely used in geo-processing and analysis, we introduce an alternative technique for the characterization of rock samples. The technique that we have used for characterizing inhomogeneous surfaces is based on Coherence Scanning Interferometry (CSI). An optical probe is first used to scan over the depth of the surface roughness of the sample. Then, to analyse the measured fringe data, we use the Five Sample Adaptive method to obtain quantitative results of the surface shape. To analyse the surface roughness parameters, Hmm and Rq, a new window resizing analysis technique is employed. The results of the morphology and surface roughness analysis show micron and nano-scale information which is characteristic of each rock type and its history. These could be used for mineral identification and studies in rock movement on different surfaces. Image processing is thus used to define the physical parameters of the rock surface.

  20. Analysis to feature-based video stabilization/registration techniques within application of traffic data collection

    NASA Astrophysics Data System (ADS)

    Sadat, Mojtaba T.; Viti, Francesco

    2015-02-01

    Machine vision is rapidly gaining popularity in the field of Intelligent Transportation Systems. In particular, advantages are foreseen by the exploitation of Aerial Vehicles (AV) in delivering a superior view on traffic phenomena. However, vibration on AVs makes it difficult to extract moving objects on the ground. To partly overcome this issue, image stabilization/registration procedures are adopted to correct and stitch multiple frames taken of the same scene but from different positions, angles, or sensors. In this study, we examine the impact of multiple feature-based techniques for stabilization, and we show that SURF detector outperforms the others in terms of time efficiency and output similarity.

  1. An Analysis Technique/Automated Tool for Comparing and Tracking Analysis Modes of Different Finite Element Models

    NASA Technical Reports Server (NTRS)

    Towner, Robert L.; Band, Jonathan L.

    2012-01-01

    An analysis technique was developed to compare and track mode shapes for different Finite Element Models. The technique may be applied to a variety of structural dynamics analyses, including model reduction validation (comparing unreduced and reduced models), mode tracking for various parametric analyses (e.g., launch vehicle model dispersion analysis to identify sensitivities to modal gain for Guidance, Navigation, and Control), comparing models of different mesh fidelity (e.g., a coarse model for a preliminary analysis compared to a higher-fidelity model for a detailed analysis) and mode tracking for a structure with properties that change over time (e.g., a launch vehicle from liftoff through end-of-burn, with propellant being expended during the flight). Mode shapes for different models are compared and tracked using several numerical indicators, including traditional Cross-Orthogonality and Modal Assurance Criteria approaches, as well as numerical indicators obtained by comparing modal strain energy and kinetic energy distributions. This analysis technique has been used to reliably identify correlated mode shapes for complex Finite Element Models that would otherwise be difficult to compare using traditional techniques. This improved approach also utilizes an adaptive mode tracking algorithm that allows for automated tracking when working with complex models and/or comparing a large group of models.

  2. Interest rate next-day variation prediction based on hybrid feedforward neural network, particle swarm optimization, and multiresolution techniques

    NASA Astrophysics Data System (ADS)

    Lahmiri, Salim

    2016-02-01

    Multiresolution analysis techniques including continuous wavelet transform, empirical mode decomposition, and variational mode decomposition are tested in the context of interest rate next-day variation prediction. In particular, multiresolution analysis techniques are used to decompose interest rate actual variation and feedforward neural network for training and prediction. Particle swarm optimization technique is adopted to optimize its initial weights. For comparison purpose, autoregressive moving average model, random walk process and the naive model are used as main reference models. In order to show the feasibility of the presented hybrid models that combine multiresolution analysis techniques and feedforward neural network optimized by particle swarm optimization, we used a set of six illustrative interest rates; including Moody's seasoned Aaa corporate bond yield, Moody's seasoned Baa corporate bond yield, 3-Month, 6-Month and 1-Year treasury bills, and effective federal fund rate. The forecasting results show that all multiresolution-based prediction systems outperform the conventional reference models on the criteria of mean absolute error, mean absolute deviation, and root mean-squared error. Therefore, it is advantageous to adopt hybrid multiresolution techniques and soft computing models to forecast interest rate daily variations as they provide good forecasting performance.

  3. A study of trends and techniques for space base electronics

    NASA Technical Reports Server (NTRS)

    Trotter, J. D.; Wade, T. E.; Gassaway, J. D.; Mahmood, Q.

    1978-01-01

    A sputtering system was developed to deposit aluminum and aluminum alloys by the dc sputtering technique. This system is designed for a high level of cleanliness and for monitoring the deposition parameters during film preparation. This system is now ready for studying the deposition and annealing parameters upon double-level metal preparation. A technique recently applied for semiconductor analysis, the finite element method, was studied for use in the computer modeling of two dimensional MOS transistor structures. It was concluded that the method has not been sufficiently well developed for confident use at this time. An algorithm was developed for confident use at this time. An algorithm was developed for implementing a computer study which is based upon the finite difference method. The program which was developed was modified and used to calculate redistribution data for boron and phosphorous which had been predeposited by ion implantation with range and straggle conditions. Data were generated for 111 oriented SOS films with redistribution in N2, dry O2 and steam ambients.

  4. Laser polymerization-based novel lift-off technique

    NASA Astrophysics Data System (ADS)

    Bhuian, B.; Winfield, R. J.; Crean, G. M.

    2009-03-01

    The fabrication of microstructures by two-photon polymerization has been widely reported as a means of directly writing three-dimensional nanoscale structures. In the majority of cases a single point serial writing technique is used to form a polymer model. Single layer writing can also be used to fabricate two-dimensional patterns and we report an extension of this capability by using two-photon polymerization to form a template that can be used as a sacrificial layer for a novel lift-off process. A Ti:sapphire laser, with wavelength 795 nm, 80 MHz repetition rate, 100 fs pulse duration and an average power of 700 mW, was used to write 2D grid patterns with pitches of 0.8 and 1.0 μm in a urethane acrylate resin that was spun on to a lift-off base layer. This was overcoated with gold and the grid lifted away to leave an array of gold islands. The optical transmission properties of the gold arrays were measured and found to be in agreement with a rigorous coupled-wave analysis simulation.

  5. An ionospheric occultation inversion technique based on epoch difference

    NASA Astrophysics Data System (ADS)

    Lin, Jian; Xiong, Jing; Zhu, Fuying; Yang, Jian; Qiao, Xuejun

    2013-09-01

    Of the ionospheric radio occultation (IRO) electron density profile (EDP) retrievals, the Abel based calibrated TEC inversion (CTI) is the most widely used technique. In order to eliminate the contribution from the altitude above the RO satellite, it is necessary to utilize the calibrated TEC to retrieve the EDP, which introduces the error due to the coplanar assumption. In this paper, a new technique based on the epoch difference inversion (EDI) is firstly proposed to eliminate this error. The comparisons between CTI and EDI have been done, taking advantage of the simulated and real COSMIC data. The following conclusions can be drawn: the EDI technique can successfully retrieve the EDPs without non-occultation side measurements and shows better performance than the CTI method, especially for lower orbit mission; no matter which technique is used, the inversion results at the higher altitudes are better than those at the lower altitudes, which could be explained theoretically.

  6. [Development of sample pretreatment techniques-rapid detection coupling methods for food security analysis].

    PubMed

    Huang, Yichun; Ding, Weiwei; Zhang, Zhuomin; Li, Gongke

    2013-07-01

    This paper summarizes the recent developments of the rapid detection methods for food security, such as sensors, optical techniques, portable spectral analysis, enzyme-linked immunosorbent assay, portable gas chromatograph, etc. Additionally, the applications of these rapid detection methods coupled with sample pretreatment techniques in real food security analysis are reviewed. The coupling technique has the potential to provide references to establish the selective, precise and quantitative rapid detection methods in food security analysis.

  7. Infusing Reliability Techniques into Software Safety Analysis

    NASA Technical Reports Server (NTRS)

    Shi, Ying

    2015-01-01

    Software safety analysis for a large software intensive system is always a challenge. Software safety practitioners need to ensure that software related hazards are completely identified, controlled, and tracked. This paper discusses in detail how to incorporate the traditional reliability techniques into the entire software safety analysis process. In addition, this paper addresses how information can be effectively shared between the various practitioners involved in the software safety analyses. The author has successfully applied the approach to several aerospace applications. Examples are provided to illustrate the key steps of the proposed approach.

  8. Nanomaterials-Based Optical Techniques for the Detection of Acetylcholinesterase and Pesticides

    PubMed Central

    Xia, Ning; Wang, Qinglong; Liu, Lin

    2015-01-01

    The large amount of pesticide residues in the environment is a threat to global health by inhibition of acetylcholinesterase (AChE). Biosensors for inhibition of AChE have been thus developed for the detection of pesticides. In line with the rapid development of nanotechnology, nanomaterials have attracted great attention and have been intensively studied in biological analysis due to their unique chemical, physical and size properties. The aim of this review is to provide insight into nanomaterial-based optical techniques for the determination of AChE and pesticides, including colorimetric and fluorescent assays and surface plasmon resonance. PMID:25558991

  9. Using cognitive task analysis to develop simulation-based training for medical tasks.

    PubMed

    Cannon-Bowers, Jan; Bowers, Clint; Stout, Renee; Ricci, Katrina; Hildabrand, Annette

    2013-10-01

    Pressures to increase the efficacy and effectiveness of medical training are causing the Department of Defense to investigate the use of simulation technologies. This article describes a comprehensive cognitive task analysis technique that can be used to simultaneously generate training requirements, performance metrics, scenario requirements, and simulator/simulation requirements for medical tasks. On the basis of a variety of existing techniques, we developed a scenario-based approach that asks experts to perform the targeted task multiple times, with each pass probing a different dimension of the training development process. In contrast to many cognitive task analysis approaches, we argue that our technique can be highly cost effective because it is designed to accomplish multiple goals. The technique was pilot tested with expert instructors from a large military medical training command. These instructors were employed to generate requirements for two selected combat casualty care tasks-cricothyroidotomy and hemorrhage control. Results indicated that the technique is feasible to use and generates usable data to inform simulation-based training system design. Reprint & Copyright © 2013 Association of Military Surgeons of the U.S.

  10. Space shuttle booster multi-engine base flow analysis

    NASA Technical Reports Server (NTRS)

    Tang, H. H.; Gardiner, C. R.; Anderson, W. A.; Navickas, J.

    1972-01-01

    A comprehensive review of currently available techniques pertinent to several prominent aspects of the base thermal problem of the space shuttle booster is given along with a brief review of experimental results. A tractable engineering analysis, capable of predicting the power-on base pressure, base heating, and other base thermal environmental conditions, such as base gas temperature, is presented and used for an analysis of various space shuttle booster configurations. The analysis consists of a rational combination of theoretical treatments of the prominent flow interaction phenomena in the base region. These theories consider jet mixing, plume flow, axisymmetric flow effects, base injection, recirculating flow dynamics, and various modes of heat transfer. Such effects as initial boundary layer expansion at the nozzle lip, reattachment, recompression, choked vent flow, and nonisoenergetic mixing processes are included in the analysis. A unified method was developed and programmed to numerically obtain compatible solutions for the various flow field components in both flight and ground test conditions. Preliminary prediction for a 12-engine space shuttle booster base thermal environment was obtained for a typical trajectory history. Theoretical predictions were also obtained for some clustered-engine experimental conditions. Results indicate good agreement between the data and theoretical predicitons.

  11. Non-Conventional Techniques for the Study of Phase Transitions in NiTi-Based Alloys

    NASA Astrophysics Data System (ADS)

    Nespoli, Adelaide; Villa, Elena; Passaretti, Francesca; Albertini, Franca; Cabassi, Riccardo; Pasquale, Massimo; Sasso, Carlo Paolo; Coïsson, Marco

    2014-07-01

    Differential scanning calorimetry and electrical resistance measurements are the two most common techniques for the study of the phase transition path and temperatures of shape memory alloys (SMA) in stress-free condition. Besides, it is well known that internal friction measurements are also useful for this purpose. There are indeed some further techniques which are seldom used for the basic characterization of SMA transition: dilatometric analysis, magnetic measurements, and Seebeck coefficient study. In this work, we discuss the attitude of these techniques for the study of NiTi-based phase transition. Measurements were conducted on several fully annealed Ni50- x Ti50Cu x samples ranging from 3 to 10 at.% in Cu content, fully annealed at 850 °C for 1 h in vacuum and quenched in water at room temperature. Results show that all these techniques are sensitive to phase transition, and they provide significant information about the existence of intermediate phases.

  12. Wavelet-based polarimetry analysis

    NASA Astrophysics Data System (ADS)

    Ezekiel, Soundararajan; Harrity, Kyle; Farag, Waleed; Alford, Mark; Ferris, David; Blasch, Erik

    2014-06-01

    Wavelet transformation has become a cutting edge and promising approach in the field of image and signal processing. A wavelet is a waveform of effectively limited duration that has an average value of zero. Wavelet analysis is done by breaking up the signal into shifted and scaled versions of the original signal. The key advantage of a wavelet is that it is capable of revealing smaller changes, trends, and breakdown points that are not revealed by other techniques such as Fourier analysis. The phenomenon of polarization has been studied for quite some time and is a very useful tool for target detection and tracking. Long Wave Infrared (LWIR) polarization is beneficial for detecting camouflaged objects and is a useful approach when identifying and distinguishing manmade objects from natural clutter. In addition, the Stokes Polarization Parameters, which are calculated from 0°, 45°, 90°, 135° right circular, and left circular intensity measurements, provide spatial orientations of target features and suppress natural features. In this paper, we propose a wavelet-based polarimetry analysis (WPA) method to analyze Long Wave Infrared Polarimetry Imagery to discriminate targets such as dismounts and vehicles from background clutter. These parameters can be used for image thresholding and segmentation. Experimental results show the wavelet-based polarimetry analysis is efficient and can be used in a wide range of applications such as change detection, shape extraction, target recognition, and feature-aided tracking.

  13. Spectrophotometric Determination of the Dissociation Constant of an Acid-Base Indicator Using a Mathematical Deconvolution Technique

    ERIC Educational Resources Information Center

    Alter, Krystyn P.; Molloy, John L.; Niemeyer, Emily D.

    2005-01-01

    A laboratory experiment reinforces the concept of acid-base equilibria while introducing a common application of spectrophotometry and can easily be completed within a standard four-hour laboratory period. It provides students with an opportunity to use advanced data analysis techniques like data smoothing and spectral deconvolution to…

  14. Optical asymmetric cryptography based on elliptical polarized light linear truncation and a numerical reconstruction technique.

    PubMed

    Lin, Chao; Shen, Xueju; Wang, Zhisong; Zhao, Cheng

    2014-06-20

    We demonstrate a novel optical asymmetric cryptosystem based on the principle of elliptical polarized light linear truncation and a numerical reconstruction technique. The device of an array of linear polarizers is introduced to achieve linear truncation on the spatially resolved elliptical polarization distribution during image encryption. This encoding process can be characterized as confusion-based optical cryptography that involves no Fourier lens and diffusion operation. Based on the Jones matrix formalism, the intensity transmittance for this truncation is deduced to perform elliptical polarized light reconstruction based on two intensity measurements. Use of a quick response code makes the proposed cryptosystem practical, with versatile key sensitivity and fault tolerance. Both simulation and preliminary experimental results that support theoretical analysis are presented. An analysis of the resistance of the proposed method on a known public key attack is also provided.

  15. Robust object tracking techniques for vision-based 3D motion analysis applications

    NASA Astrophysics Data System (ADS)

    Knyaz, Vladimir A.; Zheltov, Sergey Y.; Vishnyakov, Boris V.

    2016-04-01

    Automated and accurate spatial motion capturing of an object is necessary for a wide variety of applications including industry and science, virtual reality and movie, medicine and sports. For the most part of applications a reliability and an accuracy of the data obtained as well as convenience for a user are the main characteristics defining the quality of the motion capture system. Among the existing systems for 3D data acquisition, based on different physical principles (accelerometry, magnetometry, time-of-flight, vision-based), optical motion capture systems have a set of advantages such as high speed of acquisition, potential for high accuracy and automation based on advanced image processing algorithms. For vision-based motion capture accurate and robust object features detecting and tracking through the video sequence are the key elements along with a level of automation of capturing process. So for providing high accuracy of obtained spatial data the developed vision-based motion capture system "Mosca" is based on photogrammetric principles of 3D measurements and supports high speed image acquisition in synchronized mode. It includes from 2 to 4 technical vision cameras for capturing video sequences of object motion. The original camera calibration and external orientation procedures provide the basis for high accuracy of 3D measurements. A set of algorithms as for detecting, identifying and tracking of similar targets, so for marker-less object motion capture is developed and tested. The results of algorithms' evaluation show high robustness and high reliability for various motion analysis tasks in technical and biomechanics applications.

  16. Seismic Hazard Analysis as a Controlling Technique of Induced Seismicity in Geothermal Systems

    NASA Astrophysics Data System (ADS)

    Convertito, V.; Sharma, N.; Maercklin, N.; Emolo, A.; Zollo, A.

    2011-12-01

    The effect of induced seismicity of geothermal systems during stimulation and fluid circulation can cover a wide range of values from light and unfelt to severe and damaging. If the design of a modern geothermal system requires the largest efficiency to be obtained from the social point of view it is required that the system could be managed in order to reduce possible impact in advance. In this framework, automatic control of the seismic response of the stimulated reservoir is nowadays mandatory, particularly in proximity of densely populated areas. Recently, techniques have been proposed for this purpose mainly based on the concept of the traffic light. This system provides a tool to decide the level of stimulation rate based on the real-time analysis of the induced seismicity and the ongoing ground motion values. However, in some cases the induced effect can be delayed with respect to the time when the reservoir is stimulated. Thus, a controlling system technique able to estimate the ground motion levels for different time scales can help to better control the geothermal system. Here we present an adaptation of the classical probabilistic seismic hazard analysis to the case where the seismicity rate as well as the propagation medium properties are not constant with time. We use a non-homogeneous seismicity model for modeling purposes, in which the seismicity rate and b-value of the recurrence relationship change with time. Additionally, as a further controlling procedure, we propose a moving time window analysis of the recorded peak ground-motion values aimed at monitoring the changes in the propagation medium. In fact, for the same set of magnitude values recorded at the same stations, we expect that on average peak ground motion values attenuate in same way. As a consequence, the residual differences can be reasonably ascribed to changes in medium properties. These changes can be modeled and directly introduced in the hazard integral. We applied the proposed

  17. Surrogate-based Analysis and Optimization

    NASA Technical Reports Server (NTRS)

    Queipo, Nestor V.; Haftka, Raphael T.; Shyy, Wei; Goel, Tushar; Vaidyanathan, Raj; Tucker, P. Kevin

    2005-01-01

    A major challenge to the successful full-scale development of modem aerospace systems is to address competing objectives such as improved performance, reduced costs, and enhanced safety. Accurate, high-fidelity models are typically time consuming and computationally expensive. Furthermore, informed decisions should be made with an understanding of the impact (global sensitivity) of the design variables on the different objectives. In this context, the so-called surrogate-based approach for analysis and optimization can play a very valuable role. The surrogates are constructed using data drawn from high-fidelity models, and provide fast approximations of the objectives and constraints at new design points, thereby making sensitivity and optimization studies feasible. This paper provides a comprehensive discussion of the fundamental issues that arise in surrogate-based analysis and optimization (SBAO), highlighting concepts, methods, techniques, as well as practical implications. The issues addressed include the selection of the loss function and regularization criteria for constructing the surrogates, design of experiments, surrogate selection and construction, sensitivity analysis, convergence, and optimization. The multi-objective optimal design of a liquid rocket injector is presented to highlight the state of the art and to help guide future efforts.

  18. Breast volumetric analysis for aesthetic planning in breast reconstruction: a literature review of techniques

    PubMed Central

    Rozen, Warren Matthew; Spychal, Robert T.; Hunter-Smith, David J.

    2016-01-01

    Background Accurate volumetric analysis is an essential component of preoperative planning in both reconstructive and aesthetic breast procedures towards achieving symmetrization and patient-satisfactory outcome. Numerous comparative studies and reviews of individual techniques have been reported. However, a unifying review of all techniques comparing their accuracy, reliability, and practicality has been lacking. Methods A review of the published English literature dating from 1950 to 2015 using databases, such as PubMed, Medline, Web of Science, and EMBASE, was undertaken. Results Since Bouman’s first description of water displacement method, a range of volumetric assessment techniques have been described: thermoplastic casting, direct anthropomorphic measurement, two-dimensional (2D) imaging, and computed tomography (CT)/magnetic resonance imaging (MRI) scans. However, most have been unreliable, difficult to execute and demonstrate limited practicability. Introduction of 3D surface imaging has revolutionized the field due to its ease of use, fast speed, accuracy, and reliability. However, its widespread use has been limited by its high cost and lack of high level of evidence. Recent developments have unveiled the first web-based 3D surface imaging program, 4D imaging, and 3D printing. Conclusions Despite its importance, an accurate, reliable, and simple breast volumetric analysis tool has been elusive until the introduction of 3D surface imaging technology. However, its high cost has limited its wide usage. Novel adjunct technologies, such as web-based 3D surface imaging program, 4D imaging, and 3D printing, appear promising. PMID:27047788

  19. Optimization Techniques for Analysis of Biological and Social Networks

    DTIC Science & Technology

    2012-03-28

    analyzing a new metaheuristic technique, variable objective search. 3. Experimentation and application: Implement the proposed algorithms , test and fine...alternative mathematical programming formulations, their theoretical analysis, the development of exact algorithms , and heuristics. Originally, clusters...systematic fashion under a unifying theoretical and algorithmic framework. Optimization, Complex Networks, Social Network Analysis, Computational

  20. Probabilistic Analysis Techniques Applied to Complex Spacecraft Power System Modeling

    NASA Technical Reports Server (NTRS)

    Hojnicki, Jeffrey S.; Rusick, Jeffrey J.

    2005-01-01

    Electric power system performance predictions are critical to spacecraft, such as the International Space Station (ISS), to ensure that sufficient power is available to support all the spacecraft s power needs. In the case of the ISS power system, analyses to date have been deterministic, meaning that each analysis produces a single-valued result for power capability because of the complexity and large size of the model. As a result, the deterministic ISS analyses did not account for the sensitivity of the power capability to uncertainties in model input variables. Over the last 10 years, the NASA Glenn Research Center has developed advanced, computationally fast, probabilistic analysis techniques and successfully applied them to large (thousands of nodes) complex structural analysis models. These same techniques were recently applied to large, complex ISS power system models. This new application enables probabilistic power analyses that account for input uncertainties and produce results that include variations caused by these uncertainties. Specifically, N&R Engineering, under contract to NASA, integrated these advanced probabilistic techniques with Glenn s internationally recognized ISS power system model, System Power Analysis for Capability Evaluation (SPACE).

  1. The Recoverability of P-Technique Factor Analysis

    ERIC Educational Resources Information Center

    Molenaar, Peter C. M.; Nesselroade, John R.

    2009-01-01

    It seems that just when we are about to lay P-technique factor analysis finally to rest as obsolete because of newer, more sophisticated multivariate time-series models using latent variables--dynamic factor models--it rears its head to inform us that an obituary may be premature. We present the results of some simulations demonstrating that even…

  2. Techniques for the Analysis of Human Movement.

    ERIC Educational Resources Information Center

    Grieve, D. W.; And Others

    This book presents the major analytical techniques that may be used in the appraisal of human movement. Chapter 1 is devoted to the photopgraphic analysis of movement with particular emphasis on cine filming. Cine film may be taken with little or no restriction on the performer's range of movement; information on the film is permanent and…

  3. A GIS based spatially-explicit sensitivity and uncertainty analysis approach for multi-criteria decision analysis

    NASA Astrophysics Data System (ADS)

    Feizizadeh, Bakhtiar; Jankowski, Piotr; Blaschke, Thomas

    2014-03-01

    GIS multicriteria decision analysis (MCDA) techniques are increasingly used in landslide susceptibility mapping for the prediction of future hazards, land use planning, as well as for hazard preparedness. However, the uncertainties associated with MCDA techniques are inevitable and model outcomes are open to multiple types of uncertainty. In this paper, we present a systematic approach to uncertainty and sensitivity analysis. We access the uncertainty of landslide susceptibility maps produced with GIS-MCDA techniques. A new spatially-explicit approach and Dempster-Shafer Theory (DST) are employed to assess the uncertainties associated with two MCDA techniques, namely Analytical Hierarchical Process (AHP) and Ordered Weighted Averaging (OWA) implemented in GIS. The methodology is composed of three different phases. First, weights are computed to express the relative importance of factors (criteria) for landslide susceptibility. Next, the uncertainty and sensitivity of landslide susceptibility is analyzed as a function of weights using Monte Carlo Simulation and Global Sensitivity Analysis. Finally, the results are validated using a landslide inventory database and by applying DST. The comparisons of the obtained landslide susceptibility maps of both MCDA techniques with known landslides show that the AHP outperforms OWA. However, the OWA-generated landslide susceptibility map shows lower uncertainty than the AHP-generated map. The results demonstrate that further improvement in the accuracy of GIS-based MCDA can be achieved by employing an integrated uncertainty-sensitivity analysis approach, in which the uncertainty of landslide susceptibility model is decomposed and attributed to model's criteria weights.

  4. A GIS based spatially-explicit sensitivity and uncertainty analysis approach for multi-criteria decision analysis.

    PubMed

    Feizizadeh, Bakhtiar; Jankowski, Piotr; Blaschke, Thomas

    2014-03-01

    GIS multicriteria decision analysis (MCDA) techniques are increasingly used in landslide susceptibility mapping for the prediction of future hazards, land use planning, as well as for hazard preparedness. However, the uncertainties associated with MCDA techniques are inevitable and model outcomes are open to multiple types of uncertainty. In this paper, we present a systematic approach to uncertainty and sensitivity analysis. We access the uncertainty of landslide susceptibility maps produced with GIS-MCDA techniques. A new spatially-explicit approach and Dempster-Shafer Theory (DST) are employed to assess the uncertainties associated with two MCDA techniques, namely Analytical Hierarchical Process (AHP) and Ordered Weighted Averaging (OWA) implemented in GIS. The methodology is composed of three different phases. First, weights are computed to express the relative importance of factors (criteria) for landslide susceptibility. Next, the uncertainty and sensitivity of landslide susceptibility is analyzed as a function of weights using Monte Carlo Simulation and Global Sensitivity Analysis. Finally, the results are validated using a landslide inventory database and by applying DST. The comparisons of the obtained landslide susceptibility maps of both MCDA techniques with known landslides show that the AHP outperforms OWA. However, the OWA-generated landslide susceptibility map shows lower uncertainty than the AHP-generated map. The results demonstrate that further improvement in the accuracy of GIS-based MCDA can be achieved by employing an integrated uncertainty-sensitivity analysis approach, in which the uncertainty of landslide susceptibility model is decomposed and attributed to model's criteria weights.

  5. Artificial Intelligence Techniques: Applications for Courseware Development.

    ERIC Educational Resources Information Center

    Dear, Brian L.

    1986-01-01

    Introduces some general concepts and techniques of artificial intelligence (natural language interfaces, expert systems, knowledge bases and knowledge representation, heuristics, user-interface metaphors, and object-based environments) and investigates ways these techniques might be applied to analysis, design, development, implementation, and…

  6. Using Machine Learning Techniques in the Analysis of Oceanographic Data

    NASA Astrophysics Data System (ADS)

    Falcinelli, K. E.; Abuomar, S.

    2017-12-01

    Acoustic Doppler Current Profilers (ADCPs) are oceanographic tools capable of collecting large amounts of current profile data. Using unsupervised machine learning techniques such as principal component analysis, fuzzy c-means clustering, and self-organizing maps, patterns and trends in an ADCP dataset are found. Cluster validity algorithms such as visual assessment of cluster tendency and clustering index are used to determine the optimal number of clusters in the ADCP dataset. These techniques prove to be useful in analysis of ADCP data and demonstrate potential for future use in other oceanographic applications.

  7. A Sensitivity Analysis of Circular Error Probable Approximation Techniques

    DTIC Science & Technology

    1992-03-01

    SENSITIVITY ANALYSIS OF CIRCULAR ERROR PROBABLE APPROXIMATION TECHNIQUES THESIS Presented to the Faculty of the School of Engineering of the Air Force...programming skills. Major Paul Auclair patiently advised me in this endeavor, and Major Andy Howell added numerous insightful contributions. I thank my...techniques. The two ret(st accuratec techniiques require numerical integration and can take several hours to run ov a personal comlputer [2:1-2,4-6]. Some

  8. Autonomous facial recognition system inspired by human visual system based logarithmical image visualization technique

    NASA Astrophysics Data System (ADS)

    Wan, Qianwen; Panetta, Karen; Agaian, Sos

    2017-05-01

    Autonomous facial recognition system is widely used in real-life applications, such as homeland border security, law enforcement identification and authentication, and video-based surveillance analysis. Issues like low image quality, non-uniform illumination as well as variations in poses and facial expressions can impair the performance of recognition systems. To address the non-uniform illumination challenge, we present a novel robust autonomous facial recognition system inspired by the human visual system based, so called, logarithmical image visualization technique. In this paper, the proposed method, for the first time, utilizes the logarithmical image visualization technique coupled with the local binary pattern to perform discriminative feature extraction for facial recognition system. The Yale database, the Yale-B database and the ATT database are used for computer simulation accuracy and efficiency testing. The extensive computer simulation demonstrates the method's efficiency, accuracy, and robustness of illumination invariance for facial recognition.

  9. Histogram analysis for smartphone-based rapid hematocrit determination

    PubMed Central

    Jalal, Uddin M.; Kim, Sang C.; Shim, Joon S.

    2017-01-01

    A novel and rapid analysis technique using histogram has been proposed for the colorimetric quantification of blood hematocrits. A smartphone-based “Histogram” app for the detection of hematocrits has been developed integrating the smartphone embedded camera with a microfluidic chip via a custom-made optical platform. The developed histogram analysis shows its effectiveness in the automatic detection of sample channel including auto-calibration and can analyze the single-channel as well as multi-channel images. Furthermore, the analyzing method is advantageous to the quantification of blood-hematocrit both in the equal and varying optical conditions. The rapid determination of blood hematocrits carries enormous information regarding physiological disorders, and the use of such reproducible, cost-effective, and standard techniques may effectively help with the diagnosis and prevention of a number of human diseases. PMID:28717569

  10. Nonlinear adaptive control based on fuzzy sliding mode technique and fuzzy-based compensator.

    PubMed

    Nguyen, Sy Dzung; Vo, Hoang Duy; Seo, Tae-Il

    2017-09-01

    It is difficult to efficiently control nonlinear systems in the presence of uncertainty and disturbance (UAD). One of the main reasons derives from the negative impact of the unknown features of UAD as well as the response delay of the control system on the accuracy rate in the real time of the control signal. In order to deal with this, we propose a new controller named CO-FSMC for a class of nonlinear control systems subjected to UAD, which is constituted of a fuzzy sliding mode controller (FSMC) and a fuzzy-based compensator (CO). Firstly, the FSMC and CO are designed independently, and then an adaptive fuzzy structure is discovered to combine them. Solutions for avoiding the singular cases of the fuzzy-based function approximation and reducing the calculating cost are proposed. Based on the solutions, fuzzy sliding mode technique, lumped disturbance observer and Lyapunov stability analysis, a closed-loop adaptive control law is formulated. Simulations along with a real application based on a semi-active train-car suspension are performed to fully evaluate the method. The obtained results reflected that vibration of the chassis mass is insensitive to UAD. Compared with the other fuzzy sliding mode control strategies, the CO-FSMC can provide the best control ability to reduce unwanted vibrations. Copyright © 2017 ISA. Published by Elsevier Ltd. All rights reserved.

  11. Infrared Thermography-based Biophotonics: Integrated Diagnostic Technique for Systemic Reaction Monitoring

    NASA Astrophysics Data System (ADS)

    Vainer, Boris G.; Morozov, Vitaly V.

    A peculiar branch of biophotonics is a measurement, visualisation and quantitative analysis of infrared (IR) radiation emitted from living object surfaces. Focal plane array (FPA)-based IR cameras make it possible to realize in medicine the so called interventional infrared thermal diagnostics. An integrated technique aimed at the advancement of this new approach in biomedical science and practice is described in the paper. The assembled system includes a high-performance short-wave (2.45-3.05 μm) or long-wave (8-14 μm) IR camera, two laser Doppler flowmeters (LDF) and additional equipment and complementary facilities implementing the monitoring of human cardiovascular status. All these means operate synchronously. It is first ascertained the relationship between infrared thermography (IRT) and LDF data in humans in regard to their systemic cardiovascular reactivity. Blood supply real-time dynamics in a narcotized patient is first visualized and quantitatively represented during surgery in order to observe how the general hyperoxia influences thermoregulatory mechanisms; an abrupt increase in temperature of the upper limb is observed using IRT. It is outlined that the IRT-based integrated technique may act as a take-off runway leading to elaboration of informative new methods directly applicable to medicine and biomedical sciences.

  12. Geographically correlated errors observed from a laser-based short-arc technique

    NASA Astrophysics Data System (ADS)

    Bonnefond, P.; Exertier, P.; Barlier, F.

    1999-07-01

    The laser-based short-arc technique has been developed in order to avoid local errors which affect the dynamical orbit computation, such as those due to mismodeling in the geopotential. It is based on a geometric method and consists in fitting short arcs (about 4000 km), issued from a global orbit, with satellite laser ranging tracking measurements from a ground station network. Ninety-two TOPEX/Poseidon (T/P) cycles of laser-based short-arc orbits have then been compared to JGM-2 and JGM-3 T/P orbits computed by the Precise Orbit Determination (POD) teams (Service d'Orbitographie Doris/Centre National d'Etudes Spatiales and Goddard Space Flight Center/NASA) over two areas: (1) the Mediterranean area and (2) a part of the Pacific (including California and Hawaii) called hereafter the U.S. area. Geographically correlated orbit errors in these areas are clearly evidenced: for example, -2.6 cm and +0.7 cm for the Mediterranean and U.S. areas, respectively, relative to JGM-3 orbits. However, geographically correlated errors (GCE) which are commonly linked to errors in the gravity model, can also be due to systematic errors in the reference frame and/or to biases in the tracking measurements. The short-arc technique being very sensitive to such error sources, our analysis however demonstrates that the induced geographical systematic effects are at the level of 1-2 cm on the radial orbit component. Results are also compared with those obtained with the GPS-based reduced dynamic technique. The time-dependent part of GCE has also been studied. Over 6 years of T/P data, coherent signals in the radial component of T/P Precise Orbit Ephemeris (POE) are clearly evidenced with a time period of about 6 months. In addition, impact of time varying-error sources coming from the reference frame and the tracking data accuracy has been analyzed, showing a possible linear trend of about 0.5-1 mm/yr in the radial component of T/P POE.

  13. A human visual based binarization technique for histological images

    NASA Astrophysics Data System (ADS)

    Shreyas, Kamath K. M.; Rajendran, Rahul; Panetta, Karen; Agaian, Sos

    2017-05-01

    In the field of vision-based systems for object detection and classification, thresholding is a key pre-processing step. Thresholding is a well-known technique for image segmentation. Segmentation of medical images, such as Computed Axial Tomography (CAT), Magnetic Resonance Imaging (MRI), X-Ray, Phase Contrast Microscopy, and Histological images, present problems like high variability in terms of the human anatomy and variation in modalities. Recent advances made in computer-aided diagnosis of histological images help facilitate detection and classification of diseases. Since most pathology diagnosis depends on the expertise and ability of the pathologist, there is clearly a need for an automated assessment system. Histological images are stained to a specific color to differentiate each component in the tissue. Segmentation and analysis of such images is problematic, as they present high variability in terms of color and cell clusters. This paper presents an adaptive thresholding technique that aims at segmenting cell structures from Haematoxylin and Eosin stained images. The thresholded result can further be used by pathologists to perform effective diagnosis. The effectiveness of the proposed method is analyzed by visually comparing the results to the state of art thresholding methods such as Otsu, Niblack, Sauvola, Bernsen, and Wolf. Computer simulations demonstrate the efficiency of the proposed method in segmenting critical information.

  14. An assessment of finite-element modeling techniques for thick-solid/thin-shell joints analysis

    NASA Technical Reports Server (NTRS)

    Min, J. B.; Androlake, S. G.

    1993-01-01

    The subject of finite-element modeling has long been of critical importance to the practicing designer/analyst who is often faced with obtaining an accurate and cost-effective structural analysis of a particular design. Typically, these two goals are in conflict. The purpose is to discuss the topic of finite-element modeling for solid/shell connections (joints) which are significant for the practicing modeler. Several approaches are currently in use, but frequently various assumptions restrict their use. Such techniques currently used in practical applications were tested, especially to see which technique is the most ideally suited for the computer aided design (CAD) environment. Some basic thoughts regarding each technique are also discussed. As a consequence, some suggestions based on the results are given to lead reliable results in geometrically complex joints where the deformation and stress behavior are complicated.

  15. Peak-to-average power ratio reduction in orthogonal frequency division multiplexing-based visible light communication systems using a modified partial transmit sequence technique

    NASA Astrophysics Data System (ADS)

    Liu, Yan; Deng, Honggui; Ren, Shuang; Tang, Chengying; Qian, Xuewen

    2018-01-01

    We propose an efficient partial transmit sequence technique based on genetic algorithm and peak-value optimization algorithm (GAPOA) to reduce high peak-to-average power ratio (PAPR) in visible light communication systems based on orthogonal frequency division multiplexing (VLC-OFDM). By analysis of hill-climbing algorithm's pros and cons, we propose the POA with excellent local search ability to further process the signals whose PAPR is still over the threshold after processed by genetic algorithm (GA). To verify the effectiveness of the proposed technique and algorithm, we evaluate the PAPR performance and the bit error rate (BER) performance and compare them with partial transmit sequence (PTS) technique based on GA (GA-PTS), PTS technique based on genetic and hill-climbing algorithm (GH-PTS), and PTS based on shuffled frog leaping algorithm and hill-climbing algorithm (SFLAHC-PTS). The results show that our technique and algorithm have not only better PAPR performance but also lower computational complexity and BER than GA-PTS, GH-PTS, and SFLAHC-PTS technique.

  16. Analysis of Hospital Processes with Process Mining Techniques.

    PubMed

    Orellana García, Arturo; Pérez Alfonso, Damián; Larrea Armenteros, Osvaldo Ulises

    2015-01-01

    Process mining allows for discovery, monitoring, and improving processes identified in information systems from their event logs. In hospital environments, process analysis has been a crucial factor for cost reduction, control and proper use of resources, better patient care, and achieving service excellence. This paper presents a new component for event logs generation in the Hospital Information System or HIS, developed at University of Informatics Sciences. The event logs obtained are used for analysis of hospital processes with process mining techniques. The proposed solution intends to achieve the generation of event logs in the system with high quality. The performed analyses allowed for redefining functions in the system and proposed proper flow of information. The study exposed the need to incorporate process mining techniques in hospital systems to analyze the processes execution. Moreover, we illustrate its application for making clinical and administrative decisions for the management of hospital activities.

  17. Constraint treatment techniques and parallel algorithms for multibody dynamic analysis. Ph.D. Thesis

    NASA Technical Reports Server (NTRS)

    Chiou, Jin-Chern

    1990-01-01

    Computational procedures for kinematic and dynamic analysis of three-dimensional multibody dynamic (MBD) systems are developed from the differential-algebraic equations (DAE's) viewpoint. Constraint violations during the time integration process are minimized and penalty constraint stabilization techniques and partitioning schemes are developed. The governing equations of motion, a two-stage staggered explicit-implicit numerical algorithm, are treated which takes advantage of a partitioned solution procedure. A robust and parallelizable integration algorithm is developed. This algorithm uses a two-stage staggered central difference algorithm to integrate the translational coordinates and the angular velocities. The angular orientations of bodies in MBD systems are then obtained by using an implicit algorithm via the kinematic relationship between Euler parameters and angular velocities. It is shown that the combination of the present solution procedures yields a computationally more accurate solution. To speed up the computational procedures, parallel implementation of the present constraint treatment techniques, the two-stage staggered explicit-implicit numerical algorithm was efficiently carried out. The DAE's and the constraint treatment techniques were transformed into arrowhead matrices to which Schur complement form was derived. By fully exploiting the sparse matrix structural analysis techniques, a parallel preconditioned conjugate gradient numerical algorithm is used to solve the systems equations written in Schur complement form. A software testbed was designed and implemented in both sequential and parallel computers. This testbed was used to demonstrate the robustness and efficiency of the constraint treatment techniques, the accuracy of the two-stage staggered explicit-implicit numerical algorithm, and the speed up of the Schur-complement-based parallel preconditioned conjugate gradient algorithm on a parallel computer.

  18. Biosensor-based microRNA detection: techniques, design, performance, and challenges.

    PubMed

    Johnson, Blake N; Mutharasan, Raj

    2014-04-07

    The current state of biosensor-based techniques for amplification-free microRNA (miRNA) detection is critically reviewed. Comparison with non-sensor and amplification-based molecular techniques (MTs), such as polymerase-based methods, is made in terms of transduction mechanism, associated protocol, and sensitivity. Challenges associated with miRNA hybridization thermodynamics which affect assay selectivity and amplification bias are briefly discussed. Electrochemical, electromechanical, and optical classes of miRNA biosensors are reviewed in terms of transduction mechanism, limit of detection (LOD), time-to-results (TTR), multiplexing potential, and measurement robustness. Current trends suggest that biosensor-based techniques (BTs) for miRNA assay will complement MTs due to the advantages of amplification-free detection, LOD being femtomolar (fM)-attomolar (aM), short TTR, multiplexing capability, and minimal sample preparation requirement. Areas of future importance in miRNA BT development are presented which include focus on achieving high measurement confidence and multiplexing capabilities.

  19. Some failure modes and analysis techniques for terrestrial solar cell modules

    NASA Technical Reports Server (NTRS)

    Shumka, A.; Stern, K. H.

    1978-01-01

    Analysis data are presented on failed/defective silicon solar cell modules of various types and produced by different manufacturers. The failure mode (e.g., internal short and open circuits, output power degradation, isolation resistance degradation, etc.) are discussed in detail and in many cases related to the type of technology used in the manufacture of the modules; wherever applicable, appropriate corrective actions are recommended. Consideration is also given to some failure analysis techniques that are applicable to such modules, including X-ray radiography, capacitance measurement, cell shunt resistance measurement by the shadowing technique, steady-state illumination test station for module performance illumination, laser scanning techniques, and the SEM.

  20. Identification of piecewise affine systems based on fuzzy PCA-guided robust clustering technique

    NASA Astrophysics Data System (ADS)

    Khanmirza, Esmaeel; Nazarahari, Milad; Mousavi, Alireza

    2016-12-01

    Hybrid systems are a class of dynamical systems whose behaviors are based on the interaction between discrete and continuous dynamical behaviors. Since a general method for the analysis of hybrid systems is not available, some researchers have focused on specific types of hybrid systems. Piecewise affine (PWA) systems are one of the subsets of hybrid systems. The identification of PWA systems includes the estimation of the parameters of affine subsystems and the coefficients of the hyperplanes defining the partition of the state-input domain. In this paper, we have proposed a PWA identification approach based on a modified clustering technique. By using a fuzzy PCA-guided robust k-means clustering algorithm along with neighborhood outlier detection, the two main drawbacks of the well-known clustering algorithms, i.e., the poor initialization and the presence of outliers, are eliminated. Furthermore, this modified clustering technique enables us to determine the number of subsystems without any prior knowledge about system. In addition, applying the structure of the state-input domain, that is, considering the time sequence of input-output pairs, provides a more efficient clustering algorithm, which is the other novelty of this work. Finally, the proposed algorithm has been evaluated by parameter identification of an IGV servo actuator. Simulation together with experiment analysis has proved the effectiveness of the proposed method.

  1. Imaging Analysis of Near-Field Recording Technique for Observation of Biological Specimens

    NASA Astrophysics Data System (ADS)

    Moriguchi, Chihiro; Ohta, Akihiro; Egami, Chikara; Kawata, Yoshimasa; Terakawa, Susumu; Tsuchimori, Masaaki; Watanabe, Osamu

    2006-07-01

    We present an analysis of the properties of an imaging based on a near-field recording technique in comparison with simulation results. In the system, the optical field distributions localized near the specimens are recorded as the surface topographic distributions of a photosensitive film. It is possible to observe both soft and moving specimens, because the system does not require a scanning probe to obtain the observed image. The imaging properties are evaluated using fine structures of paramecium, and we demonstrate that it is possible to observe minute differences of refractive indices.

  2. Acoustic Biometric System Based on Preprocessing Techniques and Linear Support Vector Machines

    PubMed Central

    del Val, Lara; Izquierdo-Fuente, Alberto; Villacorta, Juan J.; Raboso, Mariano

    2015-01-01

    Drawing on the results of an acoustic biometric system based on a MSE classifier, a new biometric system has been implemented. This new system preprocesses acoustic images, extracts several parameters and finally classifies them, based on Support Vector Machine (SVM). The preprocessing techniques used are spatial filtering, segmentation—based on a Gaussian Mixture Model (GMM) to separate the person from the background, masking—to reduce the dimensions of images—and binarization—to reduce the size of each image. An analysis of classification error and a study of the sensitivity of the error versus the computational burden of each implemented algorithm are presented. This allows the selection of the most relevant algorithms, according to the benefits required by the system. A significant improvement of the biometric system has been achieved by reducing the classification error, the computational burden and the storage requirements. PMID:26091392

  3. Acoustic Biometric System Based on Preprocessing Techniques and Linear Support Vector Machines.

    PubMed

    del Val, Lara; Izquierdo-Fuente, Alberto; Villacorta, Juan J; Raboso, Mariano

    2015-06-17

    Drawing on the results of an acoustic biometric system based on a MSE classifier, a new biometric system has been implemented. This new system preprocesses acoustic images, extracts several parameters and finally classifies them, based on Support Vector Machine (SVM). The preprocessing techniques used are spatial filtering, segmentation-based on a Gaussian Mixture Model (GMM) to separate the person from the background, masking-to reduce the dimensions of images-and binarization-to reduce the size of each image. An analysis of classification error and a study of the sensitivity of the error versus the computational burden of each implemented algorithm are presented. This allows the selection of the most relevant algorithms, according to the benefits required by the system. A significant improvement of the biometric system has been achieved by reducing the classification error, the computational burden and the storage requirements.

  4. Three Techniques for Task Analysis: Examples from the Nuclear Utilities.

    ERIC Educational Resources Information Center

    Carlisle, Kenneth E.

    1984-01-01

    Discusses three task analysis techniques utilized at the Palo Verde Nuclear Generating Station to review training programs: analysis of (1) job positions, (2) procedures, and (3) instructional presentations. All of these include task breakdown, relationship determination, and task restructuring. (MBR)

  5. Identifying content-based and relational techniques to change behaviour in motivational interviewing.

    PubMed

    Hardcastle, Sarah J; Fortier, Michelle; Blake, Nicola; Hagger, Martin S

    2017-03-01

    Motivational interviewing (MI) is a complex intervention comprising multiple techniques aimed at changing health-related motivation and behaviour. However, MI techniques have not been systematically isolated and classified. This study aimed to identify the techniques unique to MI, classify them as content-related or relational, and evaluate the extent to which they overlap with techniques from the behaviour change technique taxonomy version 1 [BCTTv1; Michie, S., Richardson, M., Johnston, M., Abraham, C., Francis, J., Hardeman, W., … Wood, C. E. (2013). The behavior change technique taxonomy (v1) of 93 hierarchically clustered techniques: Building an international consensus for the reporting of behavior change interventions. Annals of Behavioral Medicine, 46, 81-95]. Behaviour change experts (n = 3) content-analysed MI techniques based on Miller and Rollnick's [(2013). Motivational interviewing: Preparing people for change (3rd ed.). New York: Guildford Press] conceptualisation. Each technique was then coded for independence and uniqueness by independent experts (n = 10). The experts also compared each MI technique to those from the BCTTv1. Experts identified 38 distinct MI techniques with high agreement on clarity, uniqueness, preciseness, and distinctiveness ratings. Of the identified techniques, 16 were classified as relational techniques. The remaining 22 techniques were classified as content based. Sixteen of the MI techniques were identified as having substantial overlap with techniques from the BCTTv1. The isolation and classification of MI techniques will provide researchers with the necessary tools to clearly specify MI interventions and test the main and interactive effects of the techniques on health behaviour. The distinction between relational and content-based techniques within MI is also an important advance, recognising that changes in motivation and behaviour in MI is a function of both intervention content and the interpersonal style

  6. CHROMATOGRAPHIC TECHNIQUES IN PHARMACEUTICAL ANALYSIS IN POIAND: HISTORY AND THE PRESENCE ON THE BASIS OF PAPERS PUBLISHED IN SELECTED POLISH PHARMACEUTICAL JOURNALS IN XX CENTURY.

    PubMed

    Bilek, Maciej; Namieśnik, Jacek

    2016-01-01

    For a long time, chromatographic techniques and techniques related to them have stimulated the development of new procedures in the field of pharmaceutical analysis. The newly developed methods, characterized by improved metrological parameters, allow for more accurate testing of, among others, the composition of raw materials, intermediates and final products. The chromatographic techniques also enable studies on waste generated in research laboratories and factories producing pharmaceuticals and parapharmaceuticals. Based on the review of reports published in Polish pharmaceutical journals, we assessed the impact of chromatographic techniques on the development of pharmaceutical analysis. The first chromatographic technique used in pharmaceutical analysis was a so-called capillary analysis. It was applied in the 1930s to control the identity of pharmaceutical formulations. In the 1940s and 1950s, the chromatographic techniques were mostly a subject of review publications, while their use in experimental work was rare. Paper chromatography and thin layer chromatography were introduced in the 1960s and 1970s, respectively. These new analytical tools have contributed to the intensive development of research in the field of phytochemistry and the analysis of herbal medicines. The development of colunm chromatography-based techniques, i.e., gas chromatography and high performance liquid chromatography took place in the end of 20th century. Both aforementioned techniques were widely applied in pharmaceutical analysis, for example, to assess the stability of drugs, test for impurities and degradation products as well as in pharmacokinetics studies. The first decade of 21" century was the time of new detection methods in gas and liquid chromatography. The information sources used to write this article were Polish pharmaceutical journals, both professional and scientific, originating from the interwar and post-war period, i.e., "Kronika Farmaceutyczna", "Farmacja Wsp

  7. Micropowder collecting technique for stable isotope analysis of carbonates.

    PubMed

    Sakai, Saburo; Kodan, Tsuyoshi

    2011-05-15

    Micromilling is a conventional technique used in the analysis of the isotopic composition of geological materials, which improves the spatial resolution of sample collection for analysis. However, a problem still remains concerning the recovery ratio of the milled sample. We constructed a simple apparatus consisting of a vacuum pump, a sintered metal filter, electrically conductive rubber stopper and a stainless steel tube for transferring the milled powder into a reaction vial. In our preliminary experiments on carbonate powder, we achieved a rapid recovery of 5 to 100 µg of carbonate with a high recovery ratio (>90%). This technique shortens the sample preparation time, improves the recovery ratio, and homogenizes the sample quantity, which, in turn, improves the analytical reproducibility. Copyright © 2011 John Wiley & Sons, Ltd.

  8. Characterizing natural colloidal/particulate-protein interactions using fluorescence-based techniques and principal component analysis.

    PubMed

    Peiris, Ramila H; Ignagni, Nicholas; Budman, Hector; Moresoli, Christine; Legge, Raymond L

    2012-09-15

    Characterization of the interactions between natural colloidal/particulate- and protein-like matter is important for understanding their contribution to different physiochemical phenomena like membrane fouling, adsorption of bacteria onto surfaces and various applications of nanoparticles in nanomedicine and nanotoxicology. Precise interpretation of the extent of such interactions is however hindered due to the limitations of most characterization methods to allow rapid, sensitive and accurate measurements. Here we report on a fluorescence-based excitation-emission matrix (EEM) approach in combination with principal component analysis (PCA) to extract information related to the interaction between natural colloidal/particulate- and protein-like matter. Surface plasmon resonance (SPR) analysis and fiber-optic probe based surface fluorescence measurements were used to confirm that the proposed approach can be used to characterize colloidal/particulate-protein interactions at the physical level. This method has potential to be a fundamental measurement of these interactions with the advantage that it can be performed rapidly and with high sensitivity. Copyright © 2012 Elsevier B.V. All rights reserved.

  9. Space shuttle/food system. Volume 2, Appendix C: Food cooling techniques analysis. Appendix D: Package and stowage: Alternate concepts analysis

    NASA Technical Reports Server (NTRS)

    1974-01-01

    The relative penalties associated with various techniques for providing an onboard cold environment for storage of perishable food items, and for the development of packaging and vehicle stowage parameters were investigated in terms of the overall food system design analysis of space shuttle. The degrees of capability for maintaining both a 40 F to 45 F refrigerated temperature and a 0 F and 20 F frozen environment were assessed for the following cooling techniques: (1) phase change (heat sink) concept; (2) thermoelectric concept; (3) vapor cycle concept; and (4) expendable ammonia concept. The parameters considered in the analysis were weight, volume, and spacecraft power restrictions. Data were also produced for packaging and vehicle stowage parameters which are compatible with vehicle weight and volume specifications. Certain assumptions were made for food packaging sizes based on previously generated space shuttle menus. The results of the study are shown, along with the range of meal choices considered.

  10. LOFT Debriefings: An Analysis of Instructor Techniques and Crew Participation

    NASA Technical Reports Server (NTRS)

    Dismukes, R. Key; Jobe, Kimberly K.; McDonnell, Lori K.

    1997-01-01

    This study analyzes techniques instructors use to facilitate crew analysis and evaluation of their Line-Oriented Flight Training (LOFT) performance. A rating instrument called the Debriefing Assessment Battery (DAB) was developed which enables raters to reliably assess instructor facilitation techniques and characterize crew participation. Thirty-six debriefing sessions conducted at five U.S. airlines were analyzed to determine the nature of instructor facilitation and crew participation. Ratings obtained using the DAB corresponded closely with descriptive measures of instructor and crew performance. The data provide empirical evidence that facilitation can be an effective tool for increasing the depth of crew participation and self-analysis of CRM performance. Instructor facilitation skill varied dramatically, suggesting a need for more concrete hands-on training in facilitation techniques. Crews were responsive but fell short of actively leading their own debriefings. Ways to improve debriefing effectiveness are suggested.

  11. Numerical analysis of thermal drilling technique on titanium sheet metal

    NASA Astrophysics Data System (ADS)

    Kumar, R.; Hynes, N. Rajesh Jesudoss

    2018-05-01

    Thermal drilling is a technique used in drilling of sheet metal for various applications. It involves rotating conical tool with high speed in order to drill the sheet metal and formed a hole with bush below the surface of sheet metal. This article investigates the finite element analysis of thermal drilling on Ti6Al4Valloy sheet metal. This analysis was carried out by means of DEFORM-3D simulation software to simulate the performance characteristics of thermal drilling technique. Due to the contribution of high temperature deformation in this technique, the output performances which are difficult to measure by the experimental approach, can be successfully achieved by finite element method. Therefore, the modeling and simulation of thermal drilling is an essential tool to predict the strain rate, stress distribution and temperature of the workpiece.

  12. Intelligent Techniques Using Molecular Data Analysis in Leukaemia: An Opportunity for Personalized Medicine Support System

    PubMed Central

    Adelson, David; Brown, Fred; Chaudhri, Naeem

    2017-01-01

    The use of intelligent techniques in medicine has brought a ray of hope in terms of treating leukaemia patients. Personalized treatment uses patient's genetic profile to select a mode of treatment. This process makes use of molecular technology and machine learning, to determine the most suitable approach to treating a leukaemia patient. Until now, no reviews have been published from a computational perspective concerning the development of personalized medicine intelligent techniques for leukaemia patients using molecular data analysis. This review studies the published empirical research on personalized medicine in leukaemia and synthesizes findings across studies related to intelligence techniques in leukaemia, with specific attention to particular categories of these studies to help identify opportunities for further research into personalized medicine support systems in chronic myeloid leukaemia. A systematic search was carried out to identify studies using intelligence techniques in leukaemia and to categorize these studies based on leukaemia type and also the task, data source, and purpose of the studies. Most studies used molecular data analysis for personalized medicine, but future advancement for leukaemia patients requires molecular models that use advanced machine-learning methods to automate decision-making in treatment management to deliver supportive medical information to the patient in clinical practice. PMID:28812013

  13. Intelligent Techniques Using Molecular Data Analysis in Leukaemia: An Opportunity for Personalized Medicine Support System.

    PubMed

    Banjar, Haneen; Adelson, David; Brown, Fred; Chaudhri, Naeem

    2017-01-01

    The use of intelligent techniques in medicine has brought a ray of hope in terms of treating leukaemia patients. Personalized treatment uses patient's genetic profile to select a mode of treatment. This process makes use of molecular technology and machine learning, to determine the most suitable approach to treating a leukaemia patient. Until now, no reviews have been published from a computational perspective concerning the development of personalized medicine intelligent techniques for leukaemia patients using molecular data analysis. This review studies the published empirical research on personalized medicine in leukaemia and synthesizes findings across studies related to intelligence techniques in leukaemia, with specific attention to particular categories of these studies to help identify opportunities for further research into personalized medicine support systems in chronic myeloid leukaemia. A systematic search was carried out to identify studies using intelligence techniques in leukaemia and to categorize these studies based on leukaemia type and also the task, data source, and purpose of the studies. Most studies used molecular data analysis for personalized medicine, but future advancement for leukaemia patients requires molecular models that use advanced machine-learning methods to automate decision-making in treatment management to deliver supportive medical information to the patient in clinical practice.

  14. Flow Injection Technique for Biochemical Analysis with Chemiluminescence Detection in Acidic Media

    PubMed Central

    Chen, Jing; Fang, Yanjun

    2007-01-01

    A review with 90 references is presented to show the development of acidic chemiluminescence methods for biochemical analysis by use of flow injection technique in the last 10 years. A brief discussion of both the chemiluminescence and flow injection technique is given. The proposed methods for biochemical analysis are described and compared according to the used chemiluminescence system.

  15. Development and verification of local/global analysis techniques for laminated composites

    NASA Technical Reports Server (NTRS)

    Griffin, O. Hayden, Jr.

    1989-01-01

    Analysis and design methods for laminated composite materials have been the subject of considerable research over the past 20 years, and are currently well developed. In performing the detailed three-dimensional analyses which are often required in proximity to discontinuities, however, analysts often encounter difficulties due to large models. Even with the current availability of powerful computers, models which are too large to run, either from a resource or time standpoint, are often required. There are several approaches which can permit such analyses, including substructuring, use of superelements or transition elements, and the global/local approach. This effort is based on the so-called zoom technique to global/local analysis, where a global analysis is run, with the results of that analysis applied to a smaller region as boundary conditions, in as many iterations as is required to attain an analysis of the desired region. Before beginning the global/local analyses, it was necessary to evaluate the accuracy of the three-dimensional elements currently implemented in the Computational Structural Mechanics (CSM) Testbed. It was also desired to install, using the Experimental Element Capability, a number of displacement formulation elements which have well known behavior when used for analysis of laminated composites.

  16. Application of sensitivity-analysis techniques to the calculation of topological quantities

    NASA Astrophysics Data System (ADS)

    Gilchrist, Stuart

    2017-08-01

    Magnetic reconnection in the corona occurs preferentially at sites where the magnetic connectivity is either discontinuous or has a large spatial gradient. Hence there is a general interest in computing quantities (like the squashing factor) that characterize the gradient in the field-line mapping function. Here we present an algorithm for calculating certain (quasi)topological quantities using mathematical techniques from the field of ``sensitivity-analysis''. The method is based on the calculation of a three dimensional field-line mapping Jacobian from which all the present topological quantities of interest can be derived. We will present the algorithm and the details of a publicly available set of libraries that implement the algorithm.

  17. Independent component analysis based digital signal processing in coherent optical fiber communication systems

    NASA Astrophysics Data System (ADS)

    Li, Xiang; Luo, Ming; Qiu, Ying; Alphones, Arokiaswami; Zhong, Wen-De; Yu, Changyuan; Yang, Qi

    2018-02-01

    In this paper, channel equalization techniques for coherent optical fiber transmission systems based on independent component analysis (ICA) are reviewed. The principle of ICA for blind source separation is introduced. The ICA based channel equalization after both single-mode fiber and few-mode fiber transmission for single-carrier and orthogonal frequency division multiplexing (OFDM) modulation formats are investigated, respectively. The performance comparisons with conventional channel equalization techniques are discussed.

  18. Cell culture-based biosensing techniques for detecting toxicity in water.

    PubMed

    Tan, Lu; Schirmer, Kristin

    2017-06-01

    The significant increase of contaminants entering fresh water bodies calls for the development of rapid and reliable methods to monitor the aquatic environment and to detect water toxicity. Cell culture-based biosensing techniques utilise the overall cytotoxic response to external stimuli, mediated by a transduced signal, to specify the toxicity of aqueous samples. These biosensing techniques can effectively indicate water toxicity for human safety and aquatic organism health. In this review we account for the recent developments of the mainstream cell culture-based biosensing techniques for water quality evaluation, discuss their key features, potentials and limitations, and outline the future prospects of their development. Copyright © 2017 The Authors. Published by Elsevier Ltd.. All rights reserved.

  19. 48 CFR 215.404-1 - Proposal analysis techniques.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... reliability of its estimating and accounting systems. [63 FR 55040, Oct. 14, 1998, as amended at 71 FR 69494... 48 Federal Acquisition Regulations System 3 2010-10-01 2010-10-01 false Proposal analysis techniques. 215.404-1 Section 215.404-1 Federal Acquisition Regulations System DEFENSE ACQUISITION...

  20. Nitrous oxide-based techniques versus nitrous oxide-free techniques for general anaesthesia.

    PubMed

    Sun, Rao; Jia, Wen Qin; Zhang, Peng; Yang, KeHu; Tian, Jin Hui; Ma, Bin; Liu, Yali; Jia, Run H; Luo, Xiao F; Kuriyama, Akira

    2015-11-06

    anaesthesia (or both) with any general anaesthesia using a volatile anaesthetic or propofol-based maintenance of anaesthesia but no nitrous oxide for adults undergoing surgery. Our primary outcome was inhospital case fatality rate. Secondary outcomes were complications and length of stay. Two review authors independently assessed trial quality and extracted the outcome data. We used meta-analysis for data synthesis. Heterogeneity was examined with the Chi² test and by calculating the I² statistic. We used a fixed-effect model if the measure of inconsistency was low for all comparisons (I² statistic < 50%); otherwise we used a random-effects model for measures with high inconsistency. We undertook subgroup analyses to explore inconsistency and sensitivity analyses to evaluate whether the results were robust. We assessed the quality of evidence of the main outcomes using the Grading of Recommendations, Assessment, Development and Evaluation (GRADE) system. We included 35 trials (13,872 adult participants). Seven included studies were at low risk of bias. We identified eight studies as awaiting classification since we could not obtain the full texts, and had insufficient information to include or exclude them. We included data from 24 trials for quantitative synthesis. The results of meta-analyses showed that nitrous oxide-based techniques increased the incidence of pulmonary atelectasis (odds ratio (OR) 1.57, 95% confidence interval (CI) 1.18 to 2.10, P = 0.002), but had no effects on the inhospital case fatality rate, the incidence of pneumonia, myocardial infarction, stroke, severe nausea and vomiting, venous thromboembolism, wound infection, or the length of hospital stay. The sensitivity analyses suggested that the results of the meta-analyses were all robust except for the outcomes of pneumonia, and severe nausea and vomiting. Two trials reported length of intensive care unit (ICU) stay but the data were skewed so were not pooled. Both trials reported that nitrous oxide-based

  1. Quantitative elemental analysis of an industrial mineral talc, using accelerator-based analytical technique

    NASA Astrophysics Data System (ADS)

    Olabanji, S. O.; Ige, A. O.; Mazzoli, C.; Ceccato, D.; Ajayi, E. O. B.; De Poli, M.; Moschini, G.

    2005-10-01

    Accelerator-based technique of PIXE was employed for the determination of the elemental concentration of an industrial mineral, talc. Talc is a very versatile mineral in industries with several applications. Due to this, there is a need to know its constituents to ensure that the workers are not exposed to health risks. Besides, microscopic tests on some talc samples in Nigeria confirm that they fall within the BP British Pharmacopoeia standard for tablet formation. However, for these samples to become a local source of raw material for pharmaceutical grade talc, the precise elemental compositions should be established which is the focus of this work. Proton beam produced by the 2.5 MV AN 2000 Van de Graaff accelerator at INFN, LNL, Legnaro, Padova, Italy was used for the PIXE measurements. The results which show the concentration of different elements in the talc samples, their health implications and metabolic roles are presented and discussed.

  2. Effects of interactive instructional techniques in a web-based peripheral nervous system component for human anatomy.

    PubMed

    Allen, Edwin B; Walls, Richard T; Reilly, Frank D

    2008-02-01

    This study investigated the effects of interactive instructional techniques in a web-based peripheral nervous system (PNS) component of a first year medical school human anatomy course. Existing data from 9 years of instruction involving 856 students were used to determine (1) the effect of web-based interactive instructional techniques on written exam item performance and (2) differences between student opinions of the benefit level of five different types of interactive learning objects used. The interactive learning objects included Patient Case studies, review Games, Simulated Interactive Patients (SIP), Flashcards, and unit Quizzes. Exam item analysis scores were found to be significantly higher (p < 0.05) for students receiving the instructional treatment incorporating the web-based interactive learning objects than for students not receiving this treatment. Questionnaires using a five-point Likert scale were analysed to determine student opinion ratings of the interactive learning objects. Students reported favorably on the benefit level of all learning objects. Students rated the benefit level of the Simulated Interactive Patients (SIP) highest, and this rating was significantly higher (p < 0.05) than all other learning objects. This study suggests that web-based interactive instructional techniques improve student exam performance. Students indicated a strong acceptance of Simulated Interactive Patient learning objects.

  3. The Use of a Context-Based Information Retrieval Technique

    DTIC Science & Technology

    2009-07-01

    provided in context. Latent Semantic Analysis (LSA) is a statistical technique for inferring contextual and structural information, and previous studies...WAIS). 10 DSTO-TR-2322 1.4.4 Latent Semantic Analysis LSA, which is also known as latent semantic indexing (LSI), uses a statistical and...1.4.6 Language Models In contrast, natural language models apply algorithms that combine statistical information with semantic information. Semantic

  4. Modal parameter identification based on combining transmissibility functions and blind source separation techniques

    NASA Astrophysics Data System (ADS)

    Araújo, Iván Gómez; Sánchez, Jesús Antonio García; Andersen, Palle

    2018-05-01

    Transmissibility-based operational modal analysis is a recent and alternative approach used to identify the modal parameters of structures under operational conditions. This approach is advantageous compared with traditional operational modal analysis because it does not make any assumptions about the excitation spectrum (i.e., white noise with a flat spectrum). However, common methodologies do not include a procedure to extract closely spaced modes with low signal-to-noise ratios. This issue is relevant when considering that engineering structures generally have closely spaced modes and that their measured responses present high levels of noise. Therefore, to overcome these problems, a new combined method for modal parameter identification is proposed in this work. The proposed method combines blind source separation (BSS) techniques and transmissibility-based methods. Here, BSS techniques were used to recover source signals, and transmissibility-based methods were applied to estimate modal information from the recovered source signals. To achieve this combination, a new method to define a transmissibility function was proposed. The suggested transmissibility function is based on the relationship between the power spectral density (PSD) of mixed signals and the PSD of signals from a single source. The numerical responses of a truss structure with high levels of added noise and very closely spaced modes were processed using the proposed combined method to evaluate its ability to identify modal parameters in these conditions. Colored and white noise excitations were used for the numerical example. The proposed combined method was also used to evaluate the modal parameters of an experimental test on a structure containing closely spaced modes. The results showed that the proposed combined method is capable of identifying very closely spaced modes in the presence of noise and, thus, may be potentially applied to improve the identification of damping ratios.

  5. Model reduction of the numerical analysis of Low Impact Developments techniques

    NASA Astrophysics Data System (ADS)

    Brunetti, Giuseppe; Šimůnek, Jirka; Wöhling, Thomas; Piro, Patrizia

    2017-04-01

    Mechanistic models have proven to be accurate and reliable tools for the numerical analysis of the hydrological behavior of Low Impact Development (LIDs) techniques. However, their widespread adoption is limited by their complexity and computational cost. Recent studies have tried to address this issue by investigating the application of new techniques, such as surrogate-based modeling. However, current results are still limited and fragmented. One of such approaches, the Model Order Reduction (MOR) technique, can represent a valuable tool for reducing the computational complexity of a numerical problems by computing an approximation of the original model. While this technique has been extensively used in water-related problems, no studies have evaluated its use in LIDs modeling. Thus, the main aim of this study is to apply the MOR technique for the development of a reduced order model (ROM) for the numerical analysis of the hydrologic behavior of LIDs, in particular green roofs. The model should be able to correctly reproduce all the hydrological processes of a green roof while reducing the computational cost. The proposed model decouples the subsurface water dynamic of a green roof in a) one-dimensional (1D) vertical flow through a green roof itself and b) one-dimensional saturated lateral flow along the impervious rooftop. The green roof is horizontally discretized in N elements. Each element represents a vertical domain, which can have different properties or boundary conditions. The 1D Richards equation is used to simulate flow in the substrate and drainage layers. Simulated outflow from the vertical domain is used as a recharge term for saturated lateral flow, which is described using the kinematic wave approximation of the Boussinesq equation. The proposed model has been compared with the mechanistic model HYDRUS-2D, which numerically solves the Richards equation for the whole domain. The HYDRUS-1D code has been used for the description of vertical flow

  6. Validating Ultrasound-based HIFU Lesion-size Monitoring Technique with MR Thermometry and Histology

    NASA Astrophysics Data System (ADS)

    Zhou, Shiwei; Petruzzello, John; Anand, Ajay; Sethuraman, Shriram; Azevedo, Jose

    2010-03-01

    In order to control and monitor HIFU lesions accurately and cost-effectively in real-time, we have developed an ultrasound-based therapy monitoring technique using acoustic radiation force to track the change in tissue mechanical properties. We validate our method with concurrent MR thermometry and histology. Comparison studies have been completed on in-vitro bovine liver samples. A single-element 1.1 MHz focused transducer was used to deliver HIFU and produce acoustic radiation force (ARF). A 5 MHz single-element transducer was placed co-axially with the HIFU transducer to acquire the RF data, and track the tissue displacement induced by ARF. During therapy, the monitoring procedure was interleaved with HIFU. MR thermometry (Philips Panorama 1T system) and ultrasound monitoring were performed simultaneously. The tissue temperature and thermal dose (CEM43 = 240 mins) were computed from the MR thermometry data. The tissue displacement induced by the acoustic radiation force was calculated from the ultrasound RF data in real-time using a cross-correlation based method. A normalized displacement difference (NDD) parameter was developed and calibrated to estimate the lesion size. The lesion size estimated by the NDD was compared with both MR thermometry prediction and the histology analysis. For lesions smaller than 8mm, the NDD-based lesion monitoring technique showed very similar performance as MR thermometry. The standard deviation of lesion size error is 0.66 mm, which is comparable to MR thermal dose contour prediction (0.42 mm). A phased array is needed for tracking displacement in 2D and monitoring lesion larger than 8 mm. The study demonstrates the potential of our ultrasound based technique to achieve precise HIFU lesion control through real-time monitoring. The results compare well with histology and an established technique like MR Thermometry. This approach provides feedback control in real-time to terminate therapy when a pre-determined lesion size is

  7. Correlation between histological outcome and surgical cartilage repair technique in the knee: A meta-analysis.

    PubMed

    DiBartola, Alex C; Everhart, Joshua S; Magnussen, Robert A; Carey, James L; Brophy, Robert H; Schmitt, Laura C; Flanigan, David C

    2016-06-01

    Compare histological outcomes after microfracture (MF), autologous chondrocyte implantation (ACI), and osteochondral autograft transfer (OATS). Literature review using PubMed MEDLINE, SCOPUS, Cumulative Index for Nursing and Allied Health Literature (CINAHL), and Cochrane Collaboration Library. Inclusion criteria limited to English language studies International Cartilage Repair Society (ICRS) grading criteria for cartilage analysis after ACI (autologous chondrocyte implantation), MF (microfracture), or OATS (osteochondral autografting) repair techniques. Thirty-three studies investigating 1511 patients were identified. Thirty evaluated ACI or one of its subtypes, six evaluated MF, and seven evaluated OATS. There was no evidence of publication bias (Begg's p=0.48). No statistically significant correlation was found between percent change in clinical outcome and percent biopsies showing ICRS Excellent scores (R(2)=0.05, p=0.38). Percent change in clinical outcome and percent of biopsies showing only hyaline cartilage were significantly associated (R(2)=0.24, p=0.024). Mean lesion size and histological outcome were not correlated based either on percent ICRS Excellent (R(2)=0.03, p=0.50) or percent hyaline cartilage only (R(2)=0.01, p=0.67). Most common lesion location and histological outcome were not correlated based either on percent ICRS Excellent (R(2)=0.03, p=0.50) or percent hyaline cartilage only (R(2)=0.01, p=0.67). Microfracture has poorer histologic outcomes than other cartilage repair techniques. OATS repairs primarily are comprised of hyaline cartilage, followed closely by cell-based techniques, but no significant difference was found cartilage quality using ICRS grading criteria among OATS, ACI-C, MACI, and ACI-P. IV, meta-analysis. Copyright © 2016 Elsevier B.V. All rights reserved.

  8. Certification-Based Process Analysis

    NASA Technical Reports Server (NTRS)

    Knight, Russell L.

    2013-01-01

    Space mission architects are often challenged with knowing which investment in technology infusion will have the highest return. Certification-based analysis (CBA) gives architects and technologists a means to communicate the risks and advantages of infusing technologies at various points in a process. Various alternatives can be compared, and requirements based on supporting streamlining or automation can be derived and levied on candidate technologies. CBA is a technique for analyzing a process and identifying potential areas of improvement. The process and analysis products are used to communicate between technologists and architects. Process means any of the standard representations of a production flow; in this case, any individual steps leading to products, which feed into other steps, until the final product is produced at the end. This sort of process is common for space mission operations, where a set of goals is reduced eventually to a fully vetted command sequence to be sent to the spacecraft. Fully vetting a product is synonymous with certification. For some types of products, this is referred to as verification and validation, and for others it is referred to as checking. Fundamentally, certification is the step in the process where one insures that a product works as intended, and contains no flaws.

  9. Memristor-Based Computing Architecture: Design Methodologies and Circuit Techniques

    DTIC Science & Technology

    2013-03-01

    MEMRISTOR-BASED COMPUTING ARCHITECTURE : DESIGN METHODOLOGIES AND CIRCUIT TECHNIQUES POLYTECHNIC INSTITUTE OF NEW YORK UNIVERSITY...TECHNICAL REPORT 3. DATES COVERED (From - To) OCT 2010 – OCT 2012 4. TITLE AND SUBTITLE MEMRISTOR-BASED COMPUTING ARCHITECTURE : DESIGN METHODOLOGIES...schemes for a memristor-based reconfigurable architecture design have not been fully explored yet. Therefore, in this project, we investigated

  10. A preliminary structural analysis of space-base living quarters modules to verify a weight-estimating technique

    NASA Technical Reports Server (NTRS)

    Grissom, D. S.; Schneider, W. C.

    1971-01-01

    The determination of a base line (minimum weight) design for the primary structure of the living quarters modules in an earth-orbiting space base was investigated. Although the design is preliminary in nature, the supporting analysis is sufficiently thorough to provide a reasonably accurate weight estimate of the major components that are considered to comprise the structural weight of the space base.

  11. Multivariate reference technique for quantitative analysis of fiber-optic tissue Raman spectroscopy.

    PubMed

    Bergholt, Mads Sylvest; Duraipandian, Shiyamala; Zheng, Wei; Huang, Zhiwei

    2013-12-03

    We report a novel method making use of multivariate reference signals of fused silica and sapphire Raman signals generated from a ball-lens fiber-optic Raman probe for quantitative analysis of in vivo tissue Raman measurements in real time. Partial least-squares (PLS) regression modeling is applied to extract the characteristic internal reference Raman signals (e.g., shoulder of the prominent fused silica boson peak (~130 cm(-1)); distinct sapphire ball-lens peaks (380, 417, 646, and 751 cm(-1))) from the ball-lens fiber-optic Raman probe for quantitative analysis of fiber-optic Raman spectroscopy. To evaluate the analytical value of this novel multivariate reference technique, a rapid Raman spectroscopy system coupled with a ball-lens fiber-optic Raman probe is used for in vivo oral tissue Raman measurements (n = 25 subjects) under 785 nm laser excitation powers ranging from 5 to 65 mW. An accurate linear relationship (R(2) = 0.981) with a root-mean-square error of cross validation (RMSECV) of 2.5 mW can be obtained for predicting the laser excitation power changes based on a leave-one-subject-out cross-validation, which is superior to the normal univariate reference method (RMSE = 6.2 mW). A root-mean-square error of prediction (RMSEP) of 2.4 mW (R(2) = 0.985) can also be achieved for laser power prediction in real time when we applied the multivariate method independently on the five new subjects (n = 166 spectra). We further apply the multivariate reference technique for quantitative analysis of gelatin tissue phantoms that gives rise to an RMSEP of ~2.0% (R(2) = 0.998) independent of laser excitation power variations. This work demonstrates that multivariate reference technique can be advantageously used to monitor and correct the variations of laser excitation power and fiber coupling efficiency in situ for standardizing the tissue Raman intensity to realize quantitative analysis of tissue Raman measurements in vivo, which is particularly appealing in

  12. Modern Micro and Nanoparticle-Based Imaging Techniques

    PubMed Central

    Ryvolova, Marketa; Chomoucka, Jana; Drbohlavova, Jana; Kopel, Pavel; Babula, Petr; Hynek, David; Adam, Vojtech; Eckschlager, Tomas; Hubalek, Jaromir; Stiborova, Marie; Kaiser, Jozef; Kizek, Rene

    2012-01-01

    The requirements for early diagnostics as well as effective treatment of insidious diseases such as cancer constantly increase the pressure on development of efficient and reliable methods for targeted drug/gene delivery as well as imaging of the treatment success/failure. One of the most recent approaches covering both the drug delivery as well as the imaging aspects is benefitting from the unique properties of nanomaterials. Therefore a new field called nanomedicine is attracting continuously growing attention. Nanoparticles, including fluorescent semiconductor nanocrystals (quantum dots) and magnetic nanoparticles, have proven their excellent properties for in vivo imaging techniques in a number of modalities such as magnetic resonance and fluorescence imaging, respectively. In this article, we review the main properties and applications of nanoparticles in various in vitro imaging techniques, including microscopy and/or laser breakdown spectroscopy and in vivo methods such as magnetic resonance imaging and/or fluorescence-based imaging. Moreover the advantages of the drug delivery performed by nanocarriers such as iron oxides, gold, biodegradable polymers, dendrimers, lipid based carriers such as liposomes or micelles are also highlighted. PMID:23202187

  13. Single cell and single molecule techniques for the analysis of the epigenome

    NASA Astrophysics Data System (ADS)

    Wallin, Christopher Benjamin

    materialized. Reasons for this, including poor signal to background, are explained in detail. Third, development of mobility-SCAN, an analytical technique for measuring and analyzing single molecules based on their fluorescent signature and their electrophoretic mobility in nanochannels is described. We use the technique to differentiate biomolecules from complex mixtures and derive parameters such as diffusion coefficients and effective charges. Finally, the device is used to detect binding interactions of various complexes similar to affinity capillary electrophoresis, but on a single molecule level. Fourth, we conclude by briefly discussing SCAN-sort, a technique to sort individual chromatin molecules based on their fluorescent emissions for further downstream analysis such as DNA sequencing. We demonstrate a 2-fold enrichment of chromatin from sorting and discuss possible system modifications for better performance in the future.

  14. Microstructural study of the nickel-base alloy WAZ-20 using qualitative and quantitative electron optical techniques

    NASA Technical Reports Server (NTRS)

    Young, S. G.

    1973-01-01

    The NASA nickel-base alloy WAZ-20 was analyzed by advanced metallographic techniques to qualitatively and quantitatively characterize its phases and stability. The as-cast alloy contained primary gamma-prime, a coarse gamma-gamma prime eutectic, a gamma-fine gamma prime matrix, and MC carbides. A specimen aged at 870 C for 1000 hours contained these same constituents and a few widely scattered high W particles. No detrimental phases (such as sigma or mu) were observed. Scanning electron microscope, light metallography, and replica electron microscope methods are compared. The value of quantitative electron microprobe techniques such as spot and area analysis is demonstrated.

  15. Investigation of a Moire Based Crack Detection Technique for Propulsion Health Monitoring

    NASA Technical Reports Server (NTRS)

    Woike, Mark R.; Abudl-Aziz, Ali; Fralick, Gustave C.; Wrbanek, John D.

    2012-01-01

    The development of techniques for the health monitoring of the rotating components in gas turbine engines is of major interest to NASA s Aviation Safety Program. As part of this on-going effort several experiments utilizing a novel optical Moir based concept along with external blade tip clearance and shaft displacement instrumentation were conducted on a simulated turbine engine disk as a means of demonstrating a potential optical crack detection technique. A Moir pattern results from the overlap of two repetitive patterns with slightly different periods. With this technique, it is possible to detect very small differences in spacing and hence radial growth in a rotating disk due to a flaw such as a crack. The experiment involved etching a circular reference pattern on a subscale engine disk that had a 50.8 mm (2 in.) long notch machined into it to simulate a crack. The disk was operated at speeds up to 12 000 rpm and the Moir pattern due to the shift with respect to the reference pattern was monitored as a means of detecting the radial growth of the disk due to the defect. In addition, blade displacement data were acquired using external blade tip clearance and shaft displacement sensors as a means of confirming the data obtained from the optical technique. The results of the crack detection experiments and its associated analysis are presented in this paper.

  16. Robotic-assisted laparoscopic repair of ureteral injury: an evidence-based review of techniques and outcomes.

    PubMed

    Tracey, Andrew T; Eun, Daniel D; Stifelman, Michael D; Hemal, Ashok K; Stein, Robert J; Mottrie, Alexandre; Cadeddu, Jeffrey A; Stolzenburg, J Uwe; Berger, Andre K; Buffi, Niccolò; Zhao, Lee C; Lee, Ziho; Hampton, Lance; Porpiglia, Francesco; Autorino, Riccardo

    2018-06-01

    Iatrogenic ureteral injuries represent a common surgical problem encountered by practicing urologists. With the rapidly expanding applications of robotic-assisted laparoscopic surgery, ureteral reconstruction has been an important field of recent advancement. This collaborative review sought to provide an evidence-based analysis of the latest surgical techniques and outcomes for robotic-assisted repair of ureteral injury. A systematic review of the literature up to December 2017 using PubMed/Medline was performed to identify relevant articles. Those studies included in the systematic review were selected according to Preferred Reporting Items for Systematic Reviews and Meta-analysis criteria. Additionally, expert opinions were included from study authors in order to critique outcomes and elaborate on surgical techniques. A cumulative outcome analysis was conducted analyzing comparative studies on robotic versus open ureteral repair. Thirteen case series have demonstrated the feasibility, safety, and success of robotic ureteral reconstruction. The surgical planning, timing of intervention, and various robotic reconstructive techniques need to be tailored to the specific case, depending on the location and length of the injury. Fluorescence imaging can represent a useful tool in this setting. Recently, three studies have shown the feasibility and technical success of robotic buccal mucosa grafting for ureteral repair. Soon, additional novel and experimental robotic reconstructive approaches might become available. The cumulative analysis of the three available comparative studies on robotic versus open ureteral repair showed no difference in operative time or complication rate, with a decreased blood loss and hospital length of stay favoring the robotic approach. Current evidence suggests that the robotic surgical platform facilitates complex ureteral reconstruction in a minimally invasive fashion. High success rates of ureteral repair using the robotic approach

  17. Refractive index sensor based on optical fiber end face using pulse reference-based compensation technique

    NASA Astrophysics Data System (ADS)

    Bian, Qiang; Song, Zhangqi; Zhang, Xueliang; Yu, Yang; Chen, Yuzhong

    2018-03-01

    We proposed a refractive index sensor based on optical fiber end face using pulse reference-based compensation technique. With good compensation effect of this compensation technique, the power fluctuation of light source, the change of optic components transmission loss and coupler splitting ratio can be compensated, which largely reduces the background noise. The refractive index resolutions can achieve 3.8 × 10-6 RIU and1.6 × 10-6 RIU in different refractive index regions.

  18. Cost minimisation analysis of using acellular dermal matrix (Strattice™) for breast reconstruction compared with standard techniques.

    PubMed

    Johnson, R K; Wright, C K; Gandhi, A; Charny, M C; Barr, L

    2013-03-01

    We performed a cost analysis (using UK 2011/12 NHS tariffs as a proxy for cost) comparing immediate breast reconstruction using the new one-stage technique of acellular dermal matrix (Strattice™) with implant versus the standard alternative techniques of tissue expander (TE)/implant as a two-stage procedure and latissimus dorsi (LD) flap reconstruction. Clinical report data were collected for operative time, length of stay, outpatient procedures, and number of elective and emergency admissions in our first consecutive 24 patients undergoing one-stage Strattice reconstruction. Total cost to the NHS based on tariff, assuming top-up payments to cover Strattice acquisition costs, was assessed and compared to the two historical control groups matched on key variables. Eleven patients having unilateral Strattice reconstruction were compared to 10 having TE/implant reconstruction and 10 having LD flap and implant reconstruction. Thirteen patients having bilateral Strattice reconstruction were compared to 12 having bilateral TE/implant reconstruction. Total costs were: unilateral Strattice, £3685; unilateral TE, £4985; unilateral LD and implant, £6321; bilateral TE, £5478; and bilateral Strattice, £6771. The cost analysis shows a financial advantage of using acellular dermal matrix (Strattice) in unilateral breast reconstruction versus alternative procedures. The reimbursement system in England (Payment by Results) is based on disease-related groups similar to that of many countries across Europe and tariffs are based on reported hospital costs, making this analysis of relevance in other countries. Copyright © 2013 Elsevier Ltd. All rights reserved.

  19. Quadrant Analysis as a Strategic Planning Technique in Curriculum Development and Program Marketing.

    ERIC Educational Resources Information Center

    Lynch, James; And Others

    1996-01-01

    Quadrant analysis, a widely-used research technique, is suggested as useful in college or university strategic planning. The technique uses consumer preference data and produces information suitable for a wide variety of curriculum and marketing decisions. Basic quadrant analysis design is described, and advanced variations are discussed, with…

  20. Investigation of laser Doppler anemometry in developing a velocity-based measurement technique

    NASA Astrophysics Data System (ADS)

    Jung, Ki Won

    2009-12-01

    Acoustic properties, such as the characteristic impedance and the complex propagation constant, of porous materials have been traditionally characterized based on pressure-based measurement techniques using microphones. Although the microphone techniques have evolved since their introduction, the most general form of the microphone technique employs two microphones in characterizing the acoustic field for one continuous medium. The shortcomings of determining the acoustic field based on only two microphones can be overcome by using numerous microphones. However, the use of a number of microphones requires a careful and intricate calibration procedure. This dissertation uses laser Doppler anemometry (LDA) to establish a new measurement technique which can resolve issues that microphone techniques have: First, it is based on a single sensor, thus the calibration is unnecessary when only overall ratio of the acoustic field is required for the characterization of a system. This includes the measurements of the characteristic impedance and the complex propagation constant of a system. Second, it can handle multiple positional measurements without calibrating the signal at each position. Third, it can measure three dimensional components of velocity even in a system with a complex geometry. Fourth, it has a flexible adaptability which is not restricted to a certain type of apparatus only if the apparatus is transparent. LDA is known to possess several disadvantages, such as the requirement of a transparent apparatus, high cost, and necessity of seeding particles. The technique based on LDA combined with a curvefitting algorithm is validated through measurements on three systems. First, the complex propagation constant of the air is measured in a rigidly terminated cylindrical pipe which has very low dissipation. Second, the radiation impedance of an open-ended pipe is measured. These two parameters can be characterized by the ratio of acoustic field measured at multiple

  1. [Rapid multi-elemental analysis on four precious Tibetan medicines based on LIBS technique].

    PubMed

    Liu, Xiao-na; Shi, Xin-yuan; Jia, Shuai-yun; Zhao, Na; Wu, Zhi-sheng; Qiao, Yan-jiang

    2015-06-01

    The laser-induced breakdown spectroscopy (LIBS) was applied to perform a qualitative elementary analysis on four precious Tibetan medicines, i. e. Renqing Mangjue, Renqing Changjue, 25-herb coral pills and 25-herb pearl pills. The specific spectra of the four Tibetan medicines were established. In the experiment, Nd: YAG and 1 064 nm-baseband pulse laser were adopted to collect the spectra. A laser beam focused on the surface of the samples to generate plasma. Its spectral signal was detected by using spectrograph. Based on the National Institute of Standard and Technology (NIST) database, LIBS spectral lines were indentified. The four Tibetan medicines mainly included Ca, Na, K, Mg and other elements and C-N molecular band. Specifically, Fe was detected in Renqing Changjue and 25-herb pearl pills; heavy mental elements Hg and Cu were shown in Renqing Mangjue and Renqing Changjue; Ag was found in Renqing Changjue. The results demonstrated that LIBS is a reliable and rapid multi-element analysis on the four Tibetan medicines. With Real-time, rapid and nondestructive advantages, LIBS has a wide application prospect in the element analysis on ethnic medicines.

  2. Analysis of meteorological variables in the Australasian region using ground- and space-based GPS techniques

    NASA Astrophysics Data System (ADS)

    Kuleshov, Yuriy; Choy, Suelynn; Fu, Erjiang Frank; Chane-Ming, Fabrice; Liou, Yuei-An; Pavelyev, Alexander G.

    2016-07-01

    Results of analysis of meteorological variables (temperature and moisture) in the Australasian region using the global positioning system (GPS) radio occultation (RO) and GPS ground-based observations verified with in situ radiosonde (RS) data are presented. The potential of using ground-based GPS observations for retrieving column integrated precipitable water vapour (PWV) over the Australian continent has been demonstrated using the Australian ground-based GPS reference stations network. Using data from the 15 ground-based GPS stations, the state of the atmosphere over Victoria during a significant weather event, the March 2010 Melbourne storm, has been investigated, and it has been shown that the GPS observations has potential for monitoring the movement of a weather front that has sharp moisture contrast. Temperature and moisture variability in the atmosphere over various climatic regions (the Indian and the Pacific Oceans, the Antarctic and Australia) has been examined using satellite-based GPS RO and in situ RS observations. Investigating recent atmospheric temperature trends over Antarctica, the time series of the collocated GPS RO and RS data were examined, and strong cooling in the lower stratosphere and warming through the troposphere over Antarctica has been identified, in agreement with outputs of climate models. With further expansion of the Global Navigation Satellite Systems (GNSS) system, it is expected that GNSS satellite- and ground-based measurements would be able to provide an order of magnitude larger amount of data which in turn could significantly advance weather forecasting services, climate monitoring and analysis in the Australasian region.

  3. Cost minimization analysis for combinations of sampling techniques in bronchoscopy of endobronchial lesions.

    PubMed

    Roth, Kjetil; Hardie, Jon Andrew; Andreassen, Alf Henrik; Leh, Friedemann; Eagan, Tomas Mikal Lind

    2009-06-01

    The choice of sampling techniques in bronchoscopy with sampling from a visible lesion will depend on the expected diagnostic yields and the costs of the sampling techniques. The aim of this study was to determine the most economical combination of sampling techniques when approaching endobronchial visible lesions. A cost minimization analysis was performed. All bronchoscopies from 2003 and 2004 at Haukeland university hospital, Bergen, Norway, were reviewed retrospectively for diagnostic yields. 162 patients with endobronchial disease were included. Potential sampling techniques used were biopsy, brushing, endobronchial needle aspiration (EBNA) and washings. Costs were estimated based on registration of equipment costs and personnel costs. Sensitivity analyses were performed to determine threshold values. The combination of biopsy, brushing and EBNA was the most economical strategy with an average cost of Euro 893 (95% CI: 657, 1336). The cost of brushing had to be below Euro 83 and it had to increase the diagnostic yield more than 2.2%, for biopsy and brushing to be more economical than biopsy alone. The combination of biopsy, brushing and EBNA was more economical than biopsy and brushing when the cost of EBNA was below Euro 205 and the increase in diagnostic yield was above 5.2%. In the current study setting, biopsy, brushing and EBNA was the most economical combination of sampling techniques for endobronchial visible lesions.

  4. An algol program for dissimilarity analysis: a divisive-omnithetic clustering technique

    USGS Publications Warehouse

    Tipper, J.C.

    1979-01-01

    Clustering techniques are used properly to generate hypotheses about patterns in data. Of the hierarchical techniques, those which are divisive and omnithetic possess many theoretically optimal properties. One such method, dissimilarity analysis, is implemented here in ALGOL 60, and determined to be competitive computationally with most other methods. ?? 1979.

  5. Typology of delivery quality: latent profile analysis of teacher engagement and delivery techniques in a school-based prevention intervention, keepin’ it REAL curriculum

    PubMed Central

    Shin, YoungJu; Miller-Day, Michelle; Pettigrew, Jonathan; Hecht, Michael L.; Krieger, Janice L.

    2014-01-01

    Enhancing the delivery quality of school-based, evidence-based prevention programs is one key to ensuring uniform program effects on student outcomes. Program evaluations often focus on content dosage when implementing prevention curricula, however, less is known about implementation quality of prevention content, especially among teachers who may or may not have a prevention background. The goal of the current study is to add to the scholarly literature on implementation quality for a school-based substance use prevention intervention. Twenty-five schools in Ohio and Pennsylvania implemented the original keepin’ REAL (kiR) substance use prevention curriculum. Each of the 10, 40–45 min lessons of the kiR curriculum was video recorded. Coders observed and rated a random sample of 276 videos reflecting 78 classes taught by 31 teachers. Codes included teachers’ delivery techniques (e.g. lecture, discussion, demonstration and role play) and engagement with students (e.g. attentiveness, enthusiasm and positivity). Based on the video ratings, a latent profile analysis was run to identify typology of delivery quality. Five profiles were identified: holistic approach, attentive teacher-orientated approach, enthusiastic lecture approach, engaged interactive learning approach and skill practice-only approach. This study provides a descriptive typology of delivery quality while implementing a school-based substance use prevention intervention. PMID:25274721

  6. Application of a sensitivity analysis technique to high-order digital flight control systems

    NASA Technical Reports Server (NTRS)

    Paduano, James D.; Downing, David R.

    1987-01-01

    A sensitivity analysis technique for multiloop flight control systems is studied. This technique uses the scaled singular values of the return difference matrix as a measure of the relative stability of a control system. It then uses the gradients of these singular values with respect to system and controller parameters to judge sensitivity. The sensitivity analysis technique is first reviewed; then it is extended to include digital systems, through the derivation of singular-value gradient equations. Gradients with respect to parameters which do not appear explicitly as control-system matrix elements are also derived, so that high-order systems can be studied. A complete review of the integrated technique is given by way of a simple example: the inverted pendulum problem. The technique is then demonstrated on the X-29 control laws. Results show linear models of real systems can be analyzed by this sensitivity technique, if it is applied with care. A computer program called SVA was written to accomplish the singular-value sensitivity analysis techniques. Thus computational methods and considerations form an integral part of many of the discussions. A user's guide to the program is included. The SVA is a fully public domain program, running on the NASA/Dryden Elxsi computer.

  7. Analysis and Prediction of Sea Ice Evolution using Koopman Mode Decomposition Techniques

    DTIC Science & Technology

    2018-04-30

    Title: Analysis and Prediction of Sea Ice Evolution using Koopman Mode Decomposition Techniques Subject: Monthly Progress Report Period of...Resources: N/A TOTAL: $18,687 2 TECHNICAL STATUS REPORT Abstract The program goal is analysis of sea ice dynamical behavior using Koopman Mode Decompo...sition (KMD) techniques. The work in the program’s first month consisted of improvements to data processing code, inclusion of additional arctic sea ice

  8. Highly efficient full-wave electromagnetic analysis of 3-D arbitrarily shaped waveguide microwave devices using an integral equation technique

    NASA Astrophysics Data System (ADS)

    Vidal, A.; San-Blas, A. A.; Quesada-Pereira, F. D.; Pérez-Soler, J.; Gil, J.; Vicente, C.; Gimeno, B.; Boria, V. E.

    2015-07-01

    A novel technique for the full-wave analysis of 3-D complex waveguide devices is presented. This new formulation, based on the Boundary Integral-Resonant Mode Expansion (BI-RME) method, allows the rigorous full-wave electromagnetic characterization of 3-D arbitrarily shaped metallic structures making use of extremely low CPU resources (both time and memory). The unknown electric current density on the surface of the metallic elements is represented by means of Rao-Wilton-Glisson basis functions, and an algebraic procedure based on a singular value decomposition is applied to transform such functions into the classical solenoidal and nonsolenoidal basis functions needed by the original BI-RME technique. The developed tool also provides an accurate computation of the electromagnetic fields at an arbitrary observation point of the considered device, so it can be used for predicting high-power breakdown phenomena. In order to validate the accuracy and efficiency of this novel approach, several new designs of band-pass waveguides filters are presented. The obtained results (S-parameters and electromagnetic fields) are successfully compared both to experimental data and to numerical simulations provided by a commercial software based on the finite element technique. The results obtained show that the new technique is specially suitable for the efficient full-wave analysis of complex waveguide devices considering an integrated coaxial excitation, where the coaxial probes may be in contact with the metallic insets of the component.

  9. A fluctuation-induced plasma transport diagnostic based upon fast-Fourier transform spectral analysis

    NASA Technical Reports Server (NTRS)

    Powers, E. J.; Kim, Y. C.; Hong, J. Y.; Roth, J. R.; Krawczonek, W. M.

    1978-01-01

    A diagnostic, based on fast Fourier-transform spectral analysis techniques, that provides experimental insight into the relationship between the experimentally observable spectral characteristics of the fluctuations and the fluctuation-induced plasma transport is described. The model upon which the diagnostic technique is based and its experimental implementation is discussed. Some characteristic results obtained during the course of an experimental study of fluctuation-induced transport in the electric field dominated NASA Lewis bumpy torus plasma are presented.

  10. Surrogate marker analysis in cancer clinical trials through time-to-event mediation techniques.

    PubMed

    Vandenberghe, Sjouke; Duchateau, Luc; Slaets, Leen; Bogaerts, Jan; Vansteelandt, Stijn

    2017-01-01

    The meta-analytic approach is the gold standard for validation of surrogate markers, but has the drawback of requiring data from several trials. We refine modern mediation analysis techniques for time-to-event endpoints and apply them to investigate whether pathological complete response can be used as a surrogate marker for disease-free survival in the EORTC 10994/BIG 1-00 randomised phase 3 trial in which locally advanced breast cancer patients were randomised to either taxane or anthracycline based neoadjuvant chemotherapy. In the mediation analysis, the treatment effect is decomposed into an indirect effect via pathological complete response and the remaining direct effect. It shows that only 4.2% of the treatment effect on disease-free survival after five years is mediated by the treatment effect on pathological complete response. There is thus no evidence from our analysis that pathological complete response is a valuable surrogate marker to evaluate the effect of taxane versus anthracycline based chemotherapies on progression free survival of locally advanced breast cancer patients. The proposed analysis strategy is broadly applicable to mediation analyses of time-to-event endpoints, is easy to apply and outperforms existing strategies in terms of precision as well as robustness against model misspecification.

  11. Feathering effect detection and artifact agglomeration index-based video deinterlacing technique

    NASA Astrophysics Data System (ADS)

    Martins, André Luis; Rodrigues, Evandro Luis Linhari; de Paiva, Maria Stela Veludo

    2018-03-01

    Several video deinterlacing techniques have been developed, and each one presents a better performance in certain conditions. Occasionally, even the most modern deinterlacing techniques create frames with worse quality than primitive deinterlacing processes. This paper validates that the final image quality can be improved by combining different types of deinterlacing techniques. The proposed strategy is able to select between two types of deinterlaced frames and, if necessary, make the local correction of the defects. This decision is based on an artifact agglomeration index obtained from a feathering effect detection map. Starting from a deinterlaced frame produced by the "interfield average" method, the defective areas are identified, and, if deemed appropriate, these areas are replaced by pixels generated through the "edge-based line average" method. Test results have proven that the proposed technique is able to produce video frames with higher quality than applying a single deinterlacing technique through getting what is good from intra- and interfield methods.

  12. Thoracolumbar imbalance analysis for osteotomy planification using a new method: FBI technique.

    PubMed

    Le Huec, J C; Leijssen, P; Duarte, M; Aunoble, S

    2011-09-01

    Treatment of spine imbalance by posterior osteotomy is a valuable technique. Several surgical techniques have been developed and proposed to redress the vertebral column in harmonious kyphosis in order to recreate correct sagittal alignment. Although surgical techniques proved to be adequate, preoperative planning still is mediocre. Multiple suggestions have been proposed, from cutting tracing paper to ingenious mathematical formulas and computerised models. The analysis of the pelvic parameters to try to recover the initial shape of the spine before the spine imbalance occurred is very important to avoid mistakes during the osteotomy planification. The authors proposed their method for the osteotomy planning paying attention to the pelvic, and spine parameters and in accordance with Roussouly's classification. The pre operative planning is based on a full-body X-ray including the spine from C1 to the femoral head and the first 10 cm of the femur shaft. Using all the balance parameters provided, a formula name FBI is proposed. Calculation of the osteotomy is basic goniometry, the midpoint of the C7 inferior plateau (point a) is transposed horizontally on the projected future C7 plumb line (point b) crossing posterior S1 plateau on a sagittal X-ray. These are the first two reference points. A third reference point is made on the anterior wall of the selected vertebra for osteotomy at mid height of the pedicle (point c) mainly L4 vertebra. These three points form a triangle with the tip being the third reference point. The angle represented by this triangle is the theoretical angle of the osteotomy. Two more angles should be measured and eventually added. The femur angulation measured as the inclination of the femoral axis to the vertical. And a third angle named the compensatory pelvic tilt to integrate the type of pelvis. If the pelvic tilt is between 15 and 25° or is higher than 25° you must add 5 or 10°, respectively. This compensatory tilt is based on a

  13. A CHARTING TECHNIQUE FOR THE ANALYSIS OF BUSINESS SYSTEMS,

    DTIC Science & Technology

    This paper describes a charting technique useful in the analysis of business systems and in studies of the information economics of the firm. The...planning advanced systems. It is not restricted to any particular kind of business or information system. (Author)

  14. Multivariate statistical analysis software technologies for astrophysical research involving large data bases

    NASA Technical Reports Server (NTRS)

    Djorgovski, George

    1993-01-01

    The existing and forthcoming data bases from NASA missions contain an abundance of information whose complexity cannot be efficiently tapped with simple statistical techniques. Powerful multivariate statistical methods already exist which can be used to harness much of the richness of these data. Automatic classification techniques have been developed to solve the problem of identifying known types of objects in multiparameter data sets, in addition to leading to the discovery of new physical phenomena and classes of objects. We propose an exploratory study and integration of promising techniques in the development of a general and modular classification/analysis system for very large data bases, which would enhance and optimize data management and the use of human research resource.

  15. Multivariate statistical analysis software technologies for astrophysical research involving large data bases

    NASA Technical Reports Server (NTRS)

    Djorgovski, Stanislav

    1992-01-01

    The existing and forthcoming data bases from NASA missions contain an abundance of information whose complexity cannot be efficiently tapped with simple statistical techniques. Powerful multivariate statistical methods already exist which can be used to harness much of the richness of these data. Automatic classification techniques have been developed to solve the problem of identifying known types of objects in multi parameter data sets, in addition to leading to the discovery of new physical phenomena and classes of objects. We propose an exploratory study and integration of promising techniques in the development of a general and modular classification/analysis system for very large data bases, which would enhance and optimize data management and the use of human research resources.

  16. The association of placenta previa and assisted reproductive techniques: a meta-analysis.

    PubMed

    Karami, Manoochehr; Jenabi, Ensiyeh; Fereidooni, Bita

    2018-07-01

    Several epidemiological studies have determined that assisted reproductive techniques (ART) can increase the risk of placenta previa. To date, only a meta-analysis has been performed for assessing the relationship between placenta previa and ART. This meta-analysis was conducted to estimate the association between placenta previa and ART in singleton and twin pregnancies. A literature search was performed in major databases PubMed, Web of Science, and Scopus from the earliest possible year to April 2017. The heterogeneity across studies was explored by Q-test and I 2 statistic. The publication bias was assessed using Begg's and Egger's tests. The results were reported using odds ratio (OR) and relative risk (RR) estimates with its 95% confidence intervals (CI) using a random-effects model. The literature search yielded 1529 publications until September 2016 with 1,388,592 participants. The overall estimate of OR was 2.67 (95%CI: 2.01, 3.34) and RR was 3.62 (95%CI: 0.21, 7.03) based on singleton pregnancies. The overall estimate of OR was 1.50 (95%CI: 1.26, 1.74) based on twin pregnancies. We showed based on odds ratio reports in observational studies that ART procedures are a risk factor for placenta previa.

  17. A diagnostic analysis of the VVP single-doppler retrieval technique

    NASA Technical Reports Server (NTRS)

    Boccippio, Dennis J.

    1995-01-01

    A diagnostic analysis of the VVP (volume velocity processing) retrieval method is presented, with emphasis on understanding the technique as a linear, multivariate regression. Similarities and differences to the velocity-azimuth display and extended velocity-azimuth display retrieval techniques are discussed, using this framework. Conventional regression diagnostics are then employed to quantitatively determine situations in which the VVP technique is likely to fail. An algorithm for preparation and analysis of a robust VVP retrieval is developed and applied to synthetic and actual datasets with high temporal and spatial resolution. A fundamental (but quantifiable) limitation to some forms of VVP analysis is inadequate sampling dispersion in the n space of the multivariate regression, manifest as a collinearity between the basis functions of some fitted parameters. Such collinearity may be present either in the definition of these basis functions or in their realization in a given sampling configuration. This nonorthogonality may cause numerical instability, variance inflation (decrease in robustness), and increased sensitivity to bias from neglected wind components. It is shown that these effects prevent the application of VVP to small azimuthal sectors of data. The behavior of the VVP regression is further diagnosed over a wide range of sampling constraints, and reasonable sector limits are established.

  18. Ultrasonic non invasive techniques for microbiological instrumentation

    NASA Astrophysics Data System (ADS)

    Elvira, L.; Sierra, C.; Galán, B.; Resa, P.

    2010-01-01

    Non invasive techniques based on ultrasounds have advantageous features to study, characterize and monitor microbiological and enzymatic reactions. These processes may change the sound speed, viscosity or particle distribution size of the medium where they take place, which makes possible their analysis using ultrasonic techniques. In this work, two different systems for the analysis of microbiological liquid media based on ultrasounds are presented. In first place, an industrial application based on an ultrasonic monitoring technique for microbiological growth detection in milk is shown. Such a system may improve the quality control strategies in food production factories, being able to decrease the time required to detect possible contaminations in packed products. Secondly, a study about the growing of the Escherichia coli DH5 α in different conditions is presented. It is shown that the use of ultrasonic non invasive characterization techniques in combination with other conventional measurements like optical density provides complementary information about the metabolism of these bacteria.

  19. Performance analysis of multiple PRF technique for ambiguity resolution

    NASA Technical Reports Server (NTRS)

    Chang, C. Y.; Curlander, J. C.

    1992-01-01

    For short wavelength spaceborne synthetic aperture radar (SAR), ambiguity in Doppler centroid estimation occurs when the azimuth squint angle uncertainty is larger than the azimuth antenna beamwidth. Multiple pulse recurrence frequency (PRF) hopping is a technique developed to resolve the ambiguity by operating the radar in different PRF's in the pre-imaging sequence. Performance analysis results of the multiple PRF technique are presented, given the constraints of the attitude bound, the drift rate uncertainty, and the arbitrary numerical values of PRF's. The algorithm performance is derived in terms of the probability of correct ambiguity resolution. Examples, using the Shuttle Imaging Radar-C (SIR-C) and X-SAR parameters, demonstrate that the probability of correct ambiguity resolution obtained by the multiple PRF technique is greater than 95 percent and 80 percent for the SIR-C and X-SAR applications, respectively. The success rate is significantly higher than that achieved by the range cross correlation technique.

  20. [Analysis of color gamut of LCD system based on LED backlight with area-controlling technique].

    PubMed

    Li, Fu-Wen; Jin, Wei-Qi; Shao, Xi-Bin; Zhang, Li-Lei; Wan, Li-Fang

    2010-05-01

    Color gamut as a significant performance index for display system describes the color reproduction ability IN real scenes. Liquid crystal display (LCD) is the most popular technology in flat panel display. However, conventional cold cathode fluorescent lamp (CCFL) backlight of LCD can not behave high color gamut compared with cathode ray tube (CRT). The common used method of color gamut measuring for LCD system is introduced at the beginning. According to the inner structure and display principle of LCD system, there are three major factors deciding LCD's color gamut: spectral properties of backlight, transmittance properties of color filters and performance of liquid crystal panel. Instead of conventional backlight CCFL, RGB-LED backlight is used for improving color reproduction of LCD display system. Due to the imperfect match between RGB-LED' s spectra and color filter's transmittance, the color filter would reduce the color gamut of LCD system more or less. Therefore, LCD system based on LED backlight with area-control technique is introduced which modifies backlight control signal according to the input signal After analyzing and calculating the spectra of LED backlight which passes through the color filters using method of colorimetry, the area sizes of color gamut triangles of RGB-LED backlight with area-control and RGB-LED backlight without area-control LCD systems are compared and the relationship between color gamut and varying contrast of liquid crystal panel is analyzed. It is indicated that LED backlight with area-control technique can avoid color saturation dropping and have little effects on the contrast variation of liquid crystal panel. In other words, LED backlight with area-control technique relaxes the requirements of both color filter performance and liquid crystal panel. Thus, it is of importance to improve the color gamut of the current LCD system with area-control LED backlight.

  1. Classification of damage in structural systems using time series analysis and supervised and unsupervised pattern recognition techniques

    NASA Astrophysics Data System (ADS)

    Omenzetter, Piotr; de Lautour, Oliver R.

    2010-04-01

    Developed for studying long, periodic records of various measured quantities, time series analysis methods are inherently suited and offer interesting possibilities for Structural Health Monitoring (SHM) applications. However, their use in SHM can still be regarded as an emerging application and deserves more studies. In this research, Autoregressive (AR) models were used to fit experimental acceleration time histories from two experimental structural systems, a 3- storey bookshelf-type laboratory structure and the ASCE Phase II SHM Benchmark Structure, in healthy and several damaged states. The coefficients of the AR models were chosen as damage sensitive features. Preliminary visual inspection of the large, multidimensional sets of AR coefficients to check the presence of clusters corresponding to different damage severities was achieved using Sammon mapping - an efficient nonlinear data compression technique. Systematic classification of damage into states based on the analysis of the AR coefficients was achieved using two supervised classification techniques: Nearest Neighbor Classification (NNC) and Learning Vector Quantization (LVQ), and one unsupervised technique: Self-organizing Maps (SOM). This paper discusses the performance of AR coefficients as damage sensitive features and compares the efficiency of the three classification techniques using experimental data.

  2. Data analysis techniques used at the Oak Ridge Y-12 plant flywheel evaluation laboratory

    NASA Astrophysics Data System (ADS)

    Steels, R. S., Jr.; Babelay, E. F., Jr.

    1980-07-01

    Some of the more advanced data analysis techniques applied to the problem of experimentally evaluating the performance of high performance composite flywheels are presented. Real time applications include polar plots of runout with interruptions relating to balance and relative motions between parts, radial growth measurements, and temperature of the spinning part. The technique used to measure torque applied to a containment housing during flywheel failure is also presented. The discussion of pre and post test analysis techniques includes resonant frequency determination with modal analysis, waterfall charts, and runout signals at failure.

  3. A program to form a multidisciplinary data base and analysis for dynamic systems

    NASA Technical Reports Server (NTRS)

    Taylor, L. W.; Suit, W. T.; Mayo, M. H.

    1984-01-01

    Diverse sets of experimental data and analysis programs have been assembled for the purpose of facilitating research in systems identification, parameter estimation and state estimation techniques. The data base analysis programs are organized to make it easy to compare alternative approaches. Additional data and alternative forms of analysis will be included as they become available.

  4. Judging complex movement performances for excellence: a principal components analysis-based technique applied to competitive diving.

    PubMed

    Young, Cole; Reinkensmeyer, David J

    2014-08-01

    Athletes rely on subjective assessment of complex movements from coaches and judges to improve their motor skills. In some sports, such as diving, snowboard half pipe, gymnastics, and figure skating, subjective scoring forms the basis for competition. It is currently unclear whether this scoring process can be mathematically modeled; doing so could provide insight into what motor skill is. Principal components analysis has been proposed as a motion analysis method for identifying fundamental units of coordination. We used PCA to analyze movement quality of dives taken from USA Diving's 2009 World Team Selection Camp, first identifying eigenpostures associated with dives, and then using the eigenpostures and their temporal weighting coefficients, as well as elements commonly assumed to affect scoring - gross body path, splash area, and board tip motion - to identify eigendives. Within this eigendive space we predicted actual judges' scores using linear regression. This technique rated dives with accuracy comparable to the human judges. The temporal weighting of the eigenpostures, body center path, splash area, and board tip motion affected the score, but not the eigenpostures themselves. These results illustrate that (1) subjective scoring in a competitive diving event can be mathematically modeled; (2) the elements commonly assumed to affect dive scoring actually do affect scoring (3) skill in elite diving is more associated with the gross body path and the effect of the movement on the board and water than the units of coordination that PCA extracts, which might reflect the high level of technique these divers had achieved. We also illustrate how eigendives can be used to produce dive animations that an observer can distort continuously from poor to excellent, which is a novel approach to performance visualization. Copyright © 2014 Elsevier B.V. All rights reserved.

  5. The composite sequential clustering technique for analysis of multispectral scanner data

    NASA Technical Reports Server (NTRS)

    Su, M. Y.

    1972-01-01

    The clustering technique consists of two parts: (1) a sequential statistical clustering which is essentially a sequential variance analysis, and (2) a generalized K-means clustering. In this composite clustering technique, the output of (1) is a set of initial clusters which are input to (2) for further improvement by an iterative scheme. This unsupervised composite technique was employed for automatic classification of two sets of remote multispectral earth resource observations. The classification accuracy by the unsupervised technique is found to be comparable to that by traditional supervised maximum likelihood classification techniques. The mathematical algorithms for the composite sequential clustering program and a detailed computer program description with job setup are given.

  6. Development of Styrene-Grafted Polyurethane by Radiation-Based Techniques

    PubMed Central

    Jeong, Jin-Oh; Park, Jong-Seok; Lim, Youn-Mook

    2016-01-01

    Polyurethane (PU) is the fifth most common polymer in the general consumer market, following Polypropylene (PP), Polyethylene (PE), Polyvinyl chloride (PVC), and Polystyrene (PS), and the most common polymer for thermosetting resins. In particular, polyurethane has excellent hardness and heat resistance, is a widely used material for electronic products and automotive parts, and can be used to create products of various physical properties, including rigid and flexible foams, films, and fibers. However, the use of polar polymer polyurethane as an impact modifier of non-polar polymers is limited due to poor combustion resistance and impact resistance. In this study, we used gamma irradiation at 25 and 50 kGy to introduce the styrene of hydrophobic monomer on the polyurethane as an impact modifier of the non-polar polymer. To verify grafted styrene, we confirmed the phenyl group of styrene at 690 cm−1 by Attenuated Total Reflection Fourier Transform Infrared Spectroscopy (ATR-FTIR) and at 6.4–6.8 ppm by 1H-Nuclear Magnetic Resonance (1H-NMR). Scanning Electron Microscope (SEM), X-ray Photoelectron Spectroscopy (XPS), Thermogravimetric Analysis (TGA) and contact angle analysis were also used to confirm styrene introduction. This study has confirmed the possibility of applying high-functional composite through radiation-based techniques. PMID:28773561

  7. A technique for setting analytical thresholds in massively parallel sequencing-based forensic DNA analysis

    PubMed Central

    2017-01-01

    Amplicon (targeted) sequencing by massively parallel sequencing (PCR-MPS) is a potential method for use in forensic DNA analyses. In this application, PCR-MPS may supplement or replace other instrumental analysis methods such as capillary electrophoresis and Sanger sequencing for STR and mitochondrial DNA typing, respectively. PCR-MPS also may enable the expansion of forensic DNA analysis methods to include new marker systems such as single nucleotide polymorphisms (SNPs) and insertion/deletions (indels) that currently are assayable using various instrumental analysis methods including microarray and quantitative PCR. Acceptance of PCR-MPS as a forensic method will depend in part upon developing protocols and criteria that define the limitations of a method, including a defensible analytical threshold or method detection limit. This paper describes an approach to establish objective analytical thresholds suitable for multiplexed PCR-MPS methods. A definition is proposed for PCR-MPS method background noise, and an analytical threshold based on background noise is described. PMID:28542338

  8. A technique for setting analytical thresholds in massively parallel sequencing-based forensic DNA analysis.

    PubMed

    Young, Brian; King, Jonathan L; Budowle, Bruce; Armogida, Luigi

    2017-01-01

    Amplicon (targeted) sequencing by massively parallel sequencing (PCR-MPS) is a potential method for use in forensic DNA analyses. In this application, PCR-MPS may supplement or replace other instrumental analysis methods such as capillary electrophoresis and Sanger sequencing for STR and mitochondrial DNA typing, respectively. PCR-MPS also may enable the expansion of forensic DNA analysis methods to include new marker systems such as single nucleotide polymorphisms (SNPs) and insertion/deletions (indels) that currently are assayable using various instrumental analysis methods including microarray and quantitative PCR. Acceptance of PCR-MPS as a forensic method will depend in part upon developing protocols and criteria that define the limitations of a method, including a defensible analytical threshold or method detection limit. This paper describes an approach to establish objective analytical thresholds suitable for multiplexed PCR-MPS methods. A definition is proposed for PCR-MPS method background noise, and an analytical threshold based on background noise is described.

  9. Measurements of strain at plate boundaries using space based geodetic techniques

    NASA Technical Reports Server (NTRS)

    Robaudo, Stefano; Harrison, Christopher G. A.

    1993-01-01

    We have used the space based geodetic techniques of Satellite Laser Ranging (SLR) and VLBI to study strain along subduction and transform plate boundaries and have interpreted the results using a simple elastic dislocation model. Six stations located behind island arcs were analyzed as representative of subduction zones while 13 sites located on either side of the San Andreas fault were used for the transcurrent zones. The length deformation scale was then calculated for both tectonic margins by fitting the relative strain to an exponentially decreasing function of distance from the plate boundary. Results show that space-based data for the transcurrent boundary along the San Andreas fault help to define better the deformation length scale in the area while fitting nicely the elastic half-space earth model. For subduction type bonndaries the analysis indicates that there is no single scale length which uniquely describes the deformation. This is mainly due to the difference in subduction characteristics for the different areas.

  10. Review of Fluorescence-Based Velocimetry Techniques to Study High-Speed Compressible Flows

    NASA Technical Reports Server (NTRS)

    Bathel, Brett F.; Johansen, Criag; Inman, Jennifer A.; Jones, Stephen B.; Danehy, Paul M.

    2013-01-01

    This paper reviews five laser-induced fluorescence-based velocimetry techniques that have been used to study high-speed compressible flows at NASA Langley Research Center. The techniques discussed in this paper include nitric oxide (NO) molecular tagging velocimetry (MTV), nitrogen dioxide photodissociation (NO2-to-NO) MTV, and NO and atomic oxygen (O-atom) Doppler-shift-based velocimetry. Measurements of both single-component and two-component velocity have been performed using these techniques. This paper details the specific application and experiment for which each technique has been used, the facility in which the experiment was performed, the experimental setup, sample results, and a discussion of the lessons learned from each experiment.

  11. Hybrid 3D reconstruction and image-based rendering techniques for reality modeling

    NASA Astrophysics Data System (ADS)

    Sequeira, Vitor; Wolfart, Erik; Bovisio, Emanuele; Biotti, Ester; Goncalves, Joao G. M.

    2000-12-01

    This paper presents a component approach that combines in a seamless way the strong features of laser range acquisition with the visual quality of purely photographic approaches. The relevant components of the system are: (i) Panoramic images for distant background scenery where parallax is insignificant; (ii) Photogrammetry for background buildings and (iii) High detailed laser based models for the primary environment, structure of exteriors of buildings and interiors of rooms. These techniques have a wide range of applications in visualization, virtual reality, cost effective as-built analysis of architectural and industrial environments, building facilities management, real-estate, E-commerce, remote inspection of hazardous environments, TV production and many others.

  12. Evaluation of analysis techniques for low frequency interior noise and vibration of commercial aircraft

    NASA Technical Reports Server (NTRS)

    Landmann, A. E.; Tillema, H. F.; Marshall, S. E.

    1989-01-01

    The application of selected analysis techniques to low frequency cabin noise associated with advanced propeller engine installations is evaluated. Three design analysis techniques were chosen for evaluation including finite element analysis, statistical energy analysis (SEA), and a power flow method using element of SEA (computer program Propeller Aircraft Interior Noise). An overview of the three procedures is provided. Data from tests of a 727 airplane (modified to accept a propeller engine) were used to compare with predictions. Comparisons of predicted and measured levels at the end of the first year's effort showed reasonable agreement leading to the conclusion that each technique had value for propeller engine noise predictions on large commercial transports. However, variations in agreement were large enough to remain cautious and to lead to recommendations for further work with each technique. Assessment of the second year's results leads to the conclusion that the selected techniques can accurately predict trends and can be useful to a designer, but that absolute level predictions remain unreliable due to complexity of the aircraft structure and low modal densities.

  13. Improving Skill Development: An Exploratory Study Comparing a Philosophical and an Applied Ethical Analysis Technique

    ERIC Educational Resources Information Center

    Al-Saggaf, Yeslam; Burmeister, Oliver K.

    2012-01-01

    This exploratory study compares and contrasts two types of critical thinking techniques; one is a philosophical and the other an applied ethical analysis technique. The two techniques analyse an ethically challenging situation involving ICT that a recent media article raised to demonstrate their ability to develop the ethical analysis skills of…

  14. A Practical Cryogen-Free CO2 Purification and Freezing Technique for Stable Isotope Analysis.

    PubMed

    Sakai, Saburo; Matsuda, Shinichi

    2017-04-18

    Since isotopic analysis by mass spectrometry began in the early 1900s, sample gas for light-element isotopic measurements has been purified by the use of cryogens and vacuum-line systems. However, this conventional purification technique can achieve only certain temperatures that depend on the cryogens and can be sustained only as long as there is a continuous cryogen supply. Here, we demonstrate a practical cryogen-free CO 2 purification technique using an electrical operated cryocooler for stable isotope analysis. This approach is based on portable free-piston Stirling cooling technology and controls the temperature to an accuracy of 0.1 °C in a range from room temperature to -196 °C (liquid-nitrogen temperature). The lowest temperature can be achieved in as little as 10 min. We successfully purified CO 2 gas generated by carbonates and phosphoric acid reaction and found its sublimation point to be -155.6 °C at 0.1 Torr in the vacuum line. This means that the temperature required for CO 2 trapping is much higher than the liquid-nitrogen temperature. Our portable cooling system offers the ability to be free from the inconvenience of cryogen use for stable isotope analysis. It also offers a new cooling method applicable to a number of fields that use gas measurements.

  15. Wavelet-based techniques for the gamma-ray sky

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McDermott, Samuel D.; Fox, Patrick J.; Cholis, Ilias

    2016-07-01

    Here, we demonstrate how the image analysis technique of wavelet decomposition can be applied to the gamma-ray sky to separate emission on different angular scales. New structures on scales that differ from the scales of the conventional astrophysical foreground and background uncertainties can be robustly extracted, allowing a model-independent characterization with no presumption of exact signal morphology. As a test case, we generate mock gamma-ray data to demonstrate our ability to extract extended signals without assuming a fixed spatial template. For some point source luminosity functions, our technique also allows us to differentiate a diffuse signal in gamma-rays from darkmore » matter annihilation and extended gamma-ray point source populations in a data-driven way.« less

  16. New approaches to the analysis of complex samples using fluorescence lifetime techniques and organized media

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hertz, P.R.

    Fluorescence spectroscopy is a highly sensitive and selective tool for the analysis of complex systems. In order to investigate the efficacy of several steady state and dynamic techniques for the analysis of complex systems, this work focuses on two types of complex, multicomponent samples: petrolatums and coal liquids. It is shown in these studies dynamic, fluorescence lifetime-based measurements provide enhanced discrimination between complex petrolatum samples. Additionally, improved quantitative analysis of multicomponent systems is demonstrated via incorporation of organized media in coal liquid samples. This research provides the first systematic studies of (1) multifrequency phase-resolved fluorescence spectroscopy for dynamic fluorescence spectralmore » fingerprinting of complex samples, and (2) the incorporation of bile salt micellar media to improve accuracy and sensitivity for characterization of complex systems. In the petroleum studies, phase-resolved fluorescence spectroscopy is used to combine spectral and lifetime information through the measurement of phase-resolved fluorescence intensity. The intensity is collected as a function of excitation and emission wavelengths, angular modulation frequency, and detector phase angle. This multidimensional information enhances the ability to distinguish between complex samples with similar spectral characteristics. Examination of the eigenvalues and eigenvectors from factor analysis of phase-resolved and steady state excitation-emission matrices, using chemometric methods of data analysis, confirms that phase-resolved fluorescence techniques offer improved discrimination between complex samples as compared with conventional steady state methods.« less

  17. Problem based learning with scaffolding technique on geometry

    NASA Astrophysics Data System (ADS)

    Bayuningsih, A. S.; Usodo, B.; Subanti, S.

    2018-05-01

    Geometry as one of the branches of mathematics has an important role in the study of mathematics. This research aims to explore the effectiveness of Problem Based Learning (PBL) with scaffolding technique viewed from self-regulation learning toward students’ achievement learning in mathematics. The research data obtained through mathematics learning achievement test and self-regulated learning (SRL) questionnaire. This research employed quasi-experimental research. The subjects of this research are students of the junior high school in Banyumas Central Java. The result of the research showed that problem-based learning model with scaffolding technique is more effective to generate students’ mathematics learning achievement than direct learning (DL). This is because in PBL model students are more able to think actively and creatively. The high SRL category student has better mathematic learning achievement than middle and low SRL categories, and then the middle SRL category has better than low SRL category. So, there are interactions between learning model with self-regulated learning in increasing mathematic learning achievement.

  18. Recent advances in capillary electrophoretic migration techniques for pharmaceutical analysis.

    PubMed

    Deeb, Sami El; Wätzig, Hermann; El-Hady, Deia Abd; Albishri, Hassan M; de Griend, Cari Sänger-van; Scriba, Gerhard K E

    2014-01-01

    Since the introduction about 30 years ago, CE techniques have gained a significant impact in pharmaceutical analysis. The present review covers recent advances and applications of CE for the analysis of pharmaceuticals. Both small molecules and biomolecules such as proteins are considered. The applications range from the determination of drug-related substances to the analysis of counterions and the determination of physicochemical parameters. Furthermore, general considerations of CE methods in pharmaceutical analysis are described. © 2013 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  19. Noble-TLBO MPPT Technique and its Comparative Analysis with Conventional methods implemented on Solar Photo Voltaic System

    NASA Astrophysics Data System (ADS)

    Patsariya, Ajay; Rai, Shiwani; Kumar, Yogendra, Dr.; Kirar, Mukesh, Dr.

    2017-08-01

    The energy crisis particularly with developing GDPs, has bring up to a new panorama of sustainable power source like solar energy, which has encountered huge development. Progressively high infiltration level of photovoltaic (PV) era emerges in keen matrix. Sunlight based power is irregular and variable, as the sun based source at the ground level is exceedingly subject to overcast cover inconstancy, environmental vaporized levels, and other climate parameters. The inalienable inconstancy of substantial scale sun based era acquaints huge difficulties with keen lattice vitality administration. Exact determining of sun powered power/irradiance is basic to secure financial operation of the shrewd framework. In this paper a noble TLBO-MPPT technique has been proposed to address the vitality of solar energy. A comparative analysis has been presented between conventional PO, IC and the proposed MPPT technique. The research has been done on Matlab Simulink software version 2013.

  20. Simultaneous Comparison of Two Roller Compaction Techniques and Two Particle Size Analysis Methods.

    PubMed

    Saarinen, Tuomas; Antikainen, Osmo; Yliruusi, Jouko

    2017-11-01

    A new dry granulation technique, gas-assisted roller compaction (GARC), was compared with conventional roller compaction (CRC) by manufacturing 34 granulation batches. The process variables studied were roll pressure, roll speed, and sieve size of the conical mill. The main quality attributes measured were granule size and flow characteristics. Within granulations also the real applicability of two particle size analysis techniques, sieve analysis (SA) and fast imaging technique (Flashsizer, FS), was tested. All granules obtained were acceptable. In general, the particle size of GARC granules was slightly larger than that of CRC granules. In addition, the GARC granules had better flowability. For example, the tablet weight variation of GARC granules was close to 2%, indicating good flowing and packing characteristics. The comparison of the two particle size analysis techniques showed that SA was more accurate in determining wide and bimodal size distributions while FS showed narrower and mono-modal distributions. However, both techniques gave good estimates for mean granule sizes. Overall, SA was a time-consuming but accurate technique that provided reliable information for the entire granule size distribution. By contrast, FS oversimplified the shape of the size distribution, but nevertheless yielded acceptable estimates for mean particle size. In general, FS was two to three orders of magnitude faster than SA.

  1. DCT-based cyber defense techniques

    NASA Astrophysics Data System (ADS)

    Amsalem, Yaron; Puzanov, Anton; Bedinerman, Anton; Kutcher, Maxim; Hadar, Ofer

    2015-09-01

    With the increasing popularity of video streaming services and multimedia sharing via social networks, there is a need to protect the multimedia from malicious use. An attacker may use steganography and watermarking techniques to embed malicious content, in order to attack the end user. Most of the attack algorithms are robust to basic image processing techniques such as filtering, compression, noise addition, etc. Hence, in this article two novel, real-time, defense techniques are proposed: Smart threshold and anomaly correction. Both techniques operate at the DCT domain, and are applicable for JPEG images and H.264 I-Frames. The defense performance was evaluated against a highly robust attack, and the perceptual quality degradation was measured by the well-known PSNR and SSIM quality assessment metrics. A set of defense techniques is suggested for improving the defense efficiency. For the most aggressive attack configuration, the combination of all the defense techniques results in 80% protection against cyber-attacks with PSNR of 25.74 db.

  2. A look-ahead probabilistic contingency analysis framework incorporating smart sampling techniques

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chen, Yousu; Etingov, Pavel V.; Ren, Huiying

    2016-07-18

    This paper describes a framework of incorporating smart sampling techniques in a probabilistic look-ahead contingency analysis application. The predictive probabilistic contingency analysis helps to reflect the impact of uncertainties caused by variable generation and load on potential violations of transmission limits.

  3. Early Detection of Severe Apnoea through Voice Analysis and Automatic Speaker Recognition Techniques

    NASA Astrophysics Data System (ADS)

    Fernández, Ruben; Blanco, Jose Luis; Díaz, David; Hernández, Luis A.; López, Eduardo; Alcázar, José

    This study is part of an on-going collaborative effort between the medical and the signal processing communities to promote research on applying voice analysis and Automatic Speaker Recognition techniques (ASR) for the automatic diagnosis of patients with severe obstructive sleep apnoea (OSA). Early detection of severe apnoea cases is important so that patients can receive early treatment. Effective ASR-based diagnosis could dramatically cut medical testing time. Working with a carefully designed speech database of healthy and apnoea subjects, we present and discuss the possibilities of using generative Gaussian Mixture Models (GMMs), generally used in ASR systems, to model distinctive apnoea voice characteristics (i.e. abnormal nasalization). Finally, we present experimental findings regarding the discriminative power of speaker recognition techniques applied to severe apnoea detection. We have achieved an 81.25 % correct classification rate, which is very promising and underpins the interest in this line of inquiry.

  4. Detecting transitions in protein dynamics using a recurrence quantification analysis based bootstrap method.

    PubMed

    Karain, Wael I

    2017-11-28

    Proteins undergo conformational transitions over different time scales. These transitions are closely intertwined with the protein's function. Numerous standard techniques such as principal component analysis are used to detect these transitions in molecular dynamics simulations. In this work, we add a new method that has the ability to detect transitions in dynamics based on the recurrences in the dynamical system. It combines bootstrapping and recurrence quantification analysis. We start from the assumption that a protein has a "baseline" recurrence structure over a given period of time. Any statistically significant deviation from this recurrence structure, as inferred from complexity measures provided by recurrence quantification analysis, is considered a transition in the dynamics of the protein. We apply this technique to a 132 ns long molecular dynamics simulation of the β-Lactamase Inhibitory Protein BLIP. We are able to detect conformational transitions in the nanosecond range in the recurrence dynamics of the BLIP protein during the simulation. The results compare favorably to those extracted using the principal component analysis technique. The recurrence quantification analysis based bootstrap technique is able to detect transitions between different dynamics states for a protein over different time scales. It is not limited to linear dynamics regimes, and can be generalized to any time scale. It also has the potential to be used to cluster frames in molecular dynamics trajectories according to the nature of their recurrence dynamics. One shortcoming for this method is the need to have large enough time windows to insure good statistical quality for the recurrence complexity measures needed to detect the transitions.

  5. Automated cloud classification using a ground based infra-red camera and texture analysis techniques

    NASA Astrophysics Data System (ADS)

    Rumi, Emal; Kerr, David; Coupland, Jeremy M.; Sandford, Andrew P.; Brettle, Mike J.

    2013-10-01

    Clouds play an important role in influencing the dynamics of local and global weather and climate conditions. Continuous monitoring of clouds is vital for weather forecasting and for air-traffic control. Convective clouds such as Towering Cumulus (TCU) and Cumulonimbus clouds (CB) are associated with thunderstorms, turbulence and atmospheric instability. Human observers periodically report the presence of CB and TCU clouds during operational hours at airports and observatories; however such observations are expensive and time limited. Robust, automatic classification of cloud type using infrared ground-based instrumentation offers the advantage of continuous, real-time (24/7) data capture and the representation of cloud structure in the form of a thermal map, which can greatly help to characterise certain cloud formations. The work presented here utilised a ground based infrared (8-14 μm) imaging device mounted on a pan/tilt unit for capturing high spatial resolution sky images. These images were processed to extract 45 separate textural features using statistical and spatial frequency based analytical techniques. These features were used to train a weighted k-nearest neighbour (KNN) classifier in order to determine cloud type. Ground truth data were obtained by inspection of images captured simultaneously from a visible wavelength colour camera at the same installation, with approximately the same field of view as the infrared device. These images were classified by a trained cloud observer. Results from the KNN classifier gave an encouraging success rate. A Probability of Detection (POD) of up to 90% with a Probability of False Alarm (POFA) as low as 16% was achieved.

  6. Theoretical Bound of CRLB for Energy Efficient Technique of RSS-Based Factor Graph Geolocation

    NASA Astrophysics Data System (ADS)

    Kahar Aziz, Muhammad Reza; Heriansyah; Saputra, EfaMaydhona; Musa, Ardiansyah

    2018-03-01

    To support the increase of wireless geolocation development as the key of the technology in the future, this paper proposes theoretical bound derivation, i.e., Cramer Rao lower bound (CRLB) for energy efficient of received signal strength (RSS)-based factor graph wireless geolocation technique. The theoretical bound derivation is crucially important to evaluate whether the energy efficient technique of RSS-based factor graph wireless geolocation is effective as well as to open the opportunity to further innovation of the technique. The CRLB is derived in this paper by using the Fisher information matrix (FIM) of the main formula of the RSS-based factor graph geolocation technique, which is lied on the Jacobian matrix. The simulation result shows that the derived CRLB has the highest accuracy as a bound shown by its lowest root mean squared error (RMSE) curve compared to the RMSE curve of the RSS-based factor graph geolocation technique. Hence, the derived CRLB becomes the lower bound for the efficient technique of RSS-based factor graph wireless geolocation.

  7. CANDU in-reactor quantitative visual-based inspection techniques

    NASA Astrophysics Data System (ADS)

    Rochefort, P. A.

    2009-02-01

    This paper describes two separate visual-based inspection procedures used at CANDU nuclear power generating stations. The techniques are quantitative in nature and are delivered and operated in highly radioactive environments with access that is restrictive, and in one case is submerged. Visual-based inspections at stations are typically qualitative in nature. For example a video system will be used to search for a missing component, inspect for a broken fixture, or locate areas of excessive corrosion in a pipe. In contrast, the methods described here are used to measure characteristic component dimensions that in one case ensure ongoing safe operation of the reactor and in the other support reactor refurbishment. CANDU reactors are Pressurized Heavy Water Reactors (PHWR). The reactor vessel is a horizontal cylindrical low-pressure calandria tank approximately 6 m in diameter and length, containing heavy water as a neutron moderator. Inside the calandria, 380 horizontal fuel channels (FC) are supported at each end by integral end-shields. Each FC holds 12 fuel bundles. The heavy water primary heat transport water flows through the FC pressure tube, removing the heat from the fuel bundles and delivering it to the steam generator. The general design of the reactor governs both the type of measurements that are required and the methods to perform the measurements. The first inspection procedure is a method to remotely measure the gap between FC and other in-core horizontal components. The technique involves delivering vertically a module with a high-radiation-resistant camera and lighting into the core of a shutdown but fuelled reactor. The measurement is done using a line-of-sight technique between the components. Compensation for image perspective and viewing elevation to the measurement is required. The second inspection procedure measures flaws within the reactor's end shield FC calandria tube rolled joint area. The FC calandria tube (the outer shell of the FC) is

  8. Electrocardiogram signal denoising based on empirical mode decomposition technique: an overview

    NASA Astrophysics Data System (ADS)

    Han, G.; Lin, B.; Xu, Z.

    2017-03-01

    Electrocardiogram (ECG) signal is nonlinear and non-stationary weak signal which reflects whether the heart is functioning normally or abnormally. ECG signal is susceptible to various kinds of noises such as high/low frequency noises, powerline interference and baseline wander. Hence, the removal of noises from ECG signal becomes a vital link in the ECG signal processing and plays a significant role in the detection and diagnosis of heart diseases. The review will describe the recent developments of ECG signal denoising based on Empirical Mode Decomposition (EMD) technique including high frequency noise removal, powerline interference separation, baseline wander correction, the combining of EMD and Other Methods, EEMD technique. EMD technique is a quite potential and prospective but not perfect method in the application of processing nonlinear and non-stationary signal like ECG signal. The EMD combined with other algorithms is a good solution to improve the performance of noise cancellation. The pros and cons of EMD technique in ECG signal denoising are discussed in detail. Finally, the future work and challenges in ECG signal denoising based on EMD technique are clarified.

  9. Effective gene prediction by high resolution frequency estimator based on least-norm solution technique

    PubMed Central

    2014-01-01

    Linear algebraic concept of subspace plays a significant role in the recent techniques of spectrum estimation. In this article, the authors have utilized the noise subspace concept for finding hidden periodicities in DNA sequence. With the vast growth of genomic sequences, the demand to identify accurately the protein-coding regions in DNA is increasingly rising. Several techniques of DNA feature extraction which involves various cross fields have come up in the recent past, among which application of digital signal processing tools is of prime importance. It is known that coding segments have a 3-base periodicity, while non-coding regions do not have this unique feature. One of the most important spectrum analysis techniques based on the concept of subspace is the least-norm method. The least-norm estimator developed in this paper shows sharp period-3 peaks in coding regions completely eliminating background noise. Comparison of proposed method with existing sliding discrete Fourier transform (SDFT) method popularly known as modified periodogram method has been drawn on several genes from various organisms and the results show that the proposed method has better as well as an effective approach towards gene prediction. Resolution, quality factor, sensitivity, specificity, miss rate, and wrong rate are used to establish superiority of least-norm gene prediction method over existing method. PMID:24386895

  10. A study of data analysis techniques for the multi-needle Langmuir probe

    NASA Astrophysics Data System (ADS)

    Hoang, H.; Røed, K.; Bekkeng, T. A.; Moen, J. I.; Spicher, A.; Clausen, L. B. N.; Miloch, W. J.; Trondsen, E.; Pedersen, A.

    2018-06-01

    In this paper we evaluate two data analysis techniques for the multi-needle Langmuir probe (m-NLP). The instrument uses several cylindrical Langmuir probes, which are positively biased with respect to the plasma potential in order to operate in the electron saturation region. Since the currents collected by these probes can be sampled at kilohertz rates, the instrument is capable of resolving the ionospheric plasma structure down to the meter scale. The two data analysis techniques, a linear fit and a non-linear least squares fit, are discussed in detail using data from the Investigation of Cusp Irregularities 2 sounding rocket. It is shown that each technique has pros and cons with respect to the m-NLP implementation. Even though the linear fitting technique seems to be better than measurements from incoherent scatter radar and in situ instruments, m-NLPs can be longer and can be cleaned during operation to improve instrument performance. The non-linear least squares fitting technique would be more reliable provided that a higher number of probes are deployed.

  11. Regression Commonality Analysis: A Technique for Quantitative Theory Building

    ERIC Educational Resources Information Center

    Nimon, Kim; Reio, Thomas G., Jr.

    2011-01-01

    When it comes to multiple linear regression analysis (MLR), it is common for social and behavioral science researchers to rely predominately on beta weights when evaluating how predictors contribute to a regression model. Presenting an underutilized statistical technique, this article describes how organizational researchers can use commonality…

  12. Algal Biomass Analysis by Laser-Based Analytical Techniques—A Review

    PubMed Central

    Pořízka, Pavel; Prochazková, Petra; Prochazka, David; Sládková, Lucia; Novotný, Jan; Petrilak, Michal; Brada, Michal; Samek, Ota; Pilát, Zdeněk; Zemánek, Pavel; Adam, Vojtěch; Kizek, René; Novotný, Karel; Kaiser, Jozef

    2014-01-01

    Algal biomass that is represented mainly by commercially grown algal strains has recently found many potential applications in various fields of interest. Its utilization has been found advantageous in the fields of bioremediation, biofuel production and the food industry. This paper reviews recent developments in the analysis of algal biomass with the main focus on the Laser-Induced Breakdown Spectroscopy, Raman spectroscopy, and partly Laser-Ablation Inductively Coupled Plasma techniques. The advantages of the selected laser-based analytical techniques are revealed and their fields of use are discussed in detail. PMID:25251409

  13. Web image retrieval using an effective topic and content-based technique

    NASA Astrophysics Data System (ADS)

    Lee, Ching-Cheng; Prabhakara, Rashmi

    2005-03-01

    There has been an exponential growth in the amount of image data that is available on the World Wide Web since the early development of Internet. With such a large amount of information and image available and its usefulness, an effective image retrieval system is thus greatly needed. In this paper, we present an effective approach with both image matching and indexing techniques that improvise on existing integrated image retrieval methods. This technique follows a two-phase approach, integrating query by topic and query by example specification methods. In the first phase, The topic-based image retrieval is performed by using an improved text information retrieval (IR) technique that makes use of the structured format of HTML documents. This technique consists of a focused crawler that not only provides for the user to enter the keyword for the topic-based search but also, the scope in which the user wants to find the images. In the second phase, we use query by example specification to perform a low-level content-based image match in order to retrieve smaller and relatively closer results of the example image. From this, information related to the image feature is automatically extracted from the query image. The main objective of our approach is to develop a functional image search and indexing technique and to demonstrate that better retrieval results can be achieved.

  14. A Survey of Shape Parameterization Techniques

    NASA Technical Reports Server (NTRS)

    Samareh, Jamshid A.

    1999-01-01

    This paper provides a survey of shape parameterization techniques for multidisciplinary optimization and highlights some emerging ideas. The survey focuses on the suitability of available techniques for complex configurations, with suitability criteria based on the efficiency, effectiveness, ease of implementation, and availability of analytical sensitivities for geometry and grids. The paper also contains a section on field grid regeneration, grid deformation, and sensitivity analysis techniques.

  15. Regional cardiac function analysis from tagged MRI images. Comparison of techniques: Harmonic-Phase (HARP) versus Sinusoidal-Modeling (SinMod) analysis.

    PubMed

    Ibrahim, El-Sayed H; Stojanovska, Jadranka; Hassanein, Azza; Duvernoy, Claire; Croisille, Pierre; Pop-Busui, Rodica; Swanson, Scott D

    2018-05-16

    Cardiac MRI tagging is a valuable technique for evaluating regional heart function. Currently, there are a number of different techniques for analyzing the tagged images. Specifically, k-space-based analysis techniques showed to be much faster than image-based techniques, where harmonic-phase (HARP) and sine-wave modeling (SinMod) stand as two famous techniques of the former group, which are frequently used in clinical studies. In this study, we compared HARP and SinMod and studied inter-observer variability between the two techniques for evaluating myocardial strain and apical-to-base torsion in numerical phantom, nine healthy controls, and thirty diabetic patients. Based on the ground-truth numerical phantom measurements (strain = -20% and rotation angle = -4.4°), HARP and SinMod resulted in overestimation (in absolute value terms) of strain by 1% and 5% (strain values), and of rotation angle by 0.4° and 2.0°, respectively. For the in-vivo results, global strain and torsion ranges were -10.6 to -35.3% and 1.8-12.7°/cm in patients, and -17.8 to -32.7% and 1.8-12.3°/cm in volunteers. On average, SinMod overestimated strain measurements by 5.7% and 5.9% (strain values) in the patients and volunteers, respectively, compared to HARP, and overestimated torsion measurements by 2.9°/cm and 2.5°/cm in the patients and volunteers, respectively, compared to HARP. Location-wise, the ranges for basal, mid-ventricular, and apical strain in patients (volunteers) were -8.4 to -31.5% (-11.6 to -33.3%), -6.3 to -37.2% (-17.8 to -33.3%), and -5.2 to -38.4% (-20.0 to -33.2%), respectively. SinMod overestimated strain in the basal, mid-ventricular, and apical slices by 4.7% (5.7%), 5.9% (5.5%), and 8.9% (6.8%), respectively, compared to HARP in the patients (volunteers). Nevertheless, there existed good correlation between the HARP and SinMod measurements. Finally, there were no significant strain or torsion measurement differences between patients and volunteers

  16. Regional environmental analysis and management: New techniques for current problems

    NASA Technical Reports Server (NTRS)

    Honea, R. B.; Paludan, C. T. N.

    1974-01-01

    Advances in data acquisition and processing procedures for regional environmental analysis are discussed. Automated and semi-automated techniques employing Earth Resources Technology Satellite data and conventional data sources are presented. Experiences are summarized. The ERTS computer compatible tapes provide a very complete and flexible record of earth resources data and represent a viable medium to enhance regional environmental analysis research.

  17. Measured extent of agricultural expansion depends on analysis technique

    DOE PAGES

    Dunn, Jennifer B.; Merz, Dylan; Copenhaver, Ken L.; ...

    2017-01-31

    Concern is rising that ecologically important, carbon-rich natural lands in the United States are losing ground to agriculture. We investigate how quantitative assessments of historical land use change to address this concern differ in their conclusions depending on the data set used. We examined land use change between 2006 and 2014 in 20 counties in the Prairie Pothole Region using the Cropland Data Layer, a modified Cropland Data Layer, data from the National Agricultural Imagery Program, and in-person ground-truthing. The Cropland Data Layer analyses overwhelmingly returned the largest amount of land use change with associated error that limits drawing conclusionsmore » from it. Analysis with visual imagery estimated a fraction of this land use change. Clearly, analysis technique drives understanding of the measured extent of land use change; different techniques produce vastly different results that would inform land management policy in strikingly different ways. As a result, best practice guidelines are needed.« less

  18. Measured extent of agricultural expansion depends on analysis technique

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dunn, Jennifer B.; Merz, Dylan; Copenhaver, Ken L.

    Concern is rising that ecologically important, carbon-rich natural lands in the United States are losing ground to agriculture. We investigate how quantitative assessments of historical land use change to address this concern differ in their conclusions depending on the data set used. We examined land use change between 2006 and 2014 in 20 counties in the Prairie Pothole Region using the Cropland Data Layer, a modified Cropland Data Layer, data from the National Agricultural Imagery Program, and in-person ground-truthing. The Cropland Data Layer analyses overwhelmingly returned the largest amount of land use change with associated error that limits drawing conclusionsmore » from it. Analysis with visual imagery estimated a fraction of this land use change. Clearly, analysis technique drives understanding of the measured extent of land use change; different techniques produce vastly different results that would inform land management policy in strikingly different ways. As a result, best practice guidelines are needed.« less

  19. Fourier transform infrared spectroscopy techniques for the analysis of drugs of abuse

    NASA Astrophysics Data System (ADS)

    Kalasinsky, Kathryn S.; Levine, Barry K.; Smith, Michael L.; Magluilo, Joseph J.; Schaefer, Teresa

    1994-01-01

    Cryogenic deposition techniques for Gas Chromatography/Fourier Transform Infrared (GC/FT-IR) can be successfully employed in urinalysis for drugs of abuse with detection limits comparable to those of the established Gas Chromatography/Mass Spectrometry (GC/MS) technique. The additional confidence of the data that infrared analysis can offer has been helpful in identifying ambiguous results, particularly, in the case of amphetamines where drugs of abuse can be confused with over-the-counter medications or naturally occurring amines. Hair analysis has been important in drug testing when adulteration of urine samples has been a question. Functional group mapping can further assist the analysis and track drug use versus time.

  20. Metrology Optical Power Budgeting in SIM Using Statistical Analysis Techniques

    NASA Technical Reports Server (NTRS)

    Kuan, Gary M

    2008-01-01

    The Space Interferometry Mission (SIM) is a space-based stellar interferometry instrument, consisting of up to three interferometers, which will be capable of micro-arc second resolution. Alignment knowledge of the three interferometer baselines requires a three-dimensional, 14-leg truss with each leg being monitored by an external metrology gauge. In addition, each of the three interferometers requires an internal metrology gauge to monitor the optical path length differences between the two sides. Both external and internal metrology gauges are interferometry based, operating at a wavelength of 1319 nanometers. Each gauge has fiber inputs delivering measurement and local oscillator (LO) power, split into probe-LO and reference-LO beam pairs. These beams experience power loss due to a variety of mechanisms including, but not restricted to, design efficiency, material attenuation, element misalignment, diffraction, and coupling efficiency. Since the attenuation due to these sources may degrade over time, an accounting of the range of expected attenuation is needed so an optical power margin can be book kept. A method of statistical optical power analysis and budgeting, based on a technique developed for deep space RF telecommunications, is described in this paper and provides a numerical confidence level for having sufficient optical power relative to mission metrology performance requirements.

  1. An Analysis of Losses to the Southern Commercial Timberland Base

    Treesearch

    Ian A. Munn; David Cleaves

    1998-01-01

    Demographic and physical factors influencing the conversion of commercial timberland iu the south to non-forestry uses between the last two Forest Inventory Analysis (FIA) surveys were investigated. GIS techniques linked Census data and FIA plot level data. Multinomial logit regression identified factors associated with losses to the timberland base. Conversion to...

  2. Peptidomics: the integrated approach of MS, hyphenated techniques and bioinformatics for neuropeptide analysis.

    PubMed

    Boonen, Kurt; Landuyt, Bart; Baggerman, Geert; Husson, Steven J; Huybrechts, Jurgen; Schoofs, Liliane

    2008-02-01

    MS is currently one of the most important analytical techniques in biological and medical research. ESI and MALDI launched the field of MS into biology. The performance of mass spectrometers increased tremendously over the past decades. Other technological advances increased the analytical power of biological MS even more. First, the advent of the genome projects allowed an automated analysis of mass spectrometric data. Second, improved separation techniques, like nanoscale HPLC, are essential for MS analysis of biomolecules. The recent progress in bioinformatics is the third factor that accelerated the biochemical analysis of macromolecules. The first part of this review will introduce the basics of these techniques. The field that integrates all these techniques to identify endogenous peptides is called peptidomics and will be discussed in the last section. This integrated approach aims at identifying all the present peptides in a cell, organ or organism (the peptidome). Today, peptidomics is used by several fields of research. Special emphasis will be given to the identification of neuropeptides, a class of short proteins that fulfil several important intercellular signalling functions in every animal. MS imaging techniques and biomarker discovery will also be discussed briefly.

  3. ELICIT: An alternative imprecise weight elicitation technique for use in multi-criteria decision analysis for healthcare.

    PubMed

    Diaby, Vakaramoko; Sanogo, Vassiki; Moussa, Kouame Richard

    2016-01-01

    In this paper, the readers are introduced to ELICIT, an imprecise weight elicitation technique for multicriteria decision analysis for healthcare. The application of ELICIT consists of two steps: the rank ordering of evaluation criteria based on decision-makers' (DMs) preferences using the principal component analysis; and the estimation of criteria weights and their descriptive statistics using the variable interdependent analysis and the Monte Carlo method. The application of ELICIT is illustrated with a hypothetical case study involving the elicitation of weights for five criteria used to select the best device for eye surgery. The criteria were ranked from 1-5, based on a strict preference relationship established by the DMs. For each criterion, the deterministic weight was estimated as well as the standard deviation and 95% credibility interval. ELICIT is appropriate in situations where only ordinal DMs' preferences are available to elicit decision criteria weights.

  4. Predicting Effective Course Conduction Strategy Using Datamining Techniques

    ERIC Educational Resources Information Center

    Parkavi, A.; Lakshmi, K.; Srinivasa, K. G.

    2017-01-01

    Data analysis techniques can be used to analyze the pattern of data in different fields. Based on the analysis' results, it is recommended that suggestions be provided to decision making authorities. The data mining techniques can be used in educational domain to improve the outcome of the educational sectors. The authors carried out this research…

  5. Large-volume constant-concentration sampling technique coupling with surface-enhanced Raman spectroscopy for rapid on-site gas analysis

    NASA Astrophysics Data System (ADS)

    Zhang, Zhuomin; Zhan, Yisen; Huang, Yichun; Li, Gongke

    2017-08-01

    In this work, a portable large-volume constant-concentration (LVCC) sampling technique coupling with surface-enhanced Raman spectroscopy (SERS) was developed for the rapid on-site gas analysis based on suitable derivatization methods. LVCC sampling technique mainly consisted of a specially designed sampling cell including the rigid sample container and flexible sampling bag, and an absorption-derivatization module with a portable pump and a gas flowmeter. LVCC sampling technique allowed large, alterable and well-controlled sampling volume, which kept the concentration of gas target in headspace phase constant during the entire sampling process and made the sampling result more representative. Moreover, absorption and derivatization of gas target during LVCC sampling process were efficiently merged in one step using bromine-thiourea and OPA-NH4+ strategy for ethylene and SO2 respectively, which made LVCC sampling technique conveniently adapted to consequent SERS analysis. Finally, a new LVCC sampling-SERS method was developed and successfully applied for rapid analysis of trace ethylene and SO2 from fruits. It was satisfied that trace ethylene and SO2 from real fruit samples could be actually and accurately quantified by this method. The minor concentration fluctuations of ethylene and SO2 during the entire LVCC sampling process were proved to be < 4.3% and 2.1% respectively. Good recoveries for ethylene and sulfur dioxide from fruit samples were achieved in range of 95.0-101% and 97.0-104% respectively. It is expected that portable LVCC sampling technique would pave the way for rapid on-site analysis of accurate concentrations of trace gas targets from real samples by SERS.

  6. Large-volume constant-concentration sampling technique coupling with surface-enhanced Raman spectroscopy for rapid on-site gas analysis.

    PubMed

    Zhang, Zhuomin; Zhan, Yisen; Huang, Yichun; Li, Gongke

    2017-08-05

    In this work, a portable large-volume constant-concentration (LVCC) sampling technique coupling with surface-enhanced Raman spectroscopy (SERS) was developed for the rapid on-site gas analysis based on suitable derivatization methods. LVCC sampling technique mainly consisted of a specially designed sampling cell including the rigid sample container and flexible sampling bag, and an absorption-derivatization module with a portable pump and a gas flowmeter. LVCC sampling technique allowed large, alterable and well-controlled sampling volume, which kept the concentration of gas target in headspace phase constant during the entire sampling process and made the sampling result more representative. Moreover, absorption and derivatization of gas target during LVCC sampling process were efficiently merged in one step using bromine-thiourea and OPA-NH 4 + strategy for ethylene and SO 2 respectively, which made LVCC sampling technique conveniently adapted to consequent SERS analysis. Finally, a new LVCC sampling-SERS method was developed and successfully applied for rapid analysis of trace ethylene and SO 2 from fruits. It was satisfied that trace ethylene and SO 2 from real fruit samples could be actually and accurately quantified by this method. The minor concentration fluctuations of ethylene and SO 2 during the entire LVCC sampling process were proved to be <4.3% and 2.1% respectively. Good recoveries for ethylene and sulfur dioxide from fruit samples were achieved in range of 95.0-101% and 97.0-104% respectively. It is expected that portable LVCC sampling technique would pave the way for rapid on-site analysis of accurate concentrations of trace gas targets from real samples by SERS. Copyright © 2017 Elsevier B.V. All rights reserved.

  7. Self-Alignment MEMS IMU Method Based on the Rotation Modulation Technique on a Swing Base

    PubMed Central

    Chen, Zhiyong; Yang, Haotian; Wang, Chengbin; Lin, Zhihui; Guo, Meifeng

    2018-01-01

    The micro-electro-mechanical-system (MEMS) inertial measurement unit (IMU) has been widely used in the field of inertial navigation due to its small size, low cost, and light weight, but aligning MEMS IMUs remains a challenge for researchers. MEMS IMUs have been conventionally aligned on a static base, requiring other sensors, such as magnetometers or satellites, to provide auxiliary information, which limits its application range to some extent. Therefore, improving the alignment accuracy of MEMS IMU as much as possible under swing conditions is of considerable value. This paper proposes an alignment method based on the rotation modulation technique (RMT), which is completely self-aligned, unlike the existing alignment techniques. The effect of the inertial sensor errors is mitigated by rotating the IMU. Then, inertial frame-based alignment using the rotation modulation technique (RMT-IFBA) achieved coarse alignment on the swing base. The strong tracking filter (STF) further improved the alignment accuracy. The performance of the proposed method was validated with a physical experiment, and the results of the alignment showed that the standard deviations of pitch, roll, and heading angle were 0.0140°, 0.0097°, and 0.91°, respectively, which verified the practicality and efficacy of the proposed method for the self-alignment of the MEMS IMU on a swing base. PMID:29649150

  8. Video multiple watermarking technique based on image interlacing using DWT.

    PubMed

    Ibrahim, Mohamed M; Abdel Kader, Neamat S; Zorkany, M

    2014-01-01

    Digital watermarking is one of the important techniques to secure digital media files in the domains of data authentication and copyright protection. In the nonblind watermarking systems, the need of the original host file in the watermark recovery operation makes an overhead over the system resources, doubles memory capacity, and doubles communications bandwidth. In this paper, a robust video multiple watermarking technique is proposed to solve this problem. This technique is based on image interlacing. In this technique, three-level discrete wavelet transform (DWT) is used as a watermark embedding/extracting domain, Arnold transform is used as a watermark encryption/decryption method, and different types of media (gray image, color image, and video) are used as watermarks. The robustness of this technique is tested by applying different types of attacks such as: geometric, noising, format-compression, and image-processing attacks. The simulation results show the effectiveness and good performance of the proposed technique in saving system resources, memory capacity, and communications bandwidth.

  9. Proteomic data analysis of glioma cancer stem-cell lines based on novel nonlinear dimensional data reduction techniques

    NASA Astrophysics Data System (ADS)

    Lespinats, Sylvain; Pinker-Domenig, Katja; Wengert, Georg; Houben, Ivo; Lobbes, Marc; Stadlbauer, Andreas; Meyer-Bäse, Anke

    2016-05-01

    Glioma-derived cancer stem cells (GSCs) are tumor-initiating cells and may be refractory to radiation and chemotherapy and thus have important implications for tumor biology and therapeutics. The analysis and interpretation of large proteomic data sets requires the development of new data mining and visualization approaches. Traditional techniques are insufficient to interpret and visualize these resulting experimental data. The emphasis of this paper lies in the application of novel approaches for the visualization, clustering and projection representation to unveil hidden data structures relevant for the accurate interpretation of biological experiments. These qualitative and quantitative methods are applied to the proteomic analysis of data sets derived from the GSCs. The achieved clustering and visualization results provide a more detailed insight into the protein-level fold changes and putative upstream regulators for the GSCs. However the extracted molecular information is insufficient in classifying GSCs and paving the pathway to an improved therapeutics of the heterogeneous glioma.

  10. Service-based analysis of biological pathways

    PubMed Central

    Zheng, George; Bouguettaya, Athman

    2009-01-01

    Background Computer-based pathway discovery is concerned with two important objectives: pathway identification and analysis. Conventional mining and modeling approaches aimed at pathway discovery are often effective at achieving either objective, but not both. Such limitations can be effectively tackled leveraging a Web service-based modeling and mining approach. Results Inspired by molecular recognitions and drug discovery processes, we developed a Web service mining tool, named PathExplorer, to discover potentially interesting biological pathways linking service models of biological processes. The tool uses an innovative approach to identify useful pathways based on graph-based hints and service-based simulation verifying user's hypotheses. Conclusion Web service modeling of biological processes allows the easy access and invocation of these processes on the Web. Web service mining techniques described in this paper enable the discovery of biological pathways linking these process service models. Algorithms presented in this paper for automatically highlighting interesting subgraph within an identified pathway network enable the user to formulate hypothesis, which can be tested out using our simulation algorithm that are also described in this paper. PMID:19796403

  11. Nano-Al Based Energetics: Rapid Heating Studies and a New Preparation Technique

    NASA Astrophysics Data System (ADS)

    Sullivan, Kyle; Kuntz, Josh; Gash, Alex; Zachariah, Michael

    2011-06-01

    Nano-Al based thermites have become an attractive alternative to traditional energetic formulations due to their increased energy density and high reactivity. Understanding the intrinsic reaction mechanism has been a difficult task, largely due to the lack of experimental techniques capable of rapidly and uniform heating a sample (~104- 108 K/s). The current work presents several studies on nano-Al based thermites, using rapid heating techniques. A new mechanism termed a Reactive Sintering Mechanism is proposed for nano-Al based thermites. In addition, new experimental techniques for nanocomposite thermite deposition onto thin Pt electrodes will be discussed. This combined technique will offer more precise control of the deposition, and will serve to further our understanding of the intrinsic reaction mechanism of rapidly heated energetic systems. An improved mechanistic understanding will lead to the development of optimized formulations and architectures. This work performed under the auspices of the U.S. Department of Energy by Lawrence Livermore National Laboratory under Contract DE-AC52-07NA27344.

  12. Low Power LDPC Code Decoder Architecture Based on Intermediate Message Compression Technique

    NASA Astrophysics Data System (ADS)

    Shimizu, Kazunori; Togawa, Nozomu; Ikenaga, Takeshi; Goto, Satoshi

    Reducing the power dissipation for LDPC code decoder is a major challenging task to apply it to the practical digital communication systems. In this paper, we propose a low power LDPC code decoder architecture based on an intermediate message-compression technique which features as follows: (i) An intermediate message compression technique enables the decoder to reduce the required memory capacity and write power dissipation. (ii) A clock gated shift register based intermediate message memory architecture enables the decoder to decompress the compressed messages in a single clock cycle while reducing the read power dissipation. The combination of the above two techniques enables the decoder to reduce the power dissipation while keeping the decoding throughput. The simulation results show that the proposed architecture improves the power efficiency up to 52% and 18% compared to that of the decoder based on the overlapped schedule and the rapid convergence schedule without the proposed techniques respectively.

  13. Traditional versus rule-based programming techniques - Application to the control of optional flight information

    NASA Technical Reports Server (NTRS)

    Ricks, Wendell R.; Abbott, Kathy H.

    1987-01-01

    A traditional programming technique for controlling the display of optional flight information in a civil transport cockpit is compared to a rule-based technique for the same function. This application required complex decision logic and a frequently modified rule base. The techniques are evaluated for execution efficiency and implementation ease; the criterion used to calculate the execution efficiency is the total number of steps required to isolate hypotheses that were true and the criteria used to evaluate the implementability are ease of modification and verification and explanation capability. It is observed that the traditional program is more efficient than the rule-based program; however, the rule-based programming technique is more applicable for improving programmer productivity.

  14. TL and ESR based identification of gamma-irradiated frozen fish using different hydrolysis techniques

    NASA Astrophysics Data System (ADS)

    Ahn, Jae-Jun; Akram, Kashif; Shahbaz, Hafiz Muhammad; Kwon, Joong-Ho

    2014-12-01

    Frozen fish fillets (walleye Pollack and Japanese Spanish mackerel) were selected as samples for irradiation (0-10 kGy) detection trials using different hydrolysis methods. Photostimulated luminescence (PSL)-based screening analysis for gamma-irradiated frozen fillets showed low sensitivity due to limited silicate mineral contents on the samples. Same limitations were found in the thermoluminescence (TL) analysis on mineral samples isolated by density separation method. However, acid (HCl) and alkali (KOH) hydrolysis methods were effective in getting enough minerals to carry out TL analysis, which was reconfirmed through the normalization step by calculating the TL ratios (TL1/TL2). For improved electron spin resonance (ESR) analysis, alkali and enzyme (alcalase) hydrolysis methods were compared in separating minute-bone fractions. The enzymatic method provided more clear radiation-specific hydroxyapatite radicals than that of the alkaline method. Different hydrolysis methods could extend the application of TL and ESR techniques in identifying the irradiation history of frozen fish fillets.

  15. Determination of minor and trace elements concentration in kidney stones using elemental analysis techniques

    NASA Astrophysics Data System (ADS)

    Srivastava, Anjali

    The determination of accurate material composition of a kidney stone is crucial for understanding the formation of the kidney stone as well as for preventive therapeutic strategies. Radiations probing instrumental activation analysis techniques are excellent tools for identification of involved materials present in the kidney stone. The X-ray fluorescence (XRF) and neutron activation analysis (NAA) experiments were performed and different kidney stones were analyzed. The interactions of X-ray photons and neutrons with matter are complementary in nature, resulting in distinctly different materials detection. This is the first approach to utilize combined X-ray fluorescence and neutron activation analysis for a comprehensive analysis of the kideny stones. Presently, experimental studies in conjunction with analytical techniques were used to determine the exact composition of the kidney stone. The use of open source program Python Multi-Channel Analyzer was utilized to unfold the XRF spectrum. A new type of experimental set-up was developed and utilized for XRF and NAA analysis of the kidney stone. To verify the experimental results with analytical calculation, several sets of kidney stones were analyzed using XRF and NAA technique. The elements which were identified from XRF technique are Br, Cu, Ga, Ge, Mo, Nb, Ni, Rb, Se, Sr, Y, Zr. And, by using Neutron Activation Analysis (NAA) are Au, Br, Ca, Er, Hg, I, K, Na, Pm, Sb, Sc, Sm, Tb, Yb, Zn. This thesis presents a new approach for exact detection of accurate material composition of kidney stone materials using XRF and NAA instrumental activation analysis techniques.

  16. Techniques for the analysis of data from coded-mask X-ray telescopes

    NASA Technical Reports Server (NTRS)

    Skinner, G. K.; Ponman, T. J.; Hammersley, A. P.; Eyles, C. J.

    1987-01-01

    Several techniques useful in the analysis of data from coded-mask telescopes are presented. Methods of handling changes in the instrument pointing direction are reviewed and ways of using FFT techniques to do the deconvolution considered. Emphasis is on techniques for optimally-coded systems, but it is shown that the range of systems included in this class can be extended through the new concept of 'partial cycle averaging'.

  17. Application of Behavior Change Techniques in a Personalized Nutrition Electronic Health Intervention Study: Protocol for the Web-Based Food4Me Randomized Controlled Trial

    PubMed Central

    Macready, Anna L; Fallaize, Rosalind; Butler, Laurie T; Ellis, Judi A; Kuznesof, Sharron; Frewer, Lynn J; Celis-Morales, Carlos; Livingstone, Katherine M; Araújo-Soares, Vera; Fischer, Arnout RH; Stewart-Knox, Barbara J; Mathers, John C

    2018-01-01

    Background To determine the efficacy of behavior change techniques applied in dietary and physical activity intervention studies, it is first necessary to record and describe techniques that have been used during such interventions. Published frameworks used in dietary and smoking cessation interventions undergo continuous development, and most are not adapted for Web-based delivery. The Food4Me study (N=1607) provided the opportunity to use existing frameworks to describe standardized Web-based techniques employed in a large-scale, internet-based intervention to change dietary behavior and physical activity. Objective The aims of this study were (1) to describe techniques embedded in the Food4Me study design and explain the selection rationale and (2) to demonstrate the use of behavior change technique taxonomies, develop standard operating procedures for training, and identify strengths and limitations of the Food4Me framework that will inform its use in future studies. Methods The 6-month randomized controlled trial took place simultaneously in seven European countries, with participants receiving one of four levels of personalized advice (generalized, intake-based, intake+phenotype–based, and intake+phenotype+gene–based). A three-phase approach was taken: (1) existing taxonomies were reviewed and techniques were identified a priori for possible inclusion in the Food4Me study, (2) a standard operating procedure was developed to maintain consistency in the use of methods and techniques across research centers, and (3) the Food4Me behavior change technique framework was reviewed and updated post intervention. An analysis of excluded techniques was also conducted. Results Of 46 techniques identified a priori as being applicable to Food4Me, 17 were embedded in the intervention design; 11 were from a dietary taxonomy, and 6 from a smoking cessation taxonomy. In addition, the four-category smoking cessation framework structure was adopted for clarity of

  18. Validation of Design and Analysis Techniques of Tailored Composite Structures

    NASA Technical Reports Server (NTRS)

    Jegley, Dawn C. (Technical Monitor); Wijayratne, Dulnath D.

    2004-01-01

    Aeroelasticity is the relationship between the elasticity of an aircraft structure and its aerodynamics. This relationship can cause instabilities such as flutter in a wing. Engineers have long studied aeroelasticity to ensure such instabilities do not become a problem within normal operating conditions. In recent decades structural tailoring has been used to take advantage of aeroelasticity. It is possible to tailor an aircraft structure to respond favorably to multiple different flight regimes such as takeoff, landing, cruise, 2-g pull up, etc. Structures can be designed so that these responses provide an aerodynamic advantage. This research investigates the ability to design and analyze tailored structures made from filamentary composites. Specifically the accuracy of tailored composite analysis must be verified if this design technique is to become feasible. To pursue this idea, a validation experiment has been performed on a small-scale filamentary composite wing box. The box is tailored such that its cover panels induce a global bend-twist coupling under an applied load. Two types of analysis were chosen for the experiment. The first is a closed form analysis based on a theoretical model of a single cell tailored box beam and the second is a finite element analysis. The predicted results are compared with the measured data to validate the analyses. The comparison of results show that the finite element analysis is capable of predicting displacements and strains to within 10% on the small-scale structure. The closed form code is consistently able to predict the wing box bending to 25% of the measured value. This error is expected due to simplifying assumptions in the closed form analysis. Differences between the closed form code representation and the wing box specimen caused large errors in the twist prediction. The closed form analysis prediction of twist has not been validated from this test.

  19. A Different Web-Based Geocoding Service Using Fuzzy Techniques

    NASA Astrophysics Data System (ADS)

    Pahlavani, P.; Abbaspour, R. A.; Zare Zadiny, A.

    2015-12-01

    Geocoding - the process of finding position based on descriptive data such as address or postal code - is considered as one of the most commonly used spatial analyses. Many online map providers such as Google Maps, Bing Maps and Yahoo Maps present geocoding as one of their basic capabilities. Despite the diversity of geocoding services, users usually face some limitations when they use available online geocoding services. In existing geocoding services, proximity and nearness concept is not modelled appropriately as well as these services search address only by address matching based on descriptive data. In addition there are also some limitations in display searching results. Resolving these limitations can enhance efficiency of the existing geocoding services. This paper proposes the idea of integrating fuzzy technique with geocoding process to resolve these limitations. In order to implement the proposed method, a web-based system is designed. In proposed method, nearness to places is defined by fuzzy membership functions and multiple fuzzy distance maps are created. Then these fuzzy distance maps are integrated using fuzzy overlay technique for obtain the results. Proposed methods provides different capabilities for users such as ability to search multi-part addresses, searching places based on their location, non-point representation of results as well as displaying search results based on their priority.

  20. Multivariate analysis of remote LIBS spectra using partial least squares, principal component analysis, and related techniques

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Clegg, Samuel M; Barefield, James E; Wiens, Roger C

    2008-01-01

    Quantitative analysis with LIBS traditionally employs calibration curves that are complicated by the chemical matrix effects. These chemical matrix effects influence the LIBS plasma and the ratio of elemental composition to elemental emission line intensity. Consequently, LIBS calibration typically requires a priori knowledge of the unknown, in order for a series of calibration standards similar to the unknown to be employed. In this paper, three new Multivariate Analysis (MV A) techniques are employed to analyze the LIBS spectra of 18 disparate igneous and highly-metamorphosed rock samples. Partial Least Squares (PLS) analysis is used to generate a calibration model from whichmore » unknown samples can be analyzed. Principal Components Analysis (PCA) and Soft Independent Modeling of Class Analogy (SIMCA) are employed to generate a model and predict the rock type of the samples. These MV A techniques appear to exploit the matrix effects associated with the chemistries of these 18 samples.« less

  1. Techniques of lumbar-sacral spine fusion in spondylosis: systematic literature review and meta-analysis of randomized clinical trials.

    PubMed

    Umeta, Ricardo S G; Avanzi, Osmar

    2011-07-01

    Spine fusions can be performed through different techniques and are used to treat a number of vertebral pathologies. However, there seems to be no consensus regarding which technique of fusion is best suited to treat each distinct spinal disease or group of diseases. To study the effectiveness and complications of the different techniques used for spinal fusion in patients with lumbar spondylosis. Systematic literature review and meta-analysis. Randomized clinical studies comparing the most commonly performed surgical techniques for spine fusion in lumbar-sacral spondylosis, as well as those reporting patient outcome were selected. Identify which technique, if any, presents the best clinical, functional, and radiographic outcome. Systematic literature review and meta-analysis based on scientific articles published and indexed to the following databases: PubMed (1966-2009), Cochrane Collaboration-CENTRAL, EMBASE (1980-2009), and LILACS (1982-2009). The general search strategy focused on the surgical treatment of patients with lumbar-sacral spondylosis. Eight studies met the inclusion criteria and were selected with a total of 1,136 patients. Meta-analysis showed that patients who underwent interbody fusion presented a significantly smaller blood loss (p=.001) and a greater rate of bone fusion (p=.02). Patients submitted to fusion using the posterolateral approach had a significantly shorter operative time (p=.007) and less perioperative complications (p=.03). No statistically significant difference was found for the other studied variables (pain, functional impairment, and return to work). The most commonly used techniques for lumbar spine fusion in patients with spondylosis were interbody fusion and posterolateral approach. Both techniques were comparable in final outcome, but the former presented better rates of fusion and the latter the less complications. Copyright © 2011 Elsevier Inc. All rights reserved.

  2. Using object-oriented analysis techniques to support system testing

    NASA Astrophysics Data System (ADS)

    Zucconi, Lin

    1990-03-01

    Testing of real-time control systems can be greatly facilitated by use of object-oriented and structured analysis modeling techniques. This report describes a project where behavior, process and information models built for a real-time control system were used to augment and aid traditional system testing. The modeling techniques used were an adaptation of the Ward/Mellor method for real-time systems analysis and design (Ward85) for object-oriented development. The models were used to simulate system behavior by means of hand execution of the behavior or state model and the associated process (data and control flow) and information (data) models. The information model, which uses an extended entity-relationship modeling technique, is used to identify application domain objects and their attributes (instance variables). The behavioral model uses state-transition diagrams to describe the state-dependent behavior of the object. The process model uses a transformation schema to describe the operations performed on or by the object. Together, these models provide a means of analyzing and specifying a system in terms of the static and dynamic properties of the objects which it manipulates. The various models were used to simultaneously capture knowledge about both the objects in the application domain and the system implementation. Models were constructed, verified against the software as-built and validated through informal reviews with the developer. These models were then hand-executed.

  3. Hyperspectral techniques in analysis of oral dosage forms.

    PubMed

    Hamilton, Sara J; Lowell, Amanda E; Lodder, Robert A

    2002-10-01

    Pharmaceutical oral dosage forms are used in this paper to test the sensitivity and spatial resolution of hyperspectral imaging instruments. The first experiment tested the hypothesis that a near-infrared (IR) tunable diode-based remote sensing system is capable of monitoring degradation of hard gelatin capsules at a relatively long distance (0.5 km). Spectra from the capsules were used to differentiate among capsules exposed to an atmosphere containing 150 ppb formaldehyde for 0, 2, 4, and 8 h. Robust median-based principal component regression with Bayesian inference was employed for outlier detection. The second experiment tested the hypothesis that near-IR imaging spectrometry of tablets permits the identification and composition of multiple individual tablets to be determined simultaneously. A near-IR camera was used to collect thousands of spectra simultaneously from a field of blister-packaged tablets. The number of tablets that a typical near-IR camera can currently analyze simultaneously was estimated to be approximately 1300. The bootstrap error-adjusted single-sample technique chemometric-imaging algorithm was used to draw probability-density contour plots that revealed tablet composition. The single-capsule analysis provides an indication of how far apart the sample and instrumentation can be and still maintain adequate signal-to-noise ratio (S/N), while the multiple-tablet imaging experiment gives an indication of how many samples can be analyzed simultaneously while maintaining an adequate S/N and pixel coverage on each sample.

  4. Decision Analysis Techniques for Adult Learners: Application to Leadership

    ERIC Educational Resources Information Center

    Toosi, Farah

    2017-01-01

    Most decision analysis techniques are not taught at higher education institutions. Leaders, project managers and procurement agents in industry have strong technical knowledge, and it is crucial for them to apply this knowledge at the right time to make critical decisions. There are uncertainties, problems, and risks involved in business…

  5. Analysis of questioning technique during classes in medical education.

    PubMed

    Cho, Young Hye; Lee, Sang Yeoup; Jeong, Dong Wook; Im, Sun Ju; Choi, Eun Jung; Lee, Sun Hee; Baek, Sun Yong; Kim, Yun Jin; Lee, Jeong Gyu; Yi, Yu Hyone; Bae, Mi Jin; Yune, So Jung

    2012-06-12

    Questioning is one of the essential techniques used by lecturers to make lectures more interactive and effective. This study surveyed the perception of questioning techniques by medical school faculty members and analyzed how the questioning technique is used in actual classes. Data on the perceptions of the questioning skills used during lectures was collected using a self-questionnaire for faculty members (N = 33) during the second semester of 2008. The questionnaire consisted of 18 items covering the awareness and characteristics of questioning skills. Recorded video tapes were used to observe the faculty members' questioning skills. Most faculty members regarded the questioning technique during classes as being important and expected positive outcomes in terms of the students' participation in class, concentration in class and understanding of the class contents. In the 99 classes analyzed, the median number of questions per class was 1 (0-29). Among them, 40 classes (40.4 %) did not use questioning techniques. The frequency of questioning per lecture was similar regardless of the faculty members' perception. On the other hand, the faculty members perceived that their usual wait time after question was approximately 10 seconds compared to only 2.5 seconds measured from video analysis. More lecture-experienced faculty members tended to ask more questions in class. There were some discrepancies regarding the questioning technique between the faculty members' perceptions and reality, even though they had positive opinions of the technique. The questioning skills during a lecture need to be emphasized to faculty members.

  6. Graphene-based terahertz photodetector by noise thermometry technique

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, Ming-Jye, E-mail: mingjye@asiss.sinica.edu.tw; Institute of Physics, Academia Sinica, Taipei 11529, Taiwan; Wang, Ji-Wun

    2014-01-20

    We report the characteristics of graphene-based terahertz (THz) photodetector based on noise thermometry technique by measuring its noise power at frequency from 4 to 6 GHz. Hot electron system in graphene microbridge is generated after THz photon pumping and creates extra noise power. The equivalent noise temperature and electron temperature increase rapidly in low THz pumping regime and saturate gradually in high THz power regime which is attributed to a faster energy relaxation process involved by stronger electron-phonon interaction. Based on this detector, a conversion efficiency around 0.15 from THz power to noise power in 4–6 GHz span has been achieved.

  7. Use of Model-Based Design Methods for Enhancing Resiliency Analysis of Unmanned Aerial Vehicles

    NASA Astrophysics Data System (ADS)

    Knox, Lenora A.

    The most common traditional non-functional requirement analysis is reliability. With systems becoming more complex, networked, and adaptive to environmental uncertainties, system resiliency has recently become the non-functional requirement analysis of choice. Analysis of system resiliency has challenges; which include, defining resilience for domain areas, identifying resilience metrics, determining resilience modeling strategies, and understanding how to best integrate the concepts of risk and reliability into resiliency. Formal methods that integrate all of these concepts do not currently exist in specific domain areas. Leveraging RAMSoS, a model-based reliability analysis methodology for Systems of Systems (SoS), we propose an extension that accounts for resiliency analysis through evaluation of mission performance, risk, and cost using multi-criteria decision-making (MCDM) modeling and design trade study variability modeling evaluation techniques. This proposed methodology, coined RAMSoS-RESIL, is applied to a case study in the multi-agent unmanned aerial vehicle (UAV) domain to investigate the potential benefits of a mission architecture where functionality to complete a mission is disseminated across multiple UAVs (distributed) opposed to being contained in a single UAV (monolithic). The case study based research demonstrates proof of concept for the proposed model-based technique and provides sufficient preliminary evidence to conclude which architectural design (distributed vs. monolithic) is most resilient based on insight into mission resilience performance, risk, and cost in addition to the traditional analysis of reliability.

  8. Vision-based system identification technique for building structures using a motion capture system

    NASA Astrophysics Data System (ADS)

    Oh, Byung Kwan; Hwang, Jin Woo; Kim, Yousok; Cho, Tongjun; Park, Hyo Seon

    2015-11-01

    This paper presents a new vision-based system identification (SI) technique for building structures by using a motion capture system (MCS). The MCS with outstanding capabilities for dynamic response measurements can provide gage-free measurements of vibrations through the convenient installation of multiple markers. In this technique, from the dynamic displacement responses measured by MCS, the dynamic characteristics (natural frequency, mode shape, and damping ratio) of building structures are extracted after the processes of converting the displacement from MCS to acceleration and conducting SI by frequency domain decomposition. A free vibration experiment on a three-story shear frame was conducted to validate the proposed technique. The SI results from the conventional accelerometer-based method were compared with those from the proposed technique and showed good agreement, which confirms the validity and applicability of the proposed vision-based SI technique for building structures. Furthermore, SI directly employing MCS measured displacements to FDD was performed and showed identical results to those of conventional SI method.

  9. Hyphenated analytical techniques for materials characterisation

    NASA Astrophysics Data System (ADS)

    Armstrong, Gordon; Kailas, Lekshmi

    2017-09-01

    This topical review will provide a survey of the current state of the art in ‘hyphenated’ techniques for characterisation of bulk materials, surface, and interfaces, whereby two or more analytical methods investigating different properties are applied simultaneously to the same sample to better characterise the sample than can be achieved by conducting separate analyses in series using different instruments. It is intended for final year undergraduates and recent graduates, who may have some background knowledge of standard analytical techniques, but are not familiar with ‘hyphenated’ techniques or hybrid instrumentation. The review will begin by defining ‘complementary’, ‘hybrid’ and ‘hyphenated’ techniques, as there is not a broad consensus among analytical scientists as to what each term means. The motivating factors driving increased development of hyphenated analytical methods will also be discussed. This introduction will conclude with a brief discussion of gas chromatography-mass spectroscopy and energy dispersive x-ray analysis in electron microscopy as two examples, in the context that combining complementary techniques for chemical analysis were among the earliest examples of hyphenated characterisation methods. The emphasis of the main review will be on techniques which are sufficiently well-established that the instrumentation is commercially available, to examine physical properties including physical, mechanical, electrical and thermal, in addition to variations in composition, rather than methods solely to identify and quantify chemical species. Therefore, the proposed topical review will address three broad categories of techniques that the reader may expect to encounter in a well-equipped materials characterisation laboratory: microscopy based techniques, scanning probe-based techniques, and thermal analysis based techniques. Examples drawn from recent literature, and a concluding case study, will be used to explain the

  10. Efficient grid-based techniques for density functional theory

    NASA Astrophysics Data System (ADS)

    Rodriguez-Hernandez, Juan Ignacio

    Understanding the chemical and physical properties of molecules and materials at a fundamental level often requires quantum-mechanical models for these substance's electronic structure. This type of many body quantum mechanics calculation is computationally demanding, hindering its application to substances with more than a few hundreds atoms. The supreme goal of many researches in quantum chemistry---and the topic of this dissertation---is to develop more efficient computational algorithms for electronic structure calculations. In particular, this dissertation develops two new numerical integration techniques for computing molecular and atomic properties within conventional Kohn-Sham-Density Functional Theory (KS-DFT) of molecular electronic structure. The first of these grid-based techniques is based on the transformed sparse grid construction. In this construction, a sparse grid is generated in the unit cube and then mapped to real space according to the pro-molecular density using the conditional distribution transformation. The transformed sparse grid was implemented in program deMon2k, where it is used as the numerical integrator for the exchange-correlation energy and potential in the KS-DFT procedure. We tested our grid by computing ground state energies, equilibrium geometries, and atomization energies. The accuracy on these test calculations shows that our grid is more efficient than some previous integration methods: our grids use fewer points to obtain the same accuracy. The transformed sparse grids were also tested for integrating, interpolating and differentiating in different dimensions (n = 1,2,3,6). The second technique is a grid-based method for computing atomic properties within QTAIM. It was also implemented in deMon2k. The performance of the method was tested by computing QTAIM atomic energies, charges, dipole moments, and quadrupole moments. For medium accuracy, our method is the fastest one we know of.

  11. Differentiation of four Aspergillus species and one Zygosaccharomyces with two electronic tongues based on different measurement techniques.

    PubMed

    Söderström, C; Rudnitskaya, A; Legin, A; Krantz-Rülcker, C

    2005-09-29

    Two electronic tongues based on different measurement techniques were applied to the discrimination of four molds and one yeast. Chosen microorganisms were different species of Aspergillus and yeast specie Zygosaccharomyces bailii, which are known as food contaminants. The electronic tongue developed in Linköping University was based on voltammetry. Four working electrodes made of noble metals were used in a standard three-electrode configuration in this case. The St. Petersburg electronic tongue consisted of 27 potentiometric chemical sensors with enhanced cross-sensitivity. Sensors with chalcogenide glass and plasticized PVC membranes were used. Two sets of samples were measured using both electronic tongues. Firstly, broths were measured in which either one of the molds or the yeast grew until late logarithmic phase or border of the stationary phase. Broths inoculated by either one of molds or the yeast was measured at five different times during microorganism growth. Data were evaluated using principal component analysis (PCA), partial least square regression (PLS) and linear discriminant analysis (LDA). It was found that both measurement techniques could differentiate between fungi species. Merged data from both electronic tongues improved differentiation of the samples in selected cases.

  12. A double sealing technique for increasing the precision of headspace-gas chromatographic analysis.

    PubMed

    Xie, Wei-Qi; Yu, Kong-Xian; Gong, Yi-Xian

    2018-01-19

    This paper investigates a new double sealing technique for increasing the precision of the headspace gas chromatographic method. The air leakage problem caused by the high pressure in the headspace vial during the headspace sampling process has a great impact to the measurement precision in the conventional headspace analysis (i.e., single sealing technique). The results (using ethanol solution as the model sample) show that the present technique is effective to minimize such a problem. The double sealing technique has an excellent measurement precision (RSD < 0.15%) and accuracy (recovery = 99.1%-100.6%) for the ethanol quantification. The detection precision of the present method was 10-20 times higher than that in earlier HS-GC work that use conventional single sealing technique. The present double sealing technique may open up a new avenue, and also serve as a general strategy for improving the performance (i.e., accuracy and precision) of headspace analysis of various volatile compounds. Copyright © 2017 Elsevier B.V. All rights reserved.

  13. An image registration-based technique for noninvasive vascular elastography

    NASA Astrophysics Data System (ADS)

    Valizadeh, Sina; Makkiabadi, Bahador; Mirbagheri, Alireza; Soozande, Mehdi; Manwar, Rayyan; Mozaffarzadeh, Moein; Nasiriavanaki, Mohammadreza

    2018-02-01

    Non-invasive vascular elastography is an emerging technique in vascular tissue imaging. During the past decades, several techniques have been suggested to estimate the tissue elasticity by measuring the displacement of the Carotid vessel wall. Cross correlation-based methods are the most prevalent approaches to measure the strain exerted in the wall vessel by the blood pressure. In the case of a low pressure, the displacement is too small to be apparent in ultrasound imaging, especially in the regions far from the center of the vessel, causing a high error of displacement measurement. On the other hand, increasing the compression leads to a relatively large displacement in the regions near the center, which reduces the performance of the cross correlation-based methods. In this study, a non-rigid image registration-based technique is proposed to measure the tissue displacement for a relatively large compression. The results show that the error of the displacement measurement obtained by the proposed method is reduced by increasing the amount of compression while the error of the cross correlationbased method rises for a relatively large compression. We also used the synthetic aperture imaging method, benefiting the directivity diagram, to improve the image quality, especially in the superficial regions. The best relative root-mean-square error (RMSE) of the proposed method and the adaptive cross correlation method were 4.5% and 6%, respectively. Consequently, the proposed algorithm outperforms the conventional method and reduces the relative RMSE by 25%.

  14. MEMS-Based Power Generation Techniques for Implantable Biosensing Applications

    PubMed Central

    Lueke, Jonathan; Moussa, Walied A.

    2011-01-01

    Implantable biosensing is attractive for both medical monitoring and diagnostic applications. It is possible to monitor phenomena such as physical loads on joints or implants, vital signs, or osseointegration in vivo and in real time. Microelectromechanical (MEMS)-based generation techniques can allow for the autonomous operation of implantable biosensors by generating electrical power to replace or supplement existing battery-based power systems. By supplementing existing battery-based power systems for implantable biosensors, the operational lifetime of the sensor is increased. In addition, the potential for a greater amount of available power allows additional components to be added to the biosensing module, such as computational and wireless and components, improving functionality and performance of the biosensor. Photovoltaic, thermovoltaic, micro fuel cell, electrostatic, electromagnetic, and piezoelectric based generation schemes are evaluated in this paper for applicability for implantable biosensing. MEMS-based generation techniques that harvest ambient energy, such as vibration, are much better suited for implantable biosensing applications than fuel-based approaches, producing up to milliwatts of electrical power. High power density MEMS-based approaches, such as piezoelectric and electromagnetic schemes, allow for supplemental and replacement power schemes for biosensing applications to improve device capabilities and performance. In addition, this may allow for the biosensor to be further miniaturized, reducing the need for relatively large batteries with respect to device size. This would cause the implanted biosensor to be less invasive, increasing the quality of care received by the patient. PMID:22319362

  15. MEMS-based power generation techniques for implantable biosensing applications.

    PubMed

    Lueke, Jonathan; Moussa, Walied A

    2011-01-01

    Implantable biosensing is attractive for both medical monitoring and diagnostic applications. It is possible to monitor phenomena such as physical loads on joints or implants, vital signs, or osseointegration in vivo and in real time. Microelectromechanical (MEMS)-based generation techniques can allow for the autonomous operation of implantable biosensors by generating electrical power to replace or supplement existing battery-based power systems. By supplementing existing battery-based power systems for implantable biosensors, the operational lifetime of the sensor is increased. In addition, the potential for a greater amount of available power allows additional components to be added to the biosensing module, such as computational and wireless and components, improving functionality and performance of the biosensor. Photovoltaic, thermovoltaic, micro fuel cell, electrostatic, electromagnetic, and piezoelectric based generation schemes are evaluated in this paper for applicability for implantable biosensing. MEMS-based generation techniques that harvest ambient energy, such as vibration, are much better suited for implantable biosensing applications than fuel-based approaches, producing up to milliwatts of electrical power. High power density MEMS-based approaches, such as piezoelectric and electromagnetic schemes, allow for supplemental and replacement power schemes for biosensing applications to improve device capabilities and performance. In addition, this may allow for the biosensor to be further miniaturized, reducing the need for relatively large batteries with respect to device size. This would cause the implanted biosensor to be less invasive, increasing the quality of care received by the patient.

  16. A comparative analysis of soft computing techniques for gene prediction.

    PubMed

    Goel, Neelam; Singh, Shailendra; Aseri, Trilok Chand

    2013-07-01

    The rapid growth of genomic sequence data for both human and nonhuman species has made analyzing these sequences, especially predicting genes in them, very important and is currently the focus of many research efforts. Beside its scientific interest in the molecular biology and genomics community, gene prediction is of considerable importance in human health and medicine. A variety of gene prediction techniques have been developed for eukaryotes over the past few years. This article reviews and analyzes the application of certain soft computing techniques in gene prediction. First, the problem of gene prediction and its challenges are described. These are followed by different soft computing techniques along with their application to gene prediction. In addition, a comparative analysis of different soft computing techniques for gene prediction is given. Finally some limitations of the current research activities and future research directions are provided. Copyright © 2013 Elsevier Inc. All rights reserved.

  17. Combination Base64 Algorithm and EOF Technique for Steganography

    NASA Astrophysics Data System (ADS)

    Rahim, Robbi; Nurdiyanto, Heri; Hidayat, Rahmat; Saleh Ahmar, Ansari; Siregar, Dodi; Putera Utama Siahaan, Andysah; Faisal, Ilham; Rahman, Sayuti; Suita, Diana; Zamsuri, Ahmad; Abdullah, Dahlan; Napitupulu, Darmawan; Ikhsan Setiawan, Muhammad; Sriadhi, S.

    2018-04-01

    The steganography process combines mathematics and computer science. Steganography consists of a set of methods and techniques to embed the data into another media so that the contents are unreadable to anyone who does not have the authority to read these data. The main objective of the use of base64 method is to convert any file in order to achieve privacy. This paper discusses a steganography and encoding method using base64, which is a set of encoding schemes that convert the same binary data to the form of a series of ASCII code. Also, the EoF technique is used to embed encoding text performed by Base64. As an example, for the mechanisms a file is used to represent the texts, and by using the two methods together will increase the security level for protecting the data, this research aims to secure many types of files in a particular media with a good security and not to damage the stored files and coverage media that used.

  18. Techniques for Analysis of DSN 64-meter Antenna Azimuth Bearing Film Height Records

    NASA Technical Reports Server (NTRS)

    Stevens, R.; Quach, C. T.

    1983-01-01

    The DSN 64-m antennas use oil pad azimuth thrust bearings. Instrumentation on the bearing pads measures the height of the oil film between the pad and the bearing runner. Techniques to analyze the film height record are developed and discussed. The analysis techniques present the unwieldy data in a compact form for assessment of bearing condition. The techniques are illustrated by analysis of a small sample of film height records from each of the three 64-m antennas. The results show the general condition of the bearings of DSS 43 and DSS 63 as good to excellent, and a DSS 14 as marginal.

  19. Comparison of Spares Logistics Analysis Techniques for Long Duration Human Spaceflight

    NASA Technical Reports Server (NTRS)

    Owens, Andrew; de Weck, Olivier; Mattfeld, Bryan; Stromgren, Chel; Cirillo, William

    2015-01-01

    As the durations and distances involved in human exploration missions increase, the logistics associated with the repair and maintenance becomes more challenging. Whereas the operation of the International Space Station (ISS) depends upon regular resupply from the Earth, this paradigm may not be feasible for future missions. Longer mission durations result in higher probabilities of component failures as well as higher uncertainty regarding which components may fail, and longer distances from Earth increase the cost of resupply as well as the speed at which the crew can abort to Earth in the event of an emergency. As such, mission development efforts must take into account the logistics requirements associated with maintenance and spares. Accurate prediction of the spare parts demand for a given mission plan and how that demand changes as a result of changes to the system architecture enables full consideration of the lifecycle cost associated with different options. In this paper, we utilize a range of analysis techniques - Monte Carlo, semi-Markov, binomial, and heuristic - to examine the relationship between the mass of spares and probability of loss of function related to the Carbon Dioxide Removal System (CRS) for a notional, simplified mission profile. The Exploration Maintainability Analysis Tool (EMAT), developed at NASA Langley Research Center, is utilized for the Monte Carlo analysis. We discuss the implications of these results and the features and drawbacks of each method. In particular, we identify the limitations of heuristic methods for logistics analysis, and the additional insights provided by more in-depth techniques. We discuss the potential impact of system complexity on each technique, as well as their respective abilities to examine dynamic events. This work is the first step in an effort that will quantitatively examine how well these techniques handle increasingly more complex systems by gradually expanding the system boundary.

  20. Design and Analysis Techniques for Concurrent Blackboard Systems. Ph.D. Thesis

    NASA Technical Reports Server (NTRS)

    Mcmanus, John William

    1992-01-01

    Blackboard systems are a natural progression of knowledge-based systems into a more powerful problem solving technique. They provide a way for several highly specialized knowledge sources to cooperate to solve large, complex problems. Blackboard systems incorporate the concepts developed by rule-based and expert systems programmers and include the ability to add conventionally coded knowledge sources. The small and specialized knowledge sources are easier to develop and test, and can be hosted on hardware specifically suited to the task that they are solving. The Formal Model for Blackboard Systems was developed to provide a consistent method for describing a blackboard system. A set of blackboard system design tools has been developed and validated for implementing systems that are expressed using the Formal Model. The tools are used to test and refine a proposed blackboard system design before the design is implemented. My research has shown that the level of independence and specialization of the knowledge sources directly affects the performance of blackboard systems. Using the design, simulation, and analysis tools, I developed a concurrent object-oriented blackboard system that is faster, more efficient, and more powerful than existing systems. The use of the design and analysis tools provided the highly specialized and independent knowledge sources required for my concurrent blackboard system to achieve its design goals.

  1. The influence of polishing techniques on pre-polymerized CAD\\CAM acrylic resin denture bases

    PubMed Central

    Alammari, Manal Rahma

    2017-01-01

    Background Lately, computer-aided design and computer-aided manufacturing (CAD/CAM) has broadly been successfully employed in dentistry. The CAD/CAM systems have recently become commercially available for fabrication of complete dentures, and are considered as an alternative technique to conventionally processed acrylic resin bases. However, they have not yet been fully investigated. Objective The purpose of this study was to inspect the effects of mechanical polishing and chemical polishing on the surface roughness (Ra) and contact angle (wettability) of heat-cured, auto-cured and CAD/CAM denture base acrylic resins. Methods This study was conducted at the Advanced Dental Research Laboratory Center of King Abdulaziz University from March to June 2017. Three denture base materials were selected: heat cure poly-methylmethacrylate resin, thermoplastic (polyamide resin) and (CAD\\CAM) denture base resin. Sixty specimens were prepared and divided into three groups, twenty in each. Each group was divided according to the polishing techniques into (Mech P) and (Chem P), ten specimens in each; surface roughness and wettability were investigated. Data were analyzed by SPSS version 22, using one-way ANOVA and Pearson coefficient. Results One-way analysis of variance (ANOVA) and post hoc tests were used for comparing the surface roughness values between three groups which revealed a statistical significant difference between them (p1<0.001). Heat-cured denture base material of (Group I) in both methods, showed the highest mean surface roughness value (2.44±0.07, 2.72±0.09, Mech P and Chem P respectively); while CAD\\CAM denture base material (group III) showed the least mean values (1.08±0.23, 1.39±0.31, Mech P and Chem P respectively). CAD/CAM showed the least contact angle in both polishing methods, which were statistically significant at 5% level (p=0.034 and p<0.001). Conclusion Mechanical polishing produced lower surface roughness of CAD\\CAM denture base resin with

  2. The influence of polishing techniques on pre-polymerized CAD\\CAM acrylic resin denture bases.

    PubMed

    Alammari, Manal Rahma

    2017-10-01

    Lately, computer-aided design and computer-aided manufacturing (CAD/CAM) has broadly been successfully employed in dentistry. The CAD/CAM systems have recently become commercially available for fabrication of complete dentures, and are considered as an alternative technique to conventionally processed acrylic resin bases. However, they have not yet been fully investigated. The purpose of this study was to inspect the effects of mechanical polishing and chemical polishing on the surface roughness (Ra) and contact angle (wettability) of heat-cured, auto-cured and CAD/CAM denture base acrylic resins. This study was conducted at the Advanced Dental Research Laboratory Center of King Abdulaziz University from March to June 2017. Three denture base materials were selected: heat cure poly-methylmethacrylate resin, thermoplastic (polyamide resin) and (CAD\\CAM) denture base resin. Sixty specimens were prepared and divided into three groups, twenty in each. Each group was divided according to the polishing techniques into (Mech P) and (Chem P), ten specimens in each; surface roughness and wettability were investigated. Data were analyzed by SPSS version 22, using one-way ANOVA and Pearson coefficient. One-way analysis of variance (ANOVA) and post hoc tests were used for comparing the surface roughness values between three groups which revealed a statistical significant difference between them (p 1 <0.001). Heat-cured denture base material of (Group I) in both methods, showed the highest mean surface roughness value (2.44±0.07, 2.72±0.09, Mech P and Chem P respectively); while CAD\\CAM denture base material (group III) showed the least mean values (1.08±0.23, 1.39±0.31, Mech P and Chem P respectively). CAD/CAM showed the least contact angle in both polishing methods, which were statistically significant at 5% level (p=0.034 and p<0.001). Mechanical polishing produced lower surface roughness of CAD\\CAM denture base resin with superior smooth surface compared to chemical

  3. ELICIT: An alternative imprecise weight elicitation technique for use in multi-criteria decision analysis for healthcare

    PubMed Central

    Diaby, Vakaramoko; Sanogo, Vassiki; Moussa, Kouame Richard

    2015-01-01

    Objective In this paper, the readers are introduced to ELICIT, an imprecise weight elicitation technique for multicriteria decision analysis for healthcare. Methods The application of ELICIT consists of two steps: the rank ordering of evaluation criteria based on decision-makers’ (DMs) preferences using the principal component analysis; and the estimation of criteria weights and their descriptive statistics using the variable interdependent analysis and the Monte Carlo method. The application of ELICIT is illustrated with a hypothetical case study involving the elicitation of weights for five criteria used to select the best device for eye surgery. Results The criteria were ranked from 1–5, based on a strict preference relationship established by the DMs. For each criterion, the deterministic weight was estimated as well as the standard deviation and 95% credibility interval. Conclusions ELICIT is appropriate in situations where only ordinal DMs’ preferences are available to elicit decision criteria weights. PMID:26361235

  4. Advanced analysis technique for the evaluation of linear alternators and linear motors

    NASA Technical Reports Server (NTRS)

    Holliday, Jeffrey C.

    1995-01-01

    A method for the mathematical analysis of linear alternator and linear motor devices and designs is described, and an example of its use is included. The technique seeks to surpass other methods of analysis by including more rigorous treatment of phenomena normally omitted or coarsely approximated such as eddy braking, non-linear material properties, and power losses generated within structures surrounding the device. The technique is broadly applicable to linear alternators and linear motors involving iron yoke structures and moving permanent magnets. The technique involves the application of Amperian current equivalents to the modeling of the moving permanent magnet components within a finite element formulation. The resulting steady state and transient mode field solutions can simultaneously account for the moving and static field sources within and around the device.

  5. Mapping accuracy via spectrally and structurally based filtering techniques: comparisons through visual observations

    NASA Astrophysics Data System (ADS)

    Chockalingam, Letchumanan

    2005-01-01

    The data of Gunung Ledang region of Malaysia acquired through LANDSAT are considered to map certain hydrogeolocial features. To map these significant features, image-processing tools such as contrast enhancement, edge detection techniques are employed. The advantages of these techniques over the other methods are evaluated from the point of their validity in properly isolating features of hydrogeolocial interest are discussed. As these techniques take the advantage of spectral aspects of the images, these techniques have several limitations to meet the objectives. To discuss these limitations, a morphological transformation, which generally considers the structural aspects rather than spectral aspects from the image, are applied to provide comparisons between the results derived from spectral based and the structural based filtering techniques.

  6. Hypergraph Based Feature Selection Technique for Medical Diagnosis.

    PubMed

    Somu, Nivethitha; Raman, M R Gauthama; Kirthivasan, Kannan; Sriram, V S Shankar

    2016-11-01

    The impact of internet and information systems across various domains have resulted in substantial generation of multidimensional datasets. The use of data mining and knowledge discovery techniques to extract the original information contained in the multidimensional datasets play a significant role in the exploitation of complete benefit provided by them. The presence of large number of features in the high dimensional datasets incurs high computational cost in terms of computing power and time. Hence, feature selection technique has been commonly used to build robust machine learning models to select a subset of relevant features which projects the maximal information content of the original dataset. In this paper, a novel Rough Set based K - Helly feature selection technique (RSKHT) which hybridize Rough Set Theory (RST) and K - Helly property of hypergraph representation had been designed to identify the optimal feature subset or reduct for medical diagnostic applications. Experiments carried out using the medical datasets from the UCI repository proves the dominance of the RSKHT over other feature selection techniques with respect to the reduct size, classification accuracy and time complexity. The performance of the RSKHT had been validated using WEKA tool, which shows that RSKHT had been computationally attractive and flexible over massive datasets.

  7. Uranium Detection - Technique Validation Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Colletti, Lisa Michelle; Garduno, Katherine; Lujan, Elmer J.

    As a LANL activity for DOE/NNSA in support of SHINE Medical Technologies™ ‘Accelerator Technology’ we have been investigating the application of UV-vis spectroscopy for uranium analysis in solution. While the technique has been developed specifically for sulfate solutions, the proposed SHINE target solutions, it can be adapted to a range of different solution matrixes. The FY15 work scope incorporated technical development that would improve accuracy, specificity, linearity & range, precision & ruggedness, and comparative analysis. Significant progress was achieved throughout FY 15 addressing these technical challenges, as is summarized in this report. In addition, comparative analysis of unknown samples usingmore » the Davies-Gray titration technique highlighted the importance of controlling temperature during analysis (impacting both technique accuracy and linearity/range). To fully understand the impact of temperature, additional experimentation and data analyses were performed during FY16. The results from this FY15/FY16 work were presented in a detailed presentation, LA-UR-16-21310, and an update of this presentation is included with this short report summarizing the key findings. The technique is based on analysis of the most intense U(VI) absorbance band in the visible region of the uranium spectra in 1 M H 2SO 4, at λ max = 419.5 nm.« less

  8. Learning Physics through Project-Based Learning Game Techniques

    ERIC Educational Resources Information Center

    Baran, Medine; Maskan, Abdulkadir; Yasar, Seyma

    2018-01-01

    The aim of the present study, in which Project and game techniques are used together, is to examine the impact of project-based learning games on students' physics achievement. Participants of the study consist of 34 9th grade students (N = 34). The data were collected using achievement tests and a questionnaire. Throughout the applications, the…

  9. Use-related risk analysis for medical devices based on improved FMEA.

    PubMed

    Liu, Long; Shuai, Ma; Wang, Zhu; Li, Ping

    2012-01-01

    In order to effectively analyze and control use-related risk of medical devices, quantitative methodologies must be applied. Failure Mode and Effects Analysis (FMEA) is a proactive technique for error detection and risk reduction. In this article, an improved FMEA based on Fuzzy Mathematics and Grey Relational Theory is developed to better carry out user-related risk analysis for medical devices. As an example, the analysis process using this improved FMEA method for a certain medical device (C-arm X-ray machine) is described.

  10. The application analysis of the multi-angle polarization technique for ocean color remote sensing

    NASA Astrophysics Data System (ADS)

    Zhang, Yongchao; Zhu, Jun; Yin, Huan; Zhang, Keli

    2017-02-01

    The multi-angle polarization technique, which uses the intensity of polarized radiation as the observed quantity, is a new remote sensing means for earth observation. With this method, not only can the multi-angle light intensity data be provided, but also the multi-angle information of polarized radiation can be obtained. So, the technique may solve the problems, those could not be solved with the traditional remote sensing methods. Nowadays, the multi-angle polarization technique has become one of the hot topics in the field of the international quantitative research on remote sensing. In this paper, we firstly introduce the principles of the multi-angle polarization technique, then the situations of basic research and engineering applications are particularly summarized and analysed in 1) the peeled-off method of sun glitter based on polarization, 2) the ocean color remote sensing based on polarization, 3) oil spill detection using polarization technique, 4) the ocean aerosol monitoring based on polarization. Finally, based on the previous work, we briefly present the problems and prospects of the multi-angle polarization technique used in China's ocean color remote sensing.

  11. Visualization and Analysis of Wireless Sensor Network Data for Smart Civil Structure Applications Based On Spatial Correlation Technique

    NASA Astrophysics Data System (ADS)

    Chowdhry, Bhawani Shankar; White, Neil M.; Jeswani, Jai Kumar; Dayo, Khalil; Rathi, Manorma

    2009-07-01

    Disasters affecting infrastructure, such as the 2001 earthquakes in India, 2005 in Pakistan, 2008 in China and the 2004 tsunami in Asia, provide a common need for intelligent buildings and smart civil structures. Now, imagine massive reductions in time to get the infrastructure working again, realtime information on damage to buildings, massive reductions in cost and time to certify that structures are undamaged and can still be operated, reductions in the number of structures to be rebuilt (if they are known not to be damaged). Achieving these ideas would lead to huge, quantifiable, long-term savings to government and industry. Wireless sensor networks (WSNs) can be deployed in buildings to make any civil structure both smart and intelligent. WSNs have recently gained much attention in both public and research communities because they are expected to bring a new paradigm to the interaction between humans, environment, and machines. This paper presents the deployment of WSN nodes in the Top Quality Centralized Instrumentation Centre (TQCIC). We created an ad hoc networking application to collect real-time data sensed from the nodes that were randomly distributed throughout the building. If the sensors are relocated, then the application automatically reconfigures itself in the light of the new routing topology. WSNs are event-based systems that rely on the collective effort of several micro-sensor nodes, which are continuously observing a physical phenomenon. WSN applications require spatially dense sensor deployment in order to achieve satisfactory coverage. The degree of spatial correlation increases with the decreasing inter-node separation. Energy consumption is reduced dramatically by having only those sensor nodes with unique readings transmit their data. We report on an algorithm based on a spatial correlation technique that assures high QoS (in terms of SNR) of the network as well as proper utilization of energy, by suppressing redundant data transmission

  12. Use of different spectroscopic techniques in the analysis of Roman age wall paintings.

    PubMed

    Agnoli, Francesca; Calliari, Irene; Mazzocchin, Gian-Antonio

    2007-01-01

    In this paper the analysis of samples of Roman age wall paintings coming from: Pordenone, Vicenza and Verona is carried out by using three different techniques: energy dispersive x-rays spectroscopy (EDS), x-rays fluorescence (XRF) and proton induced x-rays emission (PIXE). The features of the three spectroscopic techniques in the analysis of samples of archaeological interest are discussed. The studied pigments were: cinnabar, yellow ochre, green earth, Egyptian blue and carbon black.

  13. Analysis of base fuze functioning of HESH ammunitions through high-speed photographic technique

    NASA Astrophysics Data System (ADS)

    Biswal, T. K.

    2007-01-01

    High-speed photography plays a major role in a Test Range where the direct access is possible through imaging in order to understand a dynamic process thoroughly and both qualitative and quantitative data are obtained thereafter through image processing and analysis. In one of the trials it was difficult to understand the performance of HESH ammunitions on rolled homogeneous armour. There was no consistency in scab formation even though all other parameters like propellant charge mass, charge temperature, impact velocity etc are maintained constant. To understand the event thoroughly high-speed photography was deployed to have a frontal view of the total process. Clear information of shell impact, embedding of HE propellant on armour and base fuze initiation are obtained. In case of scab forming rounds these three processes are clearly observed in sequence. However in non-scab ones base fuze is initiated before the completion of the embedding process resulting non-availability of threshold thrust on to the armour to cause scab. This has been revealed in two rounds where there was a failure of scab formation. As a quantitative measure, fuze delay was calculated for each round and there after premature functioning of base fuze was ascertained in case of non-scab rounds. Such potency of high-speed photography has been depicted in details in this paper.

  14. A multiple technique approach to the analysis of urinary calculi.

    PubMed

    Rodgers, A L; Nassimbeni, L R; Mulder, K J

    1982-01-01

    10 urinary calculi have been qualitatively and quantitatively analysed using X-ray diffraction, infra-red, scanning electron microscopy, X-ray fluorescence, atomic absorption and density gradient procedures. Constituents and compositional features which often go undetected due to limitations in the particular analytical procedure being used, have been identified and a detailed picture of each stone's composition and structure has been obtained. In all cases at least two components were detected suggesting that the multiple technique approach might cast some doubt as to the existence of "pure" stones. Evidence for a continuous, non-sequential deposition mechanism has been detected. In addition, the usefulness of each technique in the analysis of urinary stones has been assessed and the multiple technique approach has been evaluated as a whole.

  15. Creep-Rupture Data Analysis - Engineering Application of Regression Techniques. Ph.D. Thesis - North Carolina State Univ.

    NASA Technical Reports Server (NTRS)

    Rummler, D. R.

    1976-01-01

    The results are presented of investigations to apply regression techniques to the development of methodology for creep-rupture data analysis. Regression analysis techniques are applied to the explicit description of the creep behavior of materials for space shuttle thermal protection systems. A regression analysis technique is compared with five parametric methods for analyzing three simulated and twenty real data sets, and a computer program for the evaluation of creep-rupture data is presented.

  16. Analytical transmissibility based transfer path analysis for multi-energy-domain systems using four-pole parameter theory

    NASA Astrophysics Data System (ADS)

    Mashayekhi, Mohammad Jalali; Behdinan, Kamran

    2017-10-01

    The increasing demand to minimize undesired vibration and noise levels in several high-tech industries has generated a renewed interest in vibration transfer path analysis. Analyzing vibration transfer paths within a system is of crucial importance in designing an effective vibration isolation strategy. Most of the existing vibration transfer path analysis techniques are empirical which are suitable for diagnosis and troubleshooting purpose. The lack of an analytical transfer path analysis to be used in the design stage is the main motivation behind this research. In this paper an analytical transfer path analysis based on the four-pole theory is proposed for multi-energy-domain systems. Bond graph modeling technique which is an effective approach to model multi-energy-domain systems is used to develop the system model. In this paper an electro-mechanical system is used as a benchmark example to elucidate the effectiveness of the proposed technique. An algorithm to obtain the equivalent four-pole representation of a dynamical systems based on the corresponding bond graph model is also presented in this paper.

  17. Bandwidth-Tunable Fiber Bragg Gratings Based on UV Glue Technique

    NASA Astrophysics Data System (ADS)

    Fu, Ming-Yue; Liu, Wen-Feng; Chen, Hsin-Tsang; Chuang, Chia-Wei; Bor, Sheau-Shong; Tien, Chuen-Lin

    2007-07-01

    In this study, we have demonstrated that a uniform fiber Bragg grating (FBG) can be transformed into a chirped fiber grating by a simple UV glue adhesive technique without shifting the reflection band with respect to the center wavelength of the FBG. The technique is based on the induced strain of an FBG due to the UV glue adhesive force on the fiber surface that causes a grating period variation and an effective index change. This technique can provide a fast and simple method of obtaining the required chirp value of a grating for applications in the dispersion compensators, gain flattening in erbium-doped fiber amplifiers (EDFAs) or optical filters.

  18. Density-cluster NMA: A new protein decomposition technique for coarse-grained normal mode analysis.

    PubMed

    Demerdash, Omar N A; Mitchell, Julie C

    2012-07-01

    Normal mode analysis has emerged as a useful technique for investigating protein motions on long time scales. This is largely due to the advent of coarse-graining techniques, particularly Hooke's Law-based potentials and the rotational-translational blocking (RTB) method for reducing the size of the force-constant matrix, the Hessian. Here we present a new method for domain decomposition for use in RTB that is based on hierarchical clustering of atomic density gradients, which we call Density-Cluster RTB (DCRTB). The method reduces the number of degrees of freedom by 85-90% compared with the standard blocking approaches. We compared the normal modes from DCRTB against standard RTB using 1-4 residues in sequence in a single block, with good agreement between the two methods. We also show that Density-Cluster RTB and standard RTB perform well in capturing the experimentally determined direction of conformational change. Significantly, we report superior correlation of DCRTB with B-factors compared with 1-4 residue per block RTB. Finally, we show significant reduction in computational cost for Density-Cluster RTB that is nearly 100-fold for many examples. Copyright © 2012 Wiley Periodicals, Inc.

  19. Vision based techniques for rotorcraft low altitude flight

    NASA Technical Reports Server (NTRS)

    Sridhar, Banavar; Suorsa, Ray; Smith, Philip

    1991-01-01

    An overview of research in obstacle detection at NASA Ames Research Center is presented. The research applies techniques from computer vision to automation of rotorcraft navigation. The development of a methodology for detecting the range to obstacles based on the maximum utilization of passive sensors is emphasized. The development of a flight and image data base for verification of vision-based algorithms, and a passive ranging methodology tailored to the needs of helicopter flight are discussed. Preliminary results indicate that it is possible to obtain adequate range estimates except at regions close to the FOE. Closer to the FOE, the error in range increases since the magnitude of the disparity gets smaller, resulting in a low SNR.

  20. Using Heuristic Task Analysis to Create Web-Based Instructional Design Theory

    ERIC Educational Resources Information Center

    Fiester, Herbert R.

    2010-01-01

    The first purpose of this study was to identify procedural and heuristic knowledge used when creating web-based instruction. The second purpose of this study was to develop suggestions for improving the Heuristic Task Analysis process, a technique for eliciting, analyzing, and representing expertise in cognitively complex tasks. Three expert…

  1. Different techniques of multispectral data analysis for vegetation fraction retrieval

    NASA Astrophysics Data System (ADS)

    Kancheva, Rumiana; Georgiev, Georgi

    2012-07-01

    Vegetation monitoring is one of the most important applications of remote sensing technologies. In respect to farmlands, the assessment of crop condition constitutes the basis of growth, development, and yield processes monitoring. Plant condition is defined by a set of biometric variables, such as density, height, biomass amount, leaf area index, and etc. The canopy cover fraction is closely related to these variables, and is state-indicative of the growth process. At the same time it is a defining factor of the soil-vegetation system spectral signatures. That is why spectral mixtures decomposition is a primary objective in remotely sensed data processing and interpretation, specifically in agricultural applications. The actual usefulness of the applied methods depends on their prediction reliability. The goal of this paper is to present and compare different techniques for quantitative endmember extraction from soil-crop patterns reflectance. These techniques include: linear spectral unmixing, two-dimensional spectra analysis, spectral ratio analysis (vegetation indices), spectral derivative analysis (red edge position), colorimetric analysis (tristimulus values sum, chromaticity coordinates and dominant wavelength). The objective is to reveal their potential, accuracy and robustness for plant fraction estimation from multispectral data. Regression relationships have been established between crop canopy cover and various spectral estimators.

  2. Quantitative Analysis of TDLUs using Adaptive Morphological Shape Techniques

    PubMed Central

    Rosebrock, Adrian; Caban, Jesus J.; Figueroa, Jonine; Gierach, Gretchen; Linville, Laura; Hewitt, Stephen; Sherman, Mark

    2014-01-01

    Within the complex branching system of the breast, terminal duct lobular units (TDLUs) are the anatomical location where most cancer originates. With aging, TDLUs undergo physiological involution, reflected in a loss of structural components (acini) and a reduction in total number. Data suggest that women undergoing benign breast biopsies that do not show age appropriate involution are at increased risk of developing breast cancer. To date, TDLU assessments have generally been made by qualitative visual assessment, rather than by objective quantitative analysis. This paper introduces a technique to automatically estimate a set of quantitative measurements and use those variables to more objectively describe and classify TDLUs. To validate the accuracy of our system, we compared the computer-based morphological properties of 51 TDLUs in breast tissues donated for research by volunteers in the Susan G. Komen Tissue Bank and compared results to those of a pathologist, demonstrating 70% agreement. Secondly, in order to show that our method is applicable to a wider range of datasets, we analyzed 52 TDLUs from biopsies performed for clinical indications in the National Cancer Institute’s Breast Radiology Evaluation and Study of Tissues (BREAST) Stamp Project and obtained 82% correlation with visual assessment. Lastly, we demonstrate the ability to uncover novel measures when researching the structural properties of the acini by applying machine learning and clustering techniques. Through our study we found that while the number of acini per TDLU increases exponentially with the TDLU diameter, the average elongation and roundness remain constant. PMID:25722829

  3. Reliability-Based Stability Analysis of Rock Slopes Using Numerical Analysis and Response Surface Method

    NASA Astrophysics Data System (ADS)

    Dadashzadeh, N.; Duzgun, H. S. B.; Yesiloglu-Gultekin, N.

    2017-08-01

    While advanced numerical techniques in slope stability analysis are successfully used in deterministic studies, they have so far found limited use in probabilistic analyses due to their high computation cost. The first-order reliability method (FORM) is one of the most efficient probabilistic techniques to perform probabilistic stability analysis by considering the associated uncertainties in the analysis parameters. However, it is not possible to directly use FORM in numerical slope stability evaluations as it requires definition of a limit state performance function. In this study, an integrated methodology for probabilistic numerical modeling of rock slope stability is proposed. The methodology is based on response surface method, where FORM is used to develop an explicit performance function from the results of numerical simulations. The implementation of the proposed methodology is performed by considering a large potential rock wedge in Sumela Monastery, Turkey. The accuracy of the developed performance function to truly represent the limit state surface is evaluated by monitoring the slope behavior. The calculated probability of failure is compared with Monte Carlo simulation (MCS) method. The proposed methodology is found to be 72% more efficient than MCS, while the accuracy is decreased with an error of 24%.

  4. Evaluation of the filling ability of artificial lateral canals using calcium silicate-based and epoxy resin-based endodontic sealers and two gutta-percha filling techniques.

    PubMed

    Fernández, R; Restrepo, J S; Aristizábal, D C; Álvarez, L G

    2016-04-01

    To evaluate the ability of a calcium silicate-based sealer (iRoot SP) and an epoxy resin-based sealer (Topseal) using two gutta-percha filling techniques to fill artificial lateral canals (ALCs). Seventy single-rooted human teeth were selected. Ten of these were used to obtain pilot data. Three ALCs were produced on mesial and distal surfaces of each root, one in each third, using size 10 engine reamers. The roots were randomly assigned to four experimental groups according to the filling technique and sealer used: 1, cold gutta-percha (single-point technique) with iRoot SP (SP-iR); 2, cold gutta-percha (single-point technique) with Topseal (SP-T); 3, continuous wave of condensation technique with iRoot SP (CWC-iR); and 4, continuous wave of condensation technique with Topseal (CWC-T). Digital periapical radiographs were taken. After the sealer had set, the roots were demineralized, cleared in methyl-salicylate and examined under a stereomicroscope. The depth of penetration of sealer and/or gutta-percha into the ALC was scored using a 5-point system, conducting an analysis on four surfaces. Filling scores of 0-1 were considered not acceptable, whilst scores of 2-4 were considered acceptable. Pearson's chi-square test was used to compare the experimental groups (P < 0.05). CWC-T was associated with the highest acceptable filling (57.8%), followed by CWC-iR (53.3%), SP-T (48.9%) and SP-iR (36.7%). Only when SP-iR was compared to the other groups, was the difference significant (P < 0.05). The apical third was associated with the lowest acceptable filling (37.5%). It was followed, in ascending order, by the middle (51.6%) and coronal thirds (58.3%). These differences were significant only when the apical thirds were compared to the other root thirds (P < 0.05). The calcium silicate-based sealer with continuous wave of condensation was more effective in artificial filling lateral canals than the single-point technique. The epoxy resin-based sealer with both filling

  5. Mobility based key management technique for multicast security in mobile ad hoc networks.

    PubMed

    Madhusudhanan, B; Chitra, S; Rajan, C

    2015-01-01

    In MANET multicasting, forward and backward secrecy result in increased packet drop rate owing to mobility. Frequent rekeying causes large message overhead which increases energy consumption and end-to-end delay. Particularly, the prevailing group key management techniques cause frequent mobility and disconnections. So there is a need to design a multicast key management technique to overcome these problems. In this paper, we propose the mobility based key management technique for multicast security in MANET. Initially, the nodes are categorized according to their stability index which is estimated based on the link availability and mobility. A multicast tree is constructed such that for every weak node, there is a strong parent node. A session key-based encryption technique is utilized to transmit a multicast data. The rekeying process is performed periodically by the initiator node. The rekeying interval is fixed depending on the node category so that this technique greatly minimizes the rekeying overhead. By simulation results, we show that our proposed approach reduces the packet drop rate and improves the data confidentiality.

  6. New methods for time-resolved fluorescence spectroscopy data analysis based on the Laguerre expansion technique--applications in tissue diagnosis.

    PubMed

    Jo, J A; Marcu, L; Fang, Q; Papaioannou, T; Qiao, J H; Fishbein, M C; Beseth, B; Dorafshar, A H; Reil, T; Baker, D; Freischlag, J

    2007-01-01

    A new deconvolution method for the analysis of time-resolved laser-induced fluorescence spectroscopy (TR-LIFS) data is introduced and applied for tissue diagnosis. The intrinsic TR-LIFS decays are expanded on a Laguerre basis, and the computed Laguerre expansion coefficients (LEC) are used to characterize the sample fluorescence emission. The method was applied for the diagnosis of atherosclerotic vulnerable plaques. At a first stage, using a rabbit atherosclerotic model, 73 TR-LIFS in-vivo measurements from the normal and atherosclerotic aorta segments of eight rabbits were taken. The Laguerre deconvolution technique was able to accurately deconvolve the TR-LIFS measurements. More interesting, the LEC reflected the changes in the arterial biochemical composition and provided discrimination of lesions rich in macrophages/foam-cells with high sensitivity (> 85%) and specificity (> 95%). At a second stage, 348 TR-LIFS measurements were obtained from the explanted carotid arteries of 30 patients. Lesions with significant inflammatory cells (macrophages/foam-cells and lymphocytes) were detected with high sensitivity (> 80%) and specificity (> 90%), using LEC-based classifiers. This study has demonstrated the potential of using TR-LIFS information by means of LEC for in vivo tissue diagnosis, and specifically for detecting inflammation in atherosclerotic lesions, a key marker of plaque vulnerability.

  7. Application of Avco data analysis and prediction techniques (ADAPT) to prediction of sunspot activity

    NASA Technical Reports Server (NTRS)

    Hunter, H. E.; Amato, R. A.

    1972-01-01

    The results are presented of the application of Avco Data Analysis and Prediction Techniques (ADAPT) to derivation of new algorithms for the prediction of future sunspot activity. The ADAPT derived algorithms show a factor of 2 to 3 reduction in the expected 2-sigma errors in the estimates of the 81-day running average of the Zurich sunspot numbers. The report presents: (1) the best estimates for sunspot cycles 20 and 21, (2) a comparison of the ADAPT performance with conventional techniques, and (3) specific approaches to further reduction in the errors of estimated sunspot activity and to recovery of earlier sunspot historical data. The ADAPT programs are used both to derive regression algorithm for prediction of the entire 11-year sunspot cycle from the preceding two cycles and to derive extrapolation algorithms for extrapolating a given sunspot cycle based on any available portion of the cycle.

  8. Reduced-Smoke Solid Propellant Combustion Products Analysis. Development of a Micromotor Combustor Technique.

    DTIC Science & Technology

    1976-10-01

    A low-cost micromotor combustor technique has been devised to support the development of reduced-smoke solid propellant formulations. The technique...includes a simple, reusable micromotor capable of high chamber pressures, a combustion products collection system, and procedures for analysis of

  9. Optical transmission testing based on asynchronous sampling techniques: images analysis containing chromatic dispersion using convolutional neural network

    NASA Astrophysics Data System (ADS)

    Mrozek, T.; Perlicki, K.; Tajmajer, T.; Wasilewski, P.

    2017-08-01

    The article presents an image analysis method, obtained from an asynchronous delay tap sampling (ADTS) technique, which is used for simultaneous monitoring of various impairments occurring in the physical layer of the optical network. The ADTS method enables the visualization of the optical signal in the form of characteristics (so called phase portraits) that change their shape under the influence of impairments such as chromatic dispersion, polarization mode dispersion and ASE noise. Using this method, a simulation model was built with OptSim 4.0. After the simulation study, data were obtained in the form of images that were further analyzed using the convolutional neural network algorithm. The main goal of the study was to train a convolutional neural network to recognize the selected impairment (distortion); then to test its accuracy and estimate the impairment for the selected set of test images. The input data consisted of processed binary images in the form of two-dimensional matrices, with the position of the pixel. This article focuses only on the analysis of images containing chromatic dispersion.

  10. Scanning angle Raman spectroscopy: Investigation of Raman scatter enhancement techniques for chemical analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Meyer, Matthew W.

    2013-01-01

    This thesis outlines advancements in Raman scatter enhancement techniques by applying evanescent fields, standing-waves (waveguides) and surface enhancements to increase the generated mean square electric field, which is directly related to the intensity of Raman scattering. These techniques are accomplished by employing scanning angle Raman spectroscopy and surface enhanced Raman spectroscopy. A 1064 nm multichannel Raman spectrometer is discussed for chemical analysis of lignin. Extending dispersive multichannel Raman spectroscopy to 1064 nm reduces the fluorescence interference that can mask the weaker Raman scattering. Overall, these techniques help address the major obstacles in Raman spectroscopy for chemical analysis, which include themore » inherently weak Raman cross section and susceptibility to fluorescence interference.« less

  11. A comparison of solute-transport solution techniques based on inverse modelling results

    USGS Publications Warehouse

    Mehl, S.; Hill, M.C.

    2000-01-01

    Five common numerical techniques (finite difference, predictor-corrector, total-variation-diminishing, method-of-characteristics, and modified-method-of-characteristics) were tested using simulations of a controlled conservative tracer-test experiment through a heterogeneous, two-dimensional sand tank. The experimental facility was constructed using randomly distributed homogeneous blocks of five sand types. This experimental model provides an outstanding opportunity to compare the solution techniques because of the heterogeneous hydraulic conductivity distribution of known structure, and the availability of detailed measurements with which to compare simulated concentrations. The present work uses this opportunity to investigate how three common types of results-simulated breakthrough curves, sensitivity analysis, and calibrated parameter values-change in this heterogeneous situation, given the different methods of simulating solute transport. The results show that simulated peak concentrations, even at very fine grid spacings, varied because of different amounts of numerical dispersion. Sensitivity analysis results were robust in that they were independent of the solution technique. They revealed extreme correlation between hydraulic conductivity and porosity, and that the breakthrough curve data did not provide enough information about the dispersivities to estimate individual values for the five sands. However, estimated hydraulic conductivity values are significantly influenced by both the large possible variations in model dispersion and the amount of numerical dispersion present in the solution technique.Five common numerical techniques (finite difference, predictor-corrector, total-variation-diminishing, method-of-characteristics, and modified-method-of-characteristics) were tested using simulations of a controlled conservative tracer-test experiment through a heterogeneous, two-dimensional sand tank. The experimental facility was constructed using randomly

  12. Errorless-based techniques can improve route finding in early Alzheimer's disease: a case study.

    PubMed

    Provencher, Véronique; Bier, Nathalie; Audet, Thérèse; Gagnon, Lise

    2008-01-01

    Topographical disorientation is a common and early manifestation of dementia of Alzheimer type, which threatens independence in activities of daily living. Errorless-based techniques appear to be effective in helping patients with amnesia to learn routes, but little is known about their effectiveness in early dementia of Alzheimer type. A 77-year-old woman with dementia of Alzheimer type had difficulty in finding her way around her seniors residence, which reduced her social activities. This study used an ABA design (A is the baseline and B is the intervention) with multiple baselines across routes for going to the rosary (target), laundry, and game rooms (controls). The errorless-based technique intervention was applied to 2 of the 3 routes. Analyses showed significant improvement only for the routes learned with errorless-based techniques. Following the study, the participant increased her topographical knowledge of her surroundings. Route learning interventions based on errorless-based techniques appear to be a promising approach for improving the independence in early dementia of Alzheimer type.

  13. Interferometric Dynamic Measurement: Techniques Based on High-Speed Imaging or a Single Photodetector

    PubMed Central

    Fu, Yu; Pedrini, Giancarlo

    2014-01-01

    In recent years, optical interferometry-based techniques have been widely used to perform noncontact measurement of dynamic deformation in different industrial areas. In these applications, various physical quantities need to be measured in any instant and the Nyquist sampling theorem has to be satisfied along the time axis on each measurement point. Two types of techniques were developed for such measurements: one is based on high-speed cameras and the other uses a single photodetector. The limitation of the measurement range along the time axis in camera-based technology is mainly due to the low capturing rate, while the photodetector-based technology can only do the measurement on a single point. In this paper, several aspects of these two technologies are discussed. For the camera-based interferometry, the discussion includes the introduction of the carrier, the processing of the recorded images, the phase extraction algorithms in various domains, and how to increase the temporal measurement range by using multiwavelength techniques. For the detector-based interferometry, the discussion mainly focuses on the single-point and multipoint laser Doppler vibrometers and their applications for measurement under extreme conditions. The results show the effort done by researchers for the improvement of the measurement capabilities using interferometry-based techniques to cover the requirements needed for the industrial applications. PMID:24963503

  14. Development of Gold Standard Ion-Selective Electrode-Based Methods for Fluoride Analysis

    PubMed Central

    Martínez-Mier, E.A.; Cury, J.A.; Heilman, J.R.; Katz, B.P.; Levy, S.M.; Li, Y.; Maguire, A.; Margineda, J.; O’Mullane, D.; Phantumvanit, P.; Soto-Rojas, A.E.; Stookey, G.K.; Villa, A.; Wefel, J.S.; Whelton, H.; Whitford, G.M.; Zero, D.T.; Zhang, W.; Zohouri, V.

    2011-01-01

    Background/Aims: Currently available techniques for fluoride analysis are not standardized. Therefore, this study was designed to develop standardized methods for analyzing fluoride in biological and nonbiological samples used for dental research. Methods A group of nine laboratories analyzed a set of standardized samples for fluoride concentration using their own methods. The group then reviewed existing analytical techniques for fluoride analysis, identified inconsistencies in the use of these techniques and conducted testing to resolve differences. Based on the results of the testing undertaken to define the best approaches for the analysis, the group developed recommendations for direct and microdiffusion methods using the fluoride ion-selective electrode. Results Initial results demonstrated that there was no consensus regarding the choice of analytical techniques for different types of samples. Although for several types of samples, the results of the fluoride analyses were similar among some laboratories, greater differences were observed for saliva, food and beverage samples. In spite of these initial differences, precise and true values of fluoride concentration, as well as smaller differences between laboratories, were obtained once the standardized methodologies were used. Intraclass correlation coefficients ranged from 0.90 to 0.93, for the analysis of a certified reference material, using the standardized methodologies. Conclusion The results of this study demonstrate that the development and use of standardized protocols for F analysis significantly decreased differences among laboratories and resulted in more precise and true values. PMID:21160184

  15. Analysis and Identification of Acid-Base Indicator Dyes by Thin-Layer Chromatography

    ERIC Educational Resources Information Center

    Clark, Daniel D.

    2007-01-01

    Thin-layer chromatography (TLC) is a very simple and effective technique that is used by chemists by different purposes, including the monitoring of the progress of a reaction. TLC can also be easily used for the analysis and identification of various acid-base indicator dyes.

  16. Molecular-Based Optical Measurement Techniques for Transition and Turbulence in High-Speed Flow

    NASA Technical Reports Server (NTRS)

    Bathel, Brett F.; Danehy, Paul M.; Cutler, Andrew D.

    2013-01-01

    High-speed laminar-to-turbulent transition and turbulence affect the control of flight vehicles, the heat transfer rate to a flight vehicle's surface, the material selected to protect such vehicles from high heating loads, the ultimate weight of a flight vehicle due to the presence of thermal protection systems, the efficiency of fuel-air mixing processes in high-speed combustion applications, etc. Gaining a fundamental understanding of the physical mechanisms involved in the transition process will lead to the development of predictive capabilities that can identify transition location and its impact on parameters like surface heating. Currently, there is no general theory that can completely describe the transition-to-turbulence process. However, transition research has led to the identification of the predominant pathways by which this process occurs. For a truly physics-based model of transition to be developed, the individual stages in the paths leading to the onset of fully turbulent flow must be well understood. This requires that each pathway be computationally modeled and experimentally characterized and validated. This may also lead to the discovery of new physical pathways. This document is intended to describe molecular based measurement techniques that have been developed, addressing the needs of the high-speed transition-to-turbulence and high-speed turbulence research fields. In particular, we focus on techniques that have either been used to study high speed transition and turbulence or techniques that show promise for studying these flows. This review is not exhaustive. In addition to the probe-based techniques described in the previous paragraph, several other classes of measurement techniques that are, or could be, used to study high speed transition and turbulence are excluded from this manuscript. For example, surface measurement techniques such as pressure and temperature paint, phosphor thermography, skin friction measurements and

  17. A cost comparison analysis of adjuvant radiation therapy techniques after breast-conserving surgery.

    PubMed

    Lanni, Thomas; Keisch, Martin; Shah, Chirag; Wobb, Jessica; Kestin, Larry; Vicini, Frank

    2013-01-01

    The aim of this study is to perform a cost analysis to compare adjuvant radiation therapy schedules following breast conserving surgery. Treatment planning and delivery utilization data were modeled for a series of 10 different breast RT techniques. The whole breast (WB) regimens consisted of: (1) Wedge based WB (25 fractions [fx]), (2) WB using IMRT, (3) WBRT with a boost (B), (4) WBRT using IMRT with a B, (5) Canadian WB (16 fx) with 3D-CRT, and (6) Canadian using IMRT. The accelerated partial breast irradiation (APBI) regimens included (7): APBI using 3D-CRT, (8) IMRT, (9) single channel balloon, and (10) multi-channel balloon. Costs incurred by the payer (i.e., direct medical costs) were taken from the 2011 Medicare Fee Schedule. Among all the different regimens examined, Canadian 3D-CRT and APBI 3D-CRT were the least costly whereas WB using IMRT with a B was the most expensive. Both APBI brachytherapy techniques were less costly than conventional WB with a B. In terms of direct medical costs, the technical component accounted for most, if not all, of the disparity among the various treatments. A general trend of decreasing RT costs was observed with further reductions in overall treatment time for WBRT techniques, but not all of the alternative treatment regimens led to similar total cost savings. APBI using brachytherapy techniques was less costly than conventional WBRT with a standard boost. © 2013 Wiley Periodicals, Inc.

  18. A Comparative of business process modelling techniques

    NASA Astrophysics Data System (ADS)

    Tangkawarow, I. R. H. T.; Waworuntu, J.

    2016-04-01

    In this era, there is a lot of business process modeling techniques. This article is the research about differences of business process modeling techniques. For each technique will explain about the definition and the structure. This paper presents a comparative analysis of some popular business process modelling techniques. The comparative framework is based on 2 criteria: notation and how it works when implemented in Somerleyton Animal Park. Each technique will end with the advantages and disadvantages. The final conclusion will give recommend of business process modeling techniques that easy to use and serve the basis for evaluating further modelling techniques.

  19. Three-Dimensional Inverse Transport Solver Based on Compressive Sensing Technique

    NASA Astrophysics Data System (ADS)

    Cheng, Yuxiong; Wu, Hongchun; Cao, Liangzhi; Zheng, Youqi

    2013-09-01

    According to the direct exposure measurements from flash radiographic image, a compressive sensing-based method for three-dimensional inverse transport problem is presented. The linear absorption coefficients and interface locations of objects are reconstructed directly at the same time. It is always very expensive to obtain enough measurements. With limited measurements, compressive sensing sparse reconstruction technique orthogonal matching pursuit is applied to obtain the sparse coefficients by solving an optimization problem. A three-dimensional inverse transport solver is developed based on a compressive sensing-based technique. There are three features in this solver: (1) AutoCAD is employed as a geometry preprocessor due to its powerful capacity in graphic. (2) The forward projection matrix rather than Gauss matrix is constructed by the visualization tool generator. (3) Fourier transform and Daubechies wavelet transform are adopted to convert an underdetermined system to a well-posed system in the algorithm. Simulations are performed and numerical results in pseudo-sine absorption problem, two-cube problem and two-cylinder problem when using compressive sensing-based solver agree well with the reference value.

  20. Comparative Effect of Different Polymerization Techniques on the Flexural and Surface Properties of Acrylic Denture Bases.

    PubMed

    Gad, Mohammed M; Fouda, Shaimaa M; ArRejaie, Aws S; Al-Thobity, Ahmad M

    2017-05-22

    Polymerization techniques have been modified to improve physical and mechanical properties of polymethylmethacrylate (PMMA) denture base, as have the laboratory procedures that facilitate denture construction techniques. The purpose of the present study was to investigate the effect of autoclave polymerization on flexural strength, elastic modulus, surface roughness, and the hardness of PMMA denture base resins. Major Base and Vertex Implacryl heat-polymerized acrylic resins were used to fabricate 180 specimens. According to the polymerization technique, tested groups were divided into: group I (water-bath polymerization), group II (short autoclave polymerization cycle, 60°C for 30 minutes, then 130°C for 10 minutes), and group III (long autoclave polymerization cycle, 60°C for 30 minutes, then 130°C for 20 minutes). Each group was divided into two subgroups based on the materials used. Flexural strength and elastic modulus were determined by a three-point bending test. Surface roughness and hardness were evaluated with a profilometer and Vickers hardness (VH) test, respectively. One-way ANOVA and the Tukey-Kramer multiple-comparison test were used for results analysis, which were statistically significant at p ≤ 0.05. Autoclave polymerization showed a significant increase in flexural strength and hardness of the two resins (p < 0.05). The elastic modulus showed a significant increase in the major base resin, while a significant decrease was seen for Vertex Implacryl in all groups (p < 0.05); however, there was no significant difference in surface roughness between autoclave polymerization and water-bath polymerization (p > 0.05). Autoclave polymerization significantly increased the flexural properties and hardness of PMMA denture bases, while the surface roughness was within acceptable clinical limits. For a long autoclave polymerization cycle, it could be used as an alternative to water-bath polymerization. © 2017 by the American College of Prosthodontists.

  1. Control volume based hydrocephalus research; analysis of human data

    NASA Astrophysics Data System (ADS)

    Cohen, Benjamin; Wei, Timothy; Voorhees, Abram; Madsen, Joseph; Anor, Tomer

    2010-11-01

    Hydrocephalus is a neuropathophysiological disorder primarily diagnosed by increased cerebrospinal fluid volume and pressure within the brain. To date, utilization of clinical measurements have been limited to understanding of the relative amplitude and timing of flow, volume and pressure waveforms; qualitative approaches without a clear framework for meaningful quantitative comparison. Pressure volume models and electric circuit analogs enforce volume conservation principles in terms of pressure. Control volume analysis, through the integral mass and momentum conservation equations, ensures that pressure and volume are accounted for using first principles fluid physics. This approach is able to directly incorporate the diverse measurements obtained by clinicians into a simple, direct and robust mechanics based framework. Clinical data obtained for analysis are discussed along with data processing techniques used to extract terms in the conservation equation. Control volume analysis provides a non-invasive, physics-based approach to extracting pressure information from magnetic resonance velocity data that cannot be measured directly by pressure instrumentation.

  2. Fast classification and compositional analysis of cornstover fractions using Fourier transform near-infrared techniques.

    PubMed

    Philip Ye, X; Liu, Lu; Hayes, Douglas; Womac, Alvin; Hong, Kunlun; Sokhansanj, Shahab

    2008-10-01

    The objectives of this research were to determine the variation of chemical composition across botanical fractions of cornstover, and to probe the potential of Fourier transform near-infrared (FT-NIR) techniques in qualitatively classifying separated cornstover fractions and in quantitatively analyzing chemical compositions of cornstover by developing calibration models to predict chemical compositions of cornstover based on FT-NIR spectra. Large variations of cornstover chemical composition for wide calibration ranges, which is required by a reliable calibration model, were achieved by manually separating the cornstover samples into six botanical fractions, and their chemical compositions were determined by conventional wet chemical analyses, which proved that chemical composition varies significantly among different botanical fractions of cornstover. Different botanic fractions, having total saccharide content in descending order, are husk, sheath, pith, rind, leaf, and node. Based on FT-NIR spectra acquired on the biomass, classification by Soft Independent Modeling of Class Analogy (SIMCA) was employed to conduct qualitative classification of cornstover fractions, and partial least square (PLS) regression was used for quantitative chemical composition analysis. SIMCA was successfully demonstrated in classifying botanical fractions of cornstover. The developed PLS model yielded root mean square error of prediction (RMSEP %w/w) of 0.92, 1.03, 0.17, 0.27, 0.21, 1.12, and 0.57 for glucan, xylan, galactan, arabinan, mannan, lignin, and ash, respectively. The results showed the potential of FT-NIR techniques in combination with multivariate analysis to be utilized by biomass feedstock suppliers, bioethanol manufacturers, and bio-power producers in order to better manage bioenergy feedstocks and enhance bioconversion.

  3. A Corpus-Based Approach for Automatic Thai Unknown Word Recognition Using Boosting Techniques

    NASA Astrophysics Data System (ADS)

    Techo, Jakkrit; Nattee, Cholwich; Theeramunkong, Thanaruk

    While classification techniques can be applied for automatic unknown word recognition in a language without word boundary, it faces with the problem of unbalanced datasets where the number of positive unknown word candidates is dominantly smaller than that of negative candidates. To solve this problem, this paper presents a corpus-based approach that introduces a so-called group-based ranking evaluation technique into ensemble learning in order to generate a sequence of classification models that later collaborate to select the most probable unknown word from multiple candidates. Given a classification model, the group-based ranking evaluation (GRE) is applied to construct a training dataset for learning the succeeding model, by weighing each of its candidates according to their ranks and correctness when the candidates of an unknown word are considered as one group. A number of experiments have been conducted on a large Thai medical text to evaluate performance of the proposed group-based ranking evaluation approach, namely V-GRE, compared to the conventional naïve Bayes classifier and our vanilla version without ensemble learning. As the result, the proposed method achieves an accuracy of 90.93±0.50% when the first rank is selected while it gains 97.26±0.26% when the top-ten candidates are considered, that is 8.45% and 6.79% improvement over the conventional record-based naïve Bayes classifier and the vanilla version. Another result on applying only best features show 93.93±0.22% and up to 98.85±0.15% accuracy for top-1 and top-10, respectively. They are 3.97% and 9.78% improvement over naive Bayes and the vanilla version. Finally, an error analysis is given.

  4. Analysis of filter tuning techniques for sequential orbit determination

    NASA Technical Reports Server (NTRS)

    Lee, T.; Yee, C.; Oza, D.

    1995-01-01

    This paper examines filter tuning techniques for a sequential orbit determination (OD) covariance analysis. Recently, there has been a renewed interest in sequential OD, primarily due to the successful flight qualification of the Tracking and Data Relay Satellite System (TDRSS) Onboard Navigation System (TONS) using Doppler data extracted onboard the Extreme Ultraviolet Explorer (EUVE) spacecraft. TONS computes highly accurate orbit solutions onboard the spacecraft in realtime using a sequential filter. As the result of the successful TONS-EUVE flight qualification experiment, the Earth Observing System (EOS) AM-1 Project has selected TONS as the prime navigation system. In addition, sequential OD methods can be used successfully for ground OD. Whether data are processed onboard or on the ground, a sequential OD procedure is generally favored over a batch technique when a realtime automated OD system is desired. Recently, OD covariance analyses were performed for the TONS-EUVE and TONS-EOS missions using the sequential processing options of the Orbit Determination Error Analysis System (ODEAS). ODEAS is the primary covariance analysis system used by the Goddard Space Flight Center (GSFC) Flight Dynamics Division (FDD). The results of these analyses revealed a high sensitivity of the OD solutions to the state process noise filter tuning parameters. The covariance analysis results show that the state estimate error contributions from measurement-related error sources, especially those due to the random noise and satellite-to-satellite ionospheric refraction correction errors, increase rapidly as the state process noise increases. These results prompted an in-depth investigation of the role of the filter tuning parameters in sequential OD covariance analysis. This paper analyzes how the spacecraft state estimate errors due to dynamic and measurement-related error sources are affected by the process noise level used. This information is then used to establish

  5. The Views of the Teachers about the Mind Mapping Technique in the Elementary Life Science and Social Studies Lessons Based on the Constructivist Method

    ERIC Educational Resources Information Center

    Seyihoglu, Aysegul; Kartal, Ayca

    2010-01-01

    The purpose of this study is to reveal the opinions of teachers on using the mind mapping technique in Life Science and Social Studies lessons. The participants of the study are 20 primary education teachers. In this study, a semi-structured interview technique was used. For content analysis, the themes and codes were defined, based on the views…

  6. [Concept analysis "Competency-based education"].

    PubMed

    Loosli, Clarence

    2016-03-01

    Competency-based education (CBE) stands out at global level as the best educational practice. Indeed, CBE is supposed to improve the quality of care provided by newly graduated nurses. Yet, there is a dearth of knowledge in nursing literature regarding CBE concept's definition. CBE is implemented differently in each entity even inside the same discipline in a single country. What accounts for CBE in nursing education ? to clarify CBE concept meaning according to literature review in order to propose a definition. Wilson concept analysis method framed our literature review through two databases: CINHAL and ERIC. following the 11 Wilson techniques analysis, we identified CBE concept as a multidimensional concept clustering three dimensions : learning, teaching and assessment. nurses educators are accountable for providing performants newly graduated professional to the society. Schools should struggle for the visibility and the transparency of means they are using to accomplish their educational activities. This first attempt to understand CBE concept opens a matter of debate concerning further development and clarification of the concept. This first description of CBE concept is a step toward its identification and assessment.

  7. Recent development in mass spectrometry and its hyphenated techniques for the analysis of medicinal plants.

    PubMed

    Zhu, Ming-Zhi; Chen, Gui-Lin; Wu, Jian-Lin; Li, Na; Liu, Zhong-Hua; Guo, Ming-Quan

    2018-04-23

    Medicinal plants are gaining increasing attention worldwide due to their empirical therapeutic efficacy and being a huge natural compound pool for new drug discovery and development. The efficacy, safety and quality of medicinal plants are the main concerns, which are highly dependent on the comprehensive analysis of chemical components in the medicinal plants. With the advances in mass spectrometry (MS) techniques, comprehensive analysis and fast identification of complex phytochemical components have become feasible, and may meet the needs, for the analysis of medicinal plants. Our aim is to provide an overview on the latest developments in MS and its hyphenated technique and their applications for the comprehensive analysis of medicinal plants. Application of various MS and its hyphenated techniques for the analysis of medicinal plants, including but not limited to one-dimensional chromatography, multiple-dimensional chromatography coupled to MS, ambient ionisation MS, and mass spectral database, have been reviewed and compared in this work. Recent advancs in MS and its hyphenated techniques have made MS one of the most powerful tools for the analysis of complex extracts from medicinal plants due to its excellent separation and identification ability, high sensitivity and resolution, and wide detection dynamic range. To achieve high-throughput or multi-dimensional analysis of medicinal plants, the state-of-the-art MS and its hyphenated techniques have played, and will continue to play a great role in being the major platform for their further research in order to obtain insight into both their empirical therapeutic efficacy and quality control. Copyright © 2018 John Wiley & Sons, Ltd.

  8. Flame analysis using image processing techniques

    NASA Astrophysics Data System (ADS)

    Her Jie, Albert Chang; Zamli, Ahmad Faizal Ahmad; Zulazlan Shah Zulkifli, Ahmad; Yee, Joanne Lim Mun; Lim, Mooktzeng

    2018-04-01

    This paper presents image processing techniques with the use of fuzzy logic and neural network approach to perform flame analysis. Flame diagnostic is important in the industry to extract relevant information from flame images. Experiment test is carried out in a model industrial burner with different flow rates. Flame features such as luminous and spectral parameters are extracted using image processing and Fast Fourier Transform (FFT). Flame images are acquired using FLIR infrared camera. Non-linearities such as thermal acoustic oscillations and background noise affect the stability of flame. Flame velocity is one of the important characteristics that determines stability of flame. In this paper, an image processing method is proposed to determine flame velocity. Power spectral density (PSD) graph is a good tool for vibration analysis where flame stability can be approximated. However, a more intelligent diagnostic system is needed to automatically determine flame stability. In this paper, flame features of different flow rates are compared and analyzed. The selected flame features are used as inputs to the proposed fuzzy inference system to determine flame stability. Neural network is used to test the performance of the fuzzy inference system.

  9. A technique based on droplet evaporation to recognize alcoholic drinks

    NASA Astrophysics Data System (ADS)

    González-Gutiérrez, Jorge; Pérez-Isidoro, Rosendo; Ruiz-Suárez, J. C.

    2017-07-01

    Chromatography is, at present, the most used technique to determine the purity of alcoholic drinks. This involves a careful separation of the components of the liquid elements. However, since this technique requires sophisticated instrumentation, there are alternative techniques such as conductivity measurements and UV-Vis and infrared spectrometries. We report here a method based on salt-induced crystallization patterns formed during the evaporation of alcoholic drops. We found that droplets of different samples form different structures upon drying, which we characterize by their radial density profiles. We prove that using the dried deposit of a spirit as a control sample, our method allows us to differentiate between pure and adulterated drinks. As a proof of concept, we study tequila.

  10. Improvement in QEPAS system utilizing a second harmonic based wavelength calibration technique

    NASA Astrophysics Data System (ADS)

    Zhang, Qinduan; Chang, Jun; Wang, Fupeng; Wang, Zongliang; Xie, Yulei; Gong, Weihua

    2018-05-01

    A simple laser wavelength calibration technique, based on second harmonic signal, is demonstrated in this paper to improve the performance of quartz enhanced photoacoustic spectroscopy (QEPAS) gas sensing system, e.g. improving the signal to noise ratio (SNR), detection limit and long-term stability. Constant current, corresponding to the gas absorption line, combining f/2 frequency sinusoidal signal are used to drive the laser (constant driving mode), a software based real-time wavelength calibration technique is developed to eliminate the wavelength drift due to ambient fluctuations. Compared to conventional wavelength modulation spectroscopy (WMS), this method allows lower filtering bandwidth and averaging algorithm applied to QEPAS system, improving SNR and detection limit. In addition, the real-time wavelength calibration technique guarantees the laser output is modulated steadily at gas absorption line. Water vapor is chosen as an objective gas to evaluate its performance compared to constant driving mode and conventional WMS system. The water vapor sensor was designed insensitive to the incoherent external acoustic noise by the numerical averaging technique. As a result, the SNR increases 12.87 times in wavelength calibration technique based system compared to conventional WMS system. The new system achieved a better linear response (R2 = 0 . 9995) in concentration range from 300 to 2000 ppmv, and achieved a minimum detection limit (MDL) of 630 ppbv.

  11. "Techniques d'expression,""approche communicative," meme combat? ("Expressive Techniques,""Communicative Approach," Same Struggle?)

    ERIC Educational Resources Information Center

    Vives, Robert

    1983-01-01

    Based on a literature review and analysis of teaching methods and objectives, it is proposed that the emphasis on communicative competence ascendant in French foreign language instruction is closely related to, and borrows from, expressive techniques taught in French native language instruction in the 1960s. (MSE)

  12. Heart Sound Biometric System Based on Marginal Spectrum Analysis

    PubMed Central

    Zhao, Zhidong; Shen, Qinqin; Ren, Fangqin

    2013-01-01

    This work presents a heart sound biometric system based on marginal spectrum analysis, which is a new feature extraction technique for identification purposes. This heart sound identification system is comprised of signal acquisition, pre-processing, feature extraction, training, and identification. Experiments on the selection of the optimal values for the system parameters are conducted. The results indicate that the new spectrum coefficients result in a significant increase in the recognition rate of 94.40% compared with that of the traditional Fourier spectrum (84.32%) based on a database of 280 heart sounds from 40 participants. PMID:23429515

  13. Performance Analysis of Ranging Techniques for the KPLO Mission

    NASA Astrophysics Data System (ADS)

    Park, Sungjoon; Moon, Sangman

    2018-03-01

    In this study, the performance of ranging techniques for the Korea Pathfinder Lunar Orbiter (KPLO) space communication system is investigated. KPLO is the first lunar mission of Korea, and pseudo-noise (PN) ranging will be used to support the mission along with sequential ranging. We compared the performance of both ranging techniques using the criteria of accuracy, acquisition probability, and measurement time. First, we investigated the end-to-end accuracy error of a ranging technique incorporating all sources of errors such as from ground stations and the spacecraft communication system. This study demonstrates that increasing the clock frequency of the ranging system is not required when the dominant factor of accuracy error is independent of the thermal noise of the ranging technique being used in the system. Based on the understanding of ranging accuracy, the measurement time of PN and sequential ranging are further investigated and compared, while both techniques satisfied the accuracy and acquisition requirements. We demonstrated that PN ranging performed better than sequential ranging in the signal-to-noise ratio (SNR) regime where KPLO will be operating, and we found that the T2B (weighted-voting balanced Tausworthe, voting v = 2) code is the best choice among the PN codes available for the KPLO mission.

  14. Using high speed smartphone cameras and video analysis techniques to teach mechanical wave physics

    NASA Astrophysics Data System (ADS)

    Bonato, Jacopo; Gratton, Luigi M.; Onorato, Pasquale; Oss, Stefano

    2017-07-01

    We propose the use of smartphone-based slow-motion video analysis techniques as a valuable tool for investigating physics concepts ruling mechanical wave propagation. The simple experimental activities presented here, suitable for both high school and undergraduate students, allows one to measure, in a simple yet rigorous way, the speed of pulses along a spring and the period of transverse standing waves generated in the same spring. These experiments can be helpful in addressing several relevant concepts about the physics of mechanical waves and in overcoming some of the typical student misconceptions in this same field.

  15. Comparative analysis of nonlinear dimensionality reduction techniques for breast MRI segmentation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Akhbardeh, Alireza; Jacobs, Michael A.; Russell H. Morgan Department of Radiology and Radiological Science, Johns Hopkins University School of Medicine, Baltimore, Maryland 21205

    2012-04-15

    Purpose: Visualization of anatomical structures using radiological imaging methods is an important tool in medicine to differentiate normal from pathological tissue and can generate large amounts of data for a radiologist to read. Integrating these large data sets is difficult and time-consuming. A new approach uses both supervised and unsupervised advanced machine learning techniques to visualize and segment radiological data. This study describes the application of a novel hybrid scheme, based on combining wavelet transform and nonlinear dimensionality reduction (NLDR) methods, to breast magnetic resonance imaging (MRI) data using three well-established NLDR techniques, namely, ISOMAP, local linear embedding (LLE), andmore » diffusion maps (DfM), to perform a comparative performance analysis. Methods: Twenty-five breast lesion subjects were scanned using a 3T scanner. MRI sequences used were T1-weighted, T2-weighted, diffusion-weighted imaging (DWI), and dynamic contrast-enhanced (DCE) imaging. The hybrid scheme consisted of two steps: preprocessing and postprocessing of the data. The preprocessing step was applied for B{sub 1} inhomogeneity correction, image registration, and wavelet-based image compression to match and denoise the data. In the postprocessing step, MRI parameters were considered data dimensions and the NLDR-based hybrid approach was applied to integrate the MRI parameters into a single image, termed the embedded image. This was achieved by mapping all pixel intensities from the higher dimension to a lower dimensional (embedded) space. For validation, the authors compared the hybrid NLDR with linear methods of principal component analysis (PCA) and multidimensional scaling (MDS) using synthetic data. For the clinical application, the authors used breast MRI data, comparison was performed using the postcontrast DCE MRI image and evaluating the congruence of the segmented lesions. Results: The NLDR-based hybrid approach was able to define and

  16. Techniques for water demand analysis and forecasting: Puerto Rico, a case study

    USGS Publications Warehouse

    Attanasi, E.D.; Close, E.R.; Lopez, M.A.

    1975-01-01

    The rapid economic growth of the Commonwealth-of Puerto Rico since 1947 has brought public pressure on Government agencies for rapid development of public water supply and waste treatment facilities. Since 1945 the Puerto Rico Aqueduct and Sewer Authority has had the responsibility for planning, developing and operating water supply and waste treatment facilities on a municipal basis. The purpose of this study was to develop operational techniques whereby a planning agency, such as the Puerto Rico Aqueduct and Sewer Authority, could project the temporal and spatial distribution of .future water demands. This report is part of a 2-year cooperative study between the U.S. Geological Survey and the Environmental Quality Board of the Commonwealth of Puerto Rico, for the development of systems analysis techniques for use in water resources planning. While the Commonwealth was assisted in the development of techniques to facilitate ongoing planning, the U.S. Geological Survey attempted to gain insights in order to better interface its data collection efforts with the planning process. The report reviews the institutional structure associated with water resources planning for the Commonwealth. A brief description of alternative water demand forecasting procedures is presented and specific techniques and analyses of Puerto Rico demand data are discussed. Water demand models for a specific area of Puerto Rico are then developed. These models provide a framework for making several sets of water demand forecasts based on alternative economic and demographic assumptions. In the second part of this report, the historical impact of water resources investment on regional economic development is analyzed and related to water demand .forecasting. Conclusions and future data needs are in the last section.

  17. Ivory species identification using electrophoresis-based techniques.

    PubMed

    Kitpipit, Thitika; Thanakiatkrai, Phuvadol; Penchart, Kitichaya; Ouithavon, Kanita; Satasook, Chutamas; Linacre, Adrian

    2016-12-01

    Despite continuous conservation efforts by national and international organizations, the populations of the three extant elephant species are still dramatically declining due to the illegal trade in ivory leading to the killing of elephants. A requirement to aid investigations and prosecutions is the accurate identification of the elephant species from which the ivory was removed. We report on the development of the first fully validated multiplex PCR-electrophoresis assay for ivory DNA analysis that can be used as a screening or confirmatory test. SNPs from the NADH dehydrogenase 5 and cytochrome b gene loci were identified and used in the development of the assay. The three extant elephant species could be identified based on three peaks/bands. Elephas maximus exhibited two distinct PCR fragments at approximate 129 and 381 bp; Loxodonta cyclotis showed two PCR fragments at 89 and 129 bp; and Loxodonta africana showed a single fragment of 129 bp. The assay correctly identified the elephant species using all 113 ivory and blood samples used in this report. We also report on the high sensitivity and specificity of the assay. All single-blinded samples were correctly classified, which demonstrated the assay's ability to be used for real casework. In addition, the assay could be used in conjunction with the technique of direct amplification. We propose that the test will benefit wildlife forensic laboratories and aid in the transition to the criminal justice system. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  18. A Study on Integrated Community Based Flood Mitigation with Remote Sensing Technique in Kota Bharu, Kelantan

    NASA Astrophysics Data System (ADS)

    'Ainullotfi, A. A.; Ibrahim, A. L.; Masron, T.

    2014-02-01

    This study is conducted to establish a community based flood management system that is integrated with remote sensing technique. To understand local knowledge, the demographic of the local society is obtained by using the survey approach. The local authorities are approached first to obtain information regarding the society in the study areas such as the population, the gender and the tabulation of settlement. The information about age, religion, ethnic, occupation, years of experience facing flood in the area, are recorded to understand more on how the local knowledge emerges. Then geographic data is obtained such as rainfall data, land use, land elevation, river discharge data. This information is used to establish a hydrological model of flood in the study area. Analysis were made from the survey approach to understand the pattern of society and how they react to floods while the analysis of geographic data is used to analyse the water extent and damage done by the flood. The final result of this research is to produce a flood mitigation method with a community based framework in the state of Kelantan. With the flood mitigation that involves the community's understanding towards flood also the techniques to forecast heavy rainfall and flood occurrence using remote sensing, it is hope that it could reduce the casualties and damage that might cause to the society and infrastructures in the study area.

  19. Technique Feature Analysis or Involvement Load Hypothesis: Estimating Their Predictive Power in Vocabulary Learning.

    PubMed

    Gohar, Manoochehr Jafari; Rahmanian, Mahboubeh; Soleimani, Hassan

    2018-02-05

    Vocabulary learning has always been a great concern and has attracted the attention of many researchers. Among the vocabulary learning hypotheses, involvement load hypothesis and technique feature analysis have been proposed which attempt to bring some concepts like noticing, motivation, and generation into focus. In the current study, 90 high proficiency EFL students were assigned into three vocabulary tasks of sentence making, composition, and reading comprehension in order to examine the power of involvement load hypothesis and technique feature analysis frameworks in predicting vocabulary learning. It was unraveled that involvement load hypothesis cannot be a good predictor, and technique feature analysis was a good predictor in pretest to posttest score change and not in during-task activity. The implications of the results will be discussed in the light of preparing vocabulary tasks.

  20. Hospitals Productivity Measurement Using Data Envelopment Analysis Technique.

    PubMed

    Torabipour, Amin; Najarzadeh, Maryam; Arab, Mohammad; Farzianpour, Freshteh; Ghasemzadeh, Roya

    2014-11-01

    This study aimed to measure the hospital productivity using data envelopment analysis (DEA) technique and Malmquist indices. This is a cross sectional study in which the panel data were used in a 4 year period from 2007 to 2010. The research was implemented in 12 teaching and non-teaching hospitals of Ahvaz County. Data envelopment analysis technique and the Malmquist indices with an input-orientation approach, was used to analyze the data and estimation of productivity. Data were analyzed using the SPSS.18 and DEAP.2 software. Six hospitals (50%) had a value lower than 1, which represents an increase in total productivity and other hospitals were non-productive. the average of total productivity factor (TPF) was 1.024 for all hospitals, which represents a decrease in efficiency by 2.4% from 2007 to 2010. The average technical, technologic, scale and managerial efficiency change was 0.989, 1.008, 1.028, and 0.996 respectively. There was not a significant difference in mean productivity changes among teaching and non-teaching hospitals (P>0.05) (except in 2009 years). Productivity rate of hospitals had an increasing trend generally. However, the total average of productivity was decreased in hospitals. Besides, between the several components of total productivity, variation of technological efficiency had the highest impact on reduce of total average of productivity.

  1. Silica nanoparticle based techniques for extraction, detection, and degradation of pesticides.

    PubMed

    Bapat, Gandhali; Labade, Chaitali; Chaudhari, Amol; Zinjarde, Smita

    2016-11-01

    Silica nanoparticles (SiNPs) find applications in the fields of drug delivery, catalysis, immobilization and sensing. Their synthesis can be mediated in a facile manner and they display broad range compatibility and stability. Their existence in the form of spheres, wires and sheets renders them suitable for varied purposes. This review summarizes the use of silica nanostructures in developing techniques for extraction, detection and degradation of pesticides. Silica nanostructures on account of their sorbent properties, porous nature and increased surface area allow effective extraction of pesticides. They can be modified (with ionic liquids, silanes or amines), coated with molecularly imprinted polymers or magnetized to improve the extraction of pesticides. Moreover, they can be altered to increase their sensitivity and stability. In addition to the analysis of pesticides by sophisticated techniques such as High Performance Liquid Chromatography or Gas chromatography, silica nanoparticles related simple detection methods are also proving to be effective. Electrochemical and optical detection based on enzymes (acetylcholinesterase and organophosphate hydrolase) or antibodies have been developed. Pesticide sensors dependent on fluorescence, chemiluminescence or Surface Enhanced Raman Spectroscopic responses are also SiNP based. Moreover, degradative enzymes (organophosphate hydrolases, carboxyesterases and laccases) and bacterial cells that produce recombinant enzymes have been immobilized on SiNPs for mediating pesticide degradation. After immobilization, these systems show increased stability and improved degradation. SiNP are significant in developing systems for effective extraction, detection and degradation of pesticides. SiNPs on account of their chemically inert nature and amenability to surface modifications makes them popular tools for fabricating devices for 'on-site' applications. Copyright © 2016 Elsevier B.V. All rights reserved.

  2. Independent Component Analysis applied to Ground-based observations

    NASA Astrophysics Data System (ADS)

    Martins-Filho, Walter; Griffith, Caitlin; Pearson, Kyle; Waldmann, Ingo; Alvarez-Candal, Alvaro; Zellem, Robert Thomas

    2018-01-01

    Transit measurements of Jovian-sized exoplanetary atmospheres allow one to study the composition of exoplanets, largely independent of the planet’s temperature profile. However, measurements of hot-Jupiter transits must archive a level of accuracy in the flux to determine the spectral modulation of the exoplanetary atmosphere. To accomplish this level of precision, we need to extract systematic errors, and, for ground-based measurements, the effects of Earth’s atmosphere, from signal due to the exoplanet, which is several orders of magnitude smaller. The effects of the terrestrial atmosphere and some of the time-dependent systematic errors of ground-based transit measurements are treated mainly by dividing the host star by a reference star at each wavelength and time step of the transit. Recently, Independent Component Analysis (ICA) have been used to remove systematics effects from the raw data of space-based observations (Waldmann, 2014, 2012; Morello et al., 2016, 2015). ICA is a statistical method born from the ideas of the blind-source separations studies, which can be used to de-trend several independent source signals of a data set (Hyvarinen and Oja, 2000). This technique requires no additional prior knowledge of the data set. In addition, this technique has the advantage of requiring no reference star. Here we apply the ICA to ground-based photometry of the exoplanet XO-2b recorded by the 61” Kuiper Telescope and compare the results of the ICA to those of a previous analysis from Zellem et al. (2015), which does not use ICA. We also simulate the effects of various conditions (concerning the systematic errors, noise and the stability of object on the detector) to determine the conditions under which an ICA can be used with high precision to extract the light curve of exoplanetary photometry measurements

  3. Analysis of hairy root culture of Rauvolfia serpentina using direct analysis in real time mass spectrometric technique.

    PubMed

    Madhusudanan, K P; Banerjee, Suchitra; Khanuja, Suman P S; Chattopadhyay, Sunil K

    2008-06-01

    The applicability of a new mass spectrometric technique, DART (direct analysis in real time) has been studied in the analysis of the hairy root culture of Rauvolfia serpentina. The intact hairy roots were analyzed by holding them in the gap between the DART source and the mass spectrometer for measurements. Two nitrogen-containing compounds, vomilenine and reserpine, were characterized from the analysis of the hairy roots almost instantaneously. The confirmation of the structures of the identified compounds was made through their accurate molecular formula determinations. This is the first report of the application of DART technique for the characterization of compounds that are expressed in the hairy root cultures of Rauvolfia serpentina. Moreover, this also constitutes the first report of expression of reserpine in the hairy root culture of Rauvolfia serpentina. Copyright (c) 2008 John Wiley & Sons, Ltd.

  4. Application of thermal analysis techniques in activated carbon production

    USGS Publications Warehouse

    Donnals, G.L.; DeBarr, J.A.; Rostam-Abadi, M.; Lizzio, A.A.; Brady, T.A.

    1996-01-01

    Thermal analysis techniques have been used at the ISGS as an aid in the development and characterization of carbon adsorbents. Promising adsorbents from fly ash, tires, and Illinois coals have been produced for various applications. Process conditions determined in the preparation of gram quantities of carbons were used as guides in the preparation of larger samples. TG techniques developed to characterize the carbon adsorbents included the measurement of the kinetics of SO2 adsorption, the performance of rapid proximate analyses, and the determination of equilibrium methane adsorption capacities. Thermal regeneration of carbons was assessed by TG to predict the life cycle of carbon adsorbents in different applications. TPD was used to determine the nature of surface functional groups and their effect on a carbon's adsorption properties.

  5. Enhanced Analysis Techniques for an Imaging Neutron and Gamma Ray Spectrometer

    NASA Astrophysics Data System (ADS)

    Madden, Amanda C.

    The presence of gamma rays and neutrons is a strong indicator of the presence of Special Nuclear Material (SNM). The imaging Neutron and gamma ray SPECTrometer (NSPECT) developed by the University of New Hampshire and Michigan Aerospace corporation detects the fast neutrons and prompt gamma rays from fissile material, and the gamma rays from radioactive material. The instrument operates as a double scatter device, requiring a neutron or a gamma ray to interact twice in the instrument. While this detection requirement decreases the efficiency of the instrument, it offers superior background rejection and the ability to measure the energy and momentum of the incident particle. These measurements create energy spectra and images of the emitting source for source identification and localization. The dual species instrument provides superior detection than a single species alone. In realistic detection scenarios, few particles are detected from a potential threat due to source shielding, detection at a distance, high background, and weak sources. This contributes to a small signal to noise ratio, and threat detection becomes difficult. To address these difficulties, several enhanced data analysis tools were developed. A Receiver Operating Characteristic Curve (ROC) helps set instrumental alarm thresholds as well as to identify the presence of a source. Analysis of a dual-species ROC curve provides superior detection capabilities. Bayesian analysis helps to detect and identify the presence of a source through model comparisons, and helps create a background corrected count spectra for enhanced spectroscopy. Development of an instrument response using simulations and numerical analyses will help perform spectra and image deconvolution. This thesis will outline the principles of operation of the NSPECT instrument using the double scatter technology, traditional analysis techniques, and enhanced analysis techniques as applied to data from the NSPECT instrument, and an

  6. Categorizing natural disaster damage assessment using satellite-based geospatial techniques

    NASA Astrophysics Data System (ADS)

    Myint, S. W.; Yuan, M.; Cerveny, R. S.; Giri, C.

    2008-07-01

    Remote sensing of a natural disaster's damage offers an exciting backup and/or alternative to traditional means of on-site damage assessment. Although necessary for complete assessment of damage areas, ground-based damage surveys conducted in the aftermath of natural hazard passage can sometimes be potentially complicated due to on-site difficulties (e.g., interaction with various authorities and emergency services) and hazards (e.g., downed power lines, gas lines, etc.), the need for rapid mobilization (particularly for remote locations), and the increasing cost of rapid physical transportation of manpower and equipment. Satellite image analysis, because of its global ubiquity, its ability for repeated independent analysis, and, as we demonstrate here, its ability to verify on-site damage assessment provides an interesting new perspective and investigative aide to researchers. Using one of the strongest tornado events in US history, the 3 May 1999 Oklahoma City Tornado, as a case example, we digitized the tornado damage path and co-registered the damage path using pre- and post-Landsat Thematic Mapper image data to perform a damage assessment. We employed several geospatial approaches, specifically the Getis index, Geary's C, and two lacunarity approaches to categorize damage characteristics according to the original Fujita tornado damage scale (F-scale). Our results indicate strong relationships between spatial indices computed within a local window and tornado F-scale damage categories identified through the ground survey. Consequently, linear regression models, even incorporating just a single band, appear effective in identifying F-scale damage categories using satellite imagery. This study demonstrates that satellite-based geospatial techniques can effectively add spatial perspectives to natural disaster damages, and in particular for this case study, tornado damages.

  7. Categorizing natural disaster damage assessment using satellite-based geospatial techniques

    USGS Publications Warehouse

    Myint, S.W.; Yuan, M.; Cerveny, R.S.; Giri, C.

    2008-01-01

    Remote sensing of a natural disaster's damage offers an exciting backup and/or alternative to traditional means of on-site damage assessment. Although necessary for complete assessment of damage areas, ground-based damage surveys conducted in the aftermath of natural hazard passage can sometimes be potentially complicated due to on-site difficulties (e.g., interaction with various authorities and emergency services) and hazards (e.g., downed power lines, gas lines, etc.), the need for rapid mobilization (particularly for remote locations), and the increasing cost of rapid physical transportation of manpower and equipment. Satellite image analysis, because of its global ubiquity, its ability for repeated independent analysis, and, as we demonstrate here, its ability to verify on-site damage assessment provides an interesting new perspective and investigative aide to researchers. Using one of the strongest tornado events in US history, the 3 May 1999 Oklahoma City Tornado, as a case example, we digitized the tornado damage path and co-registered the damage path using pre- and post-Landsat Thematic Mapper image data to perform a damage assessment. We employed several geospatial approaches, specifically the Getis index, Geary's C, and two lacunarity approaches to categorize damage characteristics according to the original Fujita tornado damage scale (F-scale). Our results indicate strong relationships between spatial indices computed within a local window and tornado F-scale damage categories identified through the ground survey. Consequently, linear regression models, even incorporating just a single band, appear effective in identifying F-scale damage categories using satellite imagery. This study demonstrates that satellite-based geospatial techniques can effectively add spatial perspectives to natural disaster damages, and in particular for this case study, tornado damages.

  8. High-throughput, 384-well, LC-MS/MS CYP inhibition assay using automation, cassette-analysis technique, and streamlined data analysis.

    PubMed

    Halladay, Jason S; Delarosa, Erlie Marie; Tran, Daniel; Wang, Leslie; Wong, Susan; Khojasteh, S Cyrus

    2011-08-01

    Here we describe a high capacity and high-throughput, automated, 384-well CYP inhibition assay using well-known HLM-based MS probes. We provide consistently robust IC(50) values at the lead optimization stage of the drug discovery process. Our method uses the Agilent Technologies/Velocity11 BioCel 1200 system, timesaving techniques for sample analysis, and streamlined data processing steps. For each experiment, we generate IC(50) values for up to 344 compounds and positive controls for five major CYP isoforms (probe substrate): CYP1A2 (phenacetin), CYP2C9 ((S)-warfarin), CYP2C19 ((S)-mephenytoin), CYP2D6 (dextromethorphan), and CYP3A4/5 (testosterone and midazolam). Each compound is incubated separately at four concentrations with each CYP probe substrate under the optimized incubation condition. Each incubation is quenched with acetonitrile containing the deuterated internal standard of the respective metabolite for each probe substrate. To minimize the number of samples to be analyzed by LC-MS/MS and reduce the amount of valuable MS runtime, we utilize timesaving techniques of cassette analysis (pooling the incubation samples at the end of each CYP probe incubation into one) and column switching (reducing the amount of MS runtime). Here we also report on the comparison of IC(50) results for five major CYP isoforms using our method compared to values reported in the literature.

  9. Techniques in helical scanning, dynamic imaging and image segmentation for improved quantitative analysis with X-ray micro-CT

    NASA Astrophysics Data System (ADS)

    Sheppard, Adrian; Latham, Shane; Middleton, Jill; Kingston, Andrew; Myers, Glenn; Varslot, Trond; Fogden, Andrew; Sawkins, Tim; Cruikshank, Ron; Saadatfar, Mohammad; Francois, Nicolas; Arns, Christoph; Senden, Tim

    2014-04-01

    This paper reports on recent advances at the micro-computed tomography facility at the Australian National University. Since 2000 this facility has been a significant centre for developments in imaging hardware and associated software for image reconstruction, image analysis and image-based modelling. In 2010 a new instrument was constructed that utilises theoretically-exact image reconstruction based on helical scanning trajectories, allowing higher cone angles and thus better utilisation of the available X-ray flux. We discuss the technical hurdles that needed to be overcome to allow imaging with cone angles in excess of 60°. We also present dynamic tomography algorithms that enable the changes between one moment and the next to be reconstructed from a sparse set of projections, allowing higher speed imaging of time-varying samples. Researchers at the facility have also created a sizeable distributed-memory image analysis toolkit with capabilities ranging from tomographic image reconstruction to 3D shape characterisation. We show results from image registration and present some of the new imaging and experimental techniques that it enables. Finally, we discuss the crucial question of image segmentation and evaluate some recently proposed techniques for automated segmentation.

  10. Change analysis in the United Arab Emirates: An investigation of techniques

    USGS Publications Warehouse

    Sohl, Terry L.

    1999-01-01

    Much of the landscape of the United Arab Emirates has been transformed over the past 15 years by massive afforestation, beautification, and agricultural programs. The "greening" of the United Arab Emirates has had environmental consequences, however, including degraded groundwater quality and possible damage to natural regional ecosystems. Personnel from the Ground- Water Research project, a joint effort between the National Drilling Company of the Abu Dhabi Emirate and the U.S. Geological Survey, were interested in studying landscape change in the Abu Dhabi Emirate using Landsat thematic mapper (TM) data. The EROs Data Center in Sioux Falls, South Dakota was asked to investigate land-cover change techniques that (1) provided locational, quantitative, and qualitative information on landcover change within the Abu Dhabi Emirate; and (2) could be easily implemented by project personnel who were relatively inexperienced in remote sensing. A number of products were created with 1987 and 1996 Landsat TM data using change-detection techniques, including univariate image differencing, an "enhanced" image differencing, vegetation index differencing, post-classification differencing, and changevector analysis. The different techniques provided products that varied in levels of adequacy according to the specific application and the ease of implementation and interpretation. Specific quantitative values of change were most accurately and easily provided by the enhanced image-differencing technique, while the change-vector analysis excelled at providing rich qualitative detail about the nature of a change. 

  11. Application of Behavior Change Techniques in a Personalized Nutrition Electronic Health Intervention Study: Protocol for the Web-Based Food4Me Randomized Controlled Trial.

    PubMed

    Macready, Anna L; Fallaize, Rosalind; Butler, Laurie T; Ellis, Judi A; Kuznesof, Sharron; Frewer, Lynn J; Celis-Morales, Carlos; Livingstone, Katherine M; Araújo-Soares, Vera; Fischer, Arnout Rh; Stewart-Knox, Barbara J; Mathers, John C; Lovegrove, Julie A

    2018-04-09

    To determine the efficacy of behavior change techniques applied in dietary and physical activity intervention studies, it is first necessary to record and describe techniques that have been used during such interventions. Published frameworks used in dietary and smoking cessation interventions undergo continuous development, and most are not adapted for Web-based delivery. The Food4Me study (N=1607) provided the opportunity to use existing frameworks to describe standardized Web-based techniques employed in a large-scale, internet-based intervention to change dietary behavior and physical activity. The aims of this study were (1) to describe techniques embedded in the Food4Me study design and explain the selection rationale and (2) to demonstrate the use of behavior change technique taxonomies, develop standard operating procedures for training, and identify strengths and limitations of the Food4Me framework that will inform its use in future studies. The 6-month randomized controlled trial took place simultaneously in seven European countries, with participants receiving one of four levels of personalized advice (generalized, intake-based, intake+phenotype-based, and intake+phenotype+gene-based). A three-phase approach was taken: (1) existing taxonomies were reviewed and techniques were identified a priori for possible inclusion in the Food4Me study, (2) a standard operating procedure was developed to maintain consistency in the use of methods and techniques across research centers, and (3) the Food4Me behavior change technique framework was reviewed and updated post intervention. An analysis of excluded techniques was also conducted. Of 46 techniques identified a priori as being applicable to Food4Me, 17 were embedded in the intervention design; 11 were from a dietary taxonomy, and 6 from a smoking cessation taxonomy. In addition, the four-category smoking cessation framework structure was adopted for clarity of communication. Smoking cessation texts were adapted

  12. Inquiry-Based Approach to a Carbohydrate Analysis Experiment

    NASA Astrophysics Data System (ADS)

    Senkbeil, Edward G.

    1999-01-01

    The analysis of an unknown carbohydrate in an inquiry-based learning format has proven to be a valuable and interesting undergraduate biochemistry laboratory experiment. Students are given a list of carbohydrates and a list of references for carbohydrate analysis. The references contain a variety of well-characterized wet chemistry and instrumental techniques for carbohydrate identification, but the students must develop an appropriate sequential protocol for unknown identification. The students are required to provide a list of chemicals and procedures and a flow chart for identification before the lab. During the 3-hour laboratory period, they utilize their accumulated information and knowledge to classify and identify their unknown. Advantages of the inquiry-based format are (i) students must be well prepared in advance to be successful in the laboratory, (ii) students feel a sense of accomplishment in both designing and carrying out a successful experiment, and (iii) the carbohydrate background information digested by the students significantly decreases the amount of lecture time required for this topic.

  13. Estimating Driving Performance Based on EEG Spectrum Analysis

    NASA Astrophysics Data System (ADS)

    Lin, Chin-Teng; Wu, Ruei-Cheng; Jung, Tzyy-Ping; Liang, Sheng-Fu; Huang, Teng-Yi

    2005-12-01

    The growing number of traffic accidents in recent years has become a serious concern to society. Accidents caused by driver's drowsiness behind the steering wheel have a high fatality rate because of the marked decline in the driver's abilities of perception, recognition, and vehicle control abilities while sleepy. Preventing such accidents caused by drowsiness is highly desirable but requires techniques for continuously detecting, estimating, and predicting the level of alertness of drivers and delivering effective feedbacks to maintain their maximum performance. This paper proposes an EEG-based drowsiness estimation system that combines electroencephalogram (EEG) log subband power spectrum, correlation analysis, principal component analysis, and linear regression models to indirectly estimate driver's drowsiness level in a virtual-reality-based driving simulator. Our results demonstrated that it is feasible to accurately estimate quantitatively driving performance, expressed as deviation between the center of the vehicle and the center of the cruising lane, in a realistic driving simulator.

  14. Comparative analysis of peak-detection techniques for comprehensive two-dimensional chromatography.

    PubMed

    Latha, Indu; Reichenbach, Stephen E; Tao, Qingping

    2011-09-23

    Comprehensive two-dimensional gas chromatography (GC×GC) is a powerful technology for separating complex samples. The typical goal of GC×GC peak detection is to aggregate data points of analyte peaks based on their retention times and intensities. Two techniques commonly used for two-dimensional peak detection are the two-step algorithm and the watershed algorithm. A recent study [4] compared the performance of the two-step and watershed algorithms for GC×GC data with retention-time shifts in the second-column separations. In that analysis, the peak retention-time shifts were corrected while applying the two-step algorithm but the watershed algorithm was applied without shift correction. The results indicated that the watershed algorithm has a higher probability of erroneously splitting a single two-dimensional peak than the two-step approach. This paper reconsiders the analysis by comparing peak-detection performance for resolved peaks after correcting retention-time shifts for both the two-step and watershed algorithms. Simulations with wide-ranging conditions indicate that when shift correction is employed with both algorithms, the watershed algorithm detects resolved peaks with greater accuracy than the two-step method. Copyright © 2011 Elsevier B.V. All rights reserved.

  15. An in Situ Technique for Elemental Analysis of Lunar Surfaces

    NASA Technical Reports Server (NTRS)

    Kane, K. Y.; Cremers, D. A.

    1992-01-01

    An in situ analytical technique that can remotely determine the elemental constituents of solids has been demonstrated. Laser-Induced Breakdown Spectroscopy (LIBS) is a form of atomic emission spectroscopy in which a powerful laser pulse is focused on a solid to generate a laser spark, or microplasma. Material in the plasma is vaporized, and the resulting atoms are excited to emit light. The light is spectrally resolved to identify the emitting species. LIBS is a simple technique that can be automated for inclusion aboard a remotely operated vehicle. Since only optical access to a sample is required, areas inaccessible to a rover can be analyzed remotely. A single laser spark both vaporizes and excites the sample so that near real-time analysis (a few minutes) is possible. This technique provides simultaneous multielement detection and has good sensitivity for many elements. LIBS also eliminates the need for sample retrieval and preparation preventing possible sample contamination. These qualities make the LIBS technique uniquely suited for use in the lunar environment.

  16. A new metaphor for projection-based visual analysis and data exploration

    NASA Astrophysics Data System (ADS)

    Schreck, Tobias; Panse, Christian

    2007-01-01

    In many important application domains such as Business and Finance, Process Monitoring, and Security, huge and quickly increasing volumes of complex data are collected. Strong efforts are underway developing automatic and interactive analysis tools for mining useful information from these data repositories. Many data analysis algorithms require an appropriate definition of similarity (or distance) between data instances to allow meaningful clustering, classification, and retrieval, among other analysis tasks. Projection-based data visualization is highly interesting (a) for visual discrimination analysis of a data set within a given similarity definition, and (b) for comparative analysis of similarity characteristics of a given data set represented by different similarity definitions. We introduce an intuitive and effective novel approach for projection-based similarity visualization for interactive discrimination analysis, data exploration, and visual evaluation of metric space effectiveness. The approach is based on the convex hull metaphor for visually aggregating sets of points in projected space, and it can be used with a variety of different projection techniques. The effectiveness of the approach is demonstrated by application on two well-known data sets. Statistical evidence supporting the validity of the hull metaphor is presented. We advocate the hull-based approach over the standard symbol-based approach to projection visualization, as it allows a more effective perception of similarity relationships and class distribution characteristics.

  17. Characterizing a New Surface-Based Shortwave Cloud Retrieval Technique, Based on Transmitted Radiance for Soil and Vegetated Surface Types

    NASA Technical Reports Server (NTRS)

    Coddington, Odele; Pilewskie, Peter; Schmidt, K. Sebastian; McBride, Patrick J.; Vukicevic, Tomislava

    2013-01-01

    This paper presents an approach using the GEneralized Nonlinear Retrieval Analysis (GENRA) tool and general inverse theory diagnostics including the maximum likelihood solution and the Shannon information content to investigate the performance of a new spectral technique for the retrieval of cloud optical properties from surface based transmittance measurements. The cumulative retrieval information over broad ranges in cloud optical thickness (tau), droplet effective radius (r(sub e)), and overhead sun angles is quantified under two conditions known to impact transmitted radiation; the variability in land surface albedo and atmospheric water vapor content. Our conclusions are: (1) the retrieved cloud properties are more sensitive to the natural variability in land surface albedo than to water vapor content; (2) the new spectral technique is more accurate (but still imprecise) than a standard approach, in particular for tau between 5 and 60 and r(sub e) less than approximately 20 nm; and (3) the retrieved cloud properties are dependent on sun angle for clouds of tau from 5 to 10 and r(sub e) less than 10 nm, with maximum sensitivity obtained for an overhead sun.

  18. Insights into the prominent effect of mahanimbine on Acanthamoeba castellanii: Cell profiling analysis based on microscopy techniques

    NASA Astrophysics Data System (ADS)

    Hashim, Fatimah; Amin, Nakisah Mat

    2017-02-01

    Mahanimbine (MH), has been shown to have antiamoeba properties. Therefore, the aim of this study was to assess the growth inhibitory mechanisms of MH on Acanthamoeba castellanii, a causative agents for Acanthamoeba keratitis. The IC50 value obtained for MH against A. castellanii was 1.18 µg/ml. Light and scanning electron microscopy observation showed that most cells were in cystic appearance. While transmission electron microscopy observation revealed changes at the ultrastructural level and fluorescence microscopy observation indicated the induction of apoptosis and autophagic activity in the amoeba cytoplasms. In conclusion, MH has very potent anti-amoebic properties on A. castellanii as is shown by cytotoxicity analyses based on microscopy techniques.

  19. Performance of dental impression materials: Benchmarking of materials and techniques by three-dimensional analysis.

    PubMed

    Rudolph, Heike; Graf, Michael R S; Kuhn, Katharina; Rupf-Köhler, Stephanie; Eirich, Alfred; Edelmann, Cornelia; Quaas, Sebastian; Luthardt, Ralph G

    2015-01-01

    Among other factors, the precision of dental impressions is an important and determining factor for the fit of dental restorations. The aim of this study was to examine the three-dimensional (3D) precision of gypsum dies made using a range of impression techniques and materials. Ten impressions of a steel canine were fabricated for each of the 24 material-method-combinations and poured with type 4 die stone. The dies were optically digitized, aligned to the CAD model of the steel canine, and 3D differences were calculated. The results were statistically analyzed using one-way analysis of variance. Depending on material and impression technique, the mean values had a range between +10.9/-10.0 µm (SD 2.8/2.3) and +16.5/-23.5 µm (SD 11.8/18.8). Qualitative analysis using colorcoded graphs showed a characteristic location of deviations for different impression techniques. Three-dimensional analysis provided a comprehensive picture of the achievable precision. Processing aspects and impression technique were of significant influence.

  20. Evaluation of Meterorite Amono Acid Analysis Data Using Multivariate Techniques

    NASA Technical Reports Server (NTRS)

    McDonald, G.; Storrie-Lombardi, M.; Nealson, K.

    1999-01-01

    The amino acid distributions in the Murchison carbonaceous chondrite, Mars meteorite ALH84001, and ice from the Allan Hills region of Antarctica are shown, using a multivariate technique known as Principal Component Analysis (PCA), to be statistically distinct from the average amino acid compostion of 101 terrestrial protein superfamilies.