Science.gov

Sample records for analysis technique based

  1. New Flutter Analysis Technique for CFD-based Unsteady Aeroelasticity

    NASA Technical Reports Server (NTRS)

    Pak, Chan-gi; Jutte, Christine V.

    2009-01-01

    This paper presents a flutter analysis technique for the transonic flight regime. The technique uses an iterative approach to determine the critical dynamic pressure for a given mach number. Unlike other CFD-based flutter analysis methods, each iteration solves for the critical dynamic pressure and uses this value in subsequent iterations until the value converges. This process reduces the iterations required to determine the critical dynamic pressure. To improve the accuracy of the analysis, the technique employs a known structural model, leaving only the aerodynamic model as the unknown. The aerodynamic model is estimated using unsteady aeroelastic CFD analysis combined with a parameter estimation routine. The technique executes as follows. The known structural model is represented as a finite element model. Modal analysis determines the frequencies and mode shapes for the structural model. At a given mach number and dynamic pressure, the unsteady CFD analysis is performed. The output time history of the surface pressure is converted to a nodal aerodynamic force vector. The forces are then normalized by the given dynamic pressure. A multi-input multi-output parameter estimation software, ERA, estimates the aerodynamic model through the use of time histories of nodal aerodynamic forces and structural deformations. The critical dynamic pressure is then calculated using the known structural model and the estimated aerodynamic model. This output is used as the dynamic pressure in subsequent iterations until the critical dynamic pressure is determined. This technique is demonstrated on the Aerostructures Test Wing-2 model at NASA's Dryden Flight Research Center.

  2. New Flutter Analysis Technique for CFD-based Unsteady Aeroelasticity

    NASA Technical Reports Server (NTRS)

    Pak, Chan-gi; Jutte, Christine V.

    2009-01-01

    This paper presents a flutter analysis technique for the transonic flight regime. The technique uses an iterative approach to determine the critical dynamic pressure for a given mach number. Unlike other CFD-based flutter analysis methods, each iteration solves for the critical dynamic pressure and uses this value in subsequent iterations until the value converges. This process reduces the iterations required to determine the critical dynamic pressure. To improve the accuracy of the analysis, the technique employs a known structural model, leaving only the aerodynamic model as the unknown. The aerodynamic model is estimated using unsteady aeroelastic CFD analysis combined with a parameter estimation routine. The technique executes as follows. The known structural model is represented as a finite element model. Modal analysis determines the frequencies and mode shapes for the structural model. At a given mach number and dynamic pressure, the unsteady CFD analysis is performed. The output time history of the surface pressure is converted to a nodal aerodynamic force vector. The forces are then normalized by the given dynamic pressure. A multi-input multi-output parameter estimation software, ERA, estimates the aerodynamic model through the use of time histories of nodal aerodynamic forces and structural deformations. The critical dynamic pressure is then calculated using the known structural model and the estimated aerodynamic model. This output is used as the dynamic pressure in subsequent iterations until the critical dynamic pressure is determined. This technique is demonstrated on the Aerostructures Test Wing-2 model at NASA's Dryden Flight Research Center.

  3. Comparative analysis of affinity-based 5-hydroxymethylation enrichment techniques

    PubMed Central

    Thomson, John P.; Hunter, Jennifer M.; Nestor, Colm E.; Dunican, Donncha S.; Terranova, Rémi; Moggs, Jonathan G.; Meehan, Richard R.

    2013-01-01

    The epigenetic modification of 5-hydroxymethylcytosine (5hmC) is receiving great attention due to its potential role in DNA methylation reprogramming and as a cell state identifier. Given this interest, it is important to identify reliable and cost-effective methods for the enrichment of 5hmC marked DNA for downstream analysis. We tested three commonly used affinity-based enrichment techniques; (i) antibody, (ii) chemical capture and (iii) protein affinity enrichment and assessed their ability to accurately and reproducibly report 5hmC profiles in mouse tissues containing high (brain) and lower (liver) levels of 5hmC. The protein-affinity technique is a poor reporter of 5hmC profiles, delivering 5hmC patterns that are incompatible with other methods. Both antibody and chemical capture-based techniques generate highly similar genome-wide patterns for 5hmC, which are independently validated by standard quantitative PCR (qPCR) and glucosyl-sensitive restriction enzyme digestion (gRES-qPCR). Both antibody and chemical capture generated profiles reproducibly link to unique chromatin modification profiles associated with 5hmC. However, there appears to be a slight bias of the antibody to bind to regions of DNA rich in simple repeats. Ultimately, the increased specificity observed with chemical capture-based approaches makes this an attractive method for the analysis of locus-specific or genome-wide patterns of 5hmC. PMID:24214958

  4. Computer Based Economic Analysis Techniques to Support Functional Economic Analysis

    DTIC Science & Technology

    1993-09-01

    is one of the most frequently used tools to uncover and explore profit potential. B. CALCULATION OF BREAK EVER ANALYSIS Haga and Lang (1992) state...BENEFITS For benfits that are quantifiable, Haga and Lang (1992) express BCR in the following notation. BCR=QOM Equation 9-1UAC (Where QOM is a...emulation. In addition to the software requirements, FEAM has the following hardware criteria: 68 * A mouse "* 2MB of RAM "* 20MB of Hard Disk space "* EGA

  5. Image analysis techniques associated with automatic data base generation.

    NASA Technical Reports Server (NTRS)

    Bond, A. D.; Ramapriyan, H. K.; Atkinson, R. J.; Hodges, B. C.; Thomas, D. T.

    1973-01-01

    This paper considers some basic problems relating to automatic data base generation from imagery, the primary emphasis being on fast and efficient automatic extraction of relevant pictorial information. Among the techniques discussed are recursive implementations of some particular types of filters which are much faster than FFT implementations, a 'sequential similarity detection' technique of implementing matched filters, and sequential linear classification of multispectral imagery. Several applications of the above techniques are presented including enhancement of underwater, aerial and radiographic imagery, detection and reconstruction of particular types of features in images, automatic picture registration and classification of multiband aerial photographs to generate thematic land use maps.

  6. Improved mesh based photon sampling techniques for neutron activation analysis

    SciTech Connect

    Relson, E.; Wilson, P. P. H.; Biondo, E. D.

    2013-07-01

    The design of fusion power systems requires analysis of neutron activation of large, complex volumes, and the resulting particles emitted from these volumes. Structured mesh-based discretization of these problems allows for improved modeling in these activation analysis problems. Finer discretization of these problems results in large computational costs, which drives the investigation of more efficient methods. Within an ad hoc subroutine of the Monte Carlo transport code MCNP, we implement sampling of voxels and photon energies for volumetric sources using the alias method. The alias method enables efficient sampling of a discrete probability distribution, and operates in 0(1) time, whereas the simpler direct discrete method requires 0(log(n)) time. By using the alias method, voxel sampling becomes a viable alternative to sampling space with the 0(1) approach of uniformly sampling the problem volume. Additionally, with voxel sampling it is straightforward to introduce biasing of volumetric sources, and we implement this biasing of voxels as an additional variance reduction technique that can be applied. We verify our implementation and compare the alias method, with and without biasing, to direct discrete sampling of voxels, and to uniform sampling. We study the behavior of source biasing in a second set of tests and find trends between improvements and source shape, material, and material density. Overall, however, the magnitude of improvements from source biasing appears to be limited. Future work will benefit from the implementation of efficient voxel sampling - particularly with conformal unstructured meshes where the uniform sampling approach cannot be applied. (authors)

  7. GC-Based Techniques for Breath Analysis: Current Status, Challenges, and Prospects.

    PubMed

    Xu, Mingjun; Tang, Zhentao; Duan, Yixiang; Liu, Yong

    2016-07-03

    Breath analysis is a noninvasive diagnostic method that profiles a person's physical state by volatile organic compounds in the breath. It has huge potential in the field of disease diagnosis. In order to offer opportunities for practical applications, various GC-based techniques have been investigated for on-line breath analysis since GC is the most preferred technique for mixed gas separation. This article reviews the development of breath analysis and GC-based techniques in basic breath research, involving sampling methods, preconcentration methods, conventional GC-based techniques, and newly developed GC techniques for breath analysis. The combination of GC and newly developed detection techniques takes advantages of the virtues of each. In addition, portable GC or micro GC are poised to become field GC-based techniques in breath analysis. Challenges faced in GC-based techniques for breath analysis are discussed candidly. Effective cooperation of experts from different fields is urgent to promote the development of breath analysis.

  8. BCC skin cancer diagnosis based on texture analysis techniques

    NASA Astrophysics Data System (ADS)

    Chuang, Shao-Hui; Sun, Xiaoyan; Chang, Wen-Yu; Chen, Gwo-Shing; Huang, Adam; Li, Jiang; McKenzie, Frederic D.

    2011-03-01

    In this paper, we present a texture analysis based method for diagnosing the Basal Cell Carcinoma (BCC) skin cancer using optical images taken from the suspicious skin regions. We first extracted the Run Length Matrix and Haralick texture features from the images and used a feature selection algorithm to identify the most effective feature set for the diagnosis. We then utilized a Multi-Layer Perceptron (MLP) classifier to classify the images to BCC or normal cases. Experiments showed that detecting BCC cancer based on optical images is feasible. The best sensitivity and specificity we achieved on our data set were 94% and 95%, respectively.

  9. Ionospheric Plasma Drift Analysis Technique Based On Ray Tracing

    NASA Astrophysics Data System (ADS)

    Ari, Gizem; Toker, Cenk

    2016-07-01

    Ionospheric drift measurements provide important information about the variability in the ionosphere, which can be used to quantify ionospheric disturbances caused by natural phenomena such as solar, geomagnetic, gravitational and seismic activities. One of the prominent ways for drift measurement depends on instrumentation based measurements, e.g. using an ionosonde. The drift estimation of an ionosonde depends on measuring the Doppler shift on the received signal, where the main cause of Doppler shift is the change in the length of the propagation path of the signal between the transmitter and the receiver. Unfortunately, ionosondes are expensive devices and their installation and maintenance require special care. Furthermore, the ionosonde network over the world or even Europe is not dense enough to obtain a global or continental drift map. In order to overcome the difficulties related to an ionosonde, we propose a technique to perform ionospheric drift estimation based on ray tracing. First, a two dimensional TEC map is constructed by using the IONOLAB-MAP tool which spatially interpolates the VTEC estimates obtained from the EUREF CORS network. Next, a three dimensional electron density profile is generated by inputting the TEC estimates to the IRI-2015 model. Eventually, a close-to-real situation electron density profile is obtained in which ray tracing can be performed. These profiles can be constructed periodically with a period of as low as 30 seconds. By processing two consequent snapshots together and calculating the propagation paths, we estimate the drift measurements over any coordinate of concern. We test our technique by comparing the results to the drift measurements taken at the DPS ionosonde at Pruhonice, Czech Republic. This study is supported by TUBITAK 115E915 and Joint TUBITAK 114E092 and AS CR14/001 projects.

  10. Fluorometric discrimination technique of phytoplankton population based on wavelet analysis

    NASA Astrophysics Data System (ADS)

    Zhang, Shanshan; Su, Rongguo; Duan, Yali; Zhang, Cui; Song, Zhijie; Wang, Xiulin

    2012-09-01

    The discrete excitation-emission-matrix fluorescence spectra (EEMS) at 12 excitation wavelengths (400, 430, 450, 460, 470, 490, 500, 510, 525, 550, 570, and 590 nm) and emission wavelengths ranging from 600-750 nm were determined for 43 phytoplankton species. A two-rank fluorescence spectra database was established by wavelet analysis and a fluorometric discrimination technique for determining phytoplankton population was developed. For laboratory simulatively mixed samples, the samples mixed from 43 algal species (the algae of one division accounted for 25%, 50%, 75%, 85%, and 100% of the gross biomass, respectively), the average discrimination rates at the level of division were 65.0%, 87.5%, 98.6%, 99.0%, and 99.1%, with average relative contents of 18.9%, 44.5%, 68.9%, 73.4%, and 82.9%, respectively; the samples mixed from 32 red tide algal species (the dominant species accounted for 60%, 70%, 80%, 90%, and 100% of the gross biomass, respectively), the average correct discrimination rates of the dominant species at the level of genus were 63.3%, 74.2%, 78.8%, 83.4%, and 79.4%, respectively. For the 81 laboratory mixed samples with the dominant species accounting for 75% of the gross biomass (chlorophyll), the discrimination rates of the dominant species were 95.1% and 72.8% at the level of division and genus, respectively. For the 12 samples collected from the mesocosm experiment in Maidao Bay of Qingdao in August 2007, the dominant species of the 11 samples were recognized at the division level and the dominant species of four of the five samples in which the dominant species accounted for more than 80% of the gross biomass were discriminated at the genus level; for the 12 samples obtained from Jiaozhou Bay in August 2007, the dominant species of all the 12 samples were recognized at the division level. The technique can be directly applied to fluorescence spectrophotometers and to the developing of an in situ algae fluorescence auto-analyzer for phytoplankton

  11. Simultaneous and integrated neutron-based techniques for material analysis of a metallic ancient flute

    NASA Astrophysics Data System (ADS)

    Festa, G.; Pietropaolo, A.; Grazzi, F.; Sutton, L. F.; Scherillo, A.; Bognetti, L.; Bini, A.; Barzagli, E.; Schooneveld, E.; Andreani, C.

    2013-09-01

    A metallic 19th century flute was studied by means of integrated and simultaneous neutron-based techniques: neutron diffraction, neutron radiative capture analysis and neutron radiography. This experiment follows benchmark measurements devoted to assessing the effectiveness of a multitask beamline concept for neutron-based investigation on materials. The aim of this study is to show the potential application of the approach using multiple and integrated neutron-based techniques for musical instruments. Such samples, in the broad scenario of cultural heritage, represent an exciting research field. They may represent an interesting link between different disciplines such as nuclear physics, metallurgy and acoustics.

  12. Analysis of High Contrast Imaging Techniques for Space Based Direct Planetary Imaging

    NASA Technical Reports Server (NTRS)

    Lyon, Richard G.; Gezari, Dan Y.; Nisenson, P.; Fisher, Richard R. (Technical Monitor)

    2001-01-01

    We report on our ongoing investigations of a number of techniques for direct detection and imaging of Earth-like planets around nearby stellar sources. Herein, we give a quantitative analysis of these techniques and compare and contrast them via computer simulations. The techniques we will be reporting on are Bracewell Interferometry, Nisenson Apodized Square Aperture, and Coronagraphic masking techniques. We parameterize our results with respect to wavelength, aperture size, effects of mirror speckle, both mid- and high-spatial frequency, detector and photon noise as well pointing error. The recent numerous detections of Jupiter and Saturn like planets has driven a resurgence in research of space based high contrast imaging techniques for direct planetary imaging. Work is currently ongoing for concepts for NASA's Terrestrial Planet Finder mission and a number of study teams have been funded. The authors are members of one team.

  13. Video-based ergonomic analysis to evaluate thoracostomy tube placement techniques.

    PubMed

    Seagull, F Jacob; Mackenzie, Colin F; Xiao, Yan; Bochicchio, Grant V

    2006-01-01

    Thoracostomy for relief of pneumo- or hemothorax may be performed emergently at the bedside, in the emergency department or trauma area, often in nonideal circumstances. We hypothesized that ergonomic analysis of thoracostomy techniques can identify areas for potential improvement in patient and operator safety. Interviews with Subject Matter Experts (SME) provided steps in the task of thoracostomy; 44 thoracostomies (emergent and elective) were video-recorded and reviewed by SMEs. Ergonomic analyses evaluated surgical performance techniques using video clips. Risks to the patient and operator included instrument-tray positioning and instrument content. Analyses of video records revealed that despite SME-survey consensus, operators inconsistently followed recommended techniques. Discrepancies between SME-recommended and observed practice are prevalent, with simple ergonomic problems impeding performance, and creating risks for patients and operators. Video-based ergonomic analysis is a rich source for identifying task performance problems and potential solutions.

  14. Knowledge-base for the new human reliability analysis method, A Technique for Human Error Analysis (ATHEANA)

    SciTech Connect

    Cooper, S.E.; Wreathall, J.; Thompson, C.M., Drouin, M.; Bley, D.C.

    1996-10-01

    This paper describes the knowledge base for the application of the new human reliability analysis (HRA) method, a ``A Technique for Human Error Analysis`` (ATHEANA). Since application of ATHEANA requires the identification of previously unmodeled human failure events, especially errors of commission, and associated error-forcing contexts (i.e., combinations of plant conditions and performance shaping factors), this knowledge base is an essential aid for the HRA analyst.

  15. A fuzzy-based histogram analysis technique for skin lesion discrimination in dermatology clinical images

    PubMed Central

    Stanley, R. Joe; Moss, Randy Hays; Van Stoecker, William; Aggarwal, Chetna

    2011-01-01

    A fuzzy logic-based color histogram analysis technique is presented for discriminating benign skin lesions from malignant melanomas in dermatology clinical images. The approach utilizes a fuzzy set for benign skin lesion color, and alpha-cut and support set cardinality for quantifying a fuzzy ratio skin lesion color feature. Skin lesion discrimination results are reported for the fuzzy ratio and fusion with a previously determined percent melanoma color feature over a data set of 258 clinical images. For the fusion technique, alpha-cuts for the fuzzy ratio can be chosen to recognize over 93.30% of melanomas with approximately 15.67% false positive lesions. PMID:12821032

  16. Data analysis techniques

    NASA Technical Reports Server (NTRS)

    Park, Steve

    1990-01-01

    A large and diverse number of computational techniques are routinely used to process and analyze remotely sensed data. These techniques include: univariate statistics; multivariate statistics; principal component analysis; pattern recognition and classification; other multivariate techniques; geometric correction; registration and resampling; radiometric correction; enhancement; restoration; Fourier analysis; and filtering. Each of these techniques will be considered, in order.

  17. Sex-based differences in lifting technique under increasing load conditions: A principal component analysis.

    PubMed

    Sheppard, P S; Stevenson, J M; Graham, R B

    2016-05-01

    The objective of the present study was to determine if there is a sex-based difference in lifting technique across increasing-load conditions. Eleven male and 14 female participants (n = 25) with no previous history of low back disorder participated in the study. Participants completed freestyle, symmetric lifts of a box with handles from the floor to a table positioned at 50% of their height for five trials under three load conditions (10%, 20%, and 30% of their individual maximum isometric back strength). Joint kinematic data for the ankle, knee, hip, and lumbar and thoracic spine were collected using a two-camera Optotrak motion capture system. Joint angles were calculated using a three-dimensional Euler rotation sequence. Principal component analysis (PCA) and single component reconstruction were applied to assess differences in lifting technique across the entire waveforms. Thirty-two PCs were retained from the five joints and three axes in accordance with the 90% trace criterion. Repeated-measures ANOVA with a mixed design revealed no significant effect of sex for any of the PCs. This is contrary to previous research that used discrete points on the lifting curve to analyze sex-based differences, but agrees with more recent research using more complex analysis techniques. There was a significant effect of load on lifting technique for five PCs of the lower limb (PC1 of ankle flexion, knee flexion, and knee adduction, as well as PC2 and PC3 of hip flexion) (p < 0.005). However, there was no significant effect of load on the thoracic and lumbar spine. It was concluded that when load is standardized to individual back strength characteristics, males and females adopted a similar lifting technique. In addition, as load increased male and female participants changed their lifting technique in a similar manner. Copyright © 2016. Published by Elsevier Ltd.

  18. Robust and discriminating method for face recognition based on correlation technique and independent component analysis model.

    PubMed

    Alfalou, A; Brosseau, C

    2011-03-01

    We demonstrate a novel technique for face recognition. Our approach relies on the performances of a strongly discriminating optical correlation method along with the robustness of the independent component analysis (ICA) model. Simulations were performed to illustrate how this algorithm can identify a face with images from the Pointing Head Pose Image Database. While maintaining algorithmic simplicity, this approach based on ICA representation significantly increases the true recognition rate compared to that obtained using our previously developed all-numerical ICA identity recognition method and another method based on optical correlation and a standard composite filter.

  19. [Survival analysis techniques].

    PubMed

    Bustamante-Teixeira, Maria Teresa; Faerstein, Eduardo; Latorre, Maria do Rosário

    2002-01-01

    Statistical methods known as survival analyses are useful for analyzing time-related events, in which time from a benchmark event to an endpoint is the focus of interest. Survival analysis describes not only patient survival statistics (as suggested by the name), but also other dichotomous outcomes such as time of remission, time of breastfeeding, etc. This paper discusses survival analysis techniques, commenting and comparing their utilization, especially in the field of oncology. It also presents and discusses types of epidemiological studies and data sources to which this type of analysis is applied. The authors take into account the difference between hospital-based or clinical series and population-based approaches. Interpretation of results is also discussed.

  20. Advanced SuperDARN meteor wind observations based on raw time series analysis technique

    NASA Astrophysics Data System (ADS)

    Tsutsumi, M.; Yukimatu, A. S.; Holdsworth, D. A.; Lester, M.

    2009-04-01

    The meteor observation technique based on SuperDARN raw time series analysis has been upgraded. This technique extracts meteor information as biproducts and does not degrade the quality of normal SuperDARN operations. In the upgrade the radar operating system (RADOPS) has been modified so that it can oversample every 15 km during the normal operations, which have a range resolution of 45 km. As an alternative method for better range determination a frequency domain interferometry (FDI) capability was also coded in RADOPS, where the operating radio frequency can be changed every pulse sequence. Test observations were conducted using the CUTLASS Iceland East and Finland radars, where oversampling and FDI operation (two frequencies separated by 3 kHz) were simultaneously carried out. Meteor ranges obtained in both ranging techniques agreed very well. The ranges were then combined with the interferometer data to estimate meteor echo reflection heights. Although there were still some ambiguities in the arrival angles of echoes because of the rather long antenna spacing of the interferometers, the heights and arrival angles of most of meteor echoes were more accurately determined than previously. Wind velocities were successfully estimated over the height range of 84 to 110 km. The FDI technique developed here can be further applied to the common SuperDARN operation, and study of fine horizontal structures of F region plasma irregularities is expected in the future.

  1. A novel murmur-based heart sound feature extraction technique using envelope-morphological analysis

    NASA Astrophysics Data System (ADS)

    Yao, Hao-Dong; Ma, Jia-Li; Fu, Bin-Bin; Wang, Hai-Yang; Dong, Ming-Chui

    2015-07-01

    Auscultation of heart sound (HS) signals serves as an important primary approach to diagnose cardiovascular diseases (CVDs) for centuries. Confronting the intrinsic drawbacks of traditional HS auscultation, computer-aided automatic HS auscultation based on feature extraction technique has witnessed explosive development. Yet, most existing HS feature extraction methods adopt acoustic or time-frequency features which exhibit poor relationship with diagnostic information, thus restricting the performance of further interpretation and analysis. Tackling such a bottleneck problem, this paper innovatively proposes a novel murmur-based HS feature extraction method since murmurs contain massive pathological information and are regarded as the first indications of pathological occurrences of heart valves. Adapting discrete wavelet transform (DWT) and Shannon envelope, the envelope-morphological characteristics of murmurs are obtained and three features are extracted accordingly. Validated by discriminating normal HS and 5 various abnormal HS signals with extracted features, the proposed method provides an attractive candidate in automatic HS auscultation.

  2. Applications of synchrotron-based micro-imaging techniques for the analysis of Cultural Heritage materials

    SciTech Connect

    Cotte, Marine; Chilida, Javier; Walter, Philippe; Taniguchi, Yoko; Susini, Jean

    2009-01-29

    The analysis of cultural Heritage objects is often technically challenging. When analyzing micro-fragments, the amount of matter is usually very tiny, hence requiring sensitive techniques. These samples, in particular painting fragments, may present multi-layered structures, with layer thickness of {approx}10 {mu}m. It leads to favor micro-imaging techniques, with a good lateral resolution (about one micrometer), that manage the discriminative study of each layer. Besides, samples are usually very complex in term of chemistry, as they are made of mineral and organic matters, amorphous and crystallized phases, major and minor elements. Accordingly, a multi-modal approach is generally essential to solve the chemical complexity of such hybrid materials. Different examples will be given, to illustrate the various possibilities of synchrotron-based micro-imaging techniques, such as micro X-ray diffraction, micro X-ray fluorescence, micro X-ray absorption spectroscopy and micro FTIR spectroscopy. Focus will be made on paintings, but the whole range of museum objects (going from soft matter like paper or wood to hard matter like metal and glass) will be also considered.

  3. Photon-based techniques for nondestructive subsurface analysis of painted cultural heritage artifacts.

    PubMed

    Janssens, K; Dik, J; Cotte, M; Susini, J

    2010-06-15

    Often, just micrometers below a painting's surface lies a wealth of information, both with Old Masters such as Peter Paul Rubens and Rembrandt van Rijn and with more recent artists of great renown such as Vincent Van Gogh and James Ensor. Subsurface layers may include underdrawing, underpainting, and alterations, and in a growing number of cases conservators have discovered abandoned compositions on paintings, illustrating artists' practice of reusing a canvas or panel. The standard methods for studying the inner structure of cultural heritage (CH) artifacts are infrared reflectography and X-ray radiography, techniques that are optionally complemented with the microscopic analysis of cross-sectioned samples. These methods have limitations, but recently, a number of fundamentally new approaches for fully imaging the buildup of hidden paint layers and other complex three-dimensional (3D) substructures have been put into practice. In this Account, we discuss these developments and their recent practical application with CH artifacts. We begin with a tabular summary of 14 IR- and X-ray-based imaging methods and then continue with a discussion of each technique, illustrating CH applications with specific case studies. X-ray-based tomographic and laminographic techniques can be used to generate 3D renditions of artifacts of varying dimensions. These methods are proving invaluable for exploring inner structures, identifying the conservation state, and postulating the original manufacturing technology of metallic and other sculptures. In the analysis of paint layers, terahertz time-domain spectroscopy (THz-TDS) can highlight interfaces between layers in a stratigraphic buildup, whereas macrosopic scanning X-ray fluorescence (MA-XRF) has been employed to measure the distribution of pigments within these layers. This combination of innovative methods provides topographic and color information about the micrometer depth scale, allowing us to look "into" paintings in an

  4. Team activity analysis and recognition based on Kinect depth map and optical imagery techniques

    NASA Astrophysics Data System (ADS)

    Elangovan, Vinayak; Bandaru, Vinod K.; Shirkhodaie, Amir

    2012-06-01

    Kinect cameras produce low-cost depth map video streams applicable for conventional surveillance systems. However, commonly applied image processing techniques are not directly applicable for depth map video processing. Kinect depth map images contain range measurement of objects at expense of having spatial features of objects suppressed. For example, typical objects' attributes such as textures, color tones, intensity, and other characteristic attributes cannot be fully realized by processing depth map imagery. In this paper, we demonstrate application of Kinect depth map and optical imagery for characterization of indoor and outdoor group activities. A Casual-Events State Inference (CESI) technique is proposed for spatiotemporal recognition and reasoning of group activities. CESI uses an ontological scheme for representation of casual distinctiveness of a priori known group activities. By tracking and serializing distinctive atomic group activities, CESI allows discovery of more complex group activities. A Modified Sequential Hidden Markov Model (MS-HMM) is implemented for trail analysis of atomic events representing correlated group activities. CESI reasons about five levels of group activities including: Merging, Planning, Cooperation, Coordination, and Dispersion. In this paper, we present results of capability of CESI approach for characterization of group activities taking place both in indoor and outdoor. Based on spatiotemporal pattern matching of atomic activities representing a known group activities, the CESI is able to discriminate suspicious group activity from normal activities. This paper also presents technical details of imagery techniques implemented for detection, tracking, and characterization of atomic events based on Kinect depth map and optical imagery data sets. Various experimental scenarios in indoors and outdoors (e.g. loading and unloading of objects, human-vehicle interactions etc.,) are carried to demonstrate effectiveness and

  5. Vehicle-bridge Interaction Analysis Based on the ANCF Quasi-conforming Plate Technique

    NASA Astrophysics Data System (ADS)

    Wang, Bingjian; He, Hua

    2017-06-01

    A new plate element is developed for analysis of plate structures in vehicle-bridge interaction analysis based on the combining of absolute nodal coordinate formulation (ANCF) and quasi-conforming technique (QCT). In order to simulate complex contact and large deformation during vehicle-bridge interaction (VBI) for the slender bridge, new curvature strains and explicit formulation of internal forces are developed for the shell elements of the bridge deck. The developed QCT_ANCF shell element is compared with the original ANCF element to verify its locking remedies. Compared with the original model, the new QCT_ANCF element shows better convergence and curvature continuity and is more accurate under the same number of elements. Numerical cases are analyzed using the QCT_ANCF element in comparison to analytical solutions and the original ANCF shell element. Meanwhile, there is less high frequency vibration in the velocity and acceleration curve by comparing with the original model. Furthermore, the vehicle-bridge interaction is parametrically analyzed using the new QCT_ANCF element under series of road roughness index and vehicle speeds. The impact factors based on the displacements and strains over the transverses of the bridge are investigated.

  6. Dynamic programming based time-delay estimation technique for analysis of time-varying time-delay

    SciTech Connect

    Gupta, Deepak K.; McKee, George R.; Fonck, Raymond J.

    2010-01-15

    A new time-delay estimation (TDE) technique based on dynamic programming is developed to measure the time-varying time-delay between two signals. The dynamic programming based TDE technique provides a frequency response five to ten times better than previously known TDE techniques, namely, those based on time-lag cross-correlation or wavelet analysis. Effects of frequency spectrum, signal-to-noise ratio, and amplitude of time-delay on response of the TDE technique (represented as transfer function) are studied using simulated data signals. The transfer function for the technique decreases with increase in noise in signal; however it is independent of signal spectrum shape. The dynamic programming based TDE technique is applied to the beam emission spectroscopy diagnostic data to measure poloidal velocity fluctuations, which led to the observation of theoretically predicted zonal flows in high-temperature tokamak plasmas.

  7. Subdivision based isogeometric analysis technique for electric field integral equations for simply connected structures

    NASA Astrophysics Data System (ADS)

    Li, Jie; Dault, Daniel; Liu, Beibei; Tong, Yiying; Shanker, Balasubramaniam

    2016-08-01

    The analysis of electromagnetic scattering has long been performed on a discrete representation of the geometry. This representation is typically continuous but not differentiable. The need to define physical quantities on this geometric representation has led to development of sets of basis functions that need to satisfy constraints at the boundaries of the elements/tessellations (viz., continuity of normal or tangential components across element boundaries). For electromagnetics, these result in either curl/div-conforming basis sets. The geometric representation used for analysis is in stark contrast with that used for design, wherein the surface representation is higher order differentiable. Using this representation for both geometry and physics on geometry has several advantages, and is elucidated in Hughes et al. (2005) [7]. Until now, a bulk of the literature on isogeometric methods have been limited to solid mechanics, with some effort to create NURBS based basis functions for electromagnetic analysis. In this paper, we present the first complete isogeometry solution methodology for the electric field integral equation as applied to simply connected structures. This paper systematically proceeds through surface representation using subdivision, definition of vector basis functions on this surface, to fidelity in the solution of integral equations. We also present techniques to stabilize the solution at low frequencies, and impose a Calderón preconditioner. Several results presented serve to validate the proposed approach as well as demonstrate some of its capabilities.

  8. Nanostructural defects evidenced in failing silicon-based NMOS capacitors by advanced failure analysis techniques

    NASA Astrophysics Data System (ADS)

    Faivre, Emilie; Llido, Roxane; Putero, Magali; Fares, Lahouari; Muller, Christophe

    2014-04-01

    An experimental methodology compliant with industrial constraints was deployed to uncover the origin of soft breakdown events in large planar silicon-based NMOS capacitors. Complementary advanced failure analysis techniques were advantageously employed to localize, isolate and observe structural defects at nanoscale. After an accurate localization of the failing area by optical beam-induced resistance change (OBIRCH), focused ion beam (FIB) technique enabled preparing thin specimens adequate for transmission electron microscopy (TEM). Characterization of the gate oxide microstructure was performed by highresolution TEM imaging and energy-filtered spectroscopy. A dedicated experimental protocol relying on iterative FIB thinning and TEM observation enabled improving the quality of electron imaging of defects at atom scale. In that way, the gate oxide integrity was evaluated and an electrical stress-induced silicon epitaxy was detected concomitantly to soft breakdown events appearing during constant voltage stress. The growth of silicon hillocks enables consuming a part of the breakdown energy and may prevent the soft breakdown event to evolve towards a hard breakdown that is catastrophic for device functionality.

  9. An Evaluation of Microcomputer-Based Strain Analysis Techniques on Meteoritic Chondrules

    NASA Astrophysics Data System (ADS)

    Hill, H. G. M.

    1995-09-01

    Introduction: Chondrule flattening and distinct foliation are preserved in certain chondrites [1] and have been interpreted, by some, as evidence of shock-induced pressure through hypervelocity impacts on parent bodies [2]. Recently, mean aspect ratios of naturally and artificially shocked chondrules, in the Allende (CV3) chondrite, have been correlated with shock intensity [3] using established shock stage criteria [4]. Clearly, quantification of chondrule deformation and appropriate petrographic criteria can be useful tools for constraining parent body shock history and, possibly, post-shock heating [3]. Here, strain analysis techniques (R(sub)(f)/phi and Fry) normally employed in structural geology, have been adapted and evaluated [5], for measuring mean chondrule strain, and orientation. In addition, the possible use of such strain data for partial shock stage classification is considered. R(sub)(f)/phi and Fry Analysis: The relationship between displacement and shape changes in rocks is known as strain [6] and assumes that an initial circle with a unit radius is deformed to form an ellipse, the finite strain ellipse (Rf). The strain ratio (Rs) is an expression of the change of shape. The orientation of the strain ellipse (phi) is the angle subtended between the semi-major axes and the direction of a fixed point of reference. Generally, log mean Rf ~ Rs and, therefore, the approximation Rf = Rs is valid. For chondrules, this is reasonable as they were originally molten, or partially-molten, droplets [7]. Fry's 'center-to-center' geological strain analysis technique [8] is based on the principle that the distribution of particle centers in rocks can sometimes be used to determine the state of finite strain (Rf). Experimental Techniques: The Bovedy (L3) chondrite was chosen for investigation as it contains abundant, oriented, elliptical chondrules [5]. Hardware employed consisted of a Macintosh microcomputer and a flat-bed scanner. Chondrule outlines, obtained

  10. Digital Instrumentation and Control Failure Events Derivation and Analysis by Frame-Based Technique

    SciTech Connect

    Hui-Wen Huang; Chunkuan Shih; Swu Yih; Yen-Chang Tzeng; Ming-Huei Chen

    2006-07-01

    A frame-based technique, including physical frame, logical frame, and cognitive frame, was adopted to perform digital I and C failure events derivation and analysis for generic ABWR. The physical frame was structured with a modified PCTran-ABWR plant simulation code, which was extended and enhanced on the feedwater system, recirculation system, and steam line system. The logical model is structured with MATLAB, which was incorporated into PCTran-ABWR to improve the pressure control system, feedwater control system, recirculation control system, and automated power regulation control system. As a result, the software failure of these digital control systems can be properly simulated and analyzed. The cognitive frame was simulated by the operator awareness status in the scenarios. Moreover, via an internal characteristics tuning technique, the modified PCTran-ABWR can precisely reflect the characteristics of the power-core flow. Hence, in addition to the transient plots, the analysis results can then be demonstrated on the power-core flow map. A number of postulated I and C system software failure events were derived to achieve the dynamic analyses. The basis for event derivation includes the published classification for software anomalies, the digital I and C design data for ABWR, chapter 15 accident analysis of generic SAR, and the reported NPP I and C software failure events. The case study of this research includes (1) the software CMF analysis for the major digital control systems; and (2) postulated ABWR digital I and C software failure events derivation from the actual happening of non-ABWR digital I and C software failure events, which were reported to LER of USNRC or IRS of IAEA. These events were analyzed by PCTran-ABWR. Conflicts among plant status, computer status, and human cognitive status are successfully identified. The operator might not easily recognize the abnormal condition, because the computer status seems to progress normally. However, a well

  11. Electromechanical actuators affected by multiple failures: Prognostic method based on spectral analysis techniques

    NASA Astrophysics Data System (ADS)

    Belmonte, D.; Vedova, M. D. L. Dalla; Ferro, C.; Maggiore, P.

    2017-06-01

    The proposal of prognostic algorithms able to identify precursors of incipient failures of primary flight command electromechanical actuators (EMA) is beneficial for the anticipation of the incoming failure: an early and correct interpretation of the failure degradation pattern, in fact, can trig an early alert of the maintenance crew, who can properly schedule the servomechanism replacement. An innovative prognostic model-based approach, able to recognize the EMA progressive degradations before his anomalous behaviors become critical, is proposed: the Fault Detection and Identification (FDI) of the considered incipient failures is performed analyzing proper system operational parameters, able to put in evidence the corresponding degradation path, by means of a numerical algorithm based on spectral analysis techniques. Subsequently, these operational parameters will be correlated with the actual EMA health condition by means of failure maps created by a reference monitoring model-based algorithm. In this work, the proposed method has been tested in case of EMA affected by combined progressive failures: in particular, partial stator single phase turn to turn short-circuit and rotor static eccentricity are considered. In order to evaluate the prognostic method, a numerical test-bench has been conceived. Results show that the method exhibit adequate robustness and a high degree of confidence in the ability to early identify an eventual malfunctioning, minimizing the risk of fake alarms or unannounced failures.

  12. Prioritization of sub-watersheds based on morphometric analysis using geospatial technique in Piperiya watershed, India

    NASA Astrophysics Data System (ADS)

    Chandniha, Surendra Kumar; Kansal, Mitthan Lal

    2017-03-01

    Hydrological investigation and behavior of watershed depend upon geo-morphometric characteristics of catchment. Morphometric analysis is commonly used for development of regional hydrological model of ungauged watershed. A critical valuation and assessment of geo-morphometric constraints has been carried out. Prioritization of watersheds based on water plot capacity of Piperiya watershed has been evaluated by linear, aerial and relief aspects. Morphometric analysis has been attempted for prioritization for nine sub-watersheds of Piperiya watershed in Hasdeo river basin, which is a tributary of the Mahanadi. Sub-watersheds are delineated by ArcMap 9.3 software as per digital elevation model (DEM). Assessment of drainages and their relative parameters such as stream order, stream length, stream frequency, drainage density, texture ratio, form factor, circulatory ratio, elongation ratio, bifurcation ratio and compactness ratio has been calculated separately for each sub-watershed using the Remote Sensing (RS) and Geospatial techniques. Finally, the prioritized score on the basis of morphometric behavior of each sub-watershed is assigned and thereafter consolidated scores have been estimated to identify the most sensitive parameters. The analysis reveals that stream order varies from 1 to 5; however, the first-order stream covers maximum area of about 87.7 %. Total number of stream segment of all order is 1,264 in the watershed. The study emphasizes the prioritization of the sub-watersheds on the basis of morphometric analysis. The final score of entire nine sub-watersheds is assigned as per erosion threat. The sub-watershed with the least compound parameter value was assigned as highest priority. However, the sub-watersheds has been categorized into three classes as high (4.1-4.7), medium (4.8-5.3) and low (>5.4) priority on the basis of their maximum (6.0) and minimum (4.1) prioritized score.

  13. Instanton-based techniques for analysis and reduction of error floor of LDPC codes

    SciTech Connect

    Chertkov, Michael; Chilappagari, Shashi K; Stepanov, Mikhail G; Vasic, Bane

    2008-01-01

    We describe a family of instanton-based optimization methods developed recently for the analysis of the error floors of low-density parity-check (LDPC) codes. Instantons are the most probable configurations of the channel noise which result in decoding failures. We show that the general idea and the respective optimization technique are applicable broadly to a variety of channels, discrete or continuous, and variety of sub-optimal decoders. Specifically, we consider: iterative belief propagation (BP) decoders, Gallager type decoders, and linear programming (LP) decoders performing over the additive white Gaussian noise channel (AWGNC) and the binary symmetric channel (BSC). The instanton analysis suggests that the underlying topological structures of the most probable instanton of the same code but different channels and decoders are related to each other. Armed with this understanding of the graphical structure of the instanton and its relation to the decoding failures, we suggest a method to construct codes whose Tanner graphs are free of these structures, and thus have less significant error floors.

  14. Operational modal analysis via image based technique of very flexible space structures

    NASA Astrophysics Data System (ADS)

    Sabatini, Marco; Gasbarri, Paolo; Palmerini, Giovanni B.; Monti, Riccardo

    2013-08-01

    Vibrations represent one of the most important topics of the engineering design relevant to flexible structures. The importance of this problem increases when a very flexible system is considered, and this is often the case of space structures. In order to identify the modal characteristics, in terms of natural frequencies and relevant modal parameters, ground tests are performed. However, these parameters could vary due to the operative conditions of the system. In order to continuously monitor the modal characteristics during the satellite lifetime, an operational modal analysis is mandatory. This kind of analysis is usually performed by using classical accelerometers or strain gauges and by properly analyzing the acquired output. In this paper a different approach for the vibrations data acquisition will be performed via image-based technique. In order to simulate a flexible satellite, a free flying platform is used; the problem is furthermore complicated by the fact that the overall system, constituted by a highly rigid bus and very flexible panels, must necessarily be modeled as a multibody system. In the experimental campaign, the camera, placed on the bus, will be used to identify the eigenfrequencies of the vibrating structure; in this case aluminum thin plates simulate very flexible solar panels. The structure is excited by a hammer or studied during a fast attitude maneuver. The results of the experimental activity will be investigated and compared with respect to the numerical simulation obtained via a FEM-multibody software and the relevant results will be proposed and discussed.

  15. An efficient technique for nuclei segmentation based on ellipse descriptor analysis and improved seed detection algorithm.

    PubMed

    Xu, Hongming; Lu, Cheng; Mandal, Mrinal

    2014-09-01

    In this paper, we propose an efficient method for segmenting cell nuclei in the skin histopathological images. The proposed technique consists of four modules. First, it separates the nuclei regions from the background with an adaptive threshold technique. Next, an elliptical descriptor is used to detect the isolated nuclei with elliptical shapes. This descriptor classifies the nuclei regions based on two ellipticity parameters. Nuclei clumps and nuclei with irregular shapes are then localized by an improved seed detection technique based on voting in the eroded nuclei regions. Finally, undivided nuclei regions are segmented by a marked watershed algorithm. Experimental results on 114 different image patches indicate that the proposed technique provides a superior performance in nuclei detection and segmentation.

  16. Polyspectral signal analysis techniques for condition based maintenance of helicopter drive-train system

    NASA Astrophysics Data System (ADS)

    Hassan Mohammed, Mohammed Ahmed

    For an efficient maintenance of a diverse fleet of air- and rotorcraft, effective condition based maintenance (CBM) must be established based on rotating components monitored vibration signals. In this dissertation, we present theory and applications of polyspectral signal processing techniques for condition monitoring of critical components in the AH-64D helicopter tail rotor drive train system. Currently available vibration-monitoring tools are mostly built around auto- and cross-power spectral analysis which have limited performance in detecting frequency correlations higher than second order. Studying higher order correlations and their Fourier transforms, higher order spectra, provides more information about the vibration signals which helps in building more accurate diagnostic models of the mechanical system. Based on higher order spectral analysis, different signal processing techniques are developed to assess health conditions of different critical rotating-components in the AH-64D helicopter drive-train. Based on cross-bispectrum, quadratic nonlinear transfer function is presented to model second order nonlinearity in a drive-shaft running between the two hanger bearings. Then, quadratic-nonlinearity coupling coefficient between frequency harmonics of the rotating shaft is used as condition metric to study different seeded shaft faults compared to baseline case, namely: shaft misalignment, shaft imbalance, and combination of shaft misalignment and imbalance. The proposed quadratic-nonlinearity metric shows better capabilities in distinguishing the four studied shaft settings than the conventional linear coupling based on cross-power spectrum. We also develop a new concept of Quadratic-Nonlinearity Power-Index spectrum, QNLPI(f), that can be used in signal detection and classification, based on bicoherence spectrum. The proposed QNLPI(f) is derived as a projection of the three-dimensional bicoherence spectrum into two-dimensional spectrum that

  17. Complex fluid flow and heat transfer analysis inside a calandria based reactor using CFD technique

    NASA Astrophysics Data System (ADS)

    Kulkarni, P. S.

    2017-04-01

    Series of numerical experiments have been carried out on a calandria based reactor for optimizing the design to increase the overall heat transfer efficiency by using Computational Fluid Dynamic (CFD) technique. Fluid flow and heat transfer inside the calandria is governed by many geometric and flow parameters like orientation of inlet, inlet mass flow rate, fuel channel configuration (in-line, staggered, etc.,), location of inlet and outlet, etc.,. It was well established that heat transfer is more wherever forced convection dominates but for geometries like calandria it is very difficult to achieve forced convection flow everywhere, intern it strongly depends on the direction of inlet jet. In the present paper the initial design was optimized with respect to inlet jet angle, the optimized design has been numerically tested for different heat load mass flow conditions. To further increase the heat removal capacity of a calandria, further numerical studies has been carried out for different inlet geometry. In all the analysis same overall geometry size and same number of tubes has been considered. The work gives good insight into the fluid flow and heat transfer inside the calandria and offer a guideline for optimizing the design and/or capacity enhancement of a present design.

  18. Singular value decomposition based feature extraction technique for physiological signal analysis.

    PubMed

    Chang, Cheng-Ding; Wang, Chien-Chih; Jiang, Bernard C

    2012-06-01

    Multiscale entropy (MSE) is one of the popular techniques to calculate and describe the complexity of the physiological signal. Many studies use this approach to detect changes in the physiological conditions in the human body. However, MSE results are easily affected by noise and trends, leading to incorrect estimation of MSE values. In this paper, singular value decomposition (SVD) is adopted to replace MSE to extract the features of physiological signals, and adopt the support vector machine (SVM) to classify the different physiological states. A test data set based on the PhysioNet website was used, and the classification results showed that using SVD to extract features of the physiological signal could attain a classification accuracy rate of 89.157%, which is higher than that using the MSE value (71.084%). The results show the proposed analysis procedure is effective and appropriate for distinguishing different physiological states. This promising result could be used as a reference for doctors in diagnosis of congestive heart failure (CHF) disease.

  19. Performance Analysis of SAC Optical PPM-CDMA System-Based Interference Rejection Technique

    NASA Astrophysics Data System (ADS)

    Alsowaidi, N.; Eltaif, Tawfig; Mokhtar, M. R.

    2016-03-01

    In this paper, we aim to theoretically analyse optical code division multiple access (OCDMA) system that based on successive interference cancellation (SIC) using pulse position modulation (PPM), considering the interference between the users, imperfection cancellation occurred during the cancellation process and receiver noises. Spectral amplitude coding (SAC) scheme is used to suppress the overlapping between the users and reduce the receiver noises effect. The theoretical analysis of the multiple access interference (MAI)-limited performance of this approach indicates the influence of the size of M-ary PPM on OCDMA system. The OCDMA system performance improves with increasing M-ary PPM. Therefore, it was found that the SIC/SAC-OCDMA system using PPM technique along with modified prime (MPR) codes used as signature sequence code offers significant improvement over the one without cancellation and it can support up to 103 users at the benchmarking value of bit error rate (BER) = 10-9 with prime number p = 11 while the system without cancellation scheme can support only up to 52 users.

  20. A neighbourhood analysis based technique for real-time error concealment in H.264 intra pictures

    NASA Astrophysics Data System (ADS)

    Beesley, Steven T. C.; Grecos, Christos; Edirisinghe, Eran

    2007-02-01

    H.264s extensive use of context-based adaptive binary arithmetic or variable length coding makes streams highly susceptible to channel errors, a common occurrence over networks such as those used by mobile devices. Even a single bit error will cause a decoder to discard all stream data up to the next fixed length resynchronisation point, the worst scenario is that an entire slice is lost. In cases where retransmission and forward error concealment are not possible, a decoder should conceal any erroneous data in order to minimise the impact on the viewer. Stream errors can often be spotted early in the decode cycle of a macroblock which if aborted can provide unused processor cycles, these can instead be used to conceal errors at minimal cost, even as part of a real time system. This paper demonstrates a technique that utilises Sobel convolution kernels to quickly analyse the neighbourhood surrounding erroneous macroblocks before performing a weighted multi-directional interpolation. This generates significantly improved statistical (PSNR) and visual (IEEE structural similarity) results when compared to the commonly used weighted pixel value averaging. Furthermore it is also computationally scalable, both during analysis and concealment, achieving maximum performance from the spare processing power available.

  1. Spatio-temporal analysis of discharge regimes based on hydrograph classification techniques in an agricultural catchment

    NASA Astrophysics Data System (ADS)

    Chen, Xiaofei; Bloeschl, Guenter; Blaschke, Alfred Paul; Silasari, Rasmiaditya; Exner-Kittridge, Mike

    2016-04-01

    The stream, discharges and groundwater hydro-graphs is an integration in spatial and temporal variations for small-scale hydrological response. Characterizing discharges response regime in a drainage farmland is essential to irrigation strategies and hydrologic modeling. Especially for agricultural basins, diurnal hydro-graphs from drainage discharges have been investigated to achieve drainage process inferences in varying magnitudes. To explore the variability of discharge responses, we developed an impersonal method to characterize and classify discharge hydrograph based on features of magnitude and time-series. A cluster analysis (hierarchical k-means) and principal components analysis techniques are used for discharge time-series and groundwater level hydro-graphs to analyze their event characteristics, using 8 different discharge and 18 groundwater level hydro-graphs to test. As the variability of rainfall activity, system location, discharge regime and soil moisture pre-event condition in the catchment, three main clusters of discharge hydro-graph are identified from the test. The results show that : (1) the hydro-graphs from these drainage discharges had similar shapes but different magnitudes for individual rainstorm; the similarity is also showed in overland flow discharge and spring system; (2) for each cluster, the similarity of shape insisted, but the rising slope are different due to different antecedent wetness condition and the rain accumulation meanwhile the difference of regression slope can be explained by system location and discharge area; and (3) surface water always has a close proportional relation with soil moisture throughout the year, while only after the soil moisture exceeds a certain threshold does the outflow of tile drainage systems have a direct ratio relationship with soil moisture and a inverse relationship with the groundwater levels. Finally, we discussed the potential application of hydrograph classification in a wider range of

  2. Application of multivariate data-analysis techniques to biomedical diagnostics based on mid-infrared spectroscopy.

    PubMed

    Wang, Liqun; Mizaikoff, Boris

    2008-07-01

    The objective of this contribution is to review the application of advanced multivariate data-analysis techniques in the field of mid-infrared (MIR) spectroscopic biomedical diagnosis. MIR spectroscopy is a powerful chemical analysis tool for detecting biomedically relevant constituents such as DNA/RNA, proteins, carbohydrates, lipids, etc., and even diseases or disease progression that may induce changes in the chemical composition or structure of biological systems including cells, tissues, and bio-fluids. However, MIR spectra of multiple constituents are usually characterized by strongly overlapping spectral features reflecting the complexity of biological samples. Consequently, MIR spectra of biological samples are frequently difficult to interpret by simple data-analysis techniques. Hence, with increasing complexity of the sample matrix more sophisticated mathematical and statistical data analysis routines are required for deconvoluting spectroscopic data and for providing useful results from information-rich spectroscopic signals. A large body of work relates to the combination of multivariate data-analysis techniques with MIR spectroscopy, and has been applied by a variety of research groups to biomedically relevant areas such as cancer detection and analysis, artery diseases, biomarkers, and other pathologies. The reported results indeed reveal a promising perspective for more widespread application of multivariate data analysis in assisting MIR spectroscopy as a screening or diagnostic tool in biomedical research and clinical studies. While the authors do not mean to ignore any relevant contributions to biomedical analysis across the entire electromagnetic spectrum, they confine the discussion in this contribution to the mid-infrared spectral range as a potentially very useful, yet underutilized frequency region. Selected representative examples without claiming completeness will demonstrate a range of biomedical diagnostic applications with particular

  3. Development of evaluation technique of GMAW welding quality based on statistical analysis

    NASA Astrophysics Data System (ADS)

    Feng, Shengqiang; Terasaki, Hidenri; Komizo, Yuichi; Hu, Shengsun; Chen, Donggao; Ma, Zhihua

    2014-11-01

    Nondestructive techniques for appraising gas metal arc welding(GMAW) faults plays a very important role in on-line quality controllability and prediction of the GMAW process. On-line welding quality controllability and prediction have several disadvantages such as high cost, low efficiency, complication and greatly being affected by the environment. An enhanced, efficient evaluation technique for evaluating welding faults based on Mahalanobis distance(MD) and normal distribution is presented. In addition, a new piece of equipment, designated the weld quality tester(WQT), is developed based on the proposed evaluation technique. MD is superior to other multidimensional distances such as Euclidean distance because the covariance matrix used for calculating MD takes into account correlations in the data and scaling. The values of MD obtained from welding current and arc voltage are assumed to follow a normal distribution. The normal distribution has two parameters: the mean µ and standard deviation σ of the data. In the proposed evaluation technique used by the WQT, values of MD located in the range from zero to µ+3 σ are regarded as "good". Two experiments which involve changing the flow of shielding gas and smearing paint on the surface of the substrate are conducted in order to verify the sensitivity of the proposed evaluation technique and the feasibility of using WQT. The experimental results demonstrate the usefulness of the WQT for evaluating welding quality. The proposed technique can be applied to implement the on-line welding quality controllability and prediction, which is of great importance to design some novel equipment for weld quality detection.

  4. Aerodynamic measurement techniques. [laser based diagnostic techniques

    NASA Technical Reports Server (NTRS)

    Hunter, W. W., Jr.

    1976-01-01

    Laser characteristics of intensity, monochromatic, spatial coherence, and temporal coherence were developed to advance laser based diagnostic techniques for aerodynamic related research. Two broad categories of visualization and optical measurements were considered, and three techniques received significant attention. These are holography, laser velocimetry, and Raman scattering. Examples of the quantitative laser velocimeter and Raman scattering measurements of velocity, temperature, and density indicated the potential of these nonintrusive techniques.

  5. Genetic analysis of seasonal runoff based on automatic techniques of hydrometeorological data processing

    NASA Astrophysics Data System (ADS)

    Kireeva, Maria; Sazonov, Alexey; Rets, Ekaterina; Ezerova, Natalia; Frolova, Natalia; Samsonov, Timofey

    2017-04-01

    Detection of the rivers' feeding type is a complex and multifactor task. Such partitioning should be based, on the one hand, on the genesis of the feeding water, on the other hand, on its physical path. At the same time it should consider relationship of the feeding type with corresponding phase of the water regime. Due to the above difficulties and complexity of the approach, there are many different variants of separation of flow hydrograph for feeding types. The most common method is extraction of so called basic component which in one way or another reflects groundwater feeding of the river. In this case, the selection most often is based on the principle of local minima or graphic separation of this component. However, in this case neither origin of the water nor corresponding phase of water regime is considered. In this paper, the authors offer a method of complex automated analysis of genetic components of the river's feeding together with the separation of specific phases of the water regime. The objects of the study are medium and large rivers of European Russia having a pronounced spring flood, formed due to melt water, and summer-autumn and winter low water which is periodically interrupted by rain or thaw flooding. The method is based on genetic separation of hydrograph proposed in 1960s years by B. I. Kudelin. This technique is considered for large rivers having hydraulic connection with groundwater horizons during flood. For better detection of floods genesis the analysis involves reanalysis data on temperature and precipitation. Separation is based on the following fundamental graphic-analytical principles: • Ground feeding during the passage of flood peak tends to zero • Beginning of the flood is determined as the exceeding of critical value of low water discharge • Flood periods are determined on the basis of exceeding the critical low-water discharge; they relate to thaw in case of above-zero temperatures • During thaw and rain floods

  6. Synchrotron-Based Microspectroscopic Analysis of Molecular and Biopolymer Structures Using Multivariate Techniques and Advanced Multi-Components Modeling

    SciTech Connect

    Yu, P.

    2008-01-01

    More recently, advanced synchrotron radiation-based bioanalytical technique (SRFTIRM) has been applied as a novel non-invasive analysis tool to study molecular, functional group and biopolymer chemistry, nutrient make-up and structural conformation in biomaterials. This novel synchrotron technique, taking advantage of bright synchrotron light (which is million times brighter than sunlight), is capable of exploring the biomaterials at molecular and cellular levels. However, with the synchrotron RFTIRM technique, a large number of molecular spectral data are usually collected. The objective of this article was to illustrate how to use two multivariate statistical techniques: (1) agglomerative hierarchical cluster analysis (AHCA) and (2) principal component analysis (PCA) and two advanced multicomponent modeling methods: (1) Gaussian and (2) Lorentzian multi-component peak modeling for molecular spectrum analysis of bio-tissues. The studies indicated that the two multivariate analyses (AHCA, PCA) are able to create molecular spectral corrections by including not just one intensity or frequency point of a molecular spectrum, but by utilizing the entire spectral information. Gaussian and Lorentzian modeling techniques are able to quantify spectral omponent peaks of molecular structure, functional group and biopolymer. By application of these four statistical methods of the multivariate techniques and Gaussian and Lorentzian modeling, inherent molecular structures, functional group and biopolymer onformation between and among biological samples can be quantified, discriminated and classified with great efficiency.

  7. Wilcoxon signed-rank-based technique for the pulse-shape analysis of HPGe detectors

    NASA Astrophysics Data System (ADS)

    Martín, S.; Quintana, B.; Barrientos, D.

    2016-07-01

    The characterization of the electric response of segmented-contact high-purity germanium detectors requires scanning systems capable of accurately associating each pulse with the position of the interaction that generated it. This process requires an algorithm sensitive to changes above the electronic noise in the pulse shapes produced at different positions, depending on the resolution of the Ge crystal. In this work, a pulse-shape comparison technique based on the Wilcoxon signed-rank test has been developed. It provides a method to distinguish pulses coming from different interaction points in the germanium crystal. Therefore, this technique is a necessary step for building a reliable pulse-shape database that can be used later for the determination of the position of interaction for γ-ray tracking spectrometry devices such as AGATA, GRETA or GERDA. The method was validated by comparison with a χ2 test using simulated and experimental pulses corresponding to a Broad Energy germanium detector (BEGe).

  8. Complexity analysis of sleep and alterations with insomnia based on non-invasive techniques.

    PubMed

    Holloway, Philip M; Angelova, Maia; Lombardo, Sara; St Clair Gibson, Alan; Lee, David; Ellis, Jason

    2014-04-06

    For the first time, fractal analysis techniques are implemented to study the correlations present in sleep actigraphy for individuals suffering from acute insomnia with comparisons made against healthy subjects. Analysis was carried out for 21 healthy individuals with no diagnosed sleep disorders and 26 subjects diagnosed with acute insomnia during night-time hours. Detrended fluctuation analysis was applied in order to look for 1/f-fluctuations indicative of high complexity. The aim is to investigate whether complexity analysis can differentiate between people who sleep normally and people who suffer from acute insomnia. We hypothesize that the complexity will be higher in subjects who suffer from acute insomnia owing to increased night-time arousals. This hypothesis, although contrary to much of the literature surrounding complexity in physiology, was found to be correct-for our study. The complexity results for nearly all of the subjects fell within a 1/f-range, indicating the presence of underlying control mechanisms. The subjects with acute insomnia displayed significantly higher correlations, confirmed by significance testing-possibly a result of too much activity in the underlying regulatory systems. Moreover, we found a linear relationship between complexity and variability, both of which increased with the onset of insomnia. Complexity analysis is very promising and could prove to be a useful non-invasive identifier for people who suffer from sleep disorders such as insomnia.

  9. Complexity analysis of sleep and alterations with insomnia based on non-invasive techniques

    PubMed Central

    Holloway, Philip M.; Angelova, Maia; Lombardo, Sara; St Clair Gibson, Alan; Lee, David; Ellis, Jason

    2014-01-01

    For the first time, fractal analysis techniques are implemented to study the correlations present in sleep actigraphy for individuals suffering from acute insomnia with comparisons made against healthy subjects. Analysis was carried out for 21 healthy individuals with no diagnosed sleep disorders and 26 subjects diagnosed with acute insomnia during night-time hours. Detrended fluctuation analysis was applied in order to look for 1/f-fluctuations indicative of high complexity. The aim is to investigate whether complexity analysis can differentiate between people who sleep normally and people who suffer from acute insomnia. We hypothesize that the complexity will be higher in subjects who suffer from acute insomnia owing to increased night-time arousals. This hypothesis, although contrary to much of the literature surrounding complexity in physiology, was found to be correct—for our study. The complexity results for nearly all of the subjects fell within a 1/f-range, indicating the presence of underlying control mechanisms. The subjects with acute insomnia displayed significantly higher correlations, confirmed by significance testing—possibly a result of too much activity in the underlying regulatory systems. Moreover, we found a linear relationship between complexity and variability, both of which increased with the onset of insomnia. Complexity analysis is very promising and could prove to be a useful non-invasive identifier for people who suffer from sleep disorders such as insomnia. PMID:24501273

  10. Analysis to feature-based video stabilization/registration techniques within application of traffic data collection

    NASA Astrophysics Data System (ADS)

    Sadat, Mojtaba T.; Viti, Francesco

    2015-02-01

    Machine vision is rapidly gaining popularity in the field of Intelligent Transportation Systems. In particular, advantages are foreseen by the exploitation of Aerial Vehicles (AV) in delivering a superior view on traffic phenomena. However, vibration on AVs makes it difficult to extract moving objects on the ground. To partly overcome this issue, image stabilization/registration procedures are adopted to correct and stitch multiple frames taken of the same scene but from different positions, angles, or sensors. In this study, we examine the impact of multiple feature-based techniques for stabilization, and we show that SURF detector outperforms the others in terms of time efficiency and output similarity.

  11. A discrete wavelet based feature extraction and hybrid classification technique for microarray data analysis.

    PubMed

    Bennet, Jaison; Ganaprakasam, Chilambuchelvan Arul; Arputharaj, Kannan

    2014-01-01

    Cancer classification by doctors and radiologists was based on morphological and clinical features and had limited diagnostic ability in olden days. The recent arrival of DNA microarray technology has led to the concurrent monitoring of thousands of gene expressions in a single chip which stimulates the progress in cancer classification. In this paper, we have proposed a hybrid approach for microarray data classification based on nearest neighbor (KNN), naive Bayes, and support vector machine (SVM). Feature selection prior to classification plays a vital role and a feature selection technique which combines discrete wavelet transform (DWT) and moving window technique (MWT) is used. The performance of the proposed method is compared with the conventional classifiers like support vector machine, nearest neighbor, and naive Bayes. Experiments have been conducted on both real and benchmark datasets and the results indicate that the ensemble approach produces higher classification accuracy than conventional classifiers. This paper serves as an automated system for the classification of cancer and can be applied by doctors in real cases which serve as a boon to the medical community. This work further reduces the misclassification of cancers which is highly not allowed in cancer detection.

  12. Analysis of meteorological variables in the Australasian region using ground- and space-based GPS techniques

    NASA Astrophysics Data System (ADS)

    Kuleshov, Yuriy; Choy, Suelynn; Fu, Erjiang Frank; Chane-Ming, Fabrice; Liou, Yuei-An; Pavelyev, Alexander G.

    2016-07-01

    Results of analysis of meteorological variables (temperature and moisture) in the Australasian region using the global positioning system (GPS) radio occultation (RO) and GPS ground-based observations verified with in situ radiosonde (RS) data are presented. The potential of using ground-based GPS observations for retrieving column integrated precipitable water vapour (PWV) over the Australian continent has been demonstrated using the Australian ground-based GPS reference stations network. Using data from the 15 ground-based GPS stations, the state of the atmosphere over Victoria during a significant weather event, the March 2010 Melbourne storm, has been investigated, and it has been shown that the GPS observations has potential for monitoring the movement of a weather front that has sharp moisture contrast. Temperature and moisture variability in the atmosphere over various climatic regions (the Indian and the Pacific Oceans, the Antarctic and Australia) has been examined using satellite-based GPS RO and in situ RS observations. Investigating recent atmospheric temperature trends over Antarctica, the time series of the collocated GPS RO and RS data were examined, and strong cooling in the lower stratosphere and warming through the troposphere over Antarctica has been identified, in agreement with outputs of climate models. With further expansion of the Global Navigation Satellite Systems (GNSS) system, it is expected that GNSS satellite- and ground-based measurements would be able to provide an order of magnitude larger amount of data which in turn could significantly advance weather forecasting services, climate monitoring and analysis in the Australasian region.

  13. Quantitative Techniques in Volumetric Analysis

    NASA Astrophysics Data System (ADS)

    Zimmerman, John; Jacobsen, Jerrold J.

    1996-12-01

    Quantitative Techniques in Volumetric Analysis is a visual library of techniques used in making volumetric measurements. This 40-minute VHS videotape is designed as a resource for introducing students to proper volumetric methods and procedures. The entire tape, or relevant segments of the tape, can also be used to review procedures used in subsequent experiments that rely on the traditional art of quantitative analysis laboratory practice. The techniques included are: Quantitative transfer of a solid with a weighing spoon Quantitative transfer of a solid with a finger held weighing bottle Quantitative transfer of a solid with a paper strap held bottle Quantitative transfer of a solid with a spatula Examples of common quantitative weighing errors Quantitative transfer of a solid from dish to beaker to volumetric flask Quantitative transfer of a solid from dish to volumetric flask Volumetric transfer pipet A complete acid-base titration Hand technique variations The conventional view of contemporary quantitative chemical measurement tends to focus on instrumental systems, computers, and robotics. In this view, the analyst is relegated to placing standards and samples on a tray. A robotic arm delivers a sample to the analysis center, while a computer controls the analysis conditions and records the results. In spite of this, it is rare to find an analysis process that does not rely on some aspect of more traditional quantitative analysis techniques, such as careful dilution to the mark of a volumetric flask. Figure 2. Transfer of a solid with a spatula. Clearly, errors in a classical step will affect the quality of the final analysis. Because of this, it is still important for students to master the key elements of the traditional art of quantitative chemical analysis laboratory practice. Some aspects of chemical analysis, like careful rinsing to insure quantitative transfer, are often an automated part of an instrumental process that must be understood by the

  14. Feature analysis of pathological speech signals using local discriminant bases technique.

    PubMed

    Umapathy, K; Krishnan, S

    2005-07-01

    Speech is an integral part of the human communication system. Various pathological conditions affect the vocal functions, inducing speech disorders. Acoustic parameters of speech are commonly used for the assessment of speech disorders and for monitoring the progress of the patient over the course of therapy. In the last two decades, signal-processing techniques have been successfully applied in screening speech disorders. In the paper, a novel approach is proposed to classify pathological speech signals using a local discriminant bases (LDB) algorithm and wavelet packet decompositions. The focus of the paper was to demonstrate the significance of identifying the signal subspaces that contribute to the discriminatory characteristics of normal and pathological speech signals in a computationally efficient way. Features were extracted from target subspaces for classification, and time-frequency decomposition was used to eliminate the need for segmentation of the speech signals. The technique was tested with a database of 212 speech signals (51 normal and 161 pathological) using the Daubechies wavelet (db4). Classification accuracies up to 96% were achieved for a two-group classification as normal and pathological speech signals, and 74% was achieved for a four-group classification as male normal, female normal, male pathological and female pathological signals.

  15. Sensitivity analysis of laboratory based mine overburden analytical techniques for the prediction of acidic mine drainage. Final report

    SciTech Connect

    Bradham, W.S.; Caruccio, F.T.

    1995-09-01

    A three part sensitivity analysis was conducted to evaluate commonly used mine overburden analytical techniques. The primary objectives of the study were: identify and evaluate the effects of variability in mine overburden geochemistry, as measured by pyrite weight percent and neutralization potential (NP), on variability of contaminant production; determine which acid/base accounting interpretation technique best predicts both qualitative and quantitative leachate quality in laboratory analytical testing; and identify the predominant factors of weathering cells, soxhlet extraction, and column leaching tests, and evaluate variability of contaminant production due to variations in; storage conditions, leachant temperature, particle size, particle sorting efficiency, and leaching interval.

  16. A Block-matching based technique for the analysis of 2D gel images.

    PubMed

    Freire, Ana; Seoane, José A; Rodríguez, Alvaro; Ruiz-Romero, Cristina; López-Campos, Guillermo; Dorado, Julián

    2010-01-01

    Research at protein level is a useful practice in personalized medicine. More specifically, 2D gel images obtained after electrophoresis process can lead to an accurate diagnosis. Several computational approaches try to help the clinicians to establish the correspondence between pairs of proteins of multiple 2D gel images. Most of them perform the alignment of a patient image referred to a reference image. In this work, an approach based on block-matching techniques is developed. Its main characteristic is that it does not need to perform the whole alignment between two images considering each protein separately. A comparison with other published methods is presented. It can be concluded that this method works over broad range of proteomic images, although they have a high level of difficulty.

  17. Quantitative elemental analysis of an industrial mineral talc, using accelerator-based analytical technique

    NASA Astrophysics Data System (ADS)

    Olabanji, S. O.; Ige, A. O.; Mazzoli, C.; Ceccato, D.; Ajayi, E. O. B.; De Poli, M.; Moschini, G.

    2005-10-01

    Accelerator-based technique of PIXE was employed for the determination of the elemental concentration of an industrial mineral, talc. Talc is a very versatile mineral in industries with several applications. Due to this, there is a need to know its constituents to ensure that the workers are not exposed to health risks. Besides, microscopic tests on some talc samples in Nigeria confirm that they fall within the BP British Pharmacopoeia standard for tablet formation. However, for these samples to become a local source of raw material for pharmaceutical grade talc, the precise elemental compositions should be established which is the focus of this work. Proton beam produced by the 2.5 MV AN 2000 Van de Graaff accelerator at INFN, LNL, Legnaro, Padova, Italy was used for the PIXE measurements. The results which show the concentration of different elements in the talc samples, their health implications and metabolic roles are presented and discussed.

  18. Automated cloud classification using a ground based infra-red camera and texture analysis techniques

    NASA Astrophysics Data System (ADS)

    Rumi, Emal; Kerr, David; Coupland, Jeremy M.; Sandford, Andrew P.; Brettle, Mike J.

    2013-10-01

    Clouds play an important role in influencing the dynamics of local and global weather and climate conditions. Continuous monitoring of clouds is vital for weather forecasting and for air-traffic control. Convective clouds such as Towering Cumulus (TCU) and Cumulonimbus clouds (CB) are associated with thunderstorms, turbulence and atmospheric instability. Human observers periodically report the presence of CB and TCU clouds during operational hours at airports and observatories; however such observations are expensive and time limited. Robust, automatic classification of cloud type using infrared ground-based instrumentation offers the advantage of continuous, real-time (24/7) data capture and the representation of cloud structure in the form of a thermal map, which can greatly help to characterise certain cloud formations. The work presented here utilised a ground based infrared (8-14 μm) imaging device mounted on a pan/tilt unit for capturing high spatial resolution sky images. These images were processed to extract 45 separate textural features using statistical and spatial frequency based analytical techniques. These features were used to train a weighted k-nearest neighbour (KNN) classifier in order to determine cloud type. Ground truth data were obtained by inspection of images captured simultaneously from a visible wavelength colour camera at the same installation, with approximately the same field of view as the infrared device. These images were classified by a trained cloud observer. Results from the KNN classifier gave an encouraging success rate. A Probability of Detection (POD) of up to 90% with a Probability of False Alarm (POFA) as low as 16% was achieved.

  19. Robust object tracking techniques for vision-based 3D motion analysis applications

    NASA Astrophysics Data System (ADS)

    Knyaz, Vladimir A.; Zheltov, Sergey Y.; Vishnyakov, Boris V.

    2016-04-01

    Automated and accurate spatial motion capturing of an object is necessary for a wide variety of applications including industry and science, virtual reality and movie, medicine and sports. For the most part of applications a reliability and an accuracy of the data obtained as well as convenience for a user are the main characteristics defining the quality of the motion capture system. Among the existing systems for 3D data acquisition, based on different physical principles (accelerometry, magnetometry, time-of-flight, vision-based), optical motion capture systems have a set of advantages such as high speed of acquisition, potential for high accuracy and automation based on advanced image processing algorithms. For vision-based motion capture accurate and robust object features detecting and tracking through the video sequence are the key elements along with a level of automation of capturing process. So for providing high accuracy of obtained spatial data the developed vision-based motion capture system "Mosca" is based on photogrammetric principles of 3D measurements and supports high speed image acquisition in synchronized mode. It includes from 2 to 4 technical vision cameras for capturing video sequences of object motion. The original camera calibration and external orientation procedures provide the basis for high accuracy of 3D measurements. A set of algorithms as for detecting, identifying and tracking of similar targets, so for marker-less object motion capture is developed and tested. The results of algorithms' evaluation show high robustness and high reliability for various motion analysis tasks in technical and biomechanics applications.

  20. A Perspective Study of Koumiss Microbiome by Metagenomics Analysis Based on Single-Cell Amplification Technique

    PubMed Central

    Yao, Guoqiang; Yu, Jie; Hou, Qiangchuan; Hui, Wenyan; Liu, Wenjun; Kwok, Lai-Yu; Menghe, Bilige; Sun, Tiansong; Zhang, Heping; Zhang, Wenyi

    2017-01-01

    Koumiss is a traditional fermented dairy product and a good source for isolating novel bacteria with biotechnology potential. In the present study, we applied the single-cell amplification technique in the metagenomics analysis of koumiss. This approach aimed at detecting the low-abundant bacteria in the koumiss. Briefly, each sample was first serially diluted until reaching the level of approximately 100 cells. Then, three diluted bacterial suspensions were randomly picked for further study. By analyzing 30 diluted koumiss suspensions, a total of 24 bacterial species were identified. In addition to the previously reported koumiss-associated species, such as Lactobacillus (L.) helveticus. Lactococcus lactis. L. buchneri, L. kefiranofaciens, and Acetobacter pasteurianus, we successfully detected three low-abundant taxa in the samples, namely L. otakiensis. Streptococcus macedonicus, and Ruminococcus torques. The functional koumiss metagenomes carried putative genes that relate to lactose metabolism and synthesis of typical flavor compounds. Our study would encourage the use of modern metagenomics to discover novel species of bacteria that could be useful in food industries. PMID:28223973

  1. Analysis of Sediment Transport for Rivers in South Korea based on Data Mining technique

    NASA Astrophysics Data System (ADS)

    Jang, Eun-kyung; Ji, Un; Yeo, Woonkwang

    2017-04-01

    The purpose of this study is to calculate of sediment discharge assessment using data mining in South Korea. The Model Tree was selected for this study which is the most suitable technique to explicitly analyze the relationship between input and output variables in various and diverse databases among the Data Mining. In order to derive the sediment discharge equation using the Model Tree of Data Mining used the dimensionless variables used in Engelund and Hansen, Ackers and White, Brownlie and van Rijn equations as the analytical condition. In addition, total of 14 analytical conditions were set considering the conditions dimensional variables and the combination conditions of the dimensionless variables and the dimensional variables according to the relationship between the flow and the sediment transport. For each case, the analysis results were analyzed by mean of discrepancy ratio, root mean square error, mean absolute percent error, correlation coefficient. The results showed that the best fit was obtained by using five dimensional variables such as velocity, depth, slope, width and Median Diameter. And closest approximation to the best goodness-of-fit was estimated from the depth, slope, width, main grain size of bed material and dimensionless tractive force and except for the slope in the single variable. In addition, the three types of Model Tree that are most appropriate are compared with the Ackers and White equation which is the best fit among the existing equations, the mean discrepancy ration and the correlation coefficient of the Model Tree are improved compared to the Ackers and White equation.

  2. Numerical analysis of radiation propagation in innovative volumetric receivers based on selective laser melting techniques

    NASA Astrophysics Data System (ADS)

    Alberti, Fabrizio; Santiago, Sergio; Roccabruna, Mattia; Luque, Salvador; Gonzalez-Aguilar, Jose; Crema, Luigi; Romero, Manuel

    2016-05-01

    Volumetric absorbers constitute one of the key elements in order to achieve high thermal conversion efficiencies in concentrating solar power plants. Regardless of the working fluid or thermodynamic cycle employed, design trends towards higher absorber output temperatures are widespread, which lead to the general need of components of high solar absorptance, high conduction within the receiver material, high internal convection, low radiative and convective heat losses and high mechanical durability. In this context, the use of advanced manufacturing techniques, such as selective laser melting, has allowed for the fabrication of intricate geometries that are capable of fulfilling the previous requirements. This paper presents a parametric design and analysis of the optical performance of volumetric absorbers of variable porosity conducted by means of detailed numerical ray tracing simulations. Sections of variable macroscopic porosity along the absorber depth were constructed by the fractal growth of single-cell structures. Measures of performance analyzed include optical reflection losses from the absorber front and rear faces, penetration of radiation inside the absorber volume, and radiation absorption as a function of absorber depth. The effects of engineering design parameters such as absorber length and wall thickness, material reflectance and porosity distribution on the optical performance of absorbers are discussed, and general design guidelines are given.

  3. AVR Microcontroller-based automated technique for analysis of DC motors

    NASA Astrophysics Data System (ADS)

    Kaur, P.; Chatterji, S.

    2014-01-01

    This paper provides essential information on the development of a 'dc motor test and analysis control card' using AVR series ATMega32 microcontroller. This card can be interfaced to PC and calculates parameters like motor losses, efficiency and plot characteristics for dc motors. Presently, there are different tests and methods available to evaluate motor parameters, but a single and universal user-friendly automated set-up has been discussed in this paper. It has been accomplished by designing a data acquisition and SCR bridge firing hardware based on AVR ATMega32 microcontroller. This hardware has the capability to drive the phase-controlled rectifiers and acquire real-time values of current, voltage, temperature and speed of motor. Various analyses feasible with the designed hardware are of immense importance for dc motor manufacturers and quality-sensitive users. Authors, through this paper aim to provide details of this AVR-based hardware which can be used for dc motor parameter analysis and also for motor control applications.

  4. [Rapid multi-elemental analysis on four precious Tibetan medicines based on LIBS technique].

    PubMed

    Liu, Xiao-na; Shi, Xin-yuan; Jia, Shuai-yun; Zhao, Na; Wu, Zhi-sheng; Qiao, Yan-jiang

    2015-06-01

    The laser-induced breakdown spectroscopy (LIBS) was applied to perform a qualitative elementary analysis on four precious Tibetan medicines, i. e. Renqing Mangjue, Renqing Changjue, 25-herb coral pills and 25-herb pearl pills. The specific spectra of the four Tibetan medicines were established. In the experiment, Nd: YAG and 1 064 nm-baseband pulse laser were adopted to collect the spectra. A laser beam focused on the surface of the samples to generate plasma. Its spectral signal was detected by using spectrograph. Based on the National Institute of Standard and Technology (NIST) database, LIBS spectral lines were indentified. The four Tibetan medicines mainly included Ca, Na, K, Mg and other elements and C-N molecular band. Specifically, Fe was detected in Renqing Changjue and 25-herb pearl pills; heavy mental elements Hg and Cu were shown in Renqing Mangjue and Renqing Changjue; Ag was found in Renqing Changjue. The results demonstrated that LIBS is a reliable and rapid multi-element analysis on the four Tibetan medicines. With Real-time, rapid and nondestructive advantages, LIBS has a wide application prospect in the element analysis on ethnic medicines.

  5. OSSE spectral analysis techniques

    NASA Technical Reports Server (NTRS)

    Purcell, W. R.; Brown, K. M.; Grabelsky, D. A.; Johnson, W. N.; Jung, G. V.; Kinzer, R. L.; Kroeger, R. A.; Kurfess, J. D.; Matz, S. M.; Strickman, M. S.

    1992-01-01

    Analysis of the spectra from the Oriented Scintillation Spectrometer Experiment (OSSE) is complicated because of the typically low signal to noise (approx. 0.1 percent) and the large background variability. The OSSE instrument was designed to address these difficulties by periodically offset-pointing the detectors from the source to perform background measurements. These background measurements are used to estimate the background during each of the source observations. The resulting background-subtracted spectra can then be accumulated and fitted for spectral lines and/or continua. Data selection based on various environmental parameters can be performed at various stages during the analysis procedure. In order to achieve the instrument's statistical sensitivity, however, it will be necessary for investigators to develop a detailed understanding of the instrument operation, data collection, and the background spectrum and its variability. A brief description of the major steps in the OSSE spectral analysis process is described, including a discussion of the OSSE background spectrum and examples of several observational strategies.

  6. Analysis of base fuze functioning of HESH ammunitions through high-speed photographic technique

    NASA Astrophysics Data System (ADS)

    Biswal, T. K.

    2007-01-01

    High-speed photography plays a major role in a Test Range where the direct access is possible through imaging in order to understand a dynamic process thoroughly and both qualitative and quantitative data are obtained thereafter through image processing and analysis. In one of the trials it was difficult to understand the performance of HESH ammunitions on rolled homogeneous armour. There was no consistency in scab formation even though all other parameters like propellant charge mass, charge temperature, impact velocity etc are maintained constant. To understand the event thoroughly high-speed photography was deployed to have a frontal view of the total process. Clear information of shell impact, embedding of HE propellant on armour and base fuze initiation are obtained. In case of scab forming rounds these three processes are clearly observed in sequence. However in non-scab ones base fuze is initiated before the completion of the embedding process resulting non-availability of threshold thrust on to the armour to cause scab. This has been revealed in two rounds where there was a failure of scab formation. As a quantitative measure, fuze delay was calculated for each round and there after premature functioning of base fuze was ascertained in case of non-scab rounds. Such potency of high-speed photography has been depicted in details in this paper.

  7. Bond strength analysis of custom base variables in indirect bonding techniques.

    PubMed

    Thompson, Michael A; Drummond, James L; BeGole, Ellen A

    2008-01-01

    Various methods are used to prepare the cured composite-adhesive interface for orthodontic indirect bonding. The intent of this study was to determine the effect on the shear bond strength of the following variables: use of a filled flowable composite resin as an adhesive, light air-abrasion of the cured composite bracket pad, and wetting the cured composite bracket pad with an unfilled resin. The sample of 240 brackets was divided into 2 groups of 120 each. The first group was further divided into 4 groups of 30 each. Brackets were bonded to bovine incisors with a filled flowable composite resin (Filtek, 3M ESPE, St Paul Minn), but the bracket pads were prepared differently in the 4 groups: unfilled resin was applied (Orthosolo, Ormco, Glendora, Calif), the surface was air abraded, the surface was air abraded followed by application of an unfilled resin (Orthosolo), and a control group. A matching sample of 120 brackets was bonded without the flowable composite as an adhesive. The different bracket pad preparations were chosen to represent the various techniques clinicians use in indirect bonding. The shear bond strength was measured on a universal testing machine. Two-way ANOVA analysis showed significant differences in the shear bond strength among the different surface preparations, but not between the use and nonuse of flowable composite. The Scheffé test showed that the mean shear bond strength of the air-abraded surface was significantly higher than all other surface preparations. Air abrading orthodontic bracket-pad composite surfaces in indirect bonding increased the shear bond strength, whereas the use of flowable composite did not affect bond strengths.

  8. A discrimination technique for extensive air showers based on multiscale, lacunarity and neural network analysis

    NASA Astrophysics Data System (ADS)

    Pagliaro, Antonio; D'Alí Staiti, G.; D'Anna, F.

    2011-03-01

    We present a new method for the identification of extensive air showers initiated by different primaries. The method uses the multiscale concept and is based on the analysis of multifractal behaviour and lacunarity of secondary particle distributions together with a properly designed and trained artificial neural network. In the present work the method is discussed and applied to a set of fully simulated vertical showers, in the experimental framework of ARGO-YBJ, to obtain hadron to gamma primary separation. We show that the presented approach gives very good results, leading, in the 1-10 TeV energy range, to a clear improvement of the discrimination power with respect to the existing figures for extended shower detectors.

  9. An Analysis Technique for Active Neutron Multiplicity Measurements Based on First Principles

    SciTech Connect

    Evans, Louise G; Goddard, Braden; Charlton, William S; Peerani, Paolo

    2012-08-13

    Passive neutron multiplicity counting is commonly used to quantify the total mass of plutonium in a sample, without prior knowledge of the sample geometry. However, passive neutron counting is less applicable to uranium measurements due to the low spontaneous fission rates of uranium. Active neutron multiplicity measurements are therefore used to determine the {sup 235}U mass in a sample. Unfortunately, there are still additional challenges to overcome for uranium measurements, such as the coupling of the active source and the uranium sample. Techniques, such as the coupling method, have been developed to help reduce the dependence of calibration curves for active measurements on uranium samples; although, they still require similar geometry known standards. An advanced active neutron multiplicity measurement method is being developed by Texas A&M University, in collaboration with Los Alamos National Laboratory (LANL) in an attempt to overcome the calibration curve requirements. This method can be used to quantify the {sup 235}U mass in a sample containing uranium without using calibration curves. Furthermore, this method is based on existing detectors and nondestructive assay (NDA) systems, such as the LANL Epithermal Neutron Multiplicity Counter (ENMC). This method uses an inexpensive boron carbide liner to shield the uranium sample from thermal and epithermal neutrons while allowing fast neutrons to reach the sample. Due to the relatively low and constant fission and absorption energy dependent cross-sections at high neutron energies for uranium isotopes, fast neutrons can penetrate the sample without significant attenuation. Fast neutron interrogation therefore creates a homogeneous fission rate in the sample, allowing for first principle methods to be used to determine the {sup 235}U mass in the sample. This paper discusses the measurement method concept and development, including measurements and simulations performed to date, as well as the potential

  10. Maid Analysis Techniques

    NASA Astrophysics Data System (ADS)

    Tiator, Lothar; Kamalov, Sabit

    2006-06-01

    MAID is a unitary isobar model for a partial wave analysis of pion photo- and electroproduction in the resonance region. It is fitted to the world data and can give predictions for multipoles, amplitudes, cross sections and polarization observables in the energy range from pion threshold up to W = 2 GeV and photon virtualities Q2 < 5 GeV2. Using more recent experimental results from Mainz, Bates, Bonn and JLab for Q2 up to 4.0 GeV2, the Q2 dependence of the helicity couplings A1/2, A3/2, S1/2 has been extracted for a series of four star resonances. We compare single-Q2 analyses with a superglobal fit in a new parametrization of Maid2005. Besides the (pion) MAID, at Mainz we maintain a collection of online programs for partial wave analysis of η, η' and kaon photo- and electroproduction which are all based on similar footings with field theoretical background and baryon excitations in Breit-Wigner form.

  11. Cost Analysis of MRI Services in Iran: An Application of Activity Based Costing Technique.

    PubMed

    Bayati, Mohsen; Mahboub Ahari, Alireza; Badakhshan, Abbas; Gholipour, Mahin; Joulaei, Hassan

    2015-10-01

    Considerable development of MRI technology in diagnostic imaging, high cost of MRI technology and controversial issues concerning official charges (tariffs) have been the main motivations to define and implement this study. The present study aimed to calculate the unit-cost of MRI services using activity-based costing (ABC) as a modern cost accounting system and to fairly compare calculated unit-costs with official charges (tariffs). We included both direct and indirect costs of MRI services delivered in fiscal year 2011 in Shiraz Shahid Faghihi hospital. Direct allocation method was used for distribution of overhead costs. We used micro-costing approach to calculate unit-cost of all different MRI services. Clinical cost data were retrieved from the hospital registering system. Straight-line method was used for depreciation cost estimation. To cope with uncertainty and to increase the robustness of study results, unit costs of 33 MRI services was calculated in terms of two scenarios. Total annual cost of MRI activity center (AC) was calculated at USD 400,746 and USD 532,104 based on first and second scenarios, respectively. Ten percent of the total cost was allocated from supportive departments. The annual variable costs of MRI center were calculated at USD 295,904. Capital costs measured at USD 104,842 and USD 236, 200 resulted from the first and second scenario, respectively. Existing tariffs for more than half of MRI services were above the calculated costs. As a public hospital, there are considerable limitations in both financial and administrative databases of Shahid Faghihi hospital. Labor cost has the greatest share of total annual cost of Shahid Faghihi hospital. The gap between unit costs and tariffs implies that the claim for extra budget from health providers may not be relevant for all services delivered by the studied MRI center. With some adjustments, ABC could be implemented in MRI centers. With the settlement of a reliable cost accounting system

  12. Cost Analysis of MRI Services in Iran: An Application of Activity Based Costing Technique

    PubMed Central

    Bayati, Mohsen; Mahboub Ahari, Alireza; Badakhshan, Abbas; Gholipour, Mahin; Joulaei, Hassan

    2015-01-01

    Background: Considerable development of MRI technology in diagnostic imaging, high cost of MRI technology and controversial issues concerning official charges (tariffs) have been the main motivations to define and implement this study. Objectives: The present study aimed to calculate the unit-cost of MRI services using activity-based costing (ABC) as a modern cost accounting system and to fairly compare calculated unit-costs with official charges (tariffs). Materials and Methods: We included both direct and indirect costs of MRI services delivered in fiscal year 2011 in Shiraz Shahid Faghihi hospital. Direct allocation method was used for distribution of overhead costs. We used micro-costing approach to calculate unit-cost of all different MRI services. Clinical cost data were retrieved from the hospital registering system. Straight-line method was used for depreciation cost estimation. To cope with uncertainty and to increase the robustness of study results, unit costs of 33 MRI services was calculated in terms of two scenarios. Results: Total annual cost of MRI activity center (AC) was calculated at USD 400,746 and USD 532,104 based on first and second scenarios, respectively. Ten percent of the total cost was allocated from supportive departments. The annual variable costs of MRI center were calculated at USD 295,904. Capital costs measured at USD 104,842 and USD 236, 200 resulted from the first and second scenario, respectively. Existing tariffs for more than half of MRI services were above the calculated costs. Conclusion: As a public hospital, there are considerable limitations in both financial and administrative databases of Shahid Faghihi hospital. Labor cost has the greatest share of total annual cost of Shahid Faghihi hospital. The gap between unit costs and tariffs implies that the claim for extra budget from health providers may not be relevant for all services delivered by the studied MRI center. With some adjustments, ABC could be implemented in MRI

  13. Can state-of-the-art HVS-based objective image quality criteria be used for image reconstruction techniques based on ROI analysis?

    NASA Astrophysics Data System (ADS)

    Dostal, P.; Krasula, L.; Klima, M.

    2012-06-01

    Various image processing techniques in multimedia technology are optimized using visual attention feature of the human visual system. Spatial non-uniformity causes that different locations in an image are of different importance in terms of perception of the image. In other words, the perceived image quality depends mainly on the quality of important locations known as regions of interest. The performance of such techniques is measured by subjective evaluation or objective image quality criteria. Many state-of-the-art objective metrics are based on HVS properties; SSIM, MS-SSIM based on image structural information, VIF based on the information that human brain can ideally gain from the reference image or FSIM utilizing the low-level features to assign the different importance to each location in the image. But still none of these objective metrics utilize the analysis of regions of interest. We solve the question if these objective metrics can be used for effective evaluation of images reconstructed by processing techniques based on ROI analysis utilizing high-level features. In this paper authors show that the state-of-the-art objective metrics do not correlate well with subjective evaluation while the demosaicing based on ROI analysis is used for reconstruction. The ROI were computed from "ground truth" visual attention data. The algorithm combining two known demosaicing techniques on the basis of ROI location is proposed to reconstruct the ROI in fine quality while the rest of image is reconstructed with low quality. The color image reconstructed by this ROI approach was compared with selected demosaicing techniques by objective criteria and subjective testing. The qualitative comparison of the objective and subjective results indicates that the state-of-the-art objective metrics are still not suitable for evaluation image processing techniques based on ROI analysis and new criteria is demanded.

  14. Comparative analysis of H.263 resilience techniques for H.223-based video transmission over slow-fading channels

    NASA Astrophysics Data System (ADS)

    Garzelli, Andrea; Abrardo, Andrea; Barni, Mauro; Marotta, D.

    2000-11-01

    The objective of this work is to analyze and compare H.263 resilience techniques for H.223-based real-time video transmission over narrow-band slow-fading channels. These channel conditions, which are typical for pedestrian video communications, are very critical, because they require Forward Error Correction (FEC), since data retransmission is not feasible, due to high network delay, and they reduce the effectiveness of FEC techniques- due to the bursty nature of the channel. In this work, two different strategies for H.263 video protection against channel errors are considered and compared. The strategies are tested over a slow-fading wireless channel, over which the H.263 video streams, organized and multiplexed by the H.223 Multiplex Protocol, are transmitted. Both standard FEC techniques considered by the H.223 recommendation for equal error protection of the video stream, and unequal error protection (UEP) through GOB synchronization are tested. The experimental results of this comparative analysis prove the superiority of the UEP technique for H.223-based video transmission.

  15. A comparative analysis of conventional cytopreparatory and liquid based cytological techniques (Sure Path) in evaluation of serous effusion fluids.

    PubMed

    Dadhich, Hrishikesh; Toi, Pampa Ch; Siddaraju, Neelaiah; Sevvanthi, Kalidas

    2016-11-01

    Clinically, detection of malignant cells in serous body fluids is critical, as their presence implies the upstaging of the disease. Cytology of body cavity fluids serves as an important tool when other diagnostic tests cannot be performed. In most laboratories, currently, the effusion fluid samples are analysed chiefly by the conventional cytopreparatory (CCP) technique. Although, there are several studies comparing the liquid-based cytology (LBC), with CCP technique in the field of cervicovaginal cytology; the literature on such comparison with respect to serous body fluid examination is sparse. One hundred samples of serous body fluids were processed by both CCP and LBC techniques. Slides prepared by these techniques were studied using six parameters. A comparative analysis of the advantages and disadvantages of the techniques in detection of malignant cells was carried out with appropriate statistical tests. The samples comprised 52 pleural, 44 peritoneal and four pericardial fluids. No statistically significant difference was noted with respect to cellularity (P values = 0.22), cell distribution (P values = 0.39) and diagnosis of malignancy (P values = 0.20). As for the remaining parameters, LBC provided statistically significant clearer smear background (P values < 0.0001) and shorter screening time (P values < 0.0001), while CPP technique provided a significantly better staining quality (P values 0.01) and sharper cytomorphologic features (P values 0.05). Although, a reduced screening time and clearer smear background are the two major advantages of LBC; the CCP technique provides the better staining quality with sharper cytomorphologic features which is more critical from the cytologic interpretation point of view. Diagn. Cytopathol. 2016;44:874-879. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.

  16. A new approach to the analysis of alpha spectra based on neural network techniques

    NASA Astrophysics Data System (ADS)

    Baeza, A.; Miranda, J.; Guillén, J.; Corbacho, J. A.; Pérez, R.

    2011-10-01

    The analysis of alpha spectra requires good radiochemical procedures in order to obtain well differentiated alpha peaks in the spectrum, and the easiest way to analyze them is by directly summing the counts obtained in the Regions of Interest (ROIs). However, the low-energy tails of the alpha peaks frequently make this simple approach unworkable because some peaks partially overlap. Many fitting procedures have been proposed to solve this problem, most of them based on semi-empirical mathematical functions that emulate the shape of a theoretical alpha peak. The main drawback of these methods is that the great number of fitting parameters used means that their physical meaning is obscure or completely lacking. We propose another approach—the application of an artificial neural network. Instead of fitting the experimental data to a mathematical function, the fit is carried out by an artificial neural network (ANN) that has previously been trained to model the shape of an alpha peak using as training patterns several polonium spectra obtained from actual samples analyzed in our laboratory. In this sense, the ANN is able to learn the shape of an actual alpha peak. We have designed such an ANN as a feed-forward multi-layer perceptron with supervised training based on a back-propagation algorithm. The fitting procedure is based on the experimental observables that are characteristic of alpha peaks—the number of counts of the maximum and several peak widths at different heights. Polonium isotope spectra were selected because the alpha peaks corresponding to 208Po, 209Po, and 210Po are monoenergetic and well separated. The uncertainties introduced by this fitting procedure were less than the counting uncertainties. This new approach was applied to the problem of resolving overlapping peaks. Firstly, a theoretical study was carried out by artificially overlapping alpha peaks from actual samples in order to test the ability of the ANN to resolve each peak. Then, the ANN

  17. Experimental investigation of evanescence-based infrared biodetection technique for micro-total-analysis systems

    NASA Astrophysics Data System (ADS)

    Chandrasekaran, Arvind; Packirisamy, Muthukumaran

    2009-09-01

    The advent of microoptoelectromechanical systems (MOEMS) and its integration with other technologies such as microfluidics, microthermal, immunoproteomics, etc. has led to the concept of an integrated micro-total-analysis systems (μTAS) or Lab-on-a-Chip for chemical and biological applications. Recently, research and development of μTAS have attained a significant growth rate over several biodetection sciences, in situ medical diagnoses, and point-of-care testing applications. However, it is essential to develop suitable biophysical label-free detection methods for the success, reliability, and ease of use of the μTAS. We proposed an infrared (IR)-based evanescence wave detection system on the silicon-on-insulator platform for biodetection with μTAS. The system operates on the principle of bio-optical interaction that occurs due to the evanescence of light from the waveguide device. The feasibility of biodetection has been experimentally investigated by the detection of horse radish peroxidase upon its reaction with hydrogen peroxide.

  18. End effect analysis of linear induction motor based on the wavelet transform technique

    SciTech Connect

    Mori, Yoshihiko; Torii, Susumu; Ebihara, Daiki

    1999-09-01

    HSST (High Speed Surface Transport) is currently being developed for the railway systems of urban transportation in Japan. It is used in the electromagnetic suspension and short-stator Linear Induction Motor (LIM) for the HSST. The performance of LIM is degraded due to the influence of the end effects. LIM is analyzed using the Fourier series expansion to throw light on this problem. However, to obtain the high-accuracy in this technique, the number of times for calculating is increased. In case of the Wavelet transform technique, as the Wavelet coefficients converge rapidly to zero, this technique has been applied to analyze the end effects of LIM. In this paper, the authors investigated the method for determining of mother wavelet.

  19. Advanced NMR-based techniques for pore structure analysis of coal. Final project report

    SciTech Connect

    Smith, D.M.; Hua, D.W.

    1996-02-01

    During the 3 year term of the project, new methods have been developed for characterizing the pore structure of porous materials such as coals, carbons, and amorphous silica gels. In general, these techniques revolve around; (1) combining multiple techniques such as small-angle x-ray scattering (SAXS) and adsorption of contrast-matched adsorbates or {sup 129}Xe NMR and thermoporometry (the change in freezing point with pore size), (2) combining adsorption isotherms over several pressure ranges to obtain a more complete description of pore filling, or (3) applying NMR ({sup 129}Xe, {sup 14}N{sub 2}, {sup 15}N{sub 2}) techniques with well-defined porous solids with pores in the large micropore size range (>1 nm).

  20. A new algorithm developed based on a mixture of spectral and nonlinear techniques for the analysis of heart rate variability.

    PubMed

    Chen, S-W

    2007-01-01

    In this paper, an algorithm based on a joint use of spectral and nonlinear techniques for heart rate variability (HRV) analysis is proposed. First, the measured RR data are passed into a trimmed moving average (TMA)-based filtering system to generate a lower frequency (LF) time series and a higher frequency (HF) one that approximately reflect the sympathetic and vagal activities, respectively. Since the Lyapunov exponent can be used to characterize the level of chaos in complex physiological systems, the largest Lyapunov exponents corresponding to the complex sympathetic and vagal systems are then estimated from the LF and HF time series, respectively, using an existing algorithm. Numerical results of a postural maneuver experiment indicate that both characteristic exponents or their combinations might serve as a set of innovative and robust indicators for HRV analysis, even under the contamination of sparse impulses due to aberrant beats in the RR data.

  1. Femoral fixation strength following soft-tissue posterolateral corner reconstruction using fibular-based technique: Biomechanical analysis of four techniques in normal and low-density synthetic bone.

    PubMed

    Gallo, Robert A; Sathyendra, Vikram; Sharkey, Neil A; Lewis, Gregory S

    2015-12-01

    Optimal femoral fixation of soft-tissue grafts has been described for anterior cruciate ligament reconstruction. Posterolateral corner reconstruction differs from ACL reconstruction in two ways: (a) soft-tissue fixation into the femur requires two tails and (b) the line of force is different. Our purpose was to determine the optimal femoral fixation of soft-tissue grafts during posterolateral corner reconstructions. We hypothesized that interference screw fixation is the strongest technique in normal-density lateral femoral condyle, whereas, cortically-based fixation techniques are stronger methods in low-density lateral femoral condyle. We evaluated elongation during cyclic loading, yield load, peak load-to-failure, and stiffness of four soft-tissue graft femoral fixation methods during posterolateral corner reconstruction. Our model included bovine flexor tendons and contoured synthetic bones. Grafts were secured to the lateral epicondyle in normal- or low-density bone models using spiked washer, button, interference screw, or button and interference screw. Five specimens for each were tested in each bone density. Analysis of variance using Tukey-Kramer adjustment for multiple hypothesis testing was used. Six cadaver bones whose density was analyzed using computerized tomography scan quantitation were tested using interference screw fixation. No method produced significantly stronger yield load or peak load-to-failure in normal-density bone. In low-density bone, cortically-based methods produced significantly higher yield load or peak load-to-failure. Yield load or peak load-to-failure was significantly higher in normal-density bone when using spiked washer or interference screw fixation. No femoral fixation method tested produced superior yield load or peak load-to-failure. Spiked washer and interference screw fixation are inferior fixation methods in low-density bone. For fibular-based posterolateral corner reconstructions, all fixation methods tested are

  2. Models, Web-Based Simulations, and Integrated Analysis Techniques for Improved Logistical Performance

    DTIC Science & Technology

    2001-11-01

    may be complex or discontinuous, and to manage multiple, conflicting system objectives (Lu et al, 1991). Statistical methods, such as designed...937.775.7364; e-mail: snarayan@cs.wright.edu. MYKYTKA, EDWARD F., Ph.D. Professor, Dept. of Engineering Management and Systems , University of...reliability measurement methodologies, and application of advanced program management techniques. Positions have ranged from aircraft maintenance to system

  3. A model reduction technique based on the PGD for elastic-viscoplastic computational analysis

    NASA Astrophysics Data System (ADS)

    Relun, N.; Néron, D.; Boucard, P. A.

    2013-01-01

    In this paper a model reduction approach for elastic-viscoplastic evolution problems is considered. Enhancement of the PGD reduced model by a new iterative technique involving only elastic problems is investigated and allows to reduce CPU cost. The accuracy of the solution and convergence properties are tested on an academic example and a calculation time comparison with the commercial finite element code Abaqus is presented in the case of an industrial structure.

  4. Novel Laser-Based Technique is Ideal for Real-Time Environmental Analysis

    ERIC Educational Resources Information Center

    Journal of Chemical Education, 2005

    2005-01-01

    Ocean Optics offers laser-induced breakdown spectrometer systems (LIBS) that can be used to identify light to heavy metals in a variety of sample types and geometries in environmental analysis applications. LIBS are versatile, real-time, high-resolution analyzers for qualitative analysis, in less than one second, of every element in solids,…

  5. Novel Laser-Based Technique is Ideal for Real-Time Environmental Analysis

    ERIC Educational Resources Information Center

    Journal of Chemical Education, 2005

    2005-01-01

    Ocean Optics offers laser-induced breakdown spectrometer systems (LIBS) that can be used to identify light to heavy metals in a variety of sample types and geometries in environmental analysis applications. LIBS are versatile, real-time, high-resolution analyzers for qualitative analysis, in less than one second, of every element in solids,…

  6. Characterization of electrically-active defects in ultraviolet light-emitting diodes with laser-based failure analysis techniques

    NASA Astrophysics Data System (ADS)

    Miller, Mary A.; Tangyunyong, Paiboon; Cole, Edward I.

    2016-01-01

    Laser-based failure analysis techniques demonstrate the ability to quickly and non-intrusively screen deep ultraviolet light-emitting diodes (LEDs) for electrically-active defects. In particular, two laser-based techniques, light-induced voltage alteration and thermally-induced voltage alteration, generate applied voltage maps (AVMs) that provide information on electrically-active defect behavior including turn-on bias, density, and spatial location. Here, multiple commercial LEDs were examined and found to have dark defect signals in the AVM indicating a site of reduced resistance or leakage through the diode. The existence of the dark defect signals in the AVM correlates strongly with an increased forward-bias leakage current. This increased leakage is not present in devices without AVM signals. Transmission electron microscopy analysis of a dark defect signal site revealed a dislocation cluster through the pn junction. The cluster included an open core dislocation. Even though LEDs with few dark AVM defect signals did not correlate strongly with power loss, direct association between increased open core dislocation densities and reduced LED device performance has been presented elsewhere [M. W. Moseley et al., J. Appl. Phys. 117, 095301 (2015)].

  7. Characterization of electrically-active defects in ultraviolet light-emitting diodes with laser-based failure analysis techniques

    SciTech Connect

    Miller, Mary A.; Tangyunyong, Paiboon; Cole, Edward I.

    2016-01-14

    Laser-based failure analysis techniques demonstrate the ability to quickly and non-intrusively screen deep ultraviolet light-emitting diodes (LEDs) for electrically-active defects. In particular, two laser-based techniques, light-induced voltage alteration and thermally-induced voltage alteration, generate applied voltage maps (AVMs) that provide information on electrically-active defect behavior including turn-on bias, density, and spatial location. Here, multiple commercial LEDs were examined and found to have dark defect signals in the AVM indicating a site of reduced resistance or leakage through the diode. The existence of the dark defect signals in the AVM correlates strongly with an increased forward-bias leakage current. This increased leakage is not present in devices without AVM signals. Transmission electron microscopy analysis of a dark defect signal site revealed a dislocation cluster through the pn junction. The cluster included an open core dislocation. Even though LEDs with few dark AVM defect signals did not correlate strongly with power loss, direct association between increased open core dislocation densities and reduced LED device performance has been presented elsewhere [M. W. Moseley et al., J. Appl. Phys. 117, 095301 (2015)].

  8. Application of the windowed-Fourier-transform-based fringe analysis technique for investigating temperature and concentration fields in fluids.

    PubMed

    Mohanan, Sharika; Srivastava, Atul

    2014-04-10

    The present work is concerned with the development and application of a novel fringe analysis technique based on the principles of the windowed-Fourier-transform (WFT) for the determination of temperature and concentration fields from interferometric images for a range of heat and mass transfer applications. Based on the extent of the noise level associated with the experimental data, the technique has been coupled with two different phase unwrapping methods: the Itoh algorithm and the quality guided phase unwrapping technique for phase extraction. In order to generate the experimental data, a range of experiments have been carried out which include cooling of a vertical flat plate in free convection conditions, combustion of mono-propellant flames, and growth of organic as well as inorganic crystals from their aqueous solutions. The flat plate and combustion experiments are modeled as heat transfer applications wherein the interest is to determine the whole-field temperature distribution. Aqueous-solution-based crystal growth experiments are performed to simulate the mass transfer phenomena and the interest is to determine the two-dimensional solute concentration field around the growing crystal. A Mach-Zehnder interferometer has been employed to record the path-integrated quantity of interest (temperature and/or concentration) in the form of interferometric images in the experiments. The potential of the WFT method has also been demonstrated on numerically simulated phase data for varying noise levels, and the accuracy in phase extraction have been quantified in terms of the root mean square errors. Three levels of noise, i.e., 0%, 10%, and 20% have been considered. Results of the present study show that the WFT technique allows an accurate extraction of phase values that can subsequently be converted into two-dimensional temperature and/or concentration distribution fields. Moreover, since WFT is a local processing technique, speckle patterns and the inherent

  9. Model building techniques for analysis.

    SciTech Connect

    Walther, Howard P.; McDaniel, Karen Lynn; Keener, Donald; Cordova, Theresa Elena; Henry, Ronald C.; Brooks, Sean; Martin, Wilbur D.

    2009-09-01

    The practice of mechanical engineering for product development has evolved into a complex activity that requires a team of specialists for success. Sandia National Laboratories (SNL) has product engineers, mechanical designers, design engineers, manufacturing engineers, mechanical analysts and experimentalists, qualification engineers, and others that contribute through product realization teams to develop new mechanical hardware. The goal of SNL's Design Group is to change product development by enabling design teams to collaborate within a virtual model-based environment whereby analysis is used to guide design decisions. Computer-aided design (CAD) models using PTC's Pro/ENGINEER software tools are heavily relied upon in the product definition stage of parts and assemblies at SNL. The three-dimensional CAD solid model acts as the design solid model that is filled with all of the detailed design definition needed to manufacture the parts. Analysis is an important part of the product development process. The CAD design solid model (DSM) is the foundation for the creation of the analysis solid model (ASM). Creating an ASM from the DSM currently is a time-consuming effort; the turnaround time for results of a design needs to be decreased to have an impact on the overall product development. This effort can be decreased immensely through simple Pro/ENGINEER modeling techniques that summarize to the method features are created in a part model. This document contains recommended modeling techniques that increase the efficiency of the creation of the ASM from the DSM.

  10. On the use of surrogate-based modeling for the numerical analysis of Low Impact Development techniques

    NASA Astrophysics Data System (ADS)

    Brunetti, Giuseppe; Šimůnek, Jirka; Turco, Michele; Piro, Patrizia

    2017-05-01

    Mechanistic models have proven to be accurate tools for the numerical analysis of the hydraulic behavior of Low Impact Development (LIDs) techniques. However, their widespread adoption has been limited by their computational cost. In this view, surrogate modeling is focused on developing and using a computationally inexpensive surrogate of the original model. While having been previously applied to various water-related and environmental modeling problems, no studies have used surrogate models for the analysis of LIDs. The aim of this research thus was to investigate the benefit of surrogate-based modeling in the numerical analysis of LIDs. The kriging technique was used to approximate the deterministic response of the widely used mechanistic model HYDRUS-2D, which was employed to simulate the variably-saturated hydraulic behavior of a contained stormwater filter. The Nash-Sutcliffe efficiency (NSE) index was used to compare the simulated and measured outflows and as the variable of interest for the construction of the response surface. The validated kriging model was first used to carry out a Global Sensitivity Analysis of the unknown soil hydraulic parameters of the filter layer, revealing that only the shape parameter α and the saturated hydraulic conductivity Ks significantly affected the model response. Next, the Particle Swarm Optimization algorithm was used to estimate their values. The NSE value of 0.85 indicated a good accuracy of estimated parameters. Finally, the calibrated model was validated against an independent set of measured outflows with a NSE value of 0.8, which again corroborated the reliability of the surrogate-based optimized parameters.

  11. Comparative Study of Various Normal Mode Analysis Techniques Based on Partial Hessians

    PubMed Central

    GHYSELS, AN; VAN SPEYBROECK, VERONIQUE; PAUWELS, EWALD; CATAK, SARON; BROOKS, BERNARD R.; VAN NECK, DIMITRI; WAROQUIER, MICHEL

    2014-01-01

    Standard normal mode analysis becomes problematic for complex molecular systems, as a result of both the high computational cost and the excessive amount of information when the full Hessian matrix is used. Several partial Hessian methods have been proposed in the literature, yielding approximate normal modes. These methods aim at reducing the computational load and/or calculating only the relevant normal modes of interest in a specific application. Each method has its own (dis)advantages and application field but guidelines for the most suitable choice are lacking. We have investigated several partial Hessian methods, including the Partial Hessian Vibrational Analysis (PHVA), the Mobile Block Hessian (MBH), and the Vibrational Subsystem Analysis (VSA). In this article, we focus on the benefits and drawbacks of these methods, in terms of the reproduction of localized modes, collective modes, and the performance in partially optimized structures. We find that the PHVA is suitable for describing localized modes, that the MBH not only reproduces localized and global modes but also serves as an analysis tool of the spectrum, and that the VSA is mostly useful for the reproduction of the low frequency spectrum. These guidelines are illustrated with the reproduction of the localized amine-stretch, the spectrum of quinine and a bis-cinchona derivative, and the low frequency modes of the LAO binding protein. PMID:19813181

  12. Comparative study of various normal mode analysis techniques based on partial Hessians.

    PubMed

    Ghysels, An; Van Speybroeck, Veronique; Pauwels, Ewald; Catak, Saron; Brooks, Bernard R; Van Neck, Dimitri; Waroquier, Michel

    2010-04-15

    Standard normal mode analysis becomes problematic for complex molecular systems, as a result of both the high computational cost and the excessive amount of information when the full Hessian matrix is used. Several partial Hessian methods have been proposed in the literature, yielding approximate normal modes. These methods aim at reducing the computational load and/or calculating only the relevant normal modes of interest in a specific application. Each method has its own (dis)advantages and application field but guidelines for the most suitable choice are lacking. We have investigated several partial Hessian methods, including the Partial Hessian Vibrational Analysis (PHVA), the Mobile Block Hessian (MBH), and the Vibrational Subsystem Analysis (VSA). In this article, we focus on the benefits and drawbacks of these methods, in terms of the reproduction of localized modes, collective modes, and the performance in partially optimized structures. We find that the PHVA is suitable for describing localized modes, that the MBH not only reproduces localized and global modes but also serves as an analysis tool of the spectrum, and that the VSA is mostly useful for the reproduction of the low frequency spectrum. These guidelines are illustrated with the reproduction of the localized amine-stretch, the spectrum of quinine and a bis-cinchona derivative, and the low frequency modes of the LAO binding protein. 2009 Wiley Periodicals, Inc.

  13. A new QMR-based technique for body composition analysis in infants

    USDA-ARS?s Scientific Manuscript database

    Accurate assessment and tracking of infant body composition is useful in evaluation of the amount and quality of weight gain, which can provide key information in both clinical and research settings. Thus, body composition analysis (BCA) results can be used to monitor and evaluate infant growth patt...

  14. 2D Flood Modelling Using Advanced Terrain Analysis Techniques And A Fully Continuous DEM-Based Rainfall-Runoff Algorithm

    NASA Astrophysics Data System (ADS)

    Nardi, F.; Grimaldi, S.; Petroselli, A.

    2012-12-01

    Remotely sensed Digital Elevation Models (DEMs), largely available at high resolution, and advanced terrain analysis techniques built in Geographic Information Systems (GIS), provide unique opportunities for DEM-based hydrologic and hydraulic modelling in data-scarce river basins paving the way for flood mapping at the global scale. This research is based on the implementation of a fully continuous hydrologic-hydraulic modelling optimized for ungauged basins with limited river flow measurements. The proposed procedure is characterized by a rainfall generator that feeds a continuous rainfall-runoff model producing flow time series that are routed along the channel using a bidimensional hydraulic model for the detailed representation of the inundation process. The main advantage of the proposed approach is the characterization of the entire physical process during hydrologic extreme events of channel runoff generation, propagation, and overland flow within the floodplain domain. This physically-based model neglects the need for synthetic design hyetograph and hydrograph estimation that constitute the main source of subjective analysis and uncertainty of standard methods for flood mapping. Selected case studies show results and performances of the proposed procedure as respect to standard event-based approaches.

  15. [Measurement Error Analysis and Calibration Technique of NTC - Based Body Temperature Sensor].

    PubMed

    Deng, Chi; Hu, Wei; Diao, Shengxi; Lin, Fujiang; Qian, Dahong

    2015-11-01

    A NTC thermistor-based wearable body temperature sensor was designed. This paper described the design principles and realization method of the NTC-based body temperature sensor. In this paper the temperature measurement error sources of the body temperature sensor were analyzed in detail. The automatic measurement and calibration method of ADC error was given. The results showed that the measurement accuracy of calibrated body temperature sensor is better than ± 0.04 degrees C. The temperature sensor has high accuracy, small size and low power consumption advantages.

  16. Kinematical Comparison Analysis on the Discus Athletes Throwing Techniques Based on Data Project

    NASA Astrophysics Data System (ADS)

    Junming, Li; Jihe, Zhou; Ting, Long

    2017-09-01

    In the discus final site of throwing event series game of China’s track and field sport in April, 2015, three dimensional camera analytical method which is an application of kinematical data project was used on female discus athletes’ discus throwing technology. And analysis was made for the top four discus throwers’ last exertion action, related kinematics parameter was thus obtained. Analysis results show that: first, Lu Xiaoxin behaves better in body twist tight effect when it is left foot on the ground and in capacity of beyond devices, followed by Su Xinyue and Tan Jian, with Feng Bin relatively weaker; second, our athletes’ discus shots speed is to be upgraded compared with world excellent female discus athletes; third, discus is left slightly earlier, with Tan Jian throwing in a reasonable angle, Feng Bin, Lu Xiaoxin in a larger angle, and Sue Xinyue in a smaller angle. Feng bin has a higher height of release, followed by Lu Xiaoxin and Tan jian.

  17. Research on fault diagnosis technique on aerocamera communication based on fault tree analysis

    NASA Astrophysics Data System (ADS)

    Li, Lijuan; He, Binggao; Tian, Chengjun; Yang, Chengyu; Duan, Jie

    2008-12-01

    ARINC429 is the standard of digital transmission of avionic device. This paper used fault tree analysis to diagnosis failures of aerocamera 429 communication, built up fault tree of aerocamera 429 communication, analyzed and diagnosed the failures, and designed the detecting flaw, finished aerocamera 429 communication detecting system finally. This detecting system can detect aerocamera 429 communication board fast and effectively, and cut down period of clearing of fault. In addition, it can increase the direction of maintenance and repair, improve the overall function of aerocamera.

  18. The Role of Liquid Based Cytology and Ancillary Techniques in the Peritoneal Washing Analysis: Our Institutional Experience

    PubMed Central

    Rossi, Esther; Bizzarro, Tommaso; Martini, Maurizio; Longatto-Filho, Adhemar; Schmitt, Fernando; Fagotti, Anna; Scambia, Giovanni; Zannoni, Gian Franco

    2017-01-01

    Background The cytological analysis of peritoneal effusions serves as a diagnostic and prognostic aid for either primary or metastatic diseases. Among the different cytological preparations, liquid based cytology (LBC) represents a feasible and reliable method ensuring also the application of ancillary techniques (i.e immunocytochemistry-ICC and molecular testing). Methods We recorded 10348 LBC peritoneal effusions between January 2000 and December 2014. They were classified as non-diagnostic (ND), negative for malignancy-NM, atypical-suspicious for malignancy-SM and positive for malignancy-PM. Results The cytological diagnosis included 218 ND, 9.035 NM, 213 SM and 882 PM. A total of 8048 (7228 NM, 115SM, 705 PM) cases with histological follow-up were included. Our NM included 21 malignant and 7207 benign histological diagnoses. Our 820 SMs+PMs were diagnosed as 107 unknown malignancies (30SM and 77PM), 691 metastatic lesions (81SM and 610PM), 9 lymphomas (2SM and 7PM), 9 mesotheliomas (1SM and 8SM), 4 sarcomas (1SM and 3PM). Primary gynecological cancers contributed with 64% of the cases. We documented 97.4% sensitivity, 99.9% specificity, 98% diagnostic accuracy, 99.7% negative predictive value (NPV) and 99.7% positive predictive value (PPV). Furthermore, the morphological diagnoses were supported by either 173 conclusive ICC results or 50 molecular analyses. Specifically the molecular testing was performed for the EGFR and KRAS mutational analysis based on the previous or contemporary diagnoses of Non Small Cell Lung Cancer (NSCLC) and colon carcinomas. We identified 10 EGFR in NSCCL and 7 KRAS mutations on LBC stored material. Conclusions Peritoneal cytology is an adjunctive tool in the surgical management of tumors mostly gynecological cancers. LBC maximizes the application of ancillary techniques such as ICC and molecular analysis with feasible diagnostic and predictive yields also in controversial cases. PMID:28099523

  19. High-precision technique for in-situ testing of the PZT scanner based on fringe analysis

    NASA Astrophysics Data System (ADS)

    Wang, Daodang; Yang, Yongying; Liu, Dong; Zhuo, Yongmo

    2010-08-01

    A technique based on fringe analysis is presented for the in-situ testing of the PZT scanner, including the end rotation analysis and displacement measurement. With the interferograms acquired in the Twyman-Green interferometer, the testing can be carried out in real time. The end rotation of the PZT scanner and its spatial displacement deviation are analyzed by processing the fringe rotation and interval changes; displacement of the PZT scanner is determined by fringe shift according to the algorithm of template-matching, from which the relation between the driving voltage and displacement is measured to calibrate the nonlinearity of the PZT scanner. It is shown by computer simulation and experiments that the proposed technique for in-situ testing of the PZT scanner takes a short time, and achieves precise displacement measurement as well as the end rotation angle and displacement deviation measurement. The proposed method has high efficiency and precision, and is of great practicality for in-situ calibration of the PZT scanner.

  20. Scatterometer-based scanner fingerprinting technique(ScatterLith) and its applications in image field and ACLV analysis

    NASA Astrophysics Data System (ADS)

    Wang, Changan; Zhang, Gary; DeMoor, Stephen J.; Boehm, Mark A.; Littau, Michael E.; Raymond, Christopher J.

    2003-06-01

    The ability to accurately, quickly and automatically fingerprint the lenses of advanced lithography scanners has always been a dream for lithographers. This is truly necessary to understand error sources of ACLV, especially when the optical lithography is pushed into 130 nm regimes and beyond. This dream has become a reality at Texas Instruments with the help of scatterometry. This paper describes the development and characterization of the scatterometer based scanner lens testing technique (ScatterLith) and its application in 193 nm and 248 nm scanner lens fingerprinting. The entire procedure includes a full field exposure through focus in a micro stepping mode, scatterometer measurement of focus matrix, image field analysis and mapping of lens curvature, astigmatism, spherical aberration, line-through pitch analysis and ACLV analysis (i.e. across chip line width variation). ACLV has been directly correlated with image field deviation, lens aberration and illumination source errors. Examples are given to illustrate its applications in accurate focus monitoring with enhanced capability of dynamic image field and lens signature mapping for the latest ArF and KrF scanners used in manufacturing environment for 130nm node and beyond. Analysis of CD variation across a full scanner field is done through a step-by-step image field correction procedure. ACLV contribution of each image field error can be quantified separately. The final across slit CD signature is further analyzed against possible errors from illumination uniformity, illumination pupil fill, and higher order projection lens aberrations. High accuracy and short cycle time make this new technique a very effective tool for in-line real time monitoring and scanner qualification. Its fingerprinting capability also provides lithography engineers a comprehensive understanding of scanner performance for CD control and tool matching. Its extendibility to 90nm and beyond is particularly attractive for future

  1. Characterization of electrically-active defects in ultraviolet light-emitting diodes with laser-based failure analysis techniques

    DOE PAGES

    Miller, Mary A.; Tangyunyong, Paiboon; Edward I. Cole, Jr.

    2016-01-12

    In this study, laser-based failure analysis techniques demonstrate the ability to quickly and non-intrusively screen deep ultraviolet light-emitting diodes(LEDs) for electrically-active defects. In particular, two laser-based techniques, light-induced voltage alteration and thermally-induced voltage alteration, generate applied voltage maps (AVMs) that provide information on electrically-active defect behavior including turn-on bias, density, and spatial location. Here, multiple commercial LEDs were examined and found to have dark defect signals in the AVM indicating a site of reduced resistance or leakage through the diode. The existence of the dark defect signals in the AVM correlates strongly with an increased forward-bias leakage current. This increasedmore » leakage is not present in devices without AVM signals. Transmission electron microscopyanalysis of a dark defect signal site revealed a dislocation cluster through the pn junction. The cluster included an open core dislocation. Even though LEDs with few dark AVM defect signals did not correlate strongly with power loss, direct association between increased open core dislocation densities and reduced LED device performance has been presented elsewhere [M. W. Moseley et al., J. Appl. Phys. 117, 095301 (2015)].« less

  2. Characterization of electrically-active defects in ultraviolet light-emitting diodes with laser-based failure analysis techniques

    SciTech Connect

    Miller, Mary A.; Tangyunyong, Paiboon; Edward I. Cole, Jr.

    2016-01-12

    In this study, laser-based failure analysis techniques demonstrate the ability to quickly and non-intrusively screen deep ultraviolet light-emitting diodes(LEDs) for electrically-active defects. In particular, two laser-based techniques, light-induced voltage alteration and thermally-induced voltage alteration, generate applied voltage maps (AVMs) that provide information on electrically-active defect behavior including turn-on bias, density, and spatial location. Here, multiple commercial LEDs were examined and found to have dark defect signals in the AVM indicating a site of reduced resistance or leakage through the diode. The existence of the dark defect signals in the AVM correlates strongly with an increased forward-bias leakage current. This increased leakage is not present in devices without AVM signals. Transmission electron microscopyanalysis of a dark defect signal site revealed a dislocation cluster through the pn junction. The cluster included an open core dislocation. Even though LEDs with few dark AVM defect signals did not correlate strongly with power loss, direct association between increased open core dislocation densities and reduced LED device performance has been presented elsewhere [M. W. Moseley et al., J. Appl. Phys. 117, 095301 (2015)].

  3. Mechanism analysis on biofouling detection based on optical fiber sensing technique

    NASA Astrophysics Data System (ADS)

    Ma, Huiping; Yuan, Feng; Liu, Yongmeng; Jiang, Xiuzhen

    2010-08-01

    More attention is paid to on-line monitoring of biofouling in industrial water process systems. Based on optical fiber sensing technology, biofouling detection mechanism is put forward in the paper. With biofouling formation, optical characteristics and the relation between light intensity and refractive index studied, schematic diagram of optical fiber self-referencing detecting system and technological flowchart are presented. Immunity to electromagnetic interference and other influencing factors by which the precision is great improved is also remarkable characteristic. Absorption spectrum of fluid medium molecule is measured by infrared spectrum and impurity is analyzed by character fingerprints of different liquid. Other pollutant source can be identified by means of infrared spectrum and arithmetic research of artificial neural networks (ANN) technology. It can be used in other fields such as mining, environment protection, medical treatment and transportation of oil, gas and water.

  4. Analysis of protein coatings on gold nanoparticles by XPS and liquid-based particle sizing techniques.

    PubMed

    Belsey, Natalie A; Shard, Alex G; Minelli, Caterina

    2015-03-27

    The precise use of nanoparticles in technological applications requires control over their surface properties. This implies the ability to quantitatively describe, for example, molecular coatings in terms of their thickness, areal mass, or number of molecules. Here, the authors describe two different approaches to the measurement of these parameters by using gold nanoparticles ranging in diameter from 10 to 80 nm and coated with three different proteins: immunoglobulin G, bovine serum albumin, and a peptide. One approach utilizes ultraviolet-visible spectroscopy, dynamic light scattering, and differential centrifugal sedimentation to measure the protein shell refractive indices and thicknesses, from which the number of molecules in the protein shell can be derived. The other approach employs x-ray photoelectron spectroscopy to measure the thickness of the dry molecular coatings and also to derive the number of molecules in the protein shell. The authors demonstrate that the two approaches, although very different, produce consistent measurement results. This finding is important to extend the quantitative analysis of nanoparticle molecular coatings to a wide range of materials.

  5. Urban major road extraction from IKONOS imagery based on modified texture progressing analysis technique

    NASA Astrophysics Data System (ADS)

    Wu, Xuewen; Xu, Hanqiu; Wu, Pingli

    2010-11-01

    A method for urban major road extraction from IKONOS imagery was proposed. The texture features of the image were first analyzed in three different levels. The first level calculated the Mahalanobis distance between test pixels and training pixels. The second level was the calculation results of Bhattacharyya distance between the distributions of the pixels in the training area and the pixels within a 3×3 window in the test area. The third level employed cooccurrence matrices over the texture cube built around one pixel, and then Bhattacharyya distance was used again. The processed results were thresholded and thinned, respectively. With the assistance of the geometrical characteristic of roads, the three resultant images corresponding to three levels were computed using fuzzy mathematics for their likelihood belonging to road and then merged together. A knowledge-based algorithm was used to link the segmented roads. The result was finally optimized by polynomial fitting. The experiment shows that the proposed method can effectively extract the urban major roads from the high-resolution imagery such as IKONOS.

  6. Urban major road extraction from IKONOS imagery based on modified texture progressing analysis technique

    NASA Astrophysics Data System (ADS)

    Wu, Xuewen; Xu, Hanqiu; Wu, Pingli

    2009-09-01

    A method for urban major road extraction from IKONOS imagery was proposed. The texture features of the image were first analyzed in three different levels. The first level calculated the Mahalanobis distance between test pixels and training pixels. The second level was the calculation results of Bhattacharyya distance between the distributions of the pixels in the training area and the pixels within a 3×3 window in the test area. The third level employed cooccurrence matrices over the texture cube built around one pixel, and then Bhattacharyya distance was used again. The processed results were thresholded and thinned, respectively. With the assistance of the geometrical characteristic of roads, the three resultant images corresponding to three levels were computed using fuzzy mathematics for their likelihood belonging to road and then merged together. A knowledge-based algorithm was used to link the segmented roads. The result was finally optimized by polynomial fitting. The experiment shows that the proposed method can effectively extract the urban major roads from the high-resolution imagery such as IKONOS.

  7. Comparative analysis of whole genome sequencing-based telomere length measurement techniques.

    PubMed

    Lee, Michael; Napier, Christine E; Yang, Sile F; Arthur, Jonathan W; Reddel, Roger R; Pickett, Hilda A

    2017-02-01

    Telomeres are regions of repetitive DNA at the ends of human chromosomes that function to maintain the integrity of the genome. Telomere attrition is associated with cellular ageing, whilst telomere maintenance is a prerequisite for malignant transformation. Whole genome sequencing (WGS) captures sequence information from the entire genome, including the telomeres, and is increasingly being applied in research and in the clinic. Several bioinformatics tools have been designed to determine telomere content and length from WGS data, and include Motif_counter, TelSeq, Computel, qMotif, and Telomerecat. These tools utilise different approaches to identify, quantify and normalise telomeric reads; however, it is not known how they compare to one another. Here we describe the details and utility of each tool, and directly compare WGS telomere length output with laboratory-based telomere length measurements. In addition, we evaluate the accessibility, practicality, speed, and additional features of each tool. Each tool was tested using a range of telomere read extraction criteria, to determine the optimal parameters for the specific WGS read length. The aim of this article is to improve the accessibility of WGS telomere length measurement tools, which have the potential to be applied to WGS cohorts for clinical as well as research benefit. Copyright © 2016 Elsevier Inc. All rights reserved.

  8. NASA standard: Trend analysis techniques

    NASA Technical Reports Server (NTRS)

    1990-01-01

    Descriptive and analytical techniques for NASA trend analysis applications are presented in this standard. Trend analysis is applicable in all organizational elements of NASA connected with, or supporting, developmental/operational programs. This document should be consulted for any data analysis activity requiring the identification or interpretation of trends. Trend analysis is neither a precise term nor a circumscribed methodology: it generally connotes quantitative analysis of time-series data. For NASA activities, the appropriate and applicable techniques include descriptive and graphical statistics, and the fitting or modeling of data by linear, quadratic, and exponential models. Usually, but not always, the data is time-series in nature. Concepts such as autocorrelation and techniques such as Box-Jenkins time-series analysis would only rarely apply and are not included in this document. The basic ideas needed for qualitative and quantitative assessment of trends along with relevant examples are presented.

  9. Get to Understand More from Single-Cells: Current Studies of Microfluidic-Based Techniques for Single-Cell Analysis

    PubMed Central

    Lo, Shih-Jie; Yao, Da-Jeng

    2015-01-01

    This review describes the microfluidic techniques developed for the analysis of a single cell. The characteristics of microfluidic (e.g., little sample amount required, high-throughput performance) make this tool suitable to answer and to solve biological questions of interest about a single cell. This review aims to introduce microfluidic related techniques for the isolation, trapping and manipulation of a single cell. The major approaches for detection in single-cell analysis are introduced; the applications of single-cell analysis are then summarized. The review concludes with discussions of the future directions and opportunities of microfluidic systems applied in analysis of a single cell. PMID:26213918

  10. 2D wavelet-analysis-based calibration technique for flat-panel imaging detectors: application in cone beam volume CT

    NASA Astrophysics Data System (ADS)

    Tang, Xiangyang; Ning, Ruola; Yu, Rongfeng; Conover, David L.

    1999-05-01

    The application of the newly developed flat panel x-ray imaging detector in cone beam volume CT has attracted increasing interest recently. Due to an imperfect solid state array manufacturing process, however, defective elements, gain non-uniformity and offset image unavoidably exist in all kinds of flat panel x-ray imaging detectors, which will cause severe streak and ring artifacts in a cone beam reconstruction image and severely degrade image quality. A calibration technique, in which the artifacts resulting from the defective elements, gain non-uniformity and offset image can be reduced significantly, is presented in this paper. The detection of defective elements is distinctively based upon two-dimensional (2D) wavelet analysis. Because of its inherent localizability in recognizing singularities or discontinuities, wavelet analysis possesses the capability of detecting defective elements over a rather large x-ray exposure range, e.g., 20% to approximately 60% of the dynamic range of the detector used. Three-dimensional (3D) images of a low-contrast CT phantom have been reconstructed from projection images acquired by a flat panel x-ray imaging detector with and without calibration process applied. The artifacts caused individually by defective elements, gain non-uniformity and offset image have been separated and investigated in detail, and the correlation with each other have also been exposed explicitly. The investigation is enforced by quantitative analysis of the signal to noise ratio (SNR) and the image uniformity of the cone beam reconstruction image. It has been demonstrated that the ring and streak artifacts resulting from the imperfect performance of a flat panel x-ray imaging detector can be reduced dramatically, and then the image qualities of a cone beam reconstruction image, such as contrast resolution and image uniformity are improved significantly. Furthermore, with little modification, the calibration technique presented here is also applicable

  11. Improved Sectional Image Analysis Technique for Evaluating Fiber Orientations in Fiber-Reinforced Cement-Based Materials

    PubMed Central

    Lee, Bang Yeon; Kang, Su-Tae; Yun, Hae-Bum; Kim, Yun Yong

    2016-01-01

    The distribution of fiber orientation is an important factor in determining the mechanical properties of fiber-reinforced concrete. This study proposes a new image analysis technique for improving the evaluation accuracy of fiber orientation distribution in the sectional image of fiber-reinforced concrete. A series of tests on the accuracy of fiber detection and the estimation performance of fiber orientation was performed on artificial fiber images to assess the validity of the proposed technique. The validation test results showed that the proposed technique estimates the distribution of fiber orientation more accurately than the direct measurement of fiber orientation by image analysis. PMID:28787839

  12. Improved Sectional Image Analysis Technique for Evaluating Fiber Orientations in Fiber-Reinforced Cement-Based Materials.

    PubMed

    Lee, Bang Yeon; Kang, Su-Tae; Yun, Hae-Bum; Kim, Yun Yong

    2016-01-12

    The distribution of fiber orientation is an important factor in determining the mechanical properties of fiber-reinforced concrete. This study proposes a new image analysis technique for improving the evaluation accuracy of fiber orientation distribution in the sectional image of fiber-reinforced concrete. A series of tests on the accuracy of fiber detection and the estimation performance of fiber orientation was performed on artificial fiber images to assess the validity of the proposed technique. The validation test results showed that the proposed technique estimates the distribution of fiber orientation more accurately than the direct measurement of fiber orientation by image analysis.

  13. Comparative analysis of DNA polymorphisms and phylogenetic relationships among Syzygium cumini Skeels based on phenotypic characters and RAPD technique.

    PubMed

    Singh, Jitendra P; Singh, Ak; Bajpai, Anju; Ahmad, Iffat Zareen

    2014-01-01

    The Indian black berry (Syzygium cumini Skeels) has a great nutraceutical and medicinal properties. As in other fruit crops, the fruit characteristics are important attributes for differentiation were also determined for different accessions of S. cumini. The fruit weight, length, breadth, length: breadth ratio, pulp weight, pulp content, seed weight and pulp: seed ratio significantly varied in different accessions. Molecular characterization was carried out using PCR based RAPD technique. Out of 80 RAPD primers, only 18 primers produced stable polymorphisms that were used to examine the phylogenetic relationship. A sum of 207 loci were generated out of which 201 loci found polymorphic. The average genetic dissimilarity was 97 per cent among jamun accessions. The phylogenetic relationship was also determined by principal coordinates analysis (PCoA) that explained 46.95 per cent cumulative variance. The two-dimensional PCoA analysis showed grouping of the different accessions that were plotted into four sub-plots, representing clustering of accessions. The UPGMA (r = 0.967) and NJ (r = 0.987) dendrogram constructed based on the dissimilarity matrix revealed a good degree of fit with the cophenetic correlation value. The dendrogram grouped the accessions into three main clusters according to their eco-geographical regions which given useful insight into their phylogenetic relationships.

  14. Comparative analysis of DNA polymorphisms and phylogenetic relationships among Syzygium cumini Skeels based on phenotypic characters and RAPD technique

    PubMed Central

    Singh, Jitendra P; Singh, AK; Bajpai, Anju; Ahmad, Iffat Zareen

    2014-01-01

    The Indian black berry (Syzygium cumini Skeels) has a great nutraceutical and medicinal properties. As in other fruit crops, the fruit characteristics are important attributes for differentiation were also determined for different accessions of S. cumini. The fruit weight, length, breadth, length: breadth ratio, pulp weight, pulp content, seed weight and pulp: seed ratio significantly varied in different accessions. Molecular characterization was carried out using PCR based RAPD technique. Out of 80 RAPD primers, only 18 primers produced stable polymorphisms that were used to examine the phylogenetic relationship. A sum of 207 loci were generated out of which 201 loci found polymorphic. The average genetic dissimilarity was 97 per cent among jamun accessions. The phylogenetic relationship was also determined by principal coordinates analysis (PCoA) that explained 46.95 per cent cumulative variance. The two-dimensional PCoA analysis showed grouping of the different accessions that were plotted into four sub-plots, representing clustering of accessions. The UPGMA (r = 0.967) and NJ (r = 0.987) dendrogram constructed based on the dissimilarity matrix revealed a good degree of fit with the cophenetic correlation value. The dendrogram grouped the accessions into three main clusters according to their eco-geographical regions which given useful insight into their phylogenetic relationships. PMID:24966521

  15. Near-infrared spectral image analysis of pork marbling based on Gabor filter and wide line detector techniques.

    PubMed

    Huang, Hui; Liu, Li; Ngadi, Michael O; Gariépy, Claude; Prasher, Shiv O

    2014-01-01

    Marbling is an important quality attribute of pork. Detection of pork marbling usually involves subjective scoring, which raises the efficiency costs to the processor. In this study, the ability to predict pork marbling using near-infrared (NIR) hyperspectral imaging (900-1700 nm) and the proper image processing techniques were studied. Near-infrared images were collected from pork after marbling evaluation according to current standard chart from the National Pork Producers Council. Image analysis techniques-Gabor filter, wide line detector, and spectral averaging-were applied to extract texture, line, and spectral features, respectively, from NIR images of pork. Samples were grouped into calibration and validation sets. Wavelength selection was performed on calibration set by stepwise regression procedure. Prediction models of pork marbling scores were built using multiple linear regressions based on derivatives of mean spectra and line features at key wavelengths. The results showed that the derivatives of both texture and spectral features produced good results, with correlation coefficients of validation of 0.90 and 0.86, respectively, using wavelengths of 961, 1186, and 1220 nm. The results revealed the great potential of the Gabor filter for analyzing NIR images of pork for the effective and efficient objective evaluation of pork marbling.

  16. Technique based on LED multispectral imaging and multivariate analysis for monitoring the conservation state of the Dead Sea Scrolls.

    PubMed

    Marengo, Emilio; Manfredi, Marcello; Zerbinati, Orfeo; Robotti, Elisa; Mazzucco, Eleonora; Gosetti, Fabio; Bearman, Greg; France, Fenella; Shor, Pnina

    2011-09-01

    The aim of this project is the development of a noninvasive technique based on LED multispectral imaging (MSI) for monitoring the conservation state of the Dead Sea Scrolls (DSS) collection. It is well-known that changes in the parchment reflectance drive the transition of the scrolls from legible to illegible. Capitalizing on this fact, we will use spectral imaging to detect changes in the reflectance before they become visible to the human eye. The technique uses multivariate analysis and statistical process control theory. The present study was carried out on a "sample" parchment of calfskin. The monitoring of the surface of a commercial modern parchment aged consecutively for 2 h and 6 h at 80 °C and 50% relative humidity (ASTM) was performed at the Imaging Lab of the Library of Congress (Washington, DC, U.S.A.). MSI is here carried out in the vis-NIR range limited to 1 μm, with a number of bands of 13 and bandwidths that range from about 10 nm in UV to 40 nm in IR. Results showed that we could detect and locate changing pixels, on the basis of reflectance changes, after only a few "hours" of aging.

  17. An expert diagnostic system based on neural networks and image analysis techniques in the field of automated cytogenetics.

    PubMed

    Beksaç, M S; Eskiizmirliler, S; Cakar, A N; Erkmen, A M; Dağdeviren, A; Lundsteen, C

    1996-03-01

    In this study, we introduce an expert system for intelligent chromosome recognition and classification based on artificial neural networks (ANN) and features obtained by automated image analysis techniques. A microscope equipped with a CCTV camera, integrated with an IBM-PC compatible computer environment including a frame grabber, is used for image data acquisition. Features of the chromosomes are obtained directly from the digital chromosome images. Two new algorithms for automated object detection and object skeletonizing constitute the basis of the feature extraction phase which constructs the components of the input vector to the ANN part of the system. This first version of our intelligent diagnostic system uses a trained unsupervised neural network structure and an original rule-based classification algorithm to find a karyotyped form of randomly distributed chromosomes over a complete metaphase. We investigate the effects of network parameters on the classification performance and discuss the adaptability and flexibility of the neural system in order to reach a structure giving an output including information about both structural and numerical abnormalities. Moreover, the classification performances of neural and rule-based system are compared for each class of chromosome.

  18. Compendium on Risk Analysis Techniques

    DTIC Science & Technology

    The evolution of risk analysis in the materiel acquisition process is traced from the Secretary Packard memorandum to current AMC guidance. Risk ... analysis is defined and many of the existing techniques are described in light of this definition and their specific role in program management and

  19. Graph-based symbolic technique and its application in the frequency response bound analysis of analog integrated circuits.

    PubMed

    Tlelo-Cuautle, E; Rodriguez-Chavez, S; Palma-Rodriguez, A A

    2014-01-01

    A new graph-based symbolic technique (GBST) for deriving exact analytical expressions like the transfer function H(s) of an analog integrated circuit (IC), is introduced herein. The derived H(s) of a given analog IC is used to compute the frequency response bounds (maximum and minimum) associated to the magnitude and phase of H(s), subject to some ranges of process variational parameters, and by performing nonlinear constrained optimization. Our simulations demonstrate the usefulness of the new GBST for deriving the exact symbolic expression for H(s), and the last section highlights the good agreement between the frequency response bounds computed by our variational analysis approach versus traditional Monte Carlo simulations. As a conclusion, performing variational analysis using our proposed GBST for computing the frequency response bounds of analog ICs, shows a gain in computing time of 100x for a differential circuit topology and 50x for a 3-stage amplifier, compared to traditional Monte Carlo simulations.

  20. Comprehensive theoretical analysis and experimental exploration of ultrafast microchip-based high-field asymmetric ion mobility spectrometry (FAIMS) technique.

    PubMed

    Li, Lingfeng; Wang, Yonghuan; Chen, Chilai; Wang, Xiaozhi; Luo, Jikui

    2015-06-01

    High-field asymmetric ion mobility spectrometry (FAIMS) has become an efficient technique for separation and characterization of gas-phase ions at ambient pressure, which utilizes the mobility differences of ions at high and low fields. Micro FAIMS devices made by micro-electromechanical system technology have small gaps of the channels, high electric field and good installation precision, as thus they have received great attentions. However, the disadvantage of relatively low resolution limits their applications in some areas. In this study, theoretical analysis and experimental exploration were carried out to overcome the disadvantage. Multiple scans, characteristic decline curves of ion transmission and pattern recognitions were proposed to improve the performance of the microchip-based FAIMS. The results showed that although micro FAIMS instruments as a standalone chemical analyzer suffer from low resolution, by using one or more of the methods proposed, they can identify chemicals precisely and provide quantitative analysis with low detection limit in some applications. Copyright © 2015 John Wiley & Sons, Ltd.

  1. Visualization and Analysis of Wireless Sensor Network Data for Smart Civil Structure Applications Based On Spatial Correlation Technique

    NASA Astrophysics Data System (ADS)

    Chowdhry, Bhawani Shankar; White, Neil M.; Jeswani, Jai Kumar; Dayo, Khalil; Rathi, Manorma

    2009-07-01

    Disasters affecting infrastructure, such as the 2001 earthquakes in India, 2005 in Pakistan, 2008 in China and the 2004 tsunami in Asia, provide a common need for intelligent buildings and smart civil structures. Now, imagine massive reductions in time to get the infrastructure working again, realtime information on damage to buildings, massive reductions in cost and time to certify that structures are undamaged and can still be operated, reductions in the number of structures to be rebuilt (if they are known not to be damaged). Achieving these ideas would lead to huge, quantifiable, long-term savings to government and industry. Wireless sensor networks (WSNs) can be deployed in buildings to make any civil structure both smart and intelligent. WSNs have recently gained much attention in both public and research communities because they are expected to bring a new paradigm to the interaction between humans, environment, and machines. This paper presents the deployment of WSN nodes in the Top Quality Centralized Instrumentation Centre (TQCIC). We created an ad hoc networking application to collect real-time data sensed from the nodes that were randomly distributed throughout the building. If the sensors are relocated, then the application automatically reconfigures itself in the light of the new routing topology. WSNs are event-based systems that rely on the collective effort of several micro-sensor nodes, which are continuously observing a physical phenomenon. WSN applications require spatially dense sensor deployment in order to achieve satisfactory coverage. The degree of spatial correlation increases with the decreasing inter-node separation. Energy consumption is reduced dramatically by having only those sensor nodes with unique readings transmit their data. We report on an algorithm based on a spatial correlation technique that assures high QoS (in terms of SNR) of the network as well as proper utilization of energy, by suppressing redundant data transmission

  2. Comparison of two headspace sampling techniques for the analysis of off-flavour volatiles from oat based products.

    PubMed

    Cognat, Claudine; Shepherd, Tom; Verrall, Susan R; Stewart, Derek

    2012-10-01

    Two different headspace sampling techniques were compared for analysis of aroma volatiles from freshly produced and aged plain oatcakes. Solid phase microextraction (SPME) using a Carboxen-Polydimethylsiloxane (PDMS) fibre and entrainment on Tenax TA within an adsorbent tube were used for collection of volatiles. The effects of variation in the sampling method were also considered using SPME. The data obtained using both techniques were processed by multivariate statistical analysis (PCA). Both techniques showed similar capacities to discriminate between the samples at different ages. Discrimination between fresh and rancid samples could be made on the basis of changes in the relative abundances of 14-15 of the constituents in the volatile profiles. A significant effect on the detection level of volatile compounds was observed when samples were crushed and analysed by SPME-GC-MS, in comparison to undisturbed product. The applicability and cost effectiveness of both methods were considered. Copyright © 2012 Elsevier Ltd. All rights reserved.

  3. A new technique for calculating reentry base heating. [analysis of laminar base flow field of two dimensional reentry body

    NASA Technical Reports Server (NTRS)

    Meng, J. C. S.

    1973-01-01

    The laminar base flow field of a two-dimensional reentry body has been studied by Telenin's method. The flow domain was divided into strips along the x-axis, and the flow variations were represented by Lagrange interpolation polynomials in the transformed vertical coordinate. The complete Navier-Stokes equations were used in the near wake region, and the boundary layer equations were applied elsewhere. The boundary conditions consisted of the flat plate thermal boundary layer in the forebody region and the near wake profile in the downstream region. The resulting two-point boundary value problem of 33 ordinary differential equations was then solved by the multiple shooting method. The detailed flow field and thermal environment in the base region are presented in the form of temperature contours, Mach number contours, velocity vectors, pressure distributions, and heat transfer coefficients on the base surface. The maximum heating rate was found on the centerline, and the two-dimensional stagnation point flow solution was adquate to estimate the maximum heating rate so long as the local Reynolds number could be obtained.

  4. NASA standard: Trend analysis techniques

    NASA Technical Reports Server (NTRS)

    1988-01-01

    This Standard presents descriptive and analytical techniques for NASA trend analysis applications. Trend analysis is applicable in all organizational elements of NASA connected with, or supporting, developmental/operational programs. Use of this Standard is not mandatory; however, it should be consulted for any data analysis activity requiring the identification or interpretation of trends. Trend Analysis is neither a precise term nor a circumscribed methodology, but rather connotes, generally, quantitative analysis of time-series data. For NASA activities, the appropriate and applicable techniques include descriptive and graphical statistics, and the fitting or modeling of data by linear, quadratic, and exponential models. Usually, but not always, the data is time-series in nature. Concepts such as autocorrelation and techniques such as Box-Jenkins time-series analysis would only rarely apply and are not included in this Standard. The document presents the basic ideas needed for qualitative and quantitative assessment of trends, together with relevant examples. A list of references provides additional sources of information.

  5. Prefractionation techniques in proteome analysis.

    PubMed

    Righetti, Pier Giorgio; Castagna, Annalisa; Herbert, Ben; Reymond, Frederic; Rossier, Joël S

    2003-08-01

    The present review deals with a number of prefractionation protocols in preparation for two-dimensional map analysis, both in the fields of chromatography and in the field of electrophoresis. In the first case, Fountoulaki's groups has reported just about any chromatographic procedure useful as a prefractionation step, including affinity, ion-exchange, and reversed-phase resins. As a result of the various enrichment steps, several hundred new species, previously undetected in unfractionated samples, could be revealed for the first time. Electrophoretic prefractionation protocols include all those electrokinetic methodologies which are performed in free solution, essentially all relying on isoelectric focusing steps. The devices here reviewed include multichamber apparatus, such as the multicompartment electrolyzer with Immobiline membranes, Off-Gel electrophoresis in a multicup device and the Rotofor, an instrument also based on a multichamber system but exploiting the conventional technique of carrier-ampholyte-focusing. Other instruments of interest are the Octopus, a continuous-flow device for isoelectric focusing in a upward flowing liquid curtain, and the Gradiflow, where different pI cuts are obtained by a multistep passage through two compartments buffered at different pH values. It is felt that this panoply of methods could offer a strong step forward in "mining below the tip of the iceberg" for detecting the "unseen proteome".

  6. Searching the most appropriate sample pretreatment for the elemental analysis of wines by inductively coupled plasma-based techniques.

    PubMed

    Gonzálvez, A; Armenta, S; Pastor, A; de la Guardia, M

    2008-07-09

    Different sample preparation methods were evaluated for the simultaneous multielement analysis of wine samples by inductively coupled plasma optical emission spectrometry (ICP-OES) and inductively coupled plasma mass spectrometry (ICP-MS). Microwave-assisted digestion in closed vessel, thermal digestion in open reactor, and direct sample dilution were considered for the determination of Li, Be, Na, Mg, Al, K, Ca, Sc, Ti, V, Cr, Mn, Fe, Co, Ni, Cu, Zn, As, Se, Sr, Y, Mo, Cd, Ba, La, Ce, Pr, Nd, Sm, Eu, Gd, Tb, Dy, Ho, Er, Tm, Yb, Lu, Tl, Pb, and Bi in 12 samples of red wine from Valencia and Utiel-Requena protected designation of origin. ICP-MS allows the determination of 17 elements in most of the samples, and using ICP-OES, a maximum of 15 elements were determined. On comparing the sample pretreatment methodology, it can be concluded that the three assayed procedures provide comparable results for the concentration of Li, Na, Mg, Al, K, Ca, Mn, Fe, Zn, and Sr by ICP-OES. Furthermore, ICP-MS data found for Cu, Pb, and Ba were comparable. Digestion treatment provides comparable values using both total decomposition in open system and microwave-assisted treatment for Cu by ICP-OES and for Cr, Ni, and Zn by ICP-MS. Open vessel total digestion provides excess values for Cr, Mn, Fe, and Zn by ICP-OES and defect values for Se. However, direct measurement of diluted wine samples provided uncomparable results with the digestion treatment for Mn, Cu, Pb, Zn, Ba, and Bi by ICP-OES and for Mg, Cr, Fe, Ni, and Zn by ICP-MS. Therefore, it can be concluded that microwave-assisted digestion is the pretreatment procedure of choice for elemental analysis of wine by ICP-based techniques.

  7. [Analysis on the application and evaluation of the case-based learning of Acupuncture-Moxibustion Techniques].

    PubMed

    Chen, Ji; Wu, Xi; Hu, You-Ping; Zheng, Hui; Liang, Fan-Rong

    2013-03-01

    The feasibility of the case-based learning (CBL) of Acupuncture-Moxibustion Techniques is discussed in the aspects of the connotation application, implementation process and effect evaluation of CBL in teaching this course. The result shows that compared with the traditional teaching model, CBL achieves the same effects on the mastery of theoretic knowledge on the techniques of acupuncture, moxibustion and cupping for the undergraduates. Moreover, CBL presentes the obvious advantages on the improvements of the ability of the manipulation selection based on the clinical symptoms and the practical manipulative skills of the students. Hence, CBL is deserved to be promoted as an experimental unit in the professional technique courses of acupuncture-moxibustion.

  8. A preliminary structural analysis of space-base living quarters modules to verify a weight-estimating technique

    NASA Technical Reports Server (NTRS)

    Grissom, D. S.; Schneider, W. C.

    1971-01-01

    The determination of a base line (minimum weight) design for the primary structure of the living quarters modules in an earth-orbiting space base was investigated. Although the design is preliminary in nature, the supporting analysis is sufficiently thorough to provide a reasonably accurate weight estimate of the major components that are considered to comprise the structural weight of the space base.

  9. [Evidence-based TEP technique].

    PubMed

    Köckerling, F

    2017-01-13

    The guidelines of all international hernia societies recommend as procedures of choice the laparoendoscopic techniques total extraperitoneal patch plasty (TEP) and transabdominal preperitoneal patch plasty (TAPP) as well as the open Lichtenstein operation for elective inguinal hernia repair. The learning curve associated with the laparoendoscopic techniques, in particular TEP, is longer than that for the open Lichtenstein technique due to the complexity of the procedures. Accordingly, for laparoendoscopic techniques it is particularly important that the operations are conducted in a standardized manner in compliance with the evidence-based recommendations given for the technical details. When procedures are carried out in strict compliance with the guidelines of the international hernia societies, low rates of perioperative complications, complication-related reoperations, recurrences and chronic pain can be expected for TEP. Compliance with the guidelines can also positively impact mastery of the learning curve for TEP. The technical guidelines on TEP are based on study results and on the experiences of numerous experts; therefore, it is imperative that they are implemented in routine surgical practice.

  10. Novel Texture-based Probabilistic Object Recognition and Tracking Techniques for Food Intake Analysis and Traffic Monitoring

    DTIC Science & Technology

    2015-10-02

    22 2.2.10 Machine Learning Based Tracking . . . . . . . . . . . . . . . . . . . . 23 2.2.11 Part Based Models...involve machine vision and machine learning , and have preprocessing, segmentation, and classification steps [53]. In [31], early food analysis...classification and Artificial Neural Networks (ANN) [53]. A few years later, ANN and Statistical Learning (SL) were the standard machine learning

  11. Techniques for Automated Performance Analysis

    SciTech Connect

    Marcus, Ryan C.

    2014-09-02

    The performance of a particular HPC code depends on a multitude of variables, including compiler selection, optimization flags, OpenMP pool size, file system load, memory usage, MPI configuration, etc. As a result of this complexity, current predictive models have limited applicability, especially at scale. We present a formulation of scientific codes, nodes, and clusters that reduces complex performance analysis to well-known mathematical techniques. Building accurate predictive models and enhancing our understanding of scientific codes at scale is an important step towards exascale computing.

  12. Proteomic data analysis of glioma cancer stem-cell lines based on novel nonlinear dimensional data reduction techniques

    NASA Astrophysics Data System (ADS)

    Lespinats, Sylvain; Pinker-Domenig, Katja; Wengert, Georg; Houben, Ivo; Lobbes, Marc; Stadlbauer, Andreas; Meyer-Bäse, Anke

    2016-05-01

    Glioma-derived cancer stem cells (GSCs) are tumor-initiating cells and may be refractory to radiation and chemotherapy and thus have important implications for tumor biology and therapeutics. The analysis and interpretation of large proteomic data sets requires the development of new data mining and visualization approaches. Traditional techniques are insufficient to interpret and visualize these resulting experimental data. The emphasis of this paper lies in the application of novel approaches for the visualization, clustering and projection representation to unveil hidden data structures relevant for the accurate interpretation of biological experiments. These qualitative and quantitative methods are applied to the proteomic analysis of data sets derived from the GSCs. The achieved clustering and visualization results provide a more detailed insight into the protein-level fold changes and putative upstream regulators for the GSCs. However the extracted molecular information is insufficient in classifying GSCs and paving the pathway to an improved therapeutics of the heterogeneous glioma.

  13. Balloon-based interferometric techniques

    NASA Technical Reports Server (NTRS)

    Rees, David

    1985-01-01

    A balloon-borne triple-etalon Fabry-Perot Interferometer, observing the Doppler shifts of absorption lines caused by molecular oxygen and water vapor in the far red/near infrared spectrum of backscattered sunlight, has been used to evaluate a passive spaceborne remote sensing technique for measuring winds in the troposphere and stratosphere. There have been two successful high altitude balloon flights of the prototype UCL instrument from the National Scientific Balloon Facility at Palestine, TE (May 80, Oct. 83). The results from these flights have demonstrated that an interferometer with adequate resolution, stability and sensitivity can be built. The wind data are of comparable quality to those obtained from operational techniques (balloon and rocket sonde, cloud-top drift analysis, and from the gradient wind analysis of satellite radiance measurements). However, the interferometric data can provide a regular global grid, over a height range from 5 to 50 km in regions of clear air. Between the middle troposphere (5 km) and the upper stratosphere (40 to 50 km), an optimized instrument can make wind measurements over the daylit hemisphere with an accuracy of about 3 to 5 m/sec (2 sigma). It is possible to obtain full height profiles between altitudes of 5 and 50 km, with 4 km height resolution, and a spatial resolution of about 200 km, along the orbit track. Below an altitude of about 10 km, Fraunhofer lines of solar origin are possible targets of the Doppler wind analysis. Above an altitude of 50 km, the weakness of the backscattered solar spectrum (decreasing air density) is coupled with the low absorption crosssection of all atmospheric species in the spectral region up to 800 nm (where imaging photon detectors can be used), causing the along-the-track resolution (or error) to increase beyond values useful for operational purposes. Within the region of optimum performance (5 to 50 km), however, the technique is a valuable potential complement to existing wind

  14. Global precipitation estimates based on a technique for combining satellite-based estimates, rain gauge analysis, and NWP model precipitation information

    NASA Technical Reports Server (NTRS)

    Huffman, George J.; Adler, Robert F.; Rudolf, Bruno; Schneider, Udo; Keehn, Peter R.

    1995-01-01

    The 'satellite-gauge model' (SGM) technique is described for combining precipitation estimates from microwave satellite data, infrared satellite data, rain gauge analyses, and numerical weather prediction models into improved estimates of global precipitation. Throughout, monthly estimates on a 2.5 degrees x 2.5 degrees lat-long grid are employed. First, a multisatellite product is developed using a combination of low-orbit microwave and geosynchronous-orbit infrared data in the latitude range 40 degrees N - 40 degrees S (the adjusted geosynchronous precipitation index) and low-orbit microwave data alone at higher latitudes. Then the rain gauge analysis is brougth in, weighting each field by its inverse relative error variance to produce a nearly global, observationally based precipitation estimate. To produce a complete global estimate, the numerical model results are used to fill data voids in the combined satellite-gauge estimate. Our sequential approach to combining estimates allows a user to select the multisatellite estimate, the satellite-gauge estimate, or the full SGM estimate (observationally based estimates plus the model information). The primary limitation in the method is imperfections in the estimation of relative error for the individual fields. The SGM results for one year of data (July 1987 to June 1988) show important differences from the individual estimates, including model estimates as well as climatological estimates. In general, the SGM results are drier in the subtropics than the model and climatological results, reflecting the relatively dry microwave estimates that dominate the SGM in oceanic regions.

  15. Global precipitation estimates based on a technique for combining satellite-based estimates, rain gauge analysis, and NWP model precipitation information

    NASA Technical Reports Server (NTRS)

    Huffman, George J.; Adler, Robert F.; Rudolf, Bruno; Schneider, Udo; Keehn, Peter R.

    1995-01-01

    The 'satellite-gauge model' (SGM) technique is described for combining precipitation estimates from microwave satellite data, infrared satellite data, rain gauge analyses, and numerical weather prediction models into improved estimates of global precipitation. Throughout, monthly estimates on a 2.5 degrees x 2.5 degrees lat-long grid are employed. First, a multisatellite product is developed using a combination of low-orbit microwave and geosynchronous-orbit infrared data in the latitude range 40 degrees N - 40 degrees S (the adjusted geosynchronous precipitation index) and low-orbit microwave data alone at higher latitudes. Then the rain gauge analysis is brougth in, weighting each field by its inverse relative error variance to produce a nearly global, observationally based precipitation estimate. To produce a complete global estimate, the numerical model results are used to fill data voids in the combined satellite-gauge estimate. Our sequential approach to combining estimates allows a user to select the multisatellite estimate, the satellite-gauge estimate, or the full SGM estimate (observationally based estimates plus the model information). The primary limitation in the method is imperfections in the estimation of relative error for the individual fields. The SGM results for one year of data (July 1987 to June 1988) show important differences from the individual estimates, including model estimates as well as climatological estimates. In general, the SGM results are drier in the subtropics than the model and climatological results, reflecting the relatively dry microwave estimates that dominate the SGM in oceanic regions.

  16. High Throughput Sample Preparation and Analysis for DNA Sequencing, PCR and Combinatorial Screening of Catalysis Based on Capillary Array Technique

    SciTech Connect

    Zhang, Yonghua

    2000-01-01

    Sample preparation has been one of the major bottlenecks for many high throughput analyses. The purpose of this research was to develop new sample preparation and integration approach for DNA sequencing, PCR based DNA analysis and combinatorial screening of homogeneous catalysis based on multiplexed capillary electrophoresis with laser induced fluorescence or imaging UV absorption detection. The author first introduced a method to integrate the front-end tasks to DNA capillary-array sequencers. protocols for directly sequencing the plasmids from a single bacterial colony in fused-silica capillaries were developed. After the colony was picked, lysis was accomplished in situ in the plastic sample tube using either a thermocycler or heating block. Upon heating, the plasmids were released while chromsomal DNA and membrane proteins were denatured and precipitated to the bottom of the tube. After adding enzyme and Sanger reagents, the resulting solution was aspirated into the reaction capillaries by a syringe pump, and cycle sequencing was initiated. No deleterious effect upon the reaction efficiency, the on-line purification system, or the capillary electrophoresis separation was observed, even though the crude lysate was used as the template. Multiplexed on-line DNA sequencing data from 8 parallel channels allowed base calling up to 620 bp with an accuracy of 98%. The entire system can be automatically regenerated for repeated operation. For PCR based DNA analysis, they demonstrated that capillary electrophoresis with UV detection can be used for DNA analysis starting from clinical sample without purification. After PCR reaction using cheek cell, blood or HIV-1 gag DNA, the reaction mixtures was injected into the capillary either on-line or off-line by base stacking. The protocol was also applied to capillary array electrophoresis. The use of cheaper detection, and the elimination of purification of DNA sample before or after PCR reaction, will make this approach an

  17. A novel fast and flexible technique of radical kinetic behaviour investigation based on pallet for plasma evaluation structure and numerical analysis

    NASA Astrophysics Data System (ADS)

    Malinowski, Arkadiusz; Takeuchi, Takuya; Chen, Shang; Suzuki, Toshiya; Ishikawa, Kenji; Sekine, Makoto; Hori, Masaru; Lukasiak, Lidia; Jakubowski, Andrzej

    2013-07-01

    This paper describes a new, fast, and case-independent technique for sticking coefficient (SC) estimation based on pallet for plasma evaluation (PAPE) structure and numerical analysis. Our approach does not require complicated structure, apparatus, or time-consuming measurements but offers high reliability of data and high flexibility. Thermal analysis is also possible. This technique has been successfully applied to estimation of very low value of SC of hydrogen radicals on chemically amplified ArF 193 nm photoresist (the main goal of this study). Upper bound of our technique has been determined by investigation of SC of fluorine radical on polysilicon (in elevated temperature). Sources of estimation error and ways of its reduction have been also discussed. Results of this study give an insight into the process kinetics, and not only they are helpful in better process understanding but additionally they may serve as parameters in a phenomenological model development for predictive modelling of etching for ultimate CMOS topography simulation.

  18. Coaching points for the technique of the eggbeater kick in synchronized swimming based on three-dimensional motion analysis.

    PubMed

    Homma, Miwako; Homma, Masanobu

    2005-01-01

    We investigated coaching points for the technique of the eggbeater kick in synchronized swimming. The movements of the eggbeater kick for six female synchronized swimmers, all the top-ranking members at the Olympics and World Championships, were analyzed by using three-dimensional motion analysis. By comparing the movements of higher eggbeater-skilled swimmers with lower eggbeater-skilled swimmers, the coaching points for techniques of the eggbeater kick in synchronized swimming are clarified as follows. First, to hold the knees as high and as near the water surface as possible and to keep the heels close to the hips. Secondly, to keep the knees as wide as possible. Thirdly, to kick sideways producing lift force, and not to kick downward. Fourthly, to pull up the heels to nearly the water surface at the out-kick, with strong internal rotation of the thighs. Lastly, to move the feet as if treading water, with strong dorsiflexion and plantar flexion and adduction of the foot; that is to move the feet like the motion of the hands when sculling. Moreover, the movements of higher eggbeater-skilled swimmers can be termed 'horizontal kick type' and the movements of lower eggbeater-skilled swimmers can be termed 'vertical kick type'.

  19. TEM based high resolution and low-dose scanning electron nanodiffraction technique for nanostructure imaging and analysis.

    PubMed

    Kim, Kyou-Hyun; Xing, Hui; Zuo, Jian-Min; Zhang, Peng; Wang, Haifeng

    2015-04-01

    We report a high resolution and low-dose scanning electron nanodiffraction (SEND) technique for nanostructure analysis. The SEND patterns are recorded in a transmission electron microscope (TEM) using a low-brightness ∼2 nm electron beam with a LaB6 thermionic source obtained by a large demagnification of the condenser 1 lens. The diffraction pattern is directly recorded using a CCD camera optimized for low-dose imaging. A custom script was developed for calibration and automated data acquisition. The performance of low-dose SEND is evaluated using nanostructured Au as a test sample for the quality of diffraction patterns, sample stability and probe size. We demonstrate that our method provides an effective and robust way for recording diffraction patterns from nanometer-sized grains. Copyright © 2015 Elsevier Ltd. All rights reserved.

  20. Judging complex movement performances for excellence: a principal components analysis-based technique applied to competitive diving.

    PubMed

    Young, Cole; Reinkensmeyer, David J

    2014-08-01

    Athletes rely on subjective assessment of complex movements from coaches and judges to improve their motor skills. In some sports, such as diving, snowboard half pipe, gymnastics, and figure skating, subjective scoring forms the basis for competition. It is currently unclear whether this scoring process can be mathematically modeled; doing so could provide insight into what motor skill is. Principal components analysis has been proposed as a motion analysis method for identifying fundamental units of coordination. We used PCA to analyze movement quality of dives taken from USA Diving's 2009 World Team Selection Camp, first identifying eigenpostures associated with dives, and then using the eigenpostures and their temporal weighting coefficients, as well as elements commonly assumed to affect scoring - gross body path, splash area, and board tip motion - to identify eigendives. Within this eigendive space we predicted actual judges' scores using linear regression. This technique rated dives with accuracy comparable to the human judges. The temporal weighting of the eigenpostures, body center path, splash area, and board tip motion affected the score, but not the eigenpostures themselves. These results illustrate that (1) subjective scoring in a competitive diving event can be mathematically modeled; (2) the elements commonly assumed to affect dive scoring actually do affect scoring (3) skill in elite diving is more associated with the gross body path and the effect of the movement on the board and water than the units of coordination that PCA extracts, which might reflect the high level of technique these divers had achieved. We also illustrate how eigendives can be used to produce dive animations that an observer can distort continuously from poor to excellent, which is a novel approach to performance visualization.

  1. Insights into the prominent effect of mahanimbine on Acanthamoeba castellanii: Cell profiling analysis based on microscopy techniques

    NASA Astrophysics Data System (ADS)

    Hashim, Fatimah; Amin, Nakisah Mat

    2017-02-01

    Mahanimbine (MH), has been shown to have antiamoeba properties. Therefore, the aim of this study was to assess the growth inhibitory mechanisms of MH on Acanthamoeba castellanii, a causative agents for Acanthamoeba keratitis. The IC50 value obtained for MH against A. castellanii was 1.18 µg/ml. Light and scanning electron microscopy observation showed that most cells were in cystic appearance. While transmission electron microscopy observation revealed changes at the ultrastructural level and fluorescence microscopy observation indicated the induction of apoptosis and autophagic activity in the amoeba cytoplasms. In conclusion, MH has very potent anti-amoebic properties on A. castellanii as is shown by cytotoxicity analyses based on microscopy techniques.

  2. Automatic control of a robot camera for broadcasting based on cameramen's techniques and subjective evaluation and analysis of reproduced images.

    PubMed

    Kato, D; Katsuura, T; Koyama, H

    2000-03-01

    With the goal of achieving an intelligent robot camera system that can take dynamic images automatically through humanlike, natural camera work, we analyzed how images were shot, subjectively evaluated reproduced images, and examined effects of camerawork, using camera control technique as a parameter. It was found that (1) A high evaluation is obtained when human-based data are used for the position adjusting velocity curve of the target; (2) Evaluation scores are relatively high for images taken with feedback-feedforward camera control method for target movement in one direction; (3) Keeping the target within the image area using the control method that imitates human camera handling becomes increasingly difficult when the target changes both direction and velocity and becomes bigger and faster, and (4) The mechanical feedback method can cope with rapid changes in the target's direction and velocity, constantly keeping the target within the image area, though the viewer finds the image rather mechanical as opposed to humanlike.

  3. Photogrammetric Techniques for Road Surface Analysis

    NASA Astrophysics Data System (ADS)

    Knyaz, V. A.; Chibunichev, A. G.

    2016-06-01

    The quality and condition of a road surface is of great importance for convenience and safety of driving. So the investigations of the behaviour of road materials in laboratory conditions and monitoring of existing roads are widely fulfilled for controlling a geometric parameters and detecting defects in the road surface. Photogrammetry as accurate non-contact measuring method provides powerful means for solving different tasks in road surface reconstruction and analysis. The range of dimensions concerned in road surface analysis can have great variation from tenths of millimetre to hundreds meters and more. So a set of techniques is needed to meet all requirements of road parameters estimation. Two photogrammetric techniques for road surface analysis are presented: for accurate measuring of road pavement and for road surface reconstruction based on imagery obtained from unmanned aerial vehicle. The first technique uses photogrammetric system based on structured light for fast and accurate surface 3D reconstruction and it allows analysing the characteristics of road texture and monitoring the pavement behaviour. The second technique provides dense 3D model road suitable for road macro parameters estimation.

  4. Task-based strategy for optimized contrast enhanced breast imaging: Analysis of six imaging techniques for mammography and tomosynthesis

    SciTech Connect

    Ikejimba, Lynda C.; Kiarashi, Nooshin; Ghate, Sujata V.; Samei, Ehsan; Lo, Joseph Y.

    2014-06-15

    Purpose: The use of contrast agents in breast imaging has the capability of enhancing nodule detectability and providing physiological information. Accordingly, there has been a growing trend toward using iodine as a contrast medium in digital mammography (DM) and digital breast tomosynthesis (DBT). Widespread use raises concerns about the best way to use iodine in DM and DBT, and thus a comparison is necessary to evaluate typical iodine-enhanced imaging methods. This study used a task-based observer model to determine the optimal imaging approach by analyzing six imaging paradigms in terms of their ability to resolve iodine at a given dose: unsubtracted mammography and tomosynthesis, temporal subtraction mammography and tomosynthesis, and dual energy subtraction mammography and tomosynthesis. Methods: Imaging performance was characterized using a detectability index d{sup ′}, derived from the system task transfer function (TTF), an imaging task, iodine signal difference, and the noise power spectrum (NPS). The task modeled a 10 mm diameter lesion containing iodine concentrations between 2.1 mg/cc and 8.6 mg/cc. TTF was obtained using an edge phantom, and the NPS was measured over several exposure levels, energies, and target-filter combinations. Using a structured CIRS phantom, d{sup ′} was generated as a function of dose and iodine concentration. Results: For all iodine concentrations and dose, temporal subtraction techniques for mammography and tomosynthesis yielded the highest d{sup ′}, while dual energy techniques for both modalities demonstrated the next best performance. Unsubtracted imaging resulted in the lowest d{sup ′} values for both modalities, with unsubtracted mammography performing the worst out of all six paradigms. Conclusions: At any dose, temporal subtraction imaging provides the greatest detectability, with temporally subtracted DBT performing the highest. The authors attribute the successful performance to excellent cancellation of

  5. Task-based strategy for optimized contrast enhanced breast imaging: Analysis of six imaging techniques for mammography and tomosynthesis

    PubMed Central

    Ikejimba, Lynda C.; Kiarashi, Nooshin; Ghate, Sujata V.; Samei, Ehsan; Lo, Joseph Y.

    2014-01-01

    Purpose: The use of contrast agents in breast imaging has the capability of enhancing nodule detectability and providing physiological information. Accordingly, there has been a growing trend toward using iodine as a contrast medium in digital mammography (DM) and digital breast tomosynthesis (DBT). Widespread use raises concerns about the best way to use iodine in DM and DBT, and thus a comparison is necessary to evaluate typical iodine-enhanced imaging methods. This study used a task-based observer model to determine the optimal imaging approach by analyzing six imaging paradigms in terms of their ability to resolve iodine at a given dose: unsubtracted mammography and tomosynthesis, temporal subtraction mammography and tomosynthesis, and dual energy subtraction mammography and tomosynthesis. Methods: Imaging performance was characterized using a detectability index d′, derived from the system task transfer function (TTF), an imaging task, iodine signal difference, and the noise power spectrum (NPS). The task modeled a 10 mm diameter lesion containing iodine concentrations between 2.1 mg/cc and 8.6 mg/cc. TTF was obtained using an edge phantom, and the NPS was measured over several exposure levels, energies, and target-filter combinations. Using a structured CIRS phantom, d′ was generated as a function of dose and iodine concentration. Results: For all iodine concentrations and dose, temporal subtraction techniques for mammography and tomosynthesis yielded the highest d′, while dual energy techniques for both modalities demonstrated the next best performance. Unsubtracted imaging resulted in the lowest d′ values for both modalities, with unsubtracted mammography performing the worst out of all six paradigms. Conclusions: At any dose, temporal subtraction imaging provides the greatest detectability, with temporally subtracted DBT performing the highest. The authors attribute the successful performance to excellent cancellation of inplane structures and

  6. Characterization of Grain Size Distribution and Grain Shape Analysis of Tephra Deposits: a New Approach Based on Automated Microscopy and Image Analysis Techniques

    NASA Astrophysics Data System (ADS)

    Volentik, A. C.; Bonadonna, C.; Connor, C. B.

    2006-12-01

    Grain size distribution (GSD) is a key parameter in physical volcanology, not only for the characterization of tephra deposits, but also for the application of numerical models and for the compilation of reliable tephra hazard assessments. In addition, grain shape analysis (GSA) has important implications in the determination of particle settling velocities, crucial factor in tephra dispersal models. We present a new method for the determination of reliable GSD together with 2D GSA for volcanic particle ranging between 0.5 μ and 2 mm in size (from 13 to -1 in the Φ scale): the application of the Malvern PharmaVision 830 (PVS) automated optical device. PVS provides several morphological parameters that can be used to determine GSD of volcanic ash. We have compared GSD results for 3 different morphological parameters (mean diameter, maximum length and width) of three samples collected along a dispersal axis of a Plinian eruption of Pululagua volcano (Ecuador). GSD based on particle width gives the best fit with GSD data resulting from sieving techniques and is thus recommended when GSD of volcanic ash (< 2mm) has to be combined with GSD of volcanic lapilli (> 2mm) that require hand sieving. GSA data were investigated to characterize morphology variations with magma composition and with distance from the volcanic vent. In fact, GSA results were analyzed for 3 different tephra deposits with different magma composition: (i) Cerro Negro (basaltic), (ii) Pululagua (dacitic) and (iii) Bishop Tuff (rhyolitic). In particular, we have found that particle intensity shows the same trend for all deposits, whereas trends of roundness and convexity are different for different magma compositions, suggesting that roundness and convexity are strongly dependent on magma fragmentation mechanisms. Preliminary results have also shown that mean roundness, mean convexity and mean intensity do not vary significantly with distance from vent for Pululagua. Finally, an attempt has been made

  7. Task-based strategy for optimized contrast enhanced breast imaging: analysis of six imaging techniques for mammography and tomosynthesis

    NASA Astrophysics Data System (ADS)

    Ikejimba, Lynda; Kiarashi, Nooshin; Lin, Yuan; Chen, Baiyu; Ghate, Sujata V.; Zerhouni, Moustafa; Samei, Ehsan; Lo, Joseph Y.

    2012-03-01

    Digital breast tomosynthesis (DBT) is a novel x-ray imaging technique that provides 3D structural information of the breast. In contrast to 2D mammography, DBT minimizes tissue overlap potentially improving cancer detection and reducing number of unnecessary recalls. The addition of a contrast agent to DBT and mammography for lesion enhancement has the benefit of providing functional information of a lesion, as lesion contrast uptake and washout patterns may help differentiate between benign and malignant tumors. This study used a task-based method to determine the optimal imaging approach by analyzing six imaging paradigms in terms of their ability to resolve iodine at a given dose: contrast enhanced mammography and tomosynthesis, temporal subtraction mammography and tomosynthesis, and dual energy subtraction mammography and tomosynthesis. Imaging performance was characterized using a detectability index d', derived from the system task transfer function (TTF), an imaging task, iodine contrast, and the noise power spectrum (NPS). The task modeled a 5 mm lesion containing iodine concentrations between 2.1 mg/cc and 8.6 mg/cc. TTF was obtained using an edge phantom, and the NPS was measured over several exposure levels, energies, and target-filter combinations. Using a structured CIRS phantom, d' was generated as a function of dose and iodine concentration. In general, higher dose gave higher d', but for the lowest iodine concentration and lowest dose, dual energy subtraction tomosynthesis and temporal subtraction tomosynthesis demonstrated the highest performance.

  8. A RT-based Technique for the Analysis and the Removal of Titan's Atmosphere by Cassini/VIMS-IR data

    NASA Astrophysics Data System (ADS)

    Sindoni, G.; Tosi, F.; Adriani, A.; Moriconi, M. L.; D'Aversa, E.; Grassi, D.; Oliva, F.; Dinelli, B. M.; Castelli, E.

    2015-12-01

    Since 2004, the Visual and Infrared Mapping Spectrometer (VIMS), together with the CIRS and UVIS spectrometers, aboard the Cassini spacecraft has provided insight on Saturn and Titan atmospheres through remote sensing observations. The presence of clouds and aerosols in Titan's dense atmosphere makes the analysis of the surface radiation a difficult task. For this purpose, an atmospheric radiative transfer (RT) model is required. The implementation of a RT code, which includes multiple scattering, in an inversion algorithm based on the Bayesian approach, can provide strong constraints about both the surface albedo and the atmospheric composition. The application of this retrieval procedure we have developed to VIMS-IR spectra acquired in nadir or slant geometries allows us to retrieve the equivalent opacity of Titan's atmosphere in terms of variable aerosols and gaseous content. Thus, the separation of the atmospheric and surface contributions in the observed spectrum is possible. The atmospheric removal procedure was tested on the spectral range 1-2.2μm of publicly available VIMS data covering the Ontario Lacus and Ligeia Mare regions. The retrieval of the accurate composition of Titan's atmosphere is a much more complex task. So far, the information about the vertical structure of the atmosphere by limb spectra was mostly derived under conditions where the scattering could be neglected [1,2]. Indeed, since the very high aerosol load in the middle-low atmosphere produces strong scattering effects on the measured spectra, the analysis requires a RT modeling taking into account multiple scattering in a spherical-shell geometry. Therefore the use of an innovative method we are developing based on the Monte-Carlo approach, can provide important information about the vertical distribution of the aerosols and the gases composing Titan's atmosphere.[1]Bellucci et al., (2009). Icarus, 201, Issue 1, p. 198-216.[2]de Kok et al., (2007). Icarus, 191, Issue 1, p. 223-235.

  9. Integration of conventional GIS-based techniques and remote sensing analysis to landslide risk assessment at basin scale

    NASA Astrophysics Data System (ADS)

    Agili, F.; Bartolomei, A.; Casagli, N.; Catani, F.; Ermini, L.; Farina, P.; Kukavicic, M.; Mirannalti, M.; Moretti, S.; Righini, G.

    2003-04-01

    This note concerns the preliminary results gathered in a research project aimed at landslide risk assessment in the Arno River basin (9000 km^2). The project, sponsored by the Basin Authority of the Arno River, started in the year 2002 and it will finish in the 2004. The objective of such a project consists of the updating of the landslide risk cartography related to the PAI document (Piano Assetto Idrogeologico) with reference to the Italian Law 267/1998. Different types of products will be generated: the updating of the existing inventory maps and the definition and application of a methodology for landslide hazard and risk mapping. Conventional methods, such as aerial-photo interpretation and field surveys are coupled with the use of different remote sensing methods, and all the data are integrated within a GIS environment. The analysis of remote sensing data regards both optical and radar images. In particular for the analysis of optical data, panchromatic and multispectral Landsat images are used in order to update the Corine standard land cover maps. In addition high resolution images (Ikonos and Quickbird), acquired in stereoscopic configuration, are analysed for integrating the aerial-photo intepretation. Differential SAR interferometry, implemented by using ERS and JERS data, is used in order to detect new mass movements, not yet observed and to evaluate the state of activity of known phenomena. Such data represent the base needed to produce the final landslide risk cartography.

  10. Decision Analysis Using Extended Techniques

    PubMed Central

    Lau, Joseph; Pauker, Stephen G.

    1985-01-01

    Clinical problems are often complex, repetitive and time dependent. Using only the classical decision tree formalism to model such details are often impractical if not impossible. A number of techniques are described here that could be used to reduce the complexity and to improve the representation. A case illustration describes how such techniques may be used.

  11. Automatic system for brain MRI analysis using a novel combination of fuzzy rule-based and automatic clustering techniques

    NASA Astrophysics Data System (ADS)

    Hillman, Gilbert R.; Chang, Chih-Wei; Ying, Hao; Kent, T. A.; Yen, John

    1995-05-01

    Analysis of magnetic resonance images (MRI) of the brain permits the identification and measurement of brain compartments. These compartments include normal subdivisions of brain tissue, such as gray matter, white matter and specific structures, and also include pathologic lesions associated with stroke or viral infection. A fuzzy system has been developed to analyze images of animal and human brain, segmenting the images into physiologically meaningful regions for display and measurement. This image segmentation system consists of two stages which include a fuzzy rule-based system and fuzzy c-means algorithm (FCM). The first stage of this system is a fuzzy rule-based system which classifies most pixels in MR images into several known classes and one `unclassified' group, which fails to fit the predetermined rules. In the second stage, this system uses the result of the first stage as initial estimates for the properties of the compartments and applies FCM to classify all the previously unclassified pixels. The initial prototypes are estimated by using the averages of the previously classified pixels. The combined processes constitute a fast, accurate and robust image segmentation system. This method can be applied to many clinical image segmentation problems. While the rule-based portion of the system allows specialized knowledge about the images to be incorporated, the FCM allows the resolution of ambiguities that result from noise and artifacts in the image data. The volumes and locations of the compartments can easily be measured and reported quantitatively once they are identified. It is easy to adapt this approach to new imaging problems, by introducing a new set of fuzzy rules and adjusting the number of expected compartments. However, for the purpose of building a practical fully automatic system, a rule learning mechanism may be necessary to improve the efficiency of modification of the fuzzy rules.

  12. A PLL-based resampling technique for vibration analysis in variable-speed wind turbines with PMSG: A bearing fault case

    NASA Astrophysics Data System (ADS)

    Pezzani, Carlos M.; Bossio, José M.; Castellino, Ariel M.; Bossio, Guillermo R.; De Angelo, Cristian H.

    2017-02-01

    Condition monitoring in permanent magnet synchronous machines has gained interest due to the increasing use in applications such as electric traction and power generation. Particularly in wind power generation, non-invasive condition monitoring techniques are of great importance. Usually, in such applications the access to the generator is complex and costly, while unexpected breakdowns results in high repair costs. This paper presents a technique which allows using vibration analysis for bearing fault detection in permanent magnet synchronous generators used in wind turbines. Given that in wind power applications the generator rotational speed may vary during normal operation, it is necessary to use special sampling techniques to apply spectral analysis of mechanical vibrations. In this work, a resampling technique based on order tracking without measuring the rotor position is proposed. To synchronize sampling with rotor position, an estimation of the rotor position obtained from the angle of the voltage vector is proposed. This angle is obtained from a phase-locked loop synchronized with the generator voltages. The proposed strategy is validated by laboratory experimental results obtained from a permanent magnet synchronous generator. Results with single point defects in the outer race of a bearing under variable speed and load conditions are presented.

  13. Real Time Monitoring and Prediction of the Monsoon Intraseasonal Oscillations: An index based on Nonlinear Laplacian Spectral Analysis Technique

    NASA Astrophysics Data System (ADS)

    Cherumadanakadan Thelliyil, S.; Ravindran, A. M.; Giannakis, D.; Majda, A.

    2016-12-01

    An improved index for real time monitoring and forecast verification of monsoon intraseasonal oscillations (MISO) is introduced using the recently developed Nonlinear Laplacian Spectral Analysis (NLSA) algorithm. Previous studies has demonstrated the proficiency of NLSA in capturing low frequency variability and intermittency of a time series. Using NLSA a hierarchy of Laplace-Beltrami (LB) eigen functions are extracted from the unfiltered daily GPCP rainfall data over the south Asian monsoon region. Two modes representing the full life cycle of complex northeastward propagating boreal summer MISO are identified from the hierarchy of Laplace-Beltrami eigen functions. These two MISO modes have a number of advantages over the conventionally used Extended Empirical Orthogonal Function (EEOF) MISO modes including higher memory and better predictability, higher fractional variance over the western Pacific, Western Ghats and adjoining Arabian Sea regions and more realistic representation of regional heat sources associated with the MISO. The skill of NLSA based MISO indices in real time prediction of MISO is demonstrated using hindcasts of CFSv2 extended range prediction runs. It is shown that these indices yield a higher prediction skill than the other conventional indices supporting the use of NLSA in real time prediction of MISO. Real time monitoring and prediction of MISO finds its application in agriculture, construction and hydro-electric power sectors and hence an important component of monsoon prediction.

  14. Terahertz Wide-Angle Imaging and Analysis on Plane-wave Criteria Based on Inverse Synthetic Aperture Techniques

    NASA Astrophysics Data System (ADS)

    Gao, Jing Kun; Qin, Yu Liang; Deng, Bin; Wang, Hong Qiang; Li, Jin; Li, Xiang

    2016-04-01

    This paper presents two parts of work around terahertz imaging applications. The first part aims at solving the problems occurred with the increasing of the rotation angle. To compensate for the nonlinearity of terahertz radar systems, a calibration signal acquired from a bright target is always used. Generally, this compensation inserts an extra linear phase term in the intermediate frequency (IF) echo signal which is not expected in large-rotation angle imaging applications. We carried out a detailed theoretical analysis on this problem, and a minimum entropy criterion was employed to estimate and compensate for the linear-phase errors. In the second part, the effects of spherical wave on terahertz inverse synthetic aperture imaging are analyzed. Analytic criteria of plane-wave approximation were derived in the cases of different rotation angles. Experimental results of corner reflectors and an aircraft model based on a 330-GHz linear frequency-modulated continuous wave (LFMCW) radar system validated the necessity and effectiveness of the proposed compensation. By comparing the experimental images obtained under plane-wave assumption and spherical-wave correction, it also showed to be highly consistent with the analytic criteria we derived.

  15. Evaluation of culture-based techniques and 454 pyrosequencing for the analysis of fungal diversity in potting media and organic fertilizers.

    PubMed

    Al-Sadi, A M; Al-Mazroui, S S; Phillips, A J L

    2015-08-01

    Potting media and organic fertilizers (OFs) are commonly used in agricultural systems. However, there is a lack of studies on the efficiency of culture-based techniques in assessing the level of fungal diversity in these products. A study was conducted to investigate the efficiency of seven culture-based techniques and pyrosequencing for characterizing fungal diversity in potting media and OFs. Fungal diversity was evaluated using serial dilution, direct plating and baiting with carrot slices, potato slices, radish seeds, cucumber seeds and cucumber cotyledons. Identity of all the isolates was confirmed on the basis of the internal transcribed spacer region of the ribosomal RNA (ITS rRNA) sequence data. The direct plating technique was found to be superior over other culture-based techniques in the number of fungal species detected. It was also found to be simple and the least time consuming technique. Comparing the efficiency of direct plating with 454 pyrosequencing revealed that pyrosequencing detected 12 and 15 times more fungal species from potting media and OFs respectively. Analysis revealed that there were differences between potting media and OFs in the dominant phyla, classes, orders, families, genera and species detected. Zygomycota (52%) and Chytridiomycota (60%) were the predominant phyla in potting media and OFs respectively. The superiority of pyrosequencing over cultural methods could be related to the ability to detect obligate fungi, slow growing fungi and fungi that exist at low population densities. The evaluated methods in this study, especially direct plating and pyrosequencing, may be used as tools to help detect and reduce movement of unwanted fungi between countries and regions. © 2015 The Society for Applied Microbiology.

  16. 3D printed device including disk-based solid-phase extraction for the automated speciation of iron using the multisyringe flow injection analysis technique.

    PubMed

    Calderilla, Carlos; Maya, Fernando; Cerdà, Víctor; Leal, Luz O

    2017-12-01

    The development of advanced manufacturing techniques is crucial for the design of novel analytical tools with unprecedented features. Advanced manufacturing, also known as 3D printing, has been explored for the first time to fabricate modular devices with integrated features for disk-based automated solid-phase extraction (SPE). A modular device integrating analyte oxidation, disk-based SPE and analyte complexation has been fabricated using stereolithographic 3D printing. The 3D printed device is directly connected to flow-based analytical instrumentation, replacing typical flow networks based on discrete elements. As proof of concept, the 3D printed device was implemented in a multisyringe flow injection analysis (MSFIA) system, and applied to the fully automated speciation, SPE and spectrophotometric quantification of Fe in water samples. The obtained limit of detection for total Fe determination was 7ng, with a dynamic linear range from 22ng to 2400ng Fe (3mL sample). An intra-day RSD of 4% (n = 12) and an inter-day RSD of 4.3% (n = 5, 3mL sample, different day with a different disk), were obtained. Incorporation of integrated 3D printed devices with automated flow-based techniques showed improved sensitivity (85% increase on the measured peak height for the determination of total Fe) in comparison with analogous flow manifolds built from conventional tubing and connectors. Our work represents a step forward towards the improved reproducibility in the fabrication of manifolds for flow-based automated methods of analysis, which is especially relevant in the implementation of interlaboratory analysis. Copyright © 2017 Elsevier B.V. All rights reserved.

  17. A Dimensionality Reduction Technique for Efficient Time Series Similarity Analysis

    PubMed Central

    Wang, Qiang; Megalooikonomou, Vasileios

    2008-01-01

    We propose a dimensionality reduction technique for time series analysis that significantly improves the efficiency and accuracy of similarity searches. In contrast to piecewise constant approximation (PCA) techniques that approximate each time series with constant value segments, the proposed method--Piecewise Vector Quantized Approximation--uses the closest (based on a distance measure) codeword from a codebook of key-sequences to represent each segment. The new representation is symbolic and it allows for the application of text-based retrieval techniques into time series similarity analysis. Experiments on real and simulated datasets show that the proposed technique generally outperforms PCA techniques in clustering and similarity searches. PMID:18496587

  18. Visual exploratory analysis of DCE-MRI data in breast cancer based on novel nonlinear dimensional data reduction techniques

    NASA Astrophysics Data System (ADS)

    Meyer-Bäse, Anke; Lespinats, Sylvain; Steinbrücker, Frank; Saalbach, Axel; Schlossbauer, Thomas; Barbu, Adrian

    2009-04-01

    Visualization of multi-dimensional data sets becomes a critical and significant area in modern medical image processing. To analyze such high dimensional data, novel nonlinear embedding approaches become increasingly important to show dependencies among these data in a two- or three-dimensional space. This paper investigates the potential of novel nonlinear dimensional data reduction techniques and compares their results with proven nonlinear techniques when applied to the differentiation of malignant and benign lesions described by high-dimensional data sets arising from dynamic contrast-enhanced magnetic resonance imaging (DCE-MRI). Two important visualization modalities in medical imaging are presented: the mapping on a lower-dimensional data manifold and the image fusion.

  19. LHC Olympics: Advanced Analysis Techniques

    NASA Astrophysics Data System (ADS)

    Armour, Kyle; Larkoski, Andrew; Gray, Amanda; Ventura, Dan; Walsh, Jon; Schabinger, Rob

    2006-05-01

    The LHC Olympics is a series of workshop aimed at encouraging theorists and experimentalists to prepare for the soon-to-be-online Large Hadron Collider in Geneva, Switzerland. One aspect of the LHC Olympics program consists of the study of simulated data sets which represent various possible new physics signals as they would be seen in LHC detectors. Through this exercise, LHC Olympians learn the phenomenology of possible new physics models and gain experience in analyzing LHC data. Additionally, the LHC Olympics encourages discussion between theorists and experimentalists, and through this collaboration new techniques could be developed. The University of Washington LHC Olympics group consists of several first-year graduate and senior undergraduate students, in both theoretical and experimental particle physics. Presented here is a discussion of some of the more advanced techniques used and the recent results of one such LHC Olympics study.

  20. Analysis techniques for momentum transport

    SciTech Connect

    Scott, S.D.

    1991-08-01

    This report discusses the following topics on momentum analysis in tokamaks and stellarators: the momentum balance equation; deposition of torque by neutral beams; effects of toroidal rotation; and experimental observations. (LSP)

  1. Application of spectral analysis techniques to the intercomparison of aerosol data - Part 4: Synthesized analysis of multisensor satellite and ground-based AOD measurements using combined maximum covariance analysis

    NASA Astrophysics Data System (ADS)

    Li, J.; Carlson, B. E.; Lacis, A. A.

    2014-08-01

    In this paper, we introduce the usage of a newly developed spectral decomposition technique - combined maximum covariance analysis (CMCA) - in the spatiotemporal comparison of four satellite data sets and ground-based observations of aerosol optical depth (AOD). This technique is based on commonly used principal component analysis (PCA) and maximum covariance analysis (MCA). By decomposing the cross-covariance matrix between the joint satellite data field and Aerosol Robotic Network (AERONET) station data, both parallel comparison across different satellite data sets and the evaluation of the satellite data against the AERONET measurements are simultaneously realized. We show that this new method not only confirms the seasonal and interannual variability of aerosol optical depth, aerosol-source regions and events represented by different satellite data sets, but also identifies the strengths and weaknesses of each data set in capturing the variability associated with sources, events or aerosol types. Furthermore, by examining the spread of the spatial modes of different satellite fields, regions with the largest uncertainties in aerosol observation are identified. We also present two regional case studies that respectively demonstrate the capability of the CMCA technique in assessing the representation of an extreme event in different data sets, and in evaluating the performance of different data sets on seasonal and interannual timescales. Global results indicate that different data sets agree qualitatively for major aerosol-source regions. Discrepancies are mostly found over the Sahel, India, eastern and southeastern Asia. Results for eastern Europe suggest that the intense wildfire event in Russia during summer 2010 was less well-represented by SeaWiFS (Sea-viewing Wide Field-of-view Sensor) and OMI (Ozone Monitoring Instrument), which might be due to misclassification of smoke plumes as clouds. Analysis for the Indian subcontinent shows that here SeaWiFS agrees

  2. Analysis of maximum reach in WDM PON architecture based on distributed Raman amplification and pump recycling technique.

    PubMed

    Kim, Chul Han; Lee, Ju Han; Lee, Kwanil

    2007-10-29

    We analyze the performance of bidirectional WDM PON architecture which utilizes distributed Raman amplification and pump recycling technique. The maximum reach at data rates of 622 Mb/s and 1.25 Gb/s in the proposed WDM PON architecture is calculated by taking into account the effects of power budget, chromatic dispersion of transmission fiber, and Raman amplification-induced noises with a given amount of Raman pump power. From the result, the maximum reach for 622 Mb/s and 1.25 Gb/s signal transmission is calculated to be 65 km and 60 km with a Raman pump power of 700 mW, respectively. We also find that the calculated results agree well with the experimental results which were reported previously.

  3. Application of Electromigration Techniques in Environmental Analysis

    NASA Astrophysics Data System (ADS)

    Bald, Edward; Kubalczyk, Paweł; Studzińska, Sylwia; Dziubakiewicz, Ewelina; Buszewski, Bogusław

    Inherently trace-level concentration of pollutants in the environment, together with the complexity of sample matrices, place a strong demand on the detection capabilities of electromigration methods. Significant progress is continually being made, widening the applicability of these techniques, mostly capillary zone electrophoresis, micellar electrokinetic chromatography, and capillary electrochromatography, to the analysis of real-world environmental samples, including the concentration sensitivity and robustness of the developed analytical procedures. This chapter covers the recent major developments in the domain of capillary electrophoresis analysis of environmental samples for pesticides, polycyclic aromatic hydrocarbons, phenols, amines, carboxylic acids, explosives, pharmaceuticals, and ionic liquids. Emphasis is made on pre-capillary and on-capillary chromatography and electrophoresis-based concentration of analytes and detection improvement.

  4. Detailed fuel spray analysis techniques

    NASA Technical Reports Server (NTRS)

    Mularz, E. J.; Bosque, M. A.; Humenik, F. M.

    1983-01-01

    Detailed fuel spray analyses are a necessary input to the analytical modeling of the complex mixing and combustion processes which occur in advanced combustor systems. It is anticipated that by controlling fuel-air reaction conditions, combustor temperatures can be better controlled, leading to improved combustion system durability. Thus, a research program is underway to demonstrate the capability to measure liquid droplet size, velocity, and number density throughout a fuel spray and to utilize this measurement technique in laboratory benchmark experiments. The research activities from two contracts and one grant are described with results to data. The experiment to characterize fuel sprays is also described. These experiments and data should be useful for application to and validation of turbulent flow modeling to improve the design systems of future advanced technology engines.

  5. Data analysis techniques: Spectral processing

    NASA Technical Reports Server (NTRS)

    Strauch, R. G.

    1983-01-01

    The individual steps in the data processing scheme applied to most radars used for wind sounding are analyzed. This processing method uses spectral analysis and assumes a pulse Doppler radar. Improvement in the signal to noise ratio of some radars is discussed.

  6. Digital techniques for ULF wave polarization analysis

    NASA Technical Reports Server (NTRS)

    Arthur, C. W.

    1979-01-01

    Digital power spectral and wave polarization analysis are powerful techniques for studying ULF waves in the earth's magnetosphere. Four different techniques for using the spectral matrix to perform such an analysis have been presented in the literature. Three of these techniques are similar in that they require transformation of the spectral matrix to the principal axis system prior to performing the polarization analysis. The differences in the three techniques lie in the manner in which determine this transformation. A comparative study of these three techniques using both simulated and real data has shown them to be approximately equal in quality of performance. The fourth technique does not require transformation of the spectral matrix. Rather, it uses the measured spectral matrix and state vectors for a desired wave type to design a polarization detector function in the frequency domain. The design of various detector functions and their application to both simulated and real data will be presented.

  7. WE-G-BRA-07: Analyzing the Safety Implications of a Brachytherapy Process Improvement Project Utilizing a Novel System-Theory-Based Hazard-Analysis Technique

    SciTech Connect

    Tang, A; Samost, A; Viswanathan, A; Cormack, R; Damato, A

    2015-06-15

    Purpose: To investigate the hazards in cervical-cancer HDR brachytherapy using a novel hazard-analysis technique, System Theoretic Process Analysis (STPA). The applicability and benefit of STPA to the field of radiation oncology is demonstrated. Methods: We analyzed the tandem and ring HDR procedure through observations, discussions with physicists and physicians, and the use of a previously developed process map. Controllers and their respective control actions were identified and arranged into a hierarchical control model of the system, modeling the workflow from applicator insertion through initiating treatment delivery. We then used the STPA process to identify potentially unsafe control actions. Scenarios were then generated from the identified unsafe control actions and used to develop recommendations for system safety constraints. Results: 10 controllers were identified and included in the final model. From these controllers 32 potentially unsafe control actions were identified, leading to more than 120 potential accident scenarios, including both clinical errors (e.g., using outdated imaging studies for planning), and managerial-based incidents (e.g., unsafe equipment, budget, or staffing decisions). Constraints identified from those scenarios include common themes, such as the need for appropriate feedback to give the controllers an adequate mental model to maintain safe boundaries of operations. As an example, one finding was that the likelihood of the potential accident scenario of the applicator breaking during insertion might be reduced by establishing a feedback loop of equipment-usage metrics and equipment-failure reports to the management controller. Conclusion: The utility of STPA in analyzing system hazards in a clinical brachytherapy system was demonstrated. This technique, rooted in system theory, identified scenarios both technical/clinical and managerial in nature. These results suggest that STPA can be successfully used to analyze safety in

  8. [Acupoints selecting and medication rules analysis based on data mining technique for bronchial asthma treated with acupoint application].

    PubMed

    Wang, Zhaohui; Han, Dongyue; Qie, Lili; Liu, Chang; Wang, Fuchun

    2015-06-01

    Clinical literature of bronchial asthma treated with acupoints application from January 2000 to March 2014 in modern periodicals databases was retrieved through computer. With cluster analysis and frequency analysis methods of data mining, acupoints selecting and medication rules of bronchial asthma treated with acupoints application were analyzed. Total 38 articles were included eventually, including 25 acupoints and 42 medicines. The results indicate that on acupoints selecting, Feishu (BL 13) is used as the main acupoint and 3 groups of bladder meridian and conception vessel acupoints are applied alternately and on medicines, Baijiezi (Brassica alba Boiss), Xixin (Radix et Rhizoma Asari), Gansui (Radix Kansui), Yanhusuo (Corydalis) and Mahuang (Radix et Rhizonma Ephedrae) are primarily adopted, epispastic medicines being the main medicines; medicines mostly belong to lung meridian, main medicines being unchanged mostly with Shengjiang as guiding drug.

  9. Bone quality around bioactive silica-based coated stainless steel implants: analysis by micro-Raman, XRF and XAS techniques.

    PubMed

    Ballarre, Josefina; Desimone, Paula M; Chorro, Matthieu; Baca, Matías; Orellano, Juan Carlos; Ceré, Silvia M

    2013-11-01

    Surface modification of surgical stainless steel implants by sol gel coatings has been proposed as a tool to generate a surface that besides being protective could also create a "bioactive" interface to generate a natural bonding between the metal surface and the existing bone. The aim of this work is to analyze the quality and bone formation around hybrid bioactive coatings containing glass-ceramic particles, made by sol-gel process on 316L stainless steel used as permanent implant in terms of mineralization, calcium content and bone maturity with micro Raman, X-ray microfluorescence and X-ray absorption techniques. Uncoated implants seem to generate a thin bone layer at the beginning of osseointegration process and then this layer being separated from the surface with time. The hybrid coatings without glass-ceramic particles generate new bone around implants, with high concentration of Ca and P at the implant/tissue interface. This fact seems to be related with the presence of silica nanoparticles in the layer. The addition of bioactive particles promotes and enhances the bone quality with a homogeneous Ca and P content and a low rate of beta carbonate substitution and crystallinity, similar to young and mechanical resistant bone.

  10. Analysis of fatty acids in 12 Mediterranean fish species: advantages and limitations of a new GC-FID/GC-MS based technique.

    PubMed

    Nevigato, Teresina; Masci, Maurizio; Orban, Elena; Di Lena, Gabriella; Casini, Irene; Caproni, Roberto

    2012-07-01

    When fatty acids in fish are analyzed, results in percentage form (profile analysis) are mostly reported. However, the much more useful results expressed as mg/100 g (absolute analysis) is the main information required. Absolute methods based on calibration curves are of good accuracy but with a high degree of complexity if applied to a great number of analytes. Procedures based on the sequence profile analysis-total FA determination-absolute analysis may be suitable for routine use, but suffer from a number of uncertainties that have never been really resolved. These uncertainties are mainly related to the profile analysis. In fact, most profile analyses reported in the literature disagree about the number and type of fatty acids monitored as well as about the total percentage to assign to their sum so leading to possible inaccuracies; in addition the instrumental response factor for all FAME (fatty acid methyl esters) is often considered as a constant, but this is not exactly true. In this work, a set of 24 fatty acids was selected and studied on 12 fish species in the Mediterranean area (variable in lipid content and month of sampling): in our results, and in these species, this set constitutes, on average, 90 ± 3 % of the total fatty acid content. Moreover the error derived from the assumption of a unique response factor was investigated. Two different detection techniques (GC-FID and GC-MS) together with two capillary columns (different in length and polarity) were used in order to acquire complementary data on the same sample. With the protocol here proposed absolute analyses on the 12 cited species are easily achievable by the total FA determination procedure. The accuracy of this approach is good in general, but in some cases (DHA for example) is lower than the accuracy of calibration-based methods. The differences were evaluated on a case by case basis.

  11. Objective Analysis and Prediction Techniques.

    DTIC Science & Technology

    1986-11-30

    aligned with the observed wind direction so that a a 900 is mean downwind, and the Fourier harmonics are calculated. The sum and difference of Eqs. (36) and...the first three Fourier harmonics . Data suitable for Fourier analysis at the lower height of 800 m occurred during a 3.3 h period from 1059 to 1416 EST...of the four derivatives. For each set a Doppler velocity function was synthesized and then analyzed for the Fourier harmonics , which were then

  12. Alternate Spectrometric Oil Analysis Techniques

    DTIC Science & Technology

    1992-04-01

    Spectrometric Analysis 75 V. CONCLUSIONS 81 VI. RECOMMENDAT IONS 85 APPENDIX A MICROFILTRATION TEST RIG DATA 87 APPENDIX B MEMBRANE FILTRATION TEST DATA 135...SPECTROETERS 9 2. DESCRIPTION OF MICROFILTRATION TEST RIG FLUIDS 12 3. DESCRIPTION OF SAMPLES USED FOR 3 MICRON PORE SIZE MEMBRANE FILTRATION 15 4... microfiltration , 10-20 ml portions of the used oil samples were passed through a 3 Pm Nucleopore membrane filter. 14 TABLE 3 DESCRIPTION OF SAMPLES USED

  13. Delamination detection in laminated composite using Virtual crack closure technique (VCCT) and modal flexibility based on dynamic analysis

    NASA Astrophysics Data System (ADS)

    Khatir, S.; Behtani, A.; Tiachacht, S.; Bouazouni, A.; Abdel Wahab, M.; Zhou, Y.-L.

    2017-05-01

    The delamination problem is very important failure mechanism in certain types of composite structures. Detecting this type of damage using vibration data is currently a problem of interest to the structural health monitoring community. In this paper, we used finite element method with embedded interface for analysing damaged laminated composite structures. The flexibly modal method, in which analysis data is related to finite element modelling, is used to detect and localize delamination. Several numerical examples are presented in order to evaluate the accuracy this approach.

  14. FDTD based SAR analysis in human head using irregular volume averaging techniques of different resolutions at GSM 900 band

    NASA Astrophysics Data System (ADS)

    Ali, Md Faruk; Ray, Sudhabindu

    2014-06-01

    Specific absorption rate (SAR) induced inside human head in the near-field of a mobile phone antenna has been investigated for three different SAR resolutions using Finite Difference in Time Domain (FDTD) method at GSM 900 band. Voxel based anthropomorphic human head model, consisting of different anatomical tissues, is used to calculate the peak SAR values averaged over 10-g, 1-g and 0.1-g mass. It is observed that the maximum local SAR increases significantly for smaller mass averages.

  15. Real Time Data Analysis Techniques

    NASA Astrophysics Data System (ADS)

    Silberberg, George G.

    1983-03-01

    By the early 1970s, classical photo-optical range instrumentation technology (as a means of gathering weapons' system performance data) had become a costly and inefficient process. Film costs were increasing due to soaring silver prices. Time required to process, read, and produce optical data was becoming unacceptable as a means of supporting weapon system development programs. NWC investigated the feasibility of utilizing Closed Circuit Television (CCTV) technology as an alternative solution for providing optical data. In 1978 a program entitled Metric Video (measurements from video images) was formulated at the Naval Weapons Center, China Lake, California. The purpose of this program was to provide timely data, to reduce the number of operating personnel, and to lower data acquisition costs. Some of the task elements for this program included a near real-time vector miss-distance system, a weapons scoring system, a velocity measuring system, a time-space position system, and a system to replace film cameras for gathering real-time engineering sequential data. These task elements and the development of special hardware and techniques to achieve real-time data will be discussed briefly in this paper.

  16. A Simulation Based Analysis of Motor Unit Number Index (MUNIX) Technique Using Motoneuron Pool and Surface Electromyogram Models

    PubMed Central

    Li, Xiaoyan; Rymer, William Zev; Zhou, Ping

    2013-01-01

    Motor unit number index (MUNIX) measurement has recently achieved increasing attention as a tool to evaluate the progression of motoneuron diseases. In our current study, the sensitivity of the MUNIX technique to changes in motoneuron and muscle properties was explored by a simulation approach utilizing variations on published motoneuron pool and surface electromyogram (EMG) models. Our simulation results indicate that, when keeping motoneuron pool and muscle parameters unchanged and varying the input motor unit numbers to the model, then MUNIX estimates can appropriately characterize changes in motor unit numbers. Such MUNIX estimates are not sensitive to different motor unit recruitment and rate coding strategies used in the model. Furthermore, alterations in motor unit control properties do not have a significant effect on the MUNIX estimates. Neither adjustment of the motor unit recruitment range nor reduction of the motor unit firing rates jeopardizes the MUNIX estimates. The MUNIX estimates closely correlate with the maximum M wave amplitude. However, if we reduce the amplitude of each motor unit action potential rather than simply reduce motor unit number, then MUNIX estimates substantially underestimate the motor unit numbers in the muscle. These findings suggest that the current MUNIX definition is most suitable for motoneuron diseases that demonstrate secondary evidence of muscle fiber reinnervation. In this regard, when MUNIX is applied, it is of much importance to examine a parallel measurement of motor unit size index (MUSIX), defined as the ratio of the maximum M wave amplitude to the MUNIX. However, there are potential limitations in the application of the MUNIX methods in atrophied muscle, where it is unclear whether the atrophy is accompanied by loss of motor units or loss of muscle fiber size. PMID:22514208

  17. Analysis of algebraic reconstruction technique for accurate imaging of gas temperature and concentration based on tunable diode laser absorption spectroscopy

    NASA Astrophysics Data System (ADS)

    Hui-Hui, Xia; Rui-Feng, Kan; Jian-Guo, Liu; Zhen-Yu, Xu; Ya-Bai, He

    2016-06-01

    An improved algebraic reconstruction technique (ART) combined with tunable diode laser absorption spectroscopy(TDLAS) is presented in this paper for determining two-dimensional (2D) distribution of H2O concentration and temperature in a simulated combustion flame. This work aims to simulate the reconstruction of spectroscopic measurements by a multi-view parallel-beam scanning geometry and analyze the effects of projection rays on reconstruction accuracy. It finally proves that reconstruction quality dramatically increases with the number of projection rays increasing until more than 180 for 20 × 20 grid, and after that point, the number of projection rays has little influence on reconstruction accuracy. It is clear that the temperature reconstruction results are more accurate than the water vapor concentration obtained by the traditional concentration calculation method. In the present study an innovative way to reduce the error of concentration reconstruction and improve the reconstruction quality greatly is also proposed, and the capability of this new method is evaluated by using appropriate assessment parameters. By using this new approach, not only the concentration reconstruction accuracy is greatly improved, but also a suitable parallel-beam arrangement is put forward for high reconstruction accuracy and simplicity of experimental validation. Finally, a bimodal structure of the combustion region is assumed to demonstrate the robustness and universality of the proposed method. Numerical investigation indicates that the proposed TDLAS tomographic algorithm is capable of detecting accurate temperature and concentration profiles. This feasible formula for reconstruction research is expected to resolve several key issues in practical combustion devices. Project supported by the Young Scientists Fund of the National Natural Science Foundation of China (Grant No. 61205151), the National Key Scientific Instrument and Equipment Development Project of China (Grant

  18. Analysis of the nonlinear behavior of shear-Alfvén modes in tokamaks based on Hamiltonian mapping techniques

    SciTech Connect

    Briguglio, S. Vlad, G.; Fogaccia, G.; Di Troia, C.; Fusco, V.; Wang, X.; Zonca, F.

    2014-11-15

    We present a series of numerical simulation experiments set up to illustrate the fundamental physics processes underlying the nonlinear dynamics of Alfvénic modes resonantly excited by energetic particles in tokamak plasmas and of the ensuing energetic particle transports. These phenomena are investigated by following the evolution of a test particle population in the electromagnetic fields computed in self-consistent MHD-particle simulation performed by the HMGC code. Hamiltonian mapping techniques are used to extract and illustrate several features of wave-particle dynamics. The universal structure of resonant particle phase space near an isolated resonance is recovered and analyzed, showing that bounded orbits and untrapped trajectories, divided by the instantaneous separatrix, form phase space zonal structures, whose characteristic non-adiabatic evolution time is the same as the nonlinear time of the underlying fluctuations. Bounded orbits correspond to a net outward resonant particle flux, which produces a flattening and/or gradient inversion of the fast ion density profile around the peak of the linear wave-particle resonance. The connection of this phenomenon to the mode saturation is analyzed with reference to two different cases: a Toroidal Alfvén eigenmode in a low shear magnetic equilibrium and a weakly unstable energetic particle mode for stronger magnetic shear. It is shown that, in the former case, saturation is reached because of radial decoupling (resonant particle redistribution matching the mode radial width) and is characterized by a weak dependence of the mode amplitude on the growth rate. In the latter case, saturation is due to resonance detuning (resonant particle redistribution matching the resonance width) with a stronger dependence of the mode amplitude on the growth rate.

  19. Analysis of Different Classification Techniques for Two-Class Functional Near-Infrared Spectroscopy-Based Brain-Computer Interface

    PubMed Central

    Qureshi, Nauman Khalid; Noori, Farzan Majeed; Hong, Keum-Shik

    2016-01-01

    We analyse and compare the classification accuracies of six different classifiers for a two-class mental task (mental arithmetic and rest) using functional near-infrared spectroscopy (fNIRS) signals. The signals of the mental arithmetic and rest tasks from the prefrontal cortex region of the brain for seven healthy subjects were acquired using a multichannel continuous-wave imaging system. After removal of the physiological noises, six features were extracted from the oxygenated hemoglobin (HbO) signals. Two- and three-dimensional combinations of those features were used for classification of mental tasks. In the classification, six different modalities, linear discriminant analysis (LDA), quadratic discriminant analysis (QDA), k-nearest neighbour (kNN), the Naïve Bayes approach, support vector machine (SVM), and artificial neural networks (ANN), were utilized. With these classifiers, the average classification accuracies among the seven subjects for the 2- and 3-dimensional combinations of features were 71.6, 90.0, 69.7, 89.8, 89.5, and 91.4% and 79.6, 95.2, 64.5, 94.8, 95.2, and 96.3%, respectively. ANN showed the maximum classification accuracies: 91.4 and 96.3%. In order to validate the results, a statistical significance test was performed, which confirmed that the p values were statistically significant relative to all of the other classifiers (p < 0.005) using HbO signals. PMID:27725827

  20. Innovative Techniques Simplify Vibration Analysis

    NASA Technical Reports Server (NTRS)

    2010-01-01

    In the early years of development, Marshall Space Flight Center engineers encountered challenges related to components in the space shuttle main engine. To assess the problems, they evaluated the effects of vibration and oscillation. To enhance the method of vibration signal analysis, Marshall awarded Small Business Innovation Research (SBIR) contracts to AI Signal Research, Inc. (ASRI), in Huntsville, Alabama. ASRI developed a software package called PC-SIGNAL that NASA now employs on a daily basis, and in 2009, the PKP-Module won Marshall s Software of the Year award. The technology is also used in many industries: aircraft and helicopter, rocket engine manufacturing, transportation, and nuclear power."

  1. Advanced Techniques for Scene Analysis

    DTIC Science & Technology

    2010-06-01

    provide segmentation of the flow field. Wang and Adelson described in [21] a method to represent moving objects using sets of overlapping layers. The... Wang and E. H. Adelson, “Representing moving images with layers,” in Image Processing, IEEE Transactions on, Sep 1994, pp. 625–638. [22] Y. A. G. D...B. Han, C. Paulson, J. Wang , and D. Wu, “Depth-based image registration,” in Proceedings of SPIE Defense & Security Symposium, vol. 7699, Orlando

  2. Logistics Support Analysis Techniques Guide

    DTIC Science & Technology

    1985-03-15

    not considered a major problem to exportability. However, when a program is dependent on data files, processing programs, or unique hardware/ software ...automated, what language is used, ADP equipment used, and software /peripheral equipment support is required? e. What is the quantity and accurarcy of data...881A. The R&D and production costs for contractor-furnished hardware and software are computed based on design and production activities recorded in

  3. A decision support system for fusion of hard and soft sensor information based on probabilistic latent semantic analysis technique

    NASA Astrophysics Data System (ADS)

    Shirkhodaie, Amir; Elangovan, Vinayak; Alkilani, Amjad; Habibi, Mohammad

    2013-05-01

    This paper presents an ongoing effort towards development of an intelligent Decision-Support System (iDSS) for fusion of information from multiple sources consisting of data from hard (physical sensors) and soft (textural sources. Primarily, this paper defines taxonomy of decision support systems for latent semantic data mining from heterogeneous data sources. A Probabilistic Latent Semantic Analysis (PLSA) approach is proposed for latent semantic concepts search from heterogeneous data sources. An architectural model for generating semantic annotation of multi-modality sensors in a modified Transducer Markup Language (TML) is described. A method for TML messages fusion is discussed for alignment and integration of spatiotemporally correlated and associated physical sensory observations. Lastly, the experimental results which exploit fusion of soft/hard sensor sources with support of iDSS are discussed.

  4. Physicochemical bases of differences between the sedimentometric and laser-diffraction techniques of soil particle-size analysis

    NASA Astrophysics Data System (ADS)

    Fedotov, G. N.; Shein, E. V.; Putlyaev, V. I.; Arkhangel'Skaya, T. A.; Eliseev, A. V.; Milanovskii, E. Yu.

    2007-03-01

    Comparison of the particle-size distributions in different soils showed that the sedimentation method (Kachinskii pipette method) gives higher (by 1.5-5 times) values of the clay content than the laser diffraction method. This is related to the significant variation in density of soil solids, which is taken to be constant in the sedimentation method. Therefore, particles of significantly larger size and lower density fall into this fraction. Using optical, electron, and confocal microscopy, it was shown that the low density of soil particles of silt size falling into the sedimentometric clay fraction is related to the organomineral shell (film) around the soil microparticles. This shell contributes to the linking of microparticles into aggregates at the lower average density. As a result, these aggregates have significantly larger size and lower density and settle with the same velocity as the small particles with the average density of the solid phase during the sedimentation particle-size analysis.

  5. An inertial sensor-based system for spatio-temporal analysis in classic cross-country skiing diagonal technique.

    PubMed

    Fasel, Benedikt; Favre, Julien; Chardonnens, Julien; Gremion, Gérald; Aminian, Kamiar

    2015-09-18

    The present study proposes a method based on ski fixed inertial sensors to automatically compute spatio-temporal parameters (phase durations, cycle speed and cycle length) for the diagonal stride in classical cross-country skiing. The proposed system was validated against a marker-based motion capture system during indoor treadmill skiing. Skiing movement of 10 junior to world-cup athletes was measured for four different conditions. The accuracy (i.e. median error) and precision (i.e. interquartile range of error) of the system was below 6 ms for cycle duration and ski thrust duration and below 35 ms for pole push duration. Cycle speed precision (accuracy) was below 0.1m/s (0.00 5m/s) and cycle length precision (accuracy) was below 0.15m (0.005 m). The system was sensitive to changes of conditions and was accurate enough to detect significant differences reported in previous studies. Since capture volume is not limited and setup is simple, the system would be well suited for outdoor measurements on snow.

  6. Typology of delivery quality: latent profile analysis of teacher engagement and delivery techniques in a school-based prevention intervention, keepin’ it REAL curriculum

    PubMed Central

    Shin, YoungJu; Miller-Day, Michelle; Pettigrew, Jonathan; Hecht, Michael L.; Krieger, Janice L.

    2014-01-01

    Enhancing the delivery quality of school-based, evidence-based prevention programs is one key to ensuring uniform program effects on student outcomes. Program evaluations often focus on content dosage when implementing prevention curricula, however, less is known about implementation quality of prevention content, especially among teachers who may or may not have a prevention background. The goal of the current study is to add to the scholarly literature on implementation quality for a school-based substance use prevention intervention. Twenty-five schools in Ohio and Pennsylvania implemented the original keepin’ REAL (kiR) substance use prevention curriculum. Each of the 10, 40–45 min lessons of the kiR curriculum was video recorded. Coders observed and rated a random sample of 276 videos reflecting 78 classes taught by 31 teachers. Codes included teachers’ delivery techniques (e.g. lecture, discussion, demonstration and role play) and engagement with students (e.g. attentiveness, enthusiasm and positivity). Based on the video ratings, a latent profile analysis was run to identify typology of delivery quality. Five profiles were identified: holistic approach, attentive teacher-orientated approach, enthusiastic lecture approach, engaged interactive learning approach and skill practice-only approach. This study provides a descriptive typology of delivery quality while implementing a school-based substance use prevention intervention. PMID:25274721

  7. Typology of delivery quality: latent profile analysis of teacher engagement and delivery techniques in a school-based prevention intervention, keepin' it REAL curriculum.

    PubMed

    Shin, YoungJu; Miller-Day, Michelle; Pettigrew, Jonathan; Hecht, Michael L; Krieger, Janice L

    2014-12-01

    Enhancing the delivery quality of school-based, evidence-based prevention programs is one key to ensuring uniform program effects on student outcomes. Program evaluations often focus on content dosage when implementing prevention curricula, however, less is known about implementation quality of prevention content, especially among teachers who may or may not have a prevention background. The goal of the current study is to add to the scholarly literature on implementation quality for a school-based substance use prevention intervention. Twenty-five schools in Ohio and Pennsylvania implemented the original keepin' REAL (kiR) substance use prevention curriculum. Each of the 10, 40-45 min lessons of the kiR curriculum was video recorded. Coders observed and rated a random sample of 276 videos reflecting 78 classes taught by 31 teachers. Codes included teachers' delivery techniques (e.g., lecture, discussion, demonstration and role play) and engagement with students (e.g. attentiveness, enthusiasm and positivity). Based on the video ratings, a latent profile analysis was run to identify typology of delivery quality. Five profiles were identified: holistic approach, attentive teacher-orientated approach, enthusiastic lecture approach, engaged interactive learning approach and skill practice-only approach. This study provides a descriptive typology of delivery quality while implementing a school-based substance use prevention intervention.

  8. Advanced techniques in current signature analysis

    NASA Astrophysics Data System (ADS)

    Smith, S. F.; Castleberry, K. N.

    1992-02-01

    In general, both ac and dc motors can be characterized as weakly nonlinear systems, in which both linear and nonlinear effects occur simultaneously. Fortunately, the nonlinearities are generally well behaved and understood and can be handled via several standard mathematical techniques already well developed in the systems modeling area; examples are piecewise linear approximations and Volterra series representations. Field measurements of numerous motors and motor-driven systems confirm the rather complex nature of motor current spectra and illustrate both linear and nonlinear effects (including line harmonics and modulation components). Although previous current signature analysis (CSA) work at Oak Ridge and other sites has principally focused on the modulation mechanisms and detection methods (AM, PM, and FM), more recent studies have been conducted on linear spectral components (those appearing in the electric current at their actual frequencies and not as modulation sidebands). For example, large axial-flow compressors (approximately 3300 hp) in the US gaseous diffusion uranium enrichment plants exhibit running-speed (approximately 20 Hz) and high-frequency vibrational information (greater than 1 kHz) in their motor current spectra. Several signal-processing techniques developed to facilitate analysis of these components, including specialized filtering schemes, are presented. Finally, concepts for the designs of advanced digitally based CSA units are offered, which should serve to foster the development of much more computationally capable 'smart' CSA instrumentation in the next several years.

  9. Advanced techniques in current signature analysis

    SciTech Connect

    Smith, S.F.; Castleberry, K.N.

    1992-03-01

    In general, both ac and dc motors can be characterized as weakly nonlinear systems, in which both linear and nonlinear effects occur simultaneously. Fortunately, the nonlinearities are generally well behaved and understood and an be handled via several standard mathematical techniques already well developed in the systems modeling area; examples are piecewise linear approximations and Volterra series representations. Field measurements of numerous motors and motor-driven systems confirm the rather complex nature of motor current spectra and illustrate both linear and nonlinear effects (including line harmonics and modulation components). Although previous current signature analysis (CSA) work at Oak Ridge and other sites has principally focused on the modulation mechanisms and detection methods (AM, PM, and FM), more recent studies have been conducted on linear spectral components (those appearing in the electric current at their actual frequencies and not as modulation sidebands). For example, large axial-flow compressors ({approximately}3300 hp) in the US gaseous diffusion uranium enrichment plants exhibit running-speed ({approximately}20 Hz) and high-frequency vibrational information (>1 kHz) in their motor current spectra. Several signal-processing techniques developed to facilitate analysis of these components, including specialized filtering schemes, are presented. Finally, concepts for the designs of advanced digitally based CSA units are offered, which should serve to foster the development of much more computationally capable ``smart`` CSA instrumentation in the next several years. 3 refs.

  10. A dual resolution measurement based Monte Carlo simulation technique for detailed dose analysis of small volume organs in the skull base region

    NASA Astrophysics Data System (ADS)

    Yeh, Chi-Yuan; Tung, Chuan-Jung; Chao, Tsi-Chain; Lin, Mu-Han; Lee, Chung-Chi

    2014-11-01

    The purpose of this study was to examine dose distribution of a skull base tumor and surrounding critical structures in response to high dose intensity-modulated radiosurgery (IMRS) with Monte Carlo (MC) simulation using a dual resolution sandwich phantom. The measurement-based Monte Carlo (MBMC) method (Lin et al., 2009) was adopted for the study. The major components of the MBMC technique involve (1) the BEAMnrc code for beam transport through the treatment head of a Varian 21EX linear accelerator, (2) the DOSXYZnrc code for patient dose simulation and (3) an EPID-measured efficiency map which describes non-uniform fluence distribution of the IMRS treatment beam. For the simulated case, five isocentric 6 MV photon beams were designed to deliver a total dose of 1200 cGy in two fractions to the skull base tumor. A sandwich phantom for the MBMC simulation was created based on the patient's CT scan of a skull base tumor [gross tumor volume (GTV)=8.4 cm3] near the right 8th cranial nerve. The phantom, consisted of a 1.2-cm thick skull base region, had a voxel resolution of 0.05×0.05×0.1 cm3 and was sandwiched in between 0.05×0.05×0.3 cm3 slices of a head phantom. A coarser 0.2×0.2×0.3 cm3 single resolution (SR) phantom was also created for comparison with the sandwich phantom. A particle history of 3×108 for each beam was used for simulations of both the SR and the sandwich phantoms to achieve a statistical uncertainty of <2%. Our study showed that the planning target volume (PTV) receiving at least 95% of the prescribed dose (VPTV95) was 96.9%, 96.7% and 99.9% for the TPS, SR, and sandwich phantom, respectively. The maximum and mean doses to large organs such as the PTV, brain stem, and parotid gland for the TPS, SR and sandwich MC simulations did not show any significant difference; however, significant dose differences were observed for very small structures like the right 8th cranial nerve, right cochlea, right malleus and right semicircular canal. Dose

  11. Crude oil price forecasting based on hybridizing wavelet multiple linear regression model, particle swarm optimization techniques, and principal component analysis.

    PubMed

    Shabri, Ani; Samsudin, Ruhaidah

    2014-01-01

    Crude oil prices do play significant role in the global economy and are a key input into option pricing formulas, portfolio allocation, and risk measurement. In this paper, a hybrid model integrating wavelet and multiple linear regressions (MLR) is proposed for crude oil price forecasting. In this model, Mallat wavelet transform is first selected to decompose an original time series into several subseries with different scale. Then, the principal component analysis (PCA) is used in processing subseries data in MLR for crude oil price forecasting. The particle swarm optimization (PSO) is used to adopt the optimal parameters of the MLR model. To assess the effectiveness of this model, daily crude oil market, West Texas Intermediate (WTI), has been used as the case study. Time series prediction capability performance of the WMLR model is compared with the MLR, ARIMA, and GARCH models using various statistics measures. The experimental results show that the proposed model outperforms the individual models in forecasting of the crude oil prices series.

  12. Crude Oil Price Forecasting Based on Hybridizing Wavelet Multiple Linear Regression Model, Particle Swarm Optimization Techniques, and Principal Component Analysis

    PubMed Central

    Shabri, Ani; Samsudin, Ruhaidah

    2014-01-01

    Crude oil prices do play significant role in the global economy and are a key input into option pricing formulas, portfolio allocation, and risk measurement. In this paper, a hybrid model integrating wavelet and multiple linear regressions (MLR) is proposed for crude oil price forecasting. In this model, Mallat wavelet transform is first selected to decompose an original time series into several subseries with different scale. Then, the principal component analysis (PCA) is used in processing subseries data in MLR for crude oil price forecasting. The particle swarm optimization (PSO) is used to adopt the optimal parameters of the MLR model. To assess the effectiveness of this model, daily crude oil market, West Texas Intermediate (WTI), has been used as the case study. Time series prediction capability performance of the WMLR model is compared with the MLR, ARIMA, and GARCH models using various statistics measures. The experimental results show that the proposed model outperforms the individual models in forecasting of the crude oil prices series. PMID:24895666

  13. Analysis Techniques in Deregulated Power Systems

    NASA Astrophysics Data System (ADS)

    Funabashi, Toshihisa

    In a deregulated power system, many distributed generations and power storages will be introduced. This article shows technical trends around power system analysis and summarizes roles of this technique in manufacturers. Then, some topics on today’s power system analysis are represented, including synchronous /induction generator modeling, Microgrid’s application and surge analysis of distributed generators.

  14. Comparing Techniques for Certified Static Analysis

    NASA Technical Reports Server (NTRS)

    Cachera, David; Pichardie, David

    2009-01-01

    A certified static analysis is an analysis whose semantic validity has been formally proved correct with a proof assistant. The recent increasing interest in using proof assistants for mechanizing programming language metatheory has given rise to several approaches for certification of static analysis. We propose a panorama of these techniques and compare their respective strengths and weaknesses.

  15. AI-based technique for tracking chains of discontinuous symbols and its application to the analysis of topographic maps

    NASA Astrophysics Data System (ADS)

    Mecocci, Alessandro; Lilla, Massimiliano

    1994-12-01

    Automatic digitization of topographic maps is a very important task nowadays. Among the different elements of a topographic map discontinuous lines represent important information. Generally they are difficult to track because they show very large gaps, and abrupt direction changes. In this paper an architecture that automates the digitalization of discontinuous lines (dot-dot lines, dash-dot-dash lines, dash-asterisk lines, etc.) is presented. The tracking process must detect the elementary symbols and then concatenate these symbols into a significant chain that represents the line. The proposed architecture is composed of a common kernel, based on a suitable modification of the A* algorithm, that starts different auxiliary processes depending on the particular line to be tracked. Three auxiliary processes are considered: search strategy generation (SSG) which is responsible for the strategy used to scan the image pixels; low level symbol detection (LSD) which decides if a certain image region around the pixel selected by the SSG is an elementary symbol; cost evaluation (CE) which gives the quality of each symbol with respect to the global course of the line. The whole system has been tested on a 1:50.000 map furnished by the Istituto Geografico Militare Italiano (IGMI). The results were very good for different types of discontinuous lines. Over the whole map (i.e. about 80 Mbytes of digitized data) 95% of the elementary symbols of the lines have been correctly chained. The operator time required to correct misclassifications is a small part of the time needed to manually digitize the discontinuous lines.

  16. Ion beam analysis techniques in interdisciplinary applications

    SciTech Connect

    Respaldiza, Miguel A.; Ager, Francisco J.

    1999-11-16

    The ion beam analysis techniques emerge in the last years as one of the main applications of electrostatic accelerators. A short summary of the most used IBA techniques will be given as well as some examples of applications in interdisciplinary sciences.

  17. Ion Beam Analysis Techniques in Interdisciplinary Applications

    SciTech Connect

    Respaldiza, Miguel A.; Ager, Francisco J.

    1999-12-31

    The ion beam analysis techniques emerge in the last years as one of the main applications of electrostatic accelerators. A short summary of the most used IBA techniques will be given as well as some examples of applications in interdisciplinary sciences.

  18. A New Microcell Technique for NMR Analysis.

    ERIC Educational Resources Information Center

    Yu, Sophia J.

    1987-01-01

    Describes a new laboratory technique for working with small samples of compounds used in nuclear magnetic resonance (NMR) analysis. Demonstrates how microcells can be constructed for each experiment and samples can be recycled. (TW)

  19. A New Microcell Technique for NMR Analysis.

    ERIC Educational Resources Information Center

    Yu, Sophia J.

    1987-01-01

    Describes a new laboratory technique for working with small samples of compounds used in nuclear magnetic resonance (NMR) analysis. Demonstrates how microcells can be constructed for each experiment and samples can be recycled. (TW)

  20. Statistical Evaluation of Time Series Analysis Techniques

    NASA Technical Reports Server (NTRS)

    Benignus, V. A.

    1973-01-01

    The performance of a modified version of NASA's multivariate spectrum analysis program is discussed. A multiple regression model was used to make the revisions. Performance improvements were documented and compared to the standard fast Fourier transform by Monte Carlo techniques.

  1. Statistical evaluation of vibration analysis techniques

    NASA Technical Reports Server (NTRS)

    Milner, G. Martin; Miller, Patrice S.

    1987-01-01

    An evaluation methodology is presented for a selection of candidate vibration analysis techniques applicable to machinery representative of the environmental control and life support system of advanced spacecraft; illustrative results are given. Attention is given to the statistical analysis of small sample experiments, the quantification of detection performance for diverse techniques through the computation of probability of detection versus probability of false alarm, and the quantification of diagnostic performance.

  2. Techniques for Enhancing Web-Based Education.

    ERIC Educational Resources Information Center

    Barbieri, Kathy; Mehringer, Susan

    The Virtual Workshop is a World Wide Web-based set of modules on high performance computing developed at the Cornell Theory Center (CTC) (New York). This approach reaches a large audience, leverages staff effort, and poses challenges for developing interesting presentation techniques. This paper describes the following techniques with their…

  3. A technique to reduce motion artifact for externally triggered cine-MRI(EC-MRI) based on detecting the onset of the articulated word with spectral analysis.

    PubMed

    Shimada, Yasuhiro; Nishimoto, Hironori; Kochiyama, Takanori; Fujimoto, Ichiro; Mano, Hiroaki; Masaki, Shinobu; Murase, Kenya

    2012-01-01

    One issue in externally triggered cine-magnetic resonance imaging (EC-MRI) for the dynamic observation of speech organs is motion artifact in the phase-encoding direction caused by unstable repetitions of speech during data acquisition. We propose a technique to reduce such artifact by rearranging the k-space data used to reconstruct MR images based on the analysis of recorded speech sounds. We recorded the subject's speech sounds during EC-MRI and used post hoc acoustical processing to reduce scanning noise and detect the onset of each utterance based on analysis of the recorded sounds. We selected each line of k-space from several data acquisition sessions and rearranged them to reconstruct a new series of dynamic MR images according to the analyzed time of utterance onset. Comparative evaluation showed significant reduction in motion artifact signal in the dynamic MR images reconstructed by the proposed method. The quality of the reconstructed images was sufficient to observe the dynamic aspects of speech production mechanisms.

  4. Quantitative analysis of Li by PIGE technique

    NASA Astrophysics Data System (ADS)

    Fonseca, M.; Mateus, R.; Santos, C.; Cruz, J.; Silva, H.; Luis, H.; Martins, L.; Jesus, A. P.

    2017-09-01

    In this work, the cross section of the reactions 7Li(p,pγ)7Li (γ - 478 keV) at the proton energy range 2.0-4.2 MeV was measured. The measurements were carried out at the 3 MV Tandem Accelerator at the CTN/IST Laboratory in Lisbon. To validate the obtained results, calculated gamma-ray yields were compared, at several proton energy values, with experimental yields for thick samples made of inorganic compounds containing lithium. In order to quantify the light elements present in the samples, we used a standard free method for PIGE in thick samples, based on a code - Emitted Radiation Yield Analysis (ERYA), which integrates the nuclear reaction excitation function along the depth of the sample. We also demonstrated the capacity of the technique for analysis of Li ores, as Spodumene, Lithium Muscovite and Holmquistite, and Li-alloys for plasma facing materials showing that this is a reliable and accurate method for PIGE analysis of Li in thick samples.

  5. Authentication of Galician (N.W. Spain) quality brand potatoes using metal analysis. Classical pattern recognition techniques versus a new vector quantization-based classification procedure.

    PubMed

    Peña, R M; García, S; Iglesias, R; Barro, S; Herrero, C

    2001-12-01

    The objective of this work was to develop a classification system in order to confirm the authenticity of Galician potatoes with a Certified Brand of Origin and Quality (CBOQ) and to differentiate them from other potatoes that did not have this quality brand. Elemental analysis (K, Na, Rb, Li, Zn, Fe, Mn, Cu, Mg and Ca) of potatoes was performed by atomic spectroscopy in 307 samples belonging to two categories, CBOQ and Non-CBOQ potatoes. The 307 x 10 data set was evaluated employing multivariate chemometric techniques, such as cluster analysis and principal component analysis in order to perform a preliminary study of the data structure. Different classification systems for the two categories on the basis of the chemical data were obtained applying several commonly supervised pattern recognition procedures [such as linear discriminant analysis, K-nearest neighbours (KNN), soft independent modelling of class analogy and multilayer feed-forward neural networks]. In spite of the fact that some of these classification methods produced satisfactory results, the particular data distribution in the 10-dimensional space led to the proposal of a new vector quantization-based classification procedure (VQBCP). The results achieved with this new approach (percentages of recognition and prediction abilities > 97%) were better than those attained by KNN and can be compared advantageously with those provided by LDA (linear discriminant analysis), SIMCA (soft independent modelling of class analogy) and MLF-ANN (multilayer feed-forward neural networks). The new VQBCP demonstrated good performance by carrying out adequate classifications in a data set in which the classes are subgrouped. The metal profiles of potatoes provided sufficient information to enable classification criteria to be developed for classifying samples on the basis of their origin and brand.

  6. Advanced analysis techniques for uranium assay

    SciTech Connect

    Geist, W. H.; Ensslin, Norbert; Carrillo, L. A.; Beard, C. A.

    2001-01-01

    Uranium has a negligible passive neutron emission rate making its assay practicable only with an active interrogation method. The active interrogation uses external neutron sources to induce fission events in the uranium in order to determine the mass. This technique requires careful calibration with standards that are representative of the items to be assayed. The samples to be measured are not always well represented by the available standards which often leads to large biases. A technique of active multiplicity counting is being developed to reduce some of these assay difficulties. Active multiplicity counting uses the measured doubles and triples count rates to determine the neutron multiplication (f4) and the product of the source-sample coupling ( C ) and the 235U mass (m). Since the 35U mass always appears in the multiplicity equations as the product of Cm, the coupling needs to be determined before the mass can be known. A relationship has been developed that relates the coupling to the neutron multiplication. The relationship is based on both an analytical derivation and also on empirical observations. To determine a scaling constant present in this relationship, known standards must be used. Evaluation of experimental data revealed an improvement over the traditional calibration curve analysis method of fitting the doubles count rate to the 235Um ass. Active multiplicity assay appears to relax the requirement that the calibration standards and unknown items have the same chemical form and geometry.

  7. Multiattribute Decision Modeling Techniques: A Comparative Analysis

    DTIC Science & Technology

    1988-08-01

    Rating Technique (SMART) as a direct response to Raiffa’s (1969) article on multiattribute utility theory , which Edwards found extremcy stimulating but...approaches such as multiattribute utility /value assessment and hierarchical analysis and have applied these techniques to a number of non-military... multiattributed ) outcomes O(l)...O(k), and if the utility function is denoted by u and the probabilities of the k events are p(l)...p(k), then the

  8. Magnetic Analysis Techniques Applied to Desert Varnish

    NASA Technical Reports Server (NTRS)

    Schmidgall, E. R.; Moskowitz, B. M.; Dahlberg, E. D.; Kuhlman, K. R.

    2003-01-01

    Desert varnish is a black or reddish coating commonly found on rock samples from arid regions. Typically, the coating is very thin, less than half a millimeter thick. Previous research has shown that the primary components of desert varnish are silicon oxide clay minerals (60%), manganese and iron oxides (20-30%), and trace amounts of other compounds [1]. Desert varnish is thought to originate when windborne particles containing iron and manganese oxides are deposited onto rock surfaces where manganese oxidizing bacteria concentrate the manganese and form the varnish [4,5]. If desert varnish is indeed biogenic, then the presence of desert varnish on rock surfaces could serve as a biomarker, indicating the presence of microorganisms. This idea has considerable appeal, especially for Martian exploration [6]. Magnetic analysis techniques have not been extensively applied to desert varnish. The only previous magnetic study reported that based on room temperature demagnetization experiments, there were noticeable differences in magnetic properties between a sample of desert varnish and the substrate sandstone [7]. Based upon the results of the demagnetization experiments, the authors concluded that the primary magnetic component of desert varnish was either magnetite (Fe3O4) or maghemite ( Fe2O3).

  9. Photomorphic analysis techniques: An interim spatial analysis using satellite remote sensor imagery and historical data

    NASA Technical Reports Server (NTRS)

    Keuper, H. R.; Peplies, R. W.; Gillooly, R. P.

    1977-01-01

    The use of machine scanning and/or computer-based techniques to provide greater objectivity in the photomorphic approach was investigated. Photomorphic analysis and its application in regional planning are discussed. Topics included: delineation of photomorphic regions; inadequacies of existing classification systems; tonal and textural characteristics and signature analysis techniques; pattern recognition and Fourier transform analysis; and optical experiments. A bibliography is included.

  10. Intrinsic biodegradation potential of aromatic hydrocarbons in an alluvial aquifer--potentials and limits of signature metabolite analysis and two stable isotope-based techniques.

    PubMed

    Morasch, Barbara; Hunkeler, Daniel; Zopfi, Jakob; Temime, Brice; Höhener, Patrick

    2011-10-01

    Three independent techniques were used to assess the biodegradation of monoaromatic hydrocarbons and low-molecular weight polyaromatic hydrocarbons in the alluvial aquifer at the site of a former cokery (Flémalle, Belgium). Firstly, a stable carbon isotope-based field method allowed quantifying biodegradation of monoaromatic compounds in situ and confirmed the degradation of naphthalene. No evidence could be deduced from stable isotope shifts for the intrinsic biodegradation of larger molecules such as methylnaphthalenes or acenaphthene. Secondly, using signature metabolite analysis, various intermediates of the anaerobic degradation of (poly-) aromatic and heterocyclic compounds were identified. The discovery of a novel metabolite of acenaphthene in groundwater samples permitted deeper insights into the anaerobic biodegradation of almost persistent environmental contaminants. A third method, microcosm incubations with 13C-labeled compounds under in situ-like conditions, complemented techniques one and two by providing quantitative information on contaminant biodegradation independent of molecule size and sorption properties. Thanks to stable isotope labels, the sensitivity of this method was much higher compared to classical microcosm studies. The 13C-microcosm approach allowed the determination of first-order rate constants for 13C-labeled benzene, naphthalene, or acenaphthene even in cases when degradation activities were only small. The plausibility of the third method was checked by comparing 13C-microcosm-derived rates to field-derived rates of the first approach. Further advantage of the use of 13C-labels in microcosms is that novel metabolites can be linked more easily to specific mother compounds even in complex systems. This was achieved using alluvial sediments where 13C-acenaphthyl methylsuccinate was identified as transformation product of the anaerobic degradation of acenaphthene.

  11. A review of sensitivity analysis techniques

    SciTech Connect

    Hamby, D.M.

    1993-12-31

    Mathematical models are utilized to approximate various highly complex engineering, physical, environmental, social, and economic phenomena. Model parameters exerting the most influence on model results are identified through a {open_quotes}sensitivity analysis.{close_quotes} A comprehensive review is presented of more than a dozen sensitivity analysis methods. The most fundamental of sensitivity techniques utilizes partial differentiation whereas the simplest approach requires varying parameter values one-at-a-time. Correlation analysis is used to determine relationships between independent and dependent variables. Regression analysis provides the most comprehensive sensitivity measure and is commonly utilized to build response surfaces that approximate complex models.

  12. Regional flood frequency analysis in eastern Australia: Bayesian GLS regression-based methods within fixed region and ROI framework - Quantile Regression vs. Parameter Regression Technique

    NASA Astrophysics Data System (ADS)

    Haddad, Khaled; Rahman, Ataur

    2012-04-01

    SummaryIn this article, an approach using Bayesian Generalised Least Squares (BGLS) regression in a region-of-influence (ROI) framework is proposed for regional flood frequency analysis (RFFA) for ungauged catchments. Using the data from 399 catchments in eastern Australia, the BGLS-ROI is constructed to regionalise the flood quantiles (Quantile Regression Technique (QRT)) and the first three moments of the log-Pearson type 3 (LP3) distribution (Parameter Regression Technique (PRT)). This scheme firstly develops a fixed region model to select the best set of predictor variables for use in the subsequent regression analyses using an approach that minimises the model error variance while also satisfying a number of statistical selection criteria. The identified optimal regression equation is then used in the ROI experiment where the ROI is chosen for a site in question as the region that minimises the predictive uncertainty. To evaluate the overall performances of the quantiles estimated by the QRT and PRT, a one-at-a-time cross-validation procedure is applied. Results of the proposed method indicate that both the QRT and PRT in a BGLS-ROI framework lead to more accurate and reliable estimates of flood quantiles and moments of the LP3 distribution when compared to a fixed region approach. Also the BGLS-ROI can deal reasonably well with the heterogeneity in Australian catchments as evidenced by the regression diagnostics. Based on the evaluation statistics it was found that both BGLS-QRT and PRT-ROI perform similarly well, which suggests that the PRT is a viable alternative to QRT in RFFA. The RFFA methods developed in this paper is based on the database available in eastern Australia. It is expected that availability of a more comprehensive database (in terms of both quality and quantity) will further improve the predictive performance of both the fixed and ROI based RFFA methods presented in this study, which however needs to be investigated in future when such a

  13. Techniques for the Analysis of Human Movement.

    ERIC Educational Resources Information Center

    Grieve, D. W.; And Others

    This book presents the major analytical techniques that may be used in the appraisal of human movement. Chapter 1 is devoted to the photopgraphic analysis of movement with particular emphasis on cine filming. Cine film may be taken with little or no restriction on the performer's range of movement; information on the film is permanent and…

  14. Sensitivity analysis of hybrid thermoelastic techniques

    Treesearch

    W.A. Samad; J.M. Considine

    2017-01-01

    Stress functions have been used as a complementary tool to support experimental techniques, such as thermoelastic stress analysis (TSA) and digital image correlation (DIC), in an effort to evaluate the complete and separate full-field stresses of loaded structures. The need for such coupling between experimental data and stress functions is due to the fact that...

  15. Techniques for the Analysis of Human Movement.

    ERIC Educational Resources Information Center

    Grieve, D. W.; And Others

    This book presents the major analytical techniques that may be used in the appraisal of human movement. Chapter 1 is devoted to the photopgraphic analysis of movement with particular emphasis on cine filming. Cine film may be taken with little or no restriction on the performer's range of movement; information on the film is permanent and…

  16. Quantifying sources of black carbon in western North America using observationally based analysis and an emission tagging technique in the Community Atmosphere Model

    DOE PAGES

    Zhang, Rudong; Wang, Hailong; Hegg, D. A.; ...

    2015-11-18

    The Community Atmosphere Model (CAM5), equipped with a technique to tag black carbon (BC) emissions by source regions and types, has been employed to establish source–receptor relationships for atmospheric BC and its deposition to snow over western North America. The CAM5 simulation was conducted with meteorological fields constrained by reanalysis for year 2013 when measurements of BC in both near-surface air and snow are available for model evaluation. We find that CAM5 has a significant low bias in predicted mixing ratios of BC in snow but only a small low bias in predicted atmospheric concentrations over northwestern USA and westernmore » Canada. Even with a strong low bias in snow mixing ratios, radiative transfer calculations show that the BC-in-snow darkening effect is substantially larger than the BC dimming effect at the surface by atmospheric BC. Local sources contribute more to near-surface atmospheric BC and to deposition than distant sources, while the latter are more important in the middle and upper troposphere where wet removal is relatively weak. Fossil fuel (FF) is the dominant source type for total column BC burden over the two regions. FF is also the dominant local source type for BC column burden, deposition, and near-surface BC, while for all distant source regions combined the contribution of biomass/biofuel (BB) is larger than FF. An observationally based positive matrix factorization (PMF) analysis of the snow-impurity chemistry is conducted to quantitatively evaluate the CAM5 BC source-type attribution. Furthermore, while CAM5 is qualitatively consistent with the PMF analysis with respect to partitioning of BC originating from BB and FF emissions, it significantly underestimates the relative contribution of BB. In addition to a possible low bias in BB emissions used in the simulation, the model is likely missing a significant source of snow darkening from local soil found in the observations.« less

  17. Microextraction sample preparation techniques in biomedical analysis.

    PubMed

    Szultka, Malgorzata; Pomastowski, Pawel; Railean-Plugaru, Viorica; Buszewski, Boguslaw

    2014-11-01

    Biologically active compounds are found in biological samples at relatively low concentration levels. The sample preparation of target compounds from biological, pharmaceutical, environmental, and food matrices is one of the most time-consuming steps in the analytical procedure. The microextraction techniques are dominant. Metabolomic studies also require application of proper analytical technique for the determination of endogenic metabolites present in biological matrix on trace concentration levels. Due to the reproducibility of data, precision, relatively low cost of the appropriate analysis, simplicity of the determination, and the possibility of direct combination of those techniques with other methods (combination types on-line and off-line), they have become the most widespread in routine determinations. Additionally, sample pretreatment procedures have to be more selective, cheap, quick, and environmentally friendly. This review summarizes the current achievements and applications of microextraction techniques. The main aim is to deal with the utilization of different types of sorbents for microextraction and emphasize the use of new synthesized sorbents as well as to bring together studies concerning the systematic approach to method development. This review is dedicated to the description of microextraction techniques and their application in biomedical analysis.

  18. Aerosol particle analysis by Raman scattering technique

    SciTech Connect

    Fung, K.H.; Tang, I.N.

    1992-10-01

    Laser Raman spectroscopy is a very versatile tool for chemical characterization of micron-sized particles. Such particles are abundant in nature, and in numerous energy-related processes. In order to elucidate the formation mechanisms and understand the subsequent chemical transformation under a variety of reaction conditions, it is imperative to develop analytical measurement techniques for in situ monitoring of these suspended particles. In this report, we outline our recent work on spontaneous Raman, resonance Raman and non-linear Raman scattering as a novel technique for chemical analysis of aerosol particles as well as supersaturated solution droplets.

  19. Wavelet-based technique for target segmentation

    NASA Astrophysics Data System (ADS)

    Sadjadi, Firooz A.

    1995-07-01

    Segmentation of targets embedded in clutter obtained by IR imaging sensors is one of the challenging problems in automatic target recognition (ATR). In this paper a new texture-based segmentation technique is presented that uses the statistics of 2D wavelet decomposition components of the lcoal sections of the image. A measure of statistical similarity is then used to segment the image and separate the target from the background. This technique is applied on a set of real sequential IR imagery and has shown to produce a high degree of segmentation accuracy across varying ranges.

  20. Dissociation techniques in mass spectrometry-based proteomics.

    PubMed

    Jones, Andrew W; Cooper, Helen J

    2011-09-07

    The field of proteomics, the large-scale analysis of proteins, has undergone a huge expansion over the past decade. Mass spectrometry-based proteomics relies on the dissociation of peptide and/or protein ions to provide information on primary sequence and sites of post-translational modifications. Fragmentation techniques include collision-induced dissociation, electron capture dissociation and electron transfer dissociation. Here, we describe each of these techniques and their use in proteomics. The principles, advantages, limitations, and applications are discussed.

  1. Chromatographic Techniques for Rare Earth Elements Analysis

    NASA Astrophysics Data System (ADS)

    Chen, Beibei; He, Man; Zhang, Huashan; Jiang, Zucheng; Hu, Bin

    2017-04-01

    The present capability of rare earth element (REE) analysis has been achieved by the development of two instrumental techniques. The efficiency of spectroscopic methods was extraordinarily improved for the detection and determination of REE traces in various materials. On the other hand, the determination of REEs very often depends on the preconcentration and separation of REEs, and chromatographic techniques are very powerful tools for the separation of REEs. By coupling with sensitive detectors, many ambitious analytical tasks can be fulfilled. Liquid chromatography is the most widely used technique. Different combinations of stationary phases and mobile phases could be used in ion exchange chromatography, ion chromatography, ion-pair reverse-phase chromatography and some other techniques. The application of gas chromatography is limited because only volatile compounds of REEs can be separated. Thin-layer and paper chromatography are techniques that cannot be directly coupled with suitable detectors, which limit their applications. For special demands, separations can be performed by capillary electrophoresis, which has very high separation efficiency.

  2. Infusing Reliability Techniques into Software Safety Analysis

    NASA Technical Reports Server (NTRS)

    Shi, Ying

    2015-01-01

    Software safety analysis for a large software intensive system is always a challenge. Software safety practitioners need to ensure that software related hazards are completely identified, controlled, and tracked. This paper discusses in detail how to incorporate the traditional reliability techniques into the entire software safety analysis process. In addition, this paper addresses how information can be effectively shared between the various practitioners involved in the software safety analyses. The author has successfully applied the approach to several aerospace applications. Examples are provided to illustrate the key steps of the proposed approach.

  3. Accelerometer Data Analysis and Presentation Techniques

    NASA Technical Reports Server (NTRS)

    Rogers, Melissa J. B.; Hrovat, Kenneth; McPherson, Kevin; Moskowitz, Milton E.; Reckart, Timothy

    1997-01-01

    The NASA Lewis Research Center's Principal Investigator Microgravity Services project analyzes Orbital Acceleration Research Experiment and Space Acceleration Measurement System data for principal investigators of microgravity experiments. Principal investigators need a thorough understanding of data analysis techniques so that they can request appropriate analyses to best interpret accelerometer data. Accelerometer data sampling and filtering is introduced along with the related topics of resolution and aliasing. Specific information about the Orbital Acceleration Research Experiment and Space Acceleration Measurement System data sampling and filtering is given. Time domain data analysis techniques are discussed and example environment interpretations are made using plots of acceleration versus time, interval average acceleration versus time, interval root-mean-square acceleration versus time, trimmean acceleration versus time, quasi-steady three dimensional histograms, and prediction of quasi-steady levels at different locations. An introduction to Fourier transform theory and windowing is provided along with specific analysis techniques and data interpretations. The frequency domain analyses discussed are power spectral density versus frequency, cumulative root-mean-square acceleration versus frequency, root-mean-square acceleration versus frequency, one-third octave band root-mean-square acceleration versus frequency, and power spectral density versus frequency versus time (spectrogram). Instructions for accessing NASA Lewis Research Center accelerometer data and related information using the internet are provided.

  4. Analysis techniques for residual acceleration data

    NASA Technical Reports Server (NTRS)

    Rogers, Melissa J. B.; Alexander, J. Iwan D.; Snyder, Robert S.

    1990-01-01

    Various aspects of residual acceleration data are of interest to low-gravity experimenters. Maximum and mean values and various other statistics can be obtained from data as collected in the time domain. Additional information may be obtained through manipulation of the data. Fourier analysis is discussed as a means of obtaining information about dominant frequency components of a given data window. Transformation of data into different coordinate axes is useful in the analysis of experiments with different orientations and can be achieved by the use of a transformation matrix. Application of such analysis techniques to residual acceleration data provides additional information than what is provided in a time history and increases the effectiveness of post-flight analysis of low-gravity experiments.

  5. Multiview video codec based on KTA techniques

    NASA Astrophysics Data System (ADS)

    Seo, Jungdong; Kim, Donghyun; Ryu, Seungchul; Sohn, Kwanghoon

    2011-03-01

    Multi-view video coding (MVC) is a video coding standard developed by MPEG and VCEG for multi-view video. It showed average PSNR gain of 1.5dB compared with view-independent coding by H.264/AVC. However, because resolutions of multi-view video are getting higher for more realistic 3D effect, high performance video codec is needed. MVC adopted hierarchical B-picture structure and inter-view prediction as core techniques. The hierarchical B-picture structure removes the temporal redundancy, and the inter-view prediction reduces the inter-view redundancy by compensated prediction from the reconstructed neighboring views. Nevertheless, MVC has inherent limitation in coding efficiency, because it is based on H.264/AVC. To overcome the limit, an enhanced video codec for multi-view video based on Key Technology Area (KTA) is proposed. KTA is a high efficiency video codec by Video Coding Expert Group (VCEG), and it was carried out for coding efficiency beyond H.264/AVC. The KTA software showed better coding gain than H.264/AVC by using additional coding techniques. The techniques and the inter-view prediction are implemented into the proposed codec, which showed high coding gain compared with the view-independent coding result by KTA. The results presents that the inter-view prediction can achieve higher efficiency in a multi-view video codec based on a high performance video codec such as HEVC.

  6. Multi-Variable Analysis and Design Techniques.

    DTIC Science & Technology

    1981-09-01

    by A.G.J.MacFarlane 2 MULTIVARIABLE DESIGN TECHNIQUES BASED ON SINGULAR VALUE GENERALIZATIONS OF CLASSICAL CONTROL by J.C. Doyle 3 LIMITATIONS ON...prototypes to complex mathematical representations. All of these assemblages of information or information generators can loosely be termed "models...non linearities (e.g., control saturation) I neglect of high frequency dynamics. T hese approximations are well understood and in general their impact

  7. Respiratory monitoring system based on the nasal pressure technique for the analysis of sleep breathing disorders: Reduction of static and dynamic errors, and comparisons with thermistors and pneumotachographs

    NASA Astrophysics Data System (ADS)

    Alves de Mesquita, Jayme; Lopes de Melo, Pedro

    2004-03-01

    Thermally sensitive devices—thermistors—have usually been used to monitor sleep-breathing disorders. However, because of their long time constant, these devices are not able to provide a good characterization of fast events, like hypopneas. Nasal pressure recording technique (NPR) has recently been suggested to quantify airflow during sleep. It is claimed that the short time constants of the devices used to implement this technique would allow an accurate analysis of fast abnormal respiratory events. However, these devices present errors associated with nonlinearities and acoustic resonance that could reduce the diagnostic value of the NPR. Moreover, in spite of the high scientific and clinical potential, there is no detailed description of a complete instrumentation system to implement this promising technique in sleep studies. In this context, the purpose of this work was twofold: (1) describe the development of a flexible NPR device and (2) evaluate the performance of this device when compared to pneumotachographs (PNTs) and thermistors. After the design details are described, the system static accuracy is evaluated by a comparative analysis with a PNT. This analysis revealed a significant reduction (p<0.001) of the static error when system nonlinearities were reduced. The dynamic performance of the NPR system was investigated by frequency response analysis and time constant evaluations and the results showed that the developed device response was as good as PNT and around 100 times faster (τ=5,3 ms) than thermistors (τ=512 ms). Experimental results obtained in simulated clinical conditions and in a patient are presented as examples, and confirmed the good features achieved in engineering tests. These results are in close agreement with physiological fundamentals, supplying substantial evidence that the improved dynamic and static characteristics of this device can contribute to a more accurate implementation of medical research projects and to improve the

  8. Forensic Analysis using Geological and Geochemical Techniques

    NASA Astrophysics Data System (ADS)

    Hoogewerff, J.

    2009-04-01

    Due to the globalisation of legal (and illegal) trade there is an increasing demand for techniques which can verify the geographical origin and transfer routes of many legal and illegal commodities and products. Although geological techniques have been used in forensic investigations since the emergence of forensics as a science in the late eighteen hundreds, the last decade has seen a marked increase in geo-scientists initiating concept studies using the latest analytical techniques, including studying natural abundance isotope variations, micro analysis with laser ablation ICPMS and geochemical mapping. Most of the concept studies have shown a good potential but uptake by the law enforcement and legal community has been limited due to concerns about the admissibility of the new methods. As an introduction to the UGU2009 session "Forensic Provenancing using Geological and Geochemical Techniques" I will give an overview of the state of the art of forensic geology and the issues that concern the admissibility of geological forensic evidence. I will use examples from the NITECRIME and FIRMS networks, the EU TRACE project and other projects and literature to illustrate the important issues at hand.

  9. A Comparative Analysis of Biomarker Selection Techniques

    PubMed Central

    Dessì, Nicoletta

    2013-01-01

    Feature selection has become the essential step in biomarker discovery from high-dimensional genomics data. It is recognized that different feature selection techniques may result in different set of biomarkers, that is, different groups of genes highly correlated to a given pathological condition, but few direct comparisons exist which quantify these differences in a systematic way. In this paper, we propose a general methodology for comparing the outcomes of different selection techniques in the context of biomarker discovery. The comparison is carried out along two dimensions: (i) measuring the similarity/dissimilarity of selected gene sets; (ii) evaluating the implications of these differences in terms of both predictive performance and stability of selected gene sets. As a case study, we considered three benchmarks deriving from DNA microarray experiments and conducted a comparative analysis among eight selection methods, representatives of different classes of feature selection techniques. Our results show that the proposed approach can provide useful insight about the pattern of agreement of biomarker discovery techniques. PMID:24324960

  10. Electrical Load Profile Analysis Using Clustering Techniques

    NASA Astrophysics Data System (ADS)

    Damayanti, R.; Abdullah, A. G.; Purnama, W.; Nandiyanto, A. B. D.

    2017-03-01

    Data mining is one of the data processing techniques to collect information from a set of stored data. Every day the consumption of electricity load is recorded by Electrical Company, usually at intervals of 15 or 30 minutes. This paper uses a clustering technique, which is one of data mining techniques to analyse the electrical load profiles during 2014. The three methods of clustering techniques were compared, namely K-Means (KM), Fuzzy C-Means (FCM), and K-Means Harmonics (KHM). The result shows that KHM is the most appropriate method to classify the electrical load profile. The optimum number of clusters is determined using the Davies-Bouldin Index. By grouping the load profile, the demand of variation analysis and estimation of energy loss from the group of load profile with similar pattern can be done. From the group of electric load profile, it can be known cluster load factor and a range of cluster loss factor that can help to find the range of values of coefficients for the estimated loss of energy without performing load flow studies.

  11. UPLC: a preeminent technique in pharmaceutical analysis.

    PubMed

    Kumar, Ashok; Saini, Gautam; Nair, Anroop; Sharma, Rishbha

    2012-01-01

    The pharmaceutical companies today are driven to create novel and more efficient tools to discover, develop, deliver and monitor the drugs. In this contest the development of rapid chromatographic method is crucial for the analytical laboratories. In precedent decade, substantial technological advances have been done in enhancing particle chemistry performance, improving detector design and in optimizing the system, data processors and various controls of chromatographic techniques. When all was blended together, it resulted in the outstanding performance via ultra-high performance liquid chromatography (UPLC), which holds back the principle of HPLC technique. UPLC shows a dramatic enhancement in speed, resolution as well as the sensitivity of analysis by using particle size less than 2 pm and the system is operational at higher pressure, while the mobile phase could be able to run at greater linear velocities as compared to HPLC. This technique is considered as a new focal point in field of liquid chromatographic studies. This review focuses on the basic principle, instrumentation of UPLC and its advantages over HPLC, furthermore, this article emphasizes various pharmaceutical applications of this technique.

  12. Flash Infrared Thermography Contrast Data Analysis Technique

    NASA Technical Reports Server (NTRS)

    Koshti, Ajay

    2014-01-01

    This paper provides information on an IR Contrast technique that involves extracting normalized contrast versus time evolutions from the flash thermography inspection infrared video data. The analysis calculates thermal measurement features from the contrast evolution. In addition, simulation of the contrast evolution is achieved through calibration on measured contrast evolutions from many flat-bottom holes in the subject material. The measurement features and the contrast simulation are used to evaluate flash thermography data in order to characterize delamination-like anomalies. The thermal measurement features relate to the anomaly characteristics. The contrast evolution simulation is matched to the measured contrast evolution over an anomaly to provide an assessment of the anomaly depth and width which correspond to the depth and diameter of the equivalent flat-bottom hole (EFBH) similar to that used as input to the simulation. A similar analysis, in terms of diameter and depth of an equivalent uniform gap (EUG) providing a best match with the measured contrast evolution, is also provided. An edge detection technique called the half-max is used to measure width and length of the anomaly. Results of the half-max width and the EFBH/EUG diameter are compared to evaluate the anomaly. The information provided here is geared towards explaining the IR Contrast technique. Results from a limited amount of validation data on reinforced carbon-carbon (RCC) hardware are included in this paper.

  13. DETECTION OF DNA DAMAGE USING MELTING ANALYSIS TECHNIQUES

    EPA Science Inventory

    A rapid and simple fluorescence screening assay for UV radiation-, chemical-, and enzyme-induced DNA damage is reported. This assay is based on a melting/annealing analysis technique and has been used with both calf thymus DNA and plasmid DNA (puc 19 plasmid from E. coli). DN...

  14. DETECTION OF DNA DAMAGE USING MELTING ANALYSIS TECHNIQUES

    EPA Science Inventory

    A rapid and simple fluorescence screening assay for UV radiation-, chemical-, and enzyme-induced DNA damage is reported. This assay is based on a melting/annealing analysis technique and has been used with both calf thymus DNA and plasmid DNA (puc 19 plasmid from E. coli). DN...

  15. Application of spectral analysis techniques to the intercomparison of aerosol data - Part 4: Combined maximum covariance analysis to bridge the gap between multi-sensor satellite retrievals and ground-based measurements

    NASA Astrophysics Data System (ADS)

    Li, J.; Carlson, B. E.; Lacis, A. A.

    2014-04-01

    The development of remote sensing techniques has greatly advanced our knowledge of atmospheric aerosols. Various satellite sensors and the associated retrieval algorithms all add to the information of global aerosol variability, while well-designed surface networks provide time series of highly accurate measurements at specific locations. In studying the variability of aerosol properties, aerosol climate effects, and constraining aerosol fields in climate models, it is essential to make the best use of all of the available information. In the previous three parts of this series, we demonstrated the usefulness of several spectral decomposition techniques in the analysis and comparison of temporal and spatial variability of aerosol optical depth using satellite and ground-based measurements. Specifically, Principal Component Analysis (PCA) successfully captures and isolates seasonal and interannual variability from different aerosol source regions, Maximum Covariance Analysis (MCA) provides a means to verify the variability in one satellite dataset against Aerosol Robotic Network (AERONET) data, and Combined Principal Component Analysis (CPCA) realized parallel comparison among multi-satellite, multi-sensor datasets. As the final part of the study, this paper introduces a novel technique that integrates both multi-sensor datasets and ground observations, and thus effectively bridges the gap between these two types of measurements. The Combined Maximum Covariance Analysis (CMCA) decomposes the cross covariance matrix between the combined multi-sensor satellite data field and AERONET station data. We show that this new method not only confirms the seasonal and interannual variability of aerosol optical depth, aerosol source regions and events represented by different satellite datasets, but also identifies the strengths and weaknesses of each dataset in capturing the variability associated with sources, events or aerosol types. Furthermore, by examining the spread of

  16. Analysis automation with paving: A new quadrilateral meshing technique

    SciTech Connect

    Blacker, T.D. ); Stephenson, M.B.; Canann, S. )

    1990-01-01

    This paper describes the impact of paving, a new automatic mesh generation algorithm, on the analysis portion of the design process. Paving generates an all-quadrilateral mesh in arbitrary 2D geometries. The paving technique significantly impacts the analysis process by drastically reducing the time and expertise requirements of traditional mesh generation. Paving produces a high quality mesh based on geometric boundary definitions and user specified element sizing constraints. In this paper we describe the paving algorithm, discuss varying aspects of the impact of the technique on design automation, and elaborate on current research into 3D all-hexahedral mesh generation. 11 refs., 10 figs.

  17. Technique analysis in elite athletes using principal component analysis.

    PubMed

    Gløersen, Øyvind; Myklebust, Håvard; Hallén, Jostein; Federolf, Peter

    2017-03-13

    The aim of this study was to advance current movement analysis methodology to enable a technique analysis in sports facilitating (1) concurrent comparison of the techniques between several athletes; (2) identification of potentially beneficial technique modifications and (3) a visual representation of the findings for feedback to the athletes. Six elite cross-country skiers, three world cup winners and three national elite, roller ski skated using the V2 technique on a treadmill while their movement patterns were recorded using 41 reflective markers. A principal component analysis performed on the marker positions resulted in multi-segmental "principal" movement components (PMs). A novel normalisation facilitated comparability of the PMs between athletes. Additionally, centre of mass (COM) trajectories were modelled. We found correlations between the athletes' performance levels (judged from race points) and specific features in the PMs and in the COM trajectories. Plausible links between COM trajectories and PMs were observed, suggesting that better performing skiers exhibited a different, possibly more efficient use of their body mass for propulsion. The analysis presented in the current study revealed specific technique features that appeared to relate to the skiers' performance levels. How changing these features would affect an individual athlete's technique was visualised with animated stick figures.

  18. Advanced automated char image analysis techniques

    SciTech Connect

    Tao Wu; Edward Lester; Michael Cloke

    2006-05-15

    Char morphology is an important characteristic when attempting to understand coal behavior and coal burnout. In this study, an augmented algorithm has been proposed to identify char types using image analysis. On the basis of a series of image processing steps, a char image is singled out from the whole image, which then allows the important major features of the char particle to be measured, including size, porosity, and wall thickness. The techniques for automated char image analysis have been tested against char images taken from ICCP Char Atlas as well as actual char particles derived from pyrolyzed char samples. Thirty different chars were prepared in a drop tube furnace operating at 1300{sup o}C, 1% oxygen, and 100 ms from 15 different world coals sieved into two size fractions (53-75 and 106-125 {mu}m). The results from this automated technique are comparable with those from manual analysis, and the additional detail from the automated sytem has potential use in applications such as combustion modeling systems. Obtaining highly detailed char information with automated methods has traditionally been hampered by the difficulty of automatic recognition of individual char particles. 20 refs., 10 figs., 3 tabs.

  19. An analysis of I/O efficient order-statistic-based techniques for noise power estimation in the HRMS sky survey's operational system

    NASA Technical Reports Server (NTRS)

    Zimmerman, G. A.; Olsen, E. T.

    1992-01-01

    Noise power estimation in the High-Resolution Microwave Survey (HRMS) sky survey element is considered as an example of a constant false alarm rate (CFAR) signal detection problem. Order-statistic-based noise power estimators for CFAR detection are considered in terms of required estimator accuracy and estimator dynamic range. By limiting the dynamic range of the value to be estimated, the performance of an order-statistic estimator can be achieved by simpler techniques requiring only a single pass of the data. Simple threshold-and-count techniques are examined, and it is shown how several parallel threshold-and-count estimation devices can be used to expand the dynamic range to meet HRMS system requirements with minimal hardware complexity. An input/output (I/O) efficient limited-precision order-statistic estimator with wide but limited dynamic range is also examined.

  20. COMBINING A NEW 3-D SEISMIC S-WAVE PROPAGATION ANALYSIS FOR REMOTE FRACTURE DETECTION WITH A ROBUST SUBSURFACE MICROFRACTURE-BASED VERIFICATION TECHNIQUE

    SciTech Connect

    Bob Hardage; M.M. Backus; M.V. DeAngelo; R.J. Graebner; S.E. Laubach; Paul Murray

    2004-02-01

    Fractures within the producing reservoirs at McElroy Field could not be studied with the industry-provided 3C3D seismic data used as a cost-sharing contribution in this study. The signal-to-noise character of the converted-SV data across the targeted reservoirs in these contributed data was not adequate for interpreting azimuth-dependent data effects. After illustrating the low signal quality of the converted-SV data at McElroy Field, the seismic portion of this report abandons the McElroy study site and defers to 3C3D seismic data acquired across a different fractured carbonate reservoir system to illustrate how 3C3D seismic data can provide useful information about fracture systems. Using these latter data, we illustrate how fast-S and slow-S data effects can be analyzed in the prestack domain to recognize fracture azimuth, and then demonstrate how fast-S and slow-S data volumes can be analyzed in the poststack domain to estimate fracture intensity. In the geologic portion of the report, we analyze published regional stress data near McElroy Field and numerous formation multi-imager (FMI) logs acquired across McElroy to develop possible fracture models for the McElroy system. Regional stress data imply a fracture orientation different from the orientations observed in most of the FMI logs. This report culminates Phase 2 of the study, ''Combining a New 3-D Seismic S-Wave Propagation Analysis for Remote Fracture Detection with a Robust Subsurface Microfracture-Based Verification Technique''. Phase 3 will not be initiated because wells were to be drilled in Phase 3 of the project to verify the validity of fracture-orientation maps and fracture-intensity maps produced in Phase 2. Such maps cannot be made across McElroy Field because of the limitations of the available 3C3D seismic data at the depth level of the reservoir target.

  1. Latent practice profiles of substance abuse treatment counselors: do evidence-based techniques displace traditional techniques?

    PubMed

    Smith, Brenda D; Liu, Junqing

    2014-04-01

    As more substance abuse treatment counselors begin to use evidence-based treatment techniques, questions arise regarding the continued use of traditional techniques. This study aims to (1) assess whether there are meaningful practice profiles among practitioners reflecting distinct combinations of cognitive-behavioral and traditional treatment techniques; and (2) if so, identify practitioner characteristics associated with the distinct practice profiles. Survey data from 278 frontline counselors working in community substance abuse treatment organizations were used to conduct latent profile analysis. The emergent practice profiles illustrate that practitioners vary most in the use of traditional techniques. Multinomial regression models suggest that practitioners with less experience, more education, and less traditional beliefs about treatment and substance abuse are least likely to mix traditional techniques with cognitive-behavioral techniques. Findings add to the understanding of how evidence-based practices are implemented in routine settings and have implications for training and support of substance abuse treatment counselors. Copyright © 2014 Elsevier Inc. All rights reserved.

  2. A comparison of geostatistically based inverse techniques for use in performance assessment analysis at the Waste Isolation Pilot Plant Site: Results from Test Case No. 1

    SciTech Connect

    Zimmerman, D.A.; Gallegos, D.P.

    1993-10-01

    The groundwater flow pathway in the Culebra Dolomite aquifer at the Waste Isolation Pilot Plant (WIPP) has been identified as a potentially important pathway for radionuclide migration to the accessible environment. Consequently, uncertainties in the models used to describe flow and transport in the Culebra need to be addressed. A ``Geostatistics Test Problem`` is being developed to evaluate a number of inverse techniques that may be used for flow calculations in the WIPP performance assessment (PA). The Test Problem is actually a series of test cases, each being developed as a highly complex synthetic data set; the intent is for the ensemble of these data sets to span the range of possible conceptual models of groundwater flow at the WIPP site. The Test Problem analysis approach is to use a comparison of the probabilistic groundwater travel time (GWTT) estimates produced by each technique as the basis for the evaluation. Participants are given observations of head and transmissivity (possibly including measurement error) or other information such as drawdowns from pumping wells, and are asked to develop stochastic models of groundwater flow for the synthetic system. Cumulative distribution functions (CDFs) of groundwater flow (computed via particle tracking) are constructed using the head and transmissivity data generated through the application of each technique; one semi-analytical method generates the CDFs of groundwater flow directly. This paper describes the results from Test Case No. 1.

  3. Visual exploratory analysis of integrated chromosome 19 proteomic data derived from glioma cancer stem-cell lines based on novel nonlinear dimensional data reduction techniques

    NASA Astrophysics Data System (ADS)

    Lespinats, Sylvain; Pinker-Domenig, Katja; Meyer-Bäse, Uwe; Meyer-Bäse, Anke

    2015-05-01

    Chromosome 19 is known to be linked to neurodegeneration and many cancers. Glioma-derived cancer stem cells (GSCs) are tumor-initiating cells and may be refractory to radiation and chemotherapy and thus have important implications for tumor biology and therapeutics. The analysis and interpretation of large proteomic data sets requires the development of new data mining and visualization approaches. Traditional techniques are insufficient to interpret and visualize these resulting experimental data. The emphasis of this paper lies in the presentation of novel approaches for the visualization, clustering and projection representation to unveil hidden data structures relevant for the accurate interpretation of biological experiments. These qualitative and quantitative methods are applied to the proteomic analysis of data sets derived from the GSCs. The achieved clustering and visualization results provide a more detailed insight into the expression patterns for chromosome 19 proteins.

  4. Further development of ultrasonic techniques for non-destructive evaluation based on Fourier analysis of signals from irregular and inhomogeneous structures

    NASA Technical Reports Server (NTRS)

    Miller, J. G.

    1979-01-01

    To investigate the use of Fourier analysis techniques model systems had to be designed to test some of the general properties of the interaction of sound with an inhomogeneity. The first models investigated were suspensions of solid spheres in water. These systems allowed comparison between theoretical computation of the frequency dependence of the attenuation coefficient and measurement of the attenuation coefficient over a range of frequencies. Ultrasonic scattering processes in both suspensions of hard spheres in water, and suspensions of hard spheres in polyester resin were investigated. The second model system was constructed to test the applicability of partial wave analysis to the description of an inhomogeneity in a solid, and to test the range of material properties over which the measurement systems were valid.

  5. Information fusion based techniques for HEVC

    NASA Astrophysics Data System (ADS)

    Fernández, D. G.; Del Barrio, A. A.; Botella, Guillermo; Meyer-Baese, Uwe; Meyer-Baese, Anke; Grecos, Christos

    2017-05-01

    Aiming at the conflict circumstances of multi-parameter H.265/HEVC encoder system, the present paper introduces the analysis of many optimizations' set in order to improve the trade-off between quality, performance and power consumption for different reliable and accurate applications. This method is based on the Pareto optimization and has been tested with different resolutions on real-time encoders.

  6. Flood alert system based on bayesian techniques

    NASA Astrophysics Data System (ADS)

    Gulliver, Z.; Herrero, J.; Viesca, C.; Polo, M. J.

    2012-04-01

    The problem of floods in the Mediterranean regions is closely linked to the occurrence of torrential storms in dry regions, where even the water supply relies on adequate water management. Like other Mediterranean basins in Southern Spain, the Guadalhorce River Basin is a medium sized watershed (3856 km2) where recurrent yearly floods occur , mainly in autumn and spring periods, driven by cold front phenomena. The torrential character of the precipitation in such small basins, with a concentration time of less than 12 hours, produces flash flood events with catastrophic effects over the city of Malaga (600000 inhabitants). From this fact arises the need for specific alert tools which can forecast these kinds of phenomena. Bayesian networks (BN) have been emerging in the last decade as a very useful and reliable computational tool for water resources and for the decision making process. The joint use of Artificial Neural Networks (ANN) and BN have served us to recognize and simulate the two different types of hydrological behaviour in the basin: natural and regulated. This led to the establishment of causal relationships between precipitation, discharge from upstream reservoirs, and water levels at a gauging station. It was seen that a recurrent ANN model working at an hourly scale, considering daily precipitation and the two previous hourly values of reservoir discharge and water level, could provide R2 values of 0.86. BN's results slightly improve this fit, but contribute with uncertainty to the prediction. In our current work to Design a Weather Warning Service based on Bayesian techniques the first steps were carried out through an analysis of the correlations between the water level and rainfall at certain representative points in the basin, along with the upstream reservoir discharge. The lower correlation found between precipitation and water level emphasizes the highly regulated condition of the stream. The autocorrelations of the variables were also

  7. Stratifying land use/land cover for spatial analysis of disease ecology and risk: an example using object-based classification techniques.

    PubMed

    Koch, David E; Mohler, Rhett L; Goodin, Douglas G

    2007-11-01

    Landscape epidemiology has made significant strides recently, driven in part by increasing availability of land cover data derived from remotely-sensed imagery. Using an example from a study of land cover effects on hantavirus dynamics at an Atlantic Forest site in eastern Paraguay, we demonstrate how automated classification methods can be used to stratify remotely-sensed land cover for studies of infectious disease dynamics. For this application, it was necessary to develop a scheme that could yield both land cover and land use data from the same classification. Hypothesizing that automated discrimination between classes would be more accurate using an object-based method compared to a per-pixel method, we used a single Landsat Enhanced Thematic Mapper+ (ETM+) image to classify land cover into eight classes using both per-pixel and object-based classification algorithms. Our results show that the object-based method achieves 84% overall accuracy, compared to only 43% using the per-pixel method. Producer's and user's accuracies for the object-based map were higher for every class compared to the per-pixel classification. The Kappa statistic was also significantly higher for the object-based classification. These results show the importance of using image information from domains beyond the spectral domain, and also illustrate the importance of object-based techniques for remote sensing applications in epidemiological studies.

  8. Artificial Intelligence based technique for BTS placement

    NASA Astrophysics Data System (ADS)

    Alenoghena, C. O.; Emagbetere, J. O.; Aibinu, A. M.

    2013-12-01

    The increase of the base transceiver station (BTS) in most urban areas can be traced to the drive by network providers to meet demand for coverage and capacity. In traditional network planning, the final decision of BTS placement is taken by a team of radio planners, this decision is not fool proof against regulatory requirements. In this paper, an intelligent based algorithm for optimal BTS site placement has been proposed. The proposed technique takes into consideration neighbour and regulation considerations objectively while determining cell site. The application will lead to a quantitatively unbiased evaluated decision making process in BTS placement. An experimental data of a 2km by 3km territory was simulated for testing the new algorithm, results obtained show a 100% performance of the neighbour constrained algorithm in BTS placement optimization. Results on the application of GA with neighbourhood constraint indicate that the choices of location can be unbiased and optimization of facility placement for network design can be carried out.

  9. The German Passive: Analysis and Teaching Technique.

    ERIC Educational Resources Information Center

    Griffen, T. D.

    1981-01-01

    Proposes an analysis of German passive based upon internal structure rather than translation conventions from Latin and Greek. Claims that this approach leads to a description of the perfect participle as an adjectival complement, which eliminates the classification of a passive voice for German and simplifies the learning task. (MES)

  10. Multiclass pesticide analysis in fruit-based baby food: A comparative study of sample preparation techniques previous to gas chromatography-mass spectrometry.

    PubMed

    Petrarca, Mateus H; Fernandes, José O; Godoy, Helena T; Cunha, Sara C

    2016-12-01

    With the aim to develop a new gas chromatography-mass spectrometry method to analyze 24 pesticide residues in baby foods at the level imposed by established regulation two simple, rapid and environmental-friendly sample preparation techniques based on QuEChERS (quick, easy, cheap, effective, robust and safe) were compared - QuEChERS with dispersive liquid-liquid microextraction (DLLME) and QuEChERS with dispersive solid-phase extraction (d-SPE). Both sample preparation techniques achieved suitable performance criteria, including selectivity, linearity, acceptable recovery (70-120%) and precision (⩽20%). A higher enrichment factor was observed for DLLME and consequently better limits of detection and quantification were obtained. Nevertheless, d-SPE provided a more effective removal of matrix co-extractives from extracts than DLLME, which contributed to lower matrix effects. Twenty-two commercial fruit-based baby food samples were analyzed by the developed method, being procymidone detected in one sample at a level above the legal limit established by EU.

  11. High-accuracy and long-range Brillouin optical time-domain analysis sensor based on the combination of pulse prepump technique and complementary coding

    NASA Astrophysics Data System (ADS)

    Sun, Qiao; Tu, Xiaobo; Lu, Yang; Sun, Shilin; Meng, Zhou

    2016-06-01

    A Brillouin optical time-domain analysis (BOTDA) sensor that combines the conventional complementary coding with the pulse prepump technique for high-accuracy and long-range distributed sensing is implemented and analyzed. The employment of the complementary coding provides an enhanced signal-to-noise ratio (SNR) of the sensing system and an extended sensing distance, and the measurement time is also reduced compared with a BOTDA sensor using linear coding. The combination of pulse prepump technique enables the establishment of a preactivated acoustic field in each pump pulse of the complementary codeword, which ensures measurements of high spatial resolution and high frequency accuracy. The feasibility of the prepumped complementary coding is analyzed theoretically and experimentally. The experiments are carried out beyond 50-km single-mode fiber, and experimental results show the capabilities of the proposed scheme to achieve 1-m spatial resolution with temperature and strain resolutions equal to ˜1.6°C and ˜32 μɛ, and 2-m spatial resolution with temperature and strain resolutions equal to ˜0.3°C and ˜6 μɛ, respectively. A longer sensing distance with the same spatial resolution and measurement accuracy can be achieved through increasing the code length of the prepumped complementary code.

  12. Oil species identification technique developed by Gabor wavelet analysis and support vector machine based on concentration-synchronous-matrix-fluorescence spectroscopy.

    PubMed

    Wang, Chunyan; Shi, Xiaofeng; Li, Wendong; Wang, Lin; Zhang, Jinliang; Yang, Chun; Wang, Zhendi

    2016-03-15

    Concentration-synchronous-matrix-fluorescence (CSMF) spectroscopy was applied to discriminate the oil species by characterizing the concentration dependent fluorescence properties of petroleum related samples. Seven days weathering experiment of 3 crude oil samples from the Bohai Sea platforms of China was carried out under controlled laboratory conditions and showed that weathering had no significant effect on the CSMF spectra. While different feature extraction methods, such as PCA, PLS and Gabor wavelet analysis, were applied to extract discriminative patterns from CSMF spectra, classifications were made via SVM to compare their respective performance of oil species recognition. Ideal correct rates of oil species recognition of 100% for the different types of oil spill samples and 92% for the closely-related source oil samples were achieved by combining Gabor wavelet with SVM, which indicated its advantages to be developed to a rapid, cost-effective, and accurate forensic oil spill identification technique.

  13. Image Analysis Technique for Material Behavior Evaluation in Civil Structures.

    PubMed

    Speranzini, Emanuela; Marsili, Roberto; Moretti, Michele; Rossi, Gianluca

    2017-07-08

    The article presents a hybrid monitoring technique for the measurement of the deformation field. The goal is to obtain information about crack propagation in existing structures, for the purpose of monitoring their state of health. The measurement technique is based on the capture and analysis of a digital image set. Special markers were used on the surface of the structures that can be removed without damaging existing structures as the historical masonry. The digital image analysis was done using software specifically designed in Matlab to follow the tracking of the markers and determine the evolution of the deformation state. The method can be used in any type of structure but is particularly suitable when it is necessary not to damage the surface of structures. A series of experiments carried out on masonry walls of the Oliverian Museum (Pesaro, Italy) and Palazzo Silvi (Perugia, Italy) have allowed the validation of the procedure elaborated by comparing the results with those derived from traditional measuring techniques.

  14. Alternative Analysis Techniques for Needs and Needs Documentation Techniques,

    DTIC Science & Technology

    1980-06-20

    economic if business planning *s to Forecasting in the seventies, if it is to be be effective in the seventies.’ Little is report- effective, must include...The strategic business planning and related perspective-tree approach to the identification research and development planning. of threats and...the Seventies: a Trend Analysis for Business Planning (New York: McGraw-Hi Book Com- from either business or technical planning. peny. 1970). FEBRUARY

  15. Laser Scanning–Based Tissue Autofluorescence/Fluorescence Imaging (LS-TAFI), a New Technique for Analysis of Microanatomy in Whole-Mount Tissues

    PubMed Central

    Mori, Hidetoshi; Borowsky, Alexander D.; Bhat, Ramray; Ghajar, Cyrus M.; Seiki, Motoharu; Bissell, Mina J.

    2012-01-01

    Intact organ structure is essential in maintaining tissue specificity and cellular differentiation. Small physiological or genetic variations lead to changes in microanatomy that, if persistent, could have functional consequences and may easily be masked by the heterogeneity of tissue anatomy. Current imaging techniques rely on histological, two-dimensional sections requiring sample manipulation that are essentially two dimensional. We have developed a method for three-dimensional imaging of whole-mount, unsectioned mammalian tissues to elucidate subtle and detailed micro- and macroanatomies in adult organs and embryos. We analyzed intact or dissected organ whole mounts with laser scanning–based tissue autofluorescence/fluorescence imaging (LS-TAFI). We obtained clear visualization of microstructures within murine mammary glands and mammary tumors and other organs without the use of immunostaining and without probes or fluorescent reporter genes. Combining autofluorescence with reflected light signals from chromophore-stained tissues allowed identification of individual cells within three-dimensional structures of whole-mounted organs. This technique could be useful for rapid diagnosis of human clinical samples and possibly the effect of subtle variations such as low dose radiation. PMID:22542846

  16. Hybrid analytical technique for the nonlinear analysis of curved beams

    NASA Technical Reports Server (NTRS)

    Noor, A. K.; Andersen, C. M.

    1992-01-01

    The application of a two-step hybrid technique to the geometrically nonlinear analysis of curved beams is used to demonstrate the potential of hybrid analytical techniques in nonlinear structural mechanics. The hybrid technique is based on successive use of the perturbation method and a classical direct variational procedure. The functions associated with the various-order terms in the perturbation expansion of the fundamental unknowns, and their sensitivity derivatives with respect to material and geometric parameters of the beam, are first obtained by using the perturbation method. These functions are selected as coordinate functions (or modes) and the classical direct variational technique is then used to compute their amplitudes. The potential of the proposed hybrid technique for nonlinear analysis of structures is discussed. The effectiveness of the hybrid technique is demonstrated by means of numerical examples. The symbolic computation system Mathematica is used in the present study. The tasks performed on Mathematica include: (1) generation of algebraic expressions for the perturbation functions of the different response quantities and their sensitivity derivatives: and (2) determination of the radius of convergence of the perturbation series.

  17. Monsoon Forecasting based on Imbalanced Classification Techniques

    NASA Astrophysics Data System (ADS)

    Ribera, Pedro; Troncoso, Alicia; Asencio-Cortes, Gualberto; Vega, Inmaculada; Gallego, David

    2017-04-01

    Monsoonal systems are quasiperiodic processes of the climatic system that control seasonal precipitation over different regions of the world. The Western North Pacific Summer Monsoon (WNPSM) is one of those monsoons and it is known to have a great impact both over the global climate and over the total precipitation of very densely populated areas. The interannual variability of the WNPSM along the last 50-60 years has been related to different climatic indices such as El Niño, El Niño Modoki, the Indian Ocean Dipole or the Pacific Decadal Oscillation. Recently, a new and longer series characterizing the monthly evolution of the WNPSM, the WNP Directional Index (WNPDI), has been developed, extending its previous length from about 50 years to more than 100 years (1900-2007). Imbalanced classification techniques have been applied to the WNPDI in order to check the capability of traditional climate indices to capture and forecast the evolution of the WNPSM. The problem of forecasting has been transformed into a binary classification problem, in which the positive class represents the occurrence of an extreme monsoon event. Given that the number of extreme monsoons is much lower than the number of non-extreme monsoons, the resultant classification problem is highly imbalanced. The complete dataset is composed of 1296 instances, where only 71 (5.47%) samples correspond to extreme monsoons. Twenty predictor variables based on the cited climatic indices have been proposed, and namely, models based on trees, black box models such as neural networks, support vector machines and nearest neighbors, and finally ensemble-based techniques as random forests have been used in order to forecast the occurrence of extreme monsoons. It can be concluded that the methodology proposed here reports promising results according to the quality parameters evaluated and predicts extreme monsoons for a temporal horizon of a month with a high accuracy. From a climatological point of view

  18. Modular Sampling and Analysis Techniques for the Real-Time Analysis of Human Breath

    SciTech Connect

    Frank, M; Farquar, G; Adams, K; Bogan, M; Martin, A; Benner, H; Spadaccini, C; Steele, P; Davis, C; Loyola, B; Morgan, J; Sankaran, S

    2007-07-09

    At LLNL and UC Davis, we are developing several techniques for the real-time sampling and analysis of trace gases, aerosols and exhaled breath that could be useful for a modular, integrated system for breath analysis. Those techniques include single-particle bioaerosol mass spectrometry (BAMS) for the analysis of exhaled aerosol particles or droplets as well as breath samplers integrated with gas chromatography mass spectrometry (GC-MS) or MEMS-based differential mobility spectrometry (DMS). We describe these techniques and present recent data obtained from human breath or breath condensate, in particular, addressing the question of how environmental exposure influences the composition of breath.

  19. HELCATS - Heliospheric Cataloguing, Analysis and Techniques Service

    NASA Astrophysics Data System (ADS)

    Harrison, Richard; Davies, Jackie; Perry, Chris; Moestl, Christian; Rouillard, Alexis; Bothmer, Volker; Rodriguez, Luciano; Eastwood, Jonathan; Kilpua, Emilia; Gallagher, Peter

    2016-04-01

    Understanding the evolution of the solar wind is fundamental to advancing our knowledge of energy and mass transport in the solar system, rendering it crucial to space weather and its prediction. The advent of truly wide-angle heliospheric imaging has revolutionised the study of both transient (CMEs) and background (SIRs/CIRs) solar wind plasma structures, by enabling their direct and continuous observation out to 1 AU and beyond. The EU-funded FP7 HELCATS project combines European expertise in heliospheric imaging, built up in particular through lead involvement in NASA's STEREO mission, with expertise in solar and coronal imaging as well as in-situ and radio measurements of solar wind phenomena, in a programme of work that will enable a much wider exploitation and understanding of heliospheric imaging observations. With HELCATS, we are (1.) cataloguing transient and background solar wind structures imaged in the heliosphere by STEREO/HI, since launch in late October 2006 to date, including estimates of their kinematic properties based on a variety of established techniques and more speculative, approaches; (2.) evaluating these kinematic properties, and thereby the validity of these techniques, through comparison with solar source observations and in-situ measurements made at multiple points throughout the heliosphere; (3.) appraising the potential for initialising advanced numerical models based on these kinematic properties; (4.) assessing the complementarity of radio observations (in particular of Type II radio bursts and interplanetary scintillation) in combination with heliospheric imagery. We will, in this presentation, provide an overview of progress from the first 18 months of the HELCATS project.

  20. Which Combinations of Techniques and Modes of Delivery in Internet-Based Interventions Effectively Change Health Behavior? A Meta-Analysis

    PubMed Central

    van Genugten, Lenneke; Webb, Thomas Llewelyn; van Empelen, Pepijn

    2016-01-01

    Background Many online interventions designed to promote health behaviors combine multiple behavior change techniques (BCTs), adopt different modes of delivery (MoD) (eg, text messages), and range in how usable they are. Research is therefore needed to examine the impact of these features on the effectiveness of online interventions. Objective This study applies Classification and Regression Trees (CART) analysis to meta-analytic data, in order to identify synergistic effects of BCTs, MoDs, and usability factors. Methods We analyzed data from Webb et al. This review included effect sizes from 52 online interventions targeting a variety of health behaviors and coded the use of 40 BCTs and 11 MoDs. Our research also developed a taxonomy for coding the usability of interventions. Meta-CART analyses were performed using the BCTs and MoDs as predictors and using treatment success (ie, effect size) as the outcome. Results Factors related to usability of the interventions influenced their efficacy. Specifically, subgroup analyses indicated that more efficient interventions (interventions that take little time to understand and use) are more likely to be effective than less efficient interventions. Meta-CART identified one synergistic effect: Interventions that included barrier identification/ problem solving and provided rewards for behavior change reported an average effect size that was smaller (ḡ=0.23, 95% CI 0.08-0.44) than interventions that used other combinations of techniques (ḡ=0.43, 95% CI 0.27-0.59). No synergistic effects were found for MoDs or for MoDs combined with BCTs. Conclusions Interventions that take little time to understand and use were more effective than those that require more time. Few specific combinations of BCTs that contribute to the effectiveness of online interventions were found. Furthermore, no synergistic effects between BCTs and MoDs were found, even though MoDs had strong effects when analyzed univariately in the original study

  1. A Video Monitoring Technique for Investigating Computer-Based Learning Programs.

    ERIC Educational Resources Information Center

    Bigum, C. J.; Gilding, A.

    1985-01-01

    Describes a video monitoring technique for detailed analysis of student use of computer-based learning programs which produces a synchronized record of computer output and student use; a study concerned with developing computer-based learning programs in chemistry which utilizes the technique; and advantages and disadvantages of the technique.…

  2. Automated analysis of non-mass-enhancing lesions in breast MRI based on morphological, kinetic, and spatio-temporal moments and joint segmentation-motion compensation technique

    NASA Astrophysics Data System (ADS)

    Hoffmann, Sebastian; Shutler, Jamie D.; Lobbes, Marc; Burgeth, Bernhard; Meyer-Bäse, Anke

    2013-12-01

    Dynamic contrast-enhanced magnetic resonance imaging (DCE-MRI) represents an established method for the detection and diagnosis of breast lesions. While mass-like enhancing lesions can be easily categorized according to the Breast Imaging Reporting and Data System (BI-RADS) MRI lexicon, a majority of diagnostically challenging lesions, the so called non-mass-like enhancing lesions, remain both qualitatively as well as quantitatively difficult to analyze. Thus, the evaluation of kinetic and/or morphological characteristics of non-masses represents a challenging task for an automated analysis and is of crucial importance for advancing current computer-aided diagnosis (CAD) systems. Compared to the well-characterized mass-enhancing lesions, non-masses have no well-defined and blurred tumor borders and a kinetic behavior that is not easily generalizable and thus discriminative for malignant and benign non-masses. To overcome these difficulties and pave the way for novel CAD systems for non-masses, we will evaluate several kinetic and morphological descriptors separately and a novel technique, the Zernike velocity moments, to capture the joint spatio-temporal behavior of these lesions, and additionally consider the impact of non-rigid motion compensation on a correct diagnosis.

  3. Developing techniques for cause-responsibility analysis of occupational accidents.

    PubMed

    Jabbari, Mousa; Ghorbani, Roghayeh

    2016-11-01

    The aim of this study was to specify the causes of occupational accidents, determine social responsibility and the role of groups involved in work-related accidents. This study develops occupational accidents causes tree, occupational accidents responsibility tree, and occupational accidents component-responsibility analysis worksheet; based on these methods, it develops cause-responsibility analysis (CRA) techniques, and for testing them, analyzes 100 fatal/disabling occupational accidents in the construction setting that were randomly selected from all the work-related accidents in Tehran, Iran, over a 5-year period (2010-2014). The main result of this study involves two techniques for CRA: occupational accidents tree analysis (OATA) and occupational accidents components analysis (OACA), used in parallel for determination of responsible groups and responsibilities rate. From the results, we find that the management group of construction projects has 74.65% responsibility of work-related accidents. The developed techniques are purposeful for occupational accidents investigation/analysis, especially for the determination of detailed list of tasks, responsibilities, and their rates. Therefore, it is useful for preventing work-related accidents by focusing on the responsible group's duties.

  4. Comparison of performance of object-based image analysis techniques available in open source software (Spring and Orfeo Toolbox/Monteverdi) considering very high spatial resolution data

    NASA Astrophysics Data System (ADS)

    Teodoro, Ana C.; Araujo, Ricardo

    2016-01-01

    The use of unmanned aerial vehicles (UAVs) for remote sensing applications is becoming more frequent. However, this type of information can result in several software problems related to the huge amount of data available. Object-based image analysis (OBIA) has proven to be superior to pixel-based analysis for very high-resolution images. The main objective of this work was to explore the potentialities of the OBIA methods available in two different open source software applications, Spring and OTB/Monteverdi, in order to generate an urban land cover map. An orthomosaic derived from UAVs was considered, 10 different regions of interest were selected, and two different approaches were followed. The first one (Spring) uses the region growing segmentation algorithm followed by the Bhattacharya classifier. The second approach (OTB/Monteverdi) uses the mean shift segmentation algorithm followed by the support vector machine (SVM) classifier. Two strategies were followed: four classes were considered using Spring and thereafter seven classes were considered for OTB/Monteverdi. The SVM classifier produces slightly better results and presents a shorter processing time. However, the poor spectral resolution of the data (only RGB bands) is an important factor that limits the performance of the classifiers applied.

  5. Numerical modeling techniques for flood analysis

    NASA Astrophysics Data System (ADS)

    Anees, Mohd Talha; Abdullah, K.; Nawawi, M. N. M.; Ab Rahman, Nik Norulaini Nik; Piah, Abd. Rahni Mt.; Zakaria, Nor Azazi; Syakir, M. I.; Mohd. Omar, A. K.

    2016-12-01

    Topographic and climatic changes are the main causes of abrupt flooding in tropical areas. It is the need to find out exact causes and effects of these changes. Numerical modeling techniques plays a vital role for such studies due to their use of hydrological parameters which are strongly linked with topographic changes. In this review, some of the widely used models utilizing hydrological and river modeling parameters and their estimation in data sparse region are discussed. Shortcomings of 1D and 2D numerical models and the possible improvements over these models through 3D modeling are also discussed. It is found that the HEC-RAS and FLO 2D model are best in terms of economical and accurate flood analysis for river and floodplain modeling respectively. Limitations of FLO 2D in floodplain modeling mainly such as floodplain elevation differences and its vertical roughness in grids were found which can be improve through 3D model. Therefore, 3D model was found to be more suitable than 1D and 2D models in terms of vertical accuracy in grid cells. It was also found that 3D models for open channel flows already developed recently but not for floodplain. Hence, it was suggested that a 3D model for floodplain should be developed by considering all hydrological and high resolution topographic parameter's models, discussed in this review, to enhance the findings of causes and effects of flooding.

  6. Function Analysis and Decomposistion using Function Analysis Systems Technique

    SciTech Connect

    J. R. Wixson

    1999-06-01

    The "Father of Value Analysis", Lawrence D. Miles, was a design engineer for General Electric in Schenectady, New York. Miles developed the concept of function analysis to address difficulties in satisfying the requirements to fill shortages of high demand manufactured parts and electrical components during World War II. His concept of function analysis was further developed in the 1960s by Charles W. Bytheway, a design engineer at Sperry Univac in Salt Lake City, Utah. Charles Bytheway extended Mile's function analysis concepts and introduced the methodology called Function Analysis Systems Techniques (FAST) to the Society of American Value Engineers (SAVE) at their International Convention in 1965 (Bytheway 1965). FAST uses intuitive logic to decompose a high level, or objective function into secondary and lower level functions that are displayed in a logic diagram called a FAST model. Other techniques can then be applied to allocate functions to components, individuals, processes, or other entities that accomplish the functions. FAST is best applied in a team setting and proves to be an effective methodology for functional decomposition, allocation, and alternative development.

  7. Function Analysis and Decomposistion using Function Analysis Systems Technique

    SciTech Connect

    Wixson, James Robert

    1999-06-01

    The "Father of Value Analysis", Lawrence D. Miles, was a design engineer for General Electric in Schenectady, New York. Miles developed the concept of function analysis to address difficulties in satisfying the requirements to fill shortages of high demand manufactured parts and electrical components during World War II. His concept of function analysis was further developed in the 1960s by Charles W. Bytheway, a design engineer at Sperry Univac in Salt Lake City, Utah. Charles Bytheway extended Mile's function analysis concepts and introduced the methodology called Function Analysis Systems Technique (FAST) to the Society of American Value Engineers (SAVE) at their International Convention in 1965 (Bytheway 1965). FAST uses intuitive logic to decompose a high level, or objective function into secondary and lower level functions that are displayed in a logic diagram called a FAST model. Other techniques can then be applied to allocate functions to components, individuals, processes, or other entities that accomplish the functions. FAST is best applied in a team setting and proves to be an effective methodology for functional decomposition, allocation, and alternative development.

  8. Analysis of diagnostic calorimeter data by the transfer function technique

    SciTech Connect

    Delogu, R. S. Pimazzoni, A.; Serianni, G.; Poggi, C.; Rossi, G.

    2016-02-15

    This paper describes the analysis procedure applied to the thermal measurements on the rear side of a carbon fibre composite calorimeter with the purpose of reconstructing the energy flux due to an ion beam colliding on the front side. The method is based on the transfer function technique and allows a fast analysis by means of the fast Fourier transform algorithm. Its efficacy has been tested both on simulated and measured temperature profiles: in all cases, the energy flux features are well reproduced and beamlets are well resolved. Limits and restrictions of the method are also discussed, providing strategies to handle issues related to signal noise and digital processing.

  9. Evaluation of Mercury in Environmental Samples by a Supramolecular SolventBased Dispersive LiquidLiquid Microextraction Method Before Analysis by a Cold Vapor Generation Technique.

    PubMed

    Ali, Jamshed; Tuzen, Mustafa; Kazi, Tasneem G

    2017-02-01

    Supramolecular solvent–based dispersive liquid–liquid microextraction was used as a preconcentration method for the determination of trace levels of Hg. This simple method accurately measured oxidized HgII content in claystone and sandstone samples obtained from the Thar Coalfield in Pakistan. Cold vapor atomic absorption spectrometry was used as the detection technique because it is reliable and accurate. The HgII in acidic media forms a complex with dithizone (DTz) in the presence of supramolecular solvent (tetrahydrofuran and 1-undecanol), forming reverse micelles. Formation of the Hg-DTz complex was achieved to increase the interactions with the supramolecular solvent phase at pH 2.5 under the optimized experimental conditions. After addition of the supramolecular solvent to the aqueous solution, the micelles were uniformly mixed using a vortex mixer. The cloudy solution was centrifuged, and the Hg-DTz complex was extracted into the supramolecular solvent phase. Under optimized experimental conditions, the LOD and enrichment factor were found to be 5.61 ng/L and 77.8, respectively. Accuracy of the developed method was checked with Certified Reference Materials. The developed method was successfully applied for the determination of HgII in claystone and sandstone samples from the Block VII and Block VIII areas of the Thar Coalfield on the basis of depth.

  10. Nuclear based techniques for detection of contraband

    SciTech Connect

    Gozani, T.

    1993-12-31

    The detection of contraband such as explosives and drugs concealed in luggage or other container can be quite difficult. Nuclear techniques offer capabilities which are essential to having effective detection devices. This report describes the features of various nuclear techniques and instrumentation.

  11. Analysis of Jordanian phosphate using nuclear techniques

    SciTech Connect

    Saleh, N.S.; Al-Saleh, K.A.

    1987-09-01

    The concentrations of major, minor and trace element content of Jordanian phosphate ores were determined using different complementary nuclear techniques. These techniques were: Gamma-Ray Spectrometry (GRS), X-Ray Fluorescence (XRF) and Proton Induced X-ray Emission (PIXE). Special emphasis was given to the determination of Uranium and rare earth element concentrations.

  12. Comparative Analysis of Bracket Slot Dimensions Evaluating Different Manufacturing Techniques

    DTIC Science & Technology

    2015-04-24

    Dimensions Evaluating Different Manufacturing Techniques JS. A. DeMeo APPROVED: fk_ 4 ’__/ . .S, M.S., Supervising Professor ,....-------__ ’ - ~l!i...manuscript entitled : "Comparative Analysis of Bracket Slot Dimensions Evaluating Different Manufactur ing Techniques" is appropriately acknowledged and...Harm s Way TITLE Comparative Analysis of Bracket Slot Dimensions Evaluating Different Manufacturing Techniques A THESIS Presented to the

  13. Investigation of electroforming techniques, literature analysis report

    NASA Technical Reports Server (NTRS)

    Malone, G. A.

    1975-01-01

    A literature analysis is presented of reports, specifications, and documented experiences with the use of electroforming to produce copper and nickel structures for aerospace and other engineering applications. The literature period covered is from 1948 to 1974. Specific effort was made to correlate mechanical property data for the electrodeposited material with known electroforming solution compositions and operating conditions. From this survey, electrolytes are suggested for selection to electroform copper and nickel outer shells on regeneratively cooled thrust chamber liners, and other devices subject to thermal and pressure exposure, based on mechanical properties obtainable, performance under various thermal environments, and ease of process control for product reproducibility. Processes of potential value in obtaining sound bonds between electrodeposited copper and nickel and copper alloy substrates are also discussed.

  14. Near Real Time Quantitative Gas Analysis Techniques

    NASA Astrophysics Data System (ADS)

    Herget, William F.; Tromp, Marianne L.; Anderson, Charles R.

    1985-12-01

    A Fourier transform infrared (FT-IR) - based system has been developed and is undergoing evaluation for near real time multicomponent quantitative analysis of undiluted gaseous automotive exhaust emissions. The total system includes: (1) a gas conditioning system (GCS) for tracer gas injection, gas mixing, and temperature stabilization; and (2) an exhaust gas analyzer (EGA) consisting of a sample cell, an FT-IR system, and a computerized data processing system. Tests have shown that the system can monitor about 20 individual species (concentrations down to the 1-20 ppm range) with a time resolution of one second. Tests have been conducted on a chassis dynamometer system utilizing different autos, different fuels, and different driving cycles. Results were compared with those obtained using a standard constant volume sampling (CVS) system.

  15. A technique for human error analysis (ATHEANA)

    SciTech Connect

    Cooper, S.E.; Ramey-Smith, A.M.; Wreathall, J.; Parry, G.W.

    1996-05-01

    Probabilistic risk assessment (PRA) has become an important tool in the nuclear power industry, both for the Nuclear Regulatory Commission (NRC) and the operating utilities. Human reliability analysis (HRA) is a critical element of PRA; however, limitations in the analysis of human actions in PRAs have long been recognized as a constraint when using PRA. A multidisciplinary HRA framework has been developed with the objective of providing a structured approach for analyzing operating experience and understanding nuclear plant safety, human error, and the underlying factors that affect them. The concepts of the framework have matured into a rudimentary working HRA method. A trial application of the method has demonstrated that it is possible to identify potentially significant human failure events from actual operating experience which are not generally included in current PRAs, as well as to identify associated performance shaping factors and plant conditions that have an observable impact on the frequency of core damage. A general process was developed, albeit in preliminary form, that addresses the iterative steps of defining human failure events and estimating their probabilities using search schemes. Additionally, a knowledge- base was developed which describes the links between performance shaping factors and resulting unsafe actions.

  16. Risk-based maintenance--techniques and applications.

    PubMed

    Arunraj, N S; Maiti, J

    2007-04-11

    Plant and equipment, however well designed, will not remain safe or reliable if it is not maintained. The general objective of the maintenance process is to make use of the knowledge of failures and accidents to achieve the possible safety with the lowest possible cost. The concept of risk-based maintenance was developed to inspect the high-risk components usually with greater frequency and thoroughness and to maintain in a greater manner, to achieve tolerable risk criteria. Risk-based maintenance methodology provides a tool for maintenance planning and decision making to reduce the probability of failure of equipment and the consequences of failure. In this paper, the risk analysis and risk-based maintenance methodologies were identified and classified into suitable classes. The factors affecting the quality of risk analysis were identified and analyzed. The applications, input data and output data were studied to understand their functioning and efficiency. The review showed that there is no unique way to perform risk analysis and risk-based maintenance. The use of suitable techniques and methodologies, careful investigation during the risk analysis phase, and its detailed and structured results are necessary to make proper risk-based maintenance decisions.

  17. 48 CFR 15.404-1 - Proposal analysis techniques.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... are: I Price Analysis, II Quantitative Techniques for Contract Pricing, III Cost Analysis, IV Advanced... obtained through market research for the same or similar items. (vii) Analysis of data other than...

  18. DCT-based cyber defense techniques

    NASA Astrophysics Data System (ADS)

    Amsalem, Yaron; Puzanov, Anton; Bedinerman, Anton; Kutcher, Maxim; Hadar, Ofer

    2015-09-01

    With the increasing popularity of video streaming services and multimedia sharing via social networks, there is a need to protect the multimedia from malicious use. An attacker may use steganography and watermarking techniques to embed malicious content, in order to attack the end user. Most of the attack algorithms are robust to basic image processing techniques such as filtering, compression, noise addition, etc. Hence, in this article two novel, real-time, defense techniques are proposed: Smart threshold and anomaly correction. Both techniques operate at the DCT domain, and are applicable for JPEG images and H.264 I-Frames. The defense performance was evaluated against a highly robust attack, and the perceptual quality degradation was measured by the well-known PSNR and SSIM quality assessment metrics. A set of defense techniques is suggested for improving the defense efficiency. For the most aggressive attack configuration, the combination of all the defense techniques results in 80% protection against cyber-attacks with PSNR of 25.74 db.

  19. Nasal base narrowing: the combined alar base excision technique.

    PubMed

    Foda, Hossam M T

    2007-01-01

    To evaluate the role of the combined alar base excision technique in narrowing the nasal base and correcting excessive alar flare. The study included 60 cases presenting with a wide nasal base and excessive alar flaring. The surgical procedure combined an external alar wedge resection with an internal vestibular floor excision. All cases were followed up for a mean of 32 (range, 12-144) months. Nasal tip modification and correction of any preexisting caudal septal deformities were always completed before the nasal base narrowing. The mean width of the external alar wedge excised was 7.2 (range, 4-11) mm, whereas the mean width of the sill excision was 3.1 (range, 2-7) mm. Completing the internal excision first resulted in a more conservative external resection, thus avoiding any blunting of the alar-facial crease. No cases of postoperative bleeding, infection, or keloid formation were encountered, and the external alar wedge excision healed with an inconspicuous scar that was well hidden in the depth of the alar-facial crease. Finally, the risk of notching of the alar rim, which can occur at the junction of the external and internal excisions, was significantly reduced by adopting a 2-layered closure of the vestibular floor (P = .01). The combined alar base excision resulted in effective narrowing of the nasal base with elimination of excessive alar flare. Commonly feared complications, such as blunting of the alar-facial crease or notching of the alar rim, were avoided by using simple modifications in the technique of excision and closure.

  20. Modular techniques for dynamic fault-tree analysis

    NASA Astrophysics Data System (ADS)

    Patterson-Hine, F. A.; Dugan, Joanne B.

    It is noted that current approaches used to assess the dependability of complex systems such as Space Station Freedom and the Air Traffic Control System are incapable of handling the size and complexity of these highly integrated designs. A novel technique for modeling such systems which is built upon current techniques in Markov theory and combinatorial analysis is described. It enables the development of a hierarchical representation of system behavior which is more flexible than either technique alone. A solution strategy which is based on an object-oriented approach to model representation and evaluation is discussed. The technique is virtually transparent to the user since the fault tree models can be built graphically and the objects defined automatically. The tree modularization procedure allows the two model types, Markov and combinatoric, to coexist and does not require that the entire fault tree be translated to a Markov chain for evaluation. This effectively reduces the size of the Markov chain required and enables solutions with less truncation, making analysis of longer mission times possible. Using the fault-tolerant parallel processor as an example, a model is built and solved for a specific mission scenario and the solution approach is illustrated in detail.

  1. Applying knowledge compilation techniques to model-based reasoning

    NASA Technical Reports Server (NTRS)

    Keller, Richard M.

    1991-01-01

    Researchers in the area of knowledge compilation are developing general purpose techniques for improving the efficiency of knowledge-based systems. In this article, an attempt is made to define knowledge compilation, to characterize several classes of knowledge compilation techniques, and to illustrate how some of these techniques can be applied to improve the performance of model-based reasoning systems.

  2. Cochlear implant simulator for surgical technique analysis

    NASA Astrophysics Data System (ADS)

    Turok, Rebecca L.; Labadie, Robert F.; Wanna, George B.; Dawant, Benoit M.; Noble, Jack H.

    2014-03-01

    Cochlear Implant (CI) surgery is a procedure in which an electrode array is inserted into the cochlea. The electrode array is used to stimulate auditory nerve fibers and restore hearing for people with severe to profound hearing loss. The primary goals when placing the electrode array are to fully insert the array into the cochlea while minimizing trauma to the cochlea. Studying the relationship between surgical outcome and various surgical techniques has been difficult since trauma and electrode placement are generally unknown without histology. Our group has created a CI placement simulator that combines an interactive 3D visualization environment with a haptic-feedback-enabled controller. Surgical techniques and patient anatomy can be varied between simulations so that outcomes can be studied under varied conditions. With this system, we envision that through numerous trials we will be able to statistically analyze how outcomes relate to surgical techniques. As a first test of this system, in this work, we have designed an experiment in which we compare the spatial distribution of forces imparted to the cochlea in the array insertion procedure when using two different but commonly used surgical techniques for cochlear access, called round window and cochleostomy access. Our results suggest that CIs implanted using round window access may cause less trauma to deeper intracochlear structures than cochleostomy techniques. This result is of interest because it challenges traditional thinking in the otological community but might offer an explanation for recent anecdotal evidence that suggests that round window access techniques lead to better outcomes.

  3. Node Augmentation Technique in Bayesian Network Evidence Analysis and Marshaling

    SciTech Connect

    Keselman, Dmitry; Tompkins, George H; Leishman, Deborah A

    2010-01-01

    Given a Bayesian network, sensitivity analysis is an important activity. This paper begins by describing a network augmentation technique which can simplifY the analysis. Next, we present two techniques which allow the user to determination the probability distribution of a hypothesis node under conditions of uncertain evidence; i.e. the state of an evidence node or nodes is described by a user specified probability distribution. Finally, we conclude with a discussion of three criteria for ranking evidence nodes based on their influence on a hypothesis node. All of these techniques have been used in conjunction with a commercial software package. A Bayesian network based on a directed acyclic graph (DAG) G is a graphical representation of a system of random variables that satisfies the following Markov property: any node (random variable) is independent of its non-descendants given the state of all its parents (Neapolitan, 2004). For simplicities sake, we consider only discrete variables with a finite number of states, though most of the conclusions may be generalized.

  4. Using remote sensing techniques and field-based structural analysis to explore new gold and associated mineral sites around Al-Hajar mine, Asir terrane, Arabian Shield

    NASA Astrophysics Data System (ADS)

    Sonbul, Abdullah R.; El-Shafei, Mohamed K.; Bishta, Adel Z.

    2016-05-01

    Modern earth resource satellites provide huge amounts of digital imagery at different resolutions. These satellite imageries are considered one of the most significant sources of data for mineral exploration. Image processing techniques were applied to the exposed rocks around the Al-Aqiq area of the Asir terrane in the southern part of the Arabian Shield. The area under study has two sub-parallel N-S trending metamorphic belts of green-schist facies. The first belt is located southeast of Al-Aqiq, where the Al-Hajar Gold Mine is situated. It is essentially composed of metavolcanics and metasedimentary rocks, and it is intruded by different plutonic rocks of primarily diorite, syenite and porphyritic granite. The second belt is located northwest of Al-Aqiq, and it is composed of metavolcanics and metasedimentary rocks and is intruded by granite bodies. The current study aimed to distinguish the lithological units, detect and map the alteration zones, and extract the major fault lineaments around the Al-Hajar gold prospect. Digital satellite imageries, including Landsat 7 ETM + multispectral and panchromatic and SPOT-5 were used in addition to field verification. Areas with similar spectral signatures to the prospect were identified in the nearby metamorphic belt; it was considered as a target area and was inspected in the field. The relationships between the alteration zones, the mineral deposits and the structural elements were used to locate the ore-bearing zones in the subsurface. The metasedimentary units of the target area showed a dextral-ductile shearing top-to-the-north and the presence of dominant mineralized quartz vein-system. The area to the north of the Al-Hajar prospect showed also sub-parallel shear zones along which different types of alterations were detected. Field-based criteria such as hydrothermal breccia, jasper, iron gossans and porphyritic granite strongly indicate the presence of porphyry-type ore deposits in Al-Hajar metamorphic belt that

  5. A quantitative study of gully erosion based on object-oriented analysis techniques: a case study in Beiyanzikou catchment of Qixia, Shandong, China.

    PubMed

    Wang, Tao; He, Fuhong; Zhang, Anding; Gu, Lijuan; Wen, Yangmao; Jiang, Weiguo; Shao, Hongbo

    2014-01-01

    This paper took a subregion in a small watershed gully system at Beiyanzikou catchment of Qixia, China, as a study and, using object-orientated image analysis (OBIA), extracted shoulder line of gullies from high spatial resolution digital orthophoto map (DOM) aerial photographs. Next, it proposed an accuracy assessment method based on the adjacent distance between the boundary classified by remote sensing and points measured by RTK-GPS along the shoulder line of gullies. Finally, the original surface was fitted using linear regression in accordance with the elevation of two extracted edges of experimental gullies, named Gully 1 and Gully 2, and the erosion volume was calculated. The results indicate that OBIA can effectively extract information of gullies; average range difference between points field measured along the edge of gullies and classified boundary is 0.3166 m, with variance of 0.2116 m. The erosion area and volume of two gullies are 2141.6250 m(2), 5074.1790 m(3) and 1316.1250 m(2), 1591.5784 m(3), respectively. The results of the study provide a new method for the quantitative study of small gully erosion.

  6. Applications of external cavity diode laser-based technique to noninvasive clinical diagnosis using expired breath ammonia analysis: chronic kidney disease, epilepsy

    NASA Astrophysics Data System (ADS)

    Bayrakli, Ismail; Turkmen, Aysenur; Akman, Hatice; Sezer, M. Tugrul; Kutluhan, Suleyman

    2016-08-01

    An external cavity laser (ECL)-based off-axis cavity-enhanced absorption spectroscopy was applied to noninvasive clinical diagnosis using expired breath ammonia analysis: (1) the correlation between breath ammonia levels and blood parameters related to chronic kidney disease (CKD) was investigated and (2) the relationship between breath ammonia levels and blood concentrations of valproic acid (VAP) was studied. The concentrations of breath ammonia in 15 healthy volunteers, 10 epilepsy patients (before and after taking VAP), and 27 patients with different stages of CKD were examined. The range of breath ammonia levels was 120 to 530 ppb for healthy subjects and 710 to 10,400 ppb for patients with CKD. There was a statistically significant positive correlation between breath ammonia concentrations and urea, blood urea nitrogen, creatinine, or estimated glomerular filtration rate in 27 patients. It was demonstrated that taking VAP gave rise to increasing breath ammonia levels. A statistically significant difference was found between the levels of exhaled ammonia (NH3) in healthy subjects and in patients with epilepsy before and after taking VAP. The results suggest that our breath ammonia measurement system has great potential as an easy, noninvasive, real-time, and continuous monitor of the clinical parameters related to epilepsy and CKD.

  7. A Quantitative Study of Gully Erosion Based on Object-Oriented Analysis Techniques: A Case Study in Beiyanzikou Catchment of Qixia, Shandong, China

    PubMed Central

    Wang, Tao; He, Fuhong; Zhang, Anding; Gu, Lijuan; Wen, Yangmao; Jiang, Weiguo; Shao, Hongbo

    2014-01-01

    This paper took a subregion in a small watershed gully system at Beiyanzikou catchment of Qixia, China, as a study and, using object-orientated image analysis (OBIA), extracted shoulder line of gullies from high spatial resolution digital orthophoto map (DOM) aerial photographs. Next, it proposed an accuracy assessment method based on the adjacent distance between the boundary classified by remote sensing and points measured by RTK-GPS along the shoulder line of gullies. Finally, the original surface was fitted using linear regression in accordance with the elevation of two extracted edges of experimental gullies, named Gully 1 and Gully 2, and the erosion volume was calculated. The results indicate that OBIA can effectively extract information of gullies; average range difference between points field measured along the edge of gullies and classified boundary is 0.3166 m, with variance of 0.2116 m. The erosion area and volume of two gullies are 2141.6250 m2, 5074.1790 m3 and 1316.1250 m2, 1591.5784 m3, respectively. The results of the study provide a new method for the quantitative study of small gully erosion. PMID:24616626

  8. Calculation of the elastic properties of prosthetic knee components with an iterative finite element-based modal analysis: quantitative comparison of different measuring techniques.

    PubMed

    Woiczinski, Matthias; Tollrian, Christopher; Schröder, Christian; Steinbrück, Arnd; Müller, Peter E; Jansson, Volkmar

    2013-08-01

    With the aging but still active population, research on total joint replacements relies increasingly on numerical methods, such as finite element analysis, to improve wear resistance of components. However, the validity of finite element models largely depends on the accuracy of their material behavior and geometrical representation. In particular, material properties are often based on manufacturer data or literature reports, but can alternatively be estimated by matching experimental measurements and structural predictions through modal analyses and identification of eigenfrequencies. The aim of the present study was to compare the accuracy of common setups used for estimating the eigenfrequencies of typical components often used in prosthetized joints. Eigenfrequencies of cobalt-chrome and ultra-high-molecular weight polyethylene components were therefore measured with four different setups, and used in modal analyses of corresponding finite element models for an iterative adjustment of their material properties. Results show that for the low-damped cobalt chromium endoprosthesis components, all common measuring setups provided accurate measurements. In the case of high-damped structures, measurements were only possible with setups including a continuously excitation system such as electrodynamic shakers. This study demonstrates that the iterative back-calculation of eigenfrequencies can be a reliable method to estimate the elastic properties for finite element models.

  9. Retention of denture bases fabricated by three different processing techniques – An in vivo study

    PubMed Central

    Chalapathi Kumar, V. H.; Surapaneni, Hemchand; Ravikiran, V.; Chandra, B. Sarat; Balusu, Srilatha; Reddy, V. Naveen

    2016-01-01

    Aim: Distortion due to Polymerization shrinkage compromises the retention. To evaluate the amount of retention of denture bases fabricated by conventional, anchorized, and injection molding polymerization techniques. Materials and Methods: Ten completely edentulous patients were selected, impressions were made, and master cast obtained was duplicated to fabricate denture bases by three polymerization techniques. Loop was attached to the finished denture bases to estimate the force required to dislodge them by retention apparatus. Readings were subjected to nonparametric Friedman two-way analysis of variance followed by Bonferroni correction methods and Wilcoxon matched-pairs signed-ranks test. Results: Denture bases fabricated by injection molding (3740 g), anchorized techniques (2913 g) recorded greater retention values than conventional technique (2468 g). Significant difference was seen between these techniques. Conclusions: Denture bases obtained by injection molding polymerization technique exhibited maximum retention, followed by anchorized technique, and least retention was seen in conventional molding technique. PMID:27382542

  10. Analysis and calibration techniques for superconducting resonators.

    PubMed

    Cataldo, Giuseppe; Wollack, Edward J; Barrentine, Emily M; Brown, Ari D; Moseley, S Harvey; U-Yen, Kongpop

    2015-01-01

    A method is proposed and experimentally explored for in-situ calibration of complex transmission data for superconducting microwave resonators. This cryogenic calibration method accounts for the instrumental transmission response between the vector network analyzer reference plane and the device calibration plane. Once calibrated, the observed resonator response is analyzed in detail by two approaches. The first, a phenomenological model based on physically realizable rational functions, enables the extraction of multiple resonance frequencies and widths for coupled resonators without explicit specification of the circuit network. In the second, an ABCD-matrix representation for the distributed transmission line circuit is used to model the observed response from the characteristic impedance and propagation constant. When used in conjunction with electromagnetic simulations, the kinetic inductance fraction can be determined with this method with an accuracy of 2%. Datasets for superconducting microstrip and coplanar-waveguide resonator devices were investigated and a recovery within 1% of the observed complex transmission amplitude was achieved with both analysis approaches. The experimental configuration used in microwave characterization of the devices and self-consistent constraints for the electromagnetic constitutive relations for parameter extraction are also presented.

  11. Techniques in micromagnetic simulation and analysis

    NASA Astrophysics Data System (ADS)

    Kumar, D.; Adeyeye, A. O.

    2017-08-01

    Advances in nanofabrication now allow us to manipulate magnetic material at micro- and nanoscales. As the steps of design, modelling and simulation typically precede that of fabrication, these improvements have also granted a significant boost to the methods of micromagnetic simulations (MSs) and analyses. The increased availability of massive computational resources has been another major contributing factor. Magnetization dynamics at micro- and nanoscale is described by the Landau-Lifshitz-Gilbert (LLG) equation, which is an ordinary differential equation (ODE) in time. Several finite difference method (FDM) and finite element method (FEM) based LLG solvers are now widely use to solve different kind of micromagnetic problems. In this review, we present a few patterns in the ways MSs are being used in the pursuit of new physics. An important objective of this review is to allow one to make a well informed decision on the details of simulation and analysis procedures needed to accomplish a given task using computational micromagnetics. We also examine the effect of different simulation parameters to underscore and extend some best practices. Lastly, we examine different methods of micromagnetic analyses which are used to process simulation results in order to extract physically meaningful and valuable information.

  12. A procedural analysis of correspondence training techniques

    PubMed Central

    Paniagua, Freddy A.

    1990-01-01

    A variety of names have been given to procedures used in correspondence training, some more descriptive than others. In this article I argue that a terminology more accurately describing actual procedures, rather than the conceptual function that those procedures are assumed to serve, would benefit the area of correspondence training. I identify two documented procedures during the reinforcement of verbalization phase and five procedures during the reinforcement of correspondence phase and suggest that those procedures can be classified, or grouped into nonoverlapping categories, by specifying the critical dimensions of those procedures belonging to a single category. I suggest that the names of such nonoverlapping categories should clearly specify the dimensions on which the classification is based in order to facilitate experimental comparison of procedures, and to be able to recognize when a new procedure (as opposed to a variant of one already in existence) is developed. Future research involving comparative analysis across and within procedures is discussed within the framework of the proposed classification. PMID:22478059

  13. Long base line interferometry: a new technique.

    PubMed

    Broten, N W; Legg, T H; Locke, J L; McLeish, C W; Richards, R S; Chisholm, R M; Gush, H P; Yen, J L; Galt, J A

    1967-06-23

    The technique of using magnetic-tape recorders and atomic frequency standards to operate two widely separated radio telescopes as a phase-coherent interferometer when the stations have no radio-frequency connecting link has been successfully tested at the National Research Council of Canada's Algonquin Radio Observatory.

  14. Shrinkage of selected southcentral Alaskan glaciers AD 1900-2010 - a spatio-temporal analysis using photogrammetric, GIS-based and historical techniques

    NASA Astrophysics Data System (ADS)

    Kienholz, Christian; Prakash, Anupma; Nussbaumer, Samuel; Zumbühl, Heinz

    2010-05-01

    The knowledge about the recent glacier change in the Chugach Mountains of southcentral Alaska is still scarce. In an effort to fill this gap we took an interdisciplinary approach and reconstructed the history of ten selected glaciers in the vicinity of Valdez (e.g., Valdez Glacier) and Cordova (e.g., Sheridan, Childs and Allen Glacier): Historical data such as early maps and photographs allowed for refining the glacier outlines of the early 20th century. Based upon photogrammetric methods, we further derived elevation models and orthomosaics from various airborne images. The Alaska High Altitude Program (AHAP) imagery, taken during the late 1970s, were the primary data of interest and provided a valuable source of information, primarily because they had not been quantitatively evaluated before. Together with the first USGS maps from the1950s and most recent data (airborne LiDAR; as well as air- and space-borne optical data), they allowed for determining the volume and area changes that have occurred within the last 60 years. A GIS analysis revealed that the recent decades have been characterized by rising equilibrium lines and thus retreating and thinning glaciers. The glaciers did not show a consistent recession pattern, which might partly be attributed to the varying area-altitude distributions. Simple hypsographic modeling indicated that the glaciers generally are far away from a state of equilibrium. Given the current climate scenarios and the unfavorable hypsography of most glaciers, the hitherto prevailing trend of glacier melt and recession is likely to continue or accelerate in the upcoming years. Reliably predicting the extents and characteristics of these glaciers at the end of the century remains an important yet poorly answered research question.

  15. Laser image denoising technique based on multi-fractal theory

    NASA Astrophysics Data System (ADS)

    Du, Lin; Sun, Huayan; Tian, Weiqing; Wang, Shuai

    2014-02-01

    The noise of laser images is complex, which includes additive noise and multiplicative noise. Considering the features of laser images, the basic processing capacity and defects of the common algorithm, this paper introduces the fractal theory into the research of laser image denoising. The research of laser image denoising is implemented mainly through the analysis of the singularity exponent of each pixel in fractal space and the feature of multi-fractal spectrum. According to the quantitative and qualitative evaluation of the processed image, the laser image processing technique based on fractal theory not only effectively removes the complicated noise of the laser images obtained by range-gated laser active imaging system, but can also maintains the detail information when implementing the image denoising processing. For different laser images, multi-fractal denoising technique can increase SNR of the laser image at least 1~2dB compared with other denoising techniques, which basically meet the needs of the laser image denoising technique.

  16. An analysis technique for testing log grades

    Treesearch

    Carl A. Newport; William G. O' Regan

    1963-01-01

    An analytical technique that may be used in evaluating log-grading systems is described. It also provides means of comparing two or more grading systems, or a proposed change with the system from which it was developed. The total volume and computed value of lumber from each sample log are the basic data used.

  17. EEG source analysis using space mapping techniques

    NASA Astrophysics Data System (ADS)

    Crevecoeur, G.; Hallez, H.; van Hese, P.; D'Asseler, Y.; Dupre, L.; van de Walle, R.

    2008-06-01

    The electroencephalogram (EEG) measures potential differences, generated by electrical activity in brain tissue, between scalp electrodes. The EEG potentials can be calculated by the quasi-static Poisson equation in a certain head model. It is well known that the electrical dipole (source) which best fits the measured EEG potentials is obtained by an inverse problem. The dipole parameters are obtained by finding the global minimum of the relative residual energy (RRE). For the first time, the space mapping technique (SM technique) is used for minimizing the RRE. The SM technique aims at aligning two different simulation models: a fine model, accurate but CPU-time expensive, and a coarse model, computationally fast but less accurate than the fine one. The coarse model is a semi-analytical model, the so-called three-shell concentric sphere model. The fine model numerically solves the Poisson equation in a realistic head model. If we use the aggressive space mapping (ASM) algorithm, the errors on the dipole location are too large. The hybrid aggressive space mapping (HASM) on the other hand has better convergence properties, yielding a reduction in dipole location errors. The computational effort of HASM is greater than ASM but smaller than using direct optimization techniques.

  18. Recent trends in particle size analysis techniques

    NASA Technical Reports Server (NTRS)

    Kang, S. H.

    1984-01-01

    Recent advances and developments in the particle-sizing technologies are briefly reviewed in accordance with three operating principles including particle size and shape descriptions. Significant trends of the particle size analysing equipment recently developed show that compact electronic circuitry and rapid data processing systems were mainly adopted in the instrument design. Some newly developed techniques characterizing the particulate system were also introduced.

  19. On Bitstream Based Edge Detection Techniques

    DTIC Science & Technology

    2009-01-01

    IEEE Transactions on, vol. 38, no. 1, pp. xviii– iv, Feb 1992. [5] Rafael C. Gonzalez and Richard E. Woods, Digital Image Processing, Addison-Wesley...Carmona-Poyato, R. Medina- Carnicer, and F. J. Madrid- Cuevas , “Automatic genera- tion of consensus ground truth for the comparison of edge detection techniques,” Image Vision Comput., vol. 26, no. 4, pp. 496–511, 2008.

  20. Noninvasive in vivo glucose sensing using an iris based technique

    NASA Astrophysics Data System (ADS)

    Webb, Anthony J.; Cameron, Brent D.

    2011-03-01

    Physiological glucose monitoring is important aspect in the treatment of individuals afflicted with diabetes mellitus. Although invasive techniques for glucose monitoring are widely available, it would be very beneficial to make such measurements in a noninvasive manner. In this study, a New Zealand White (NZW) rabbit animal model was utilized to evaluate a developed iris-based imaging technique for the in vivo measurement of physiological glucose concentration. The animals were anesthetized with isoflurane and an insulin/dextrose protocol was used to control blood glucose concentration. To further help restrict eye movement, a developed ocular fixation device was used. During the experimental time frame, near infrared illuminated iris images were acquired along with corresponding discrete blood glucose measurements taken with a handheld glucometer. Calibration was performed using an image based Partial Least Squares (PLS) technique. Independent validation was also performed to assess model performance along with Clarke Error Grid Analysis (CEGA). Initial validation results were promising and show that a high percentage of the predicted glucose concentrations are within 20% of the reference values.

  1. Classification Techniques for Multivariate Data Analysis.

    DTIC Science & Technology

    1980-03-28

    analysis among biologists, botanists, and ecologists, while some social scientists may refer "typology". Other frequently encountered terms are pattern...the determinantal equation: lB -XW 0 (42) 49 The solutions X. are the eigenvalues of the matrix W-1 B 1 as in discriminant analysis. There are t non...Statistical Package for Social Sciences (SPSS) (14) subprogram FACTOR was used for the principal components analysis. It is designed both for the factor

  2. Application of unsupervised analysis techniques to lung cancer patient data

    PubMed Central

    Lynch, Chip M.; van Berkel, Victor H.

    2017-01-01

    This study applies unsupervised machine learning techniques for classification and clustering to a collection of descriptive variables from 10,442 lung cancer patient records in the Surveillance, Epidemiology, and End Results (SEER) program database. The goal is to automatically classify lung cancer patients into groups based on clinically measurable disease-specific variables in order to estimate survival. Variables selected as inputs for machine learning include Number of Primaries, Age, Grade, Tumor Size, Stage, and TNM, which are numeric or can readily be converted to numeric type. Minimal up-front processing of the data enables exploring the out-of-the-box capabilities of established unsupervised learning techniques, with little human intervention through the entire process. The output of the techniques is used to predict survival time, with the efficacy of the prediction representing a proxy for the usefulness of the classification. A basic single variable linear regression against each unsupervised output is applied, and the associated Root Mean Squared Error (RMSE) value is calculated as a metric to compare between the outputs. The results show that self-ordering maps exhibit the best performance, while k-Means performs the best of the simpler classification techniques. Predicting against the full data set, it is found that their respective RMSE values (15.591 for self-ordering maps and 16.193 for k-Means) are comparable to supervised regression techniques, such as Gradient Boosting Machine (RMSE of 15.048). We conclude that unsupervised data analysis techniques may be of use to classify patients by defining the classes as effective proxies for survival prediction. PMID:28910336

  3. Application of unsupervised analysis techniques to lung cancer patient data.

    PubMed

    Lynch, Chip M; van Berkel, Victor H; Frieboes, Hermann B

    2017-01-01

    This study applies unsupervised machine learning techniques for classification and clustering to a collection of descriptive variables from 10,442 lung cancer patient records in the Surveillance, Epidemiology, and End Results (SEER) program database. The goal is to automatically classify lung cancer patients into groups based on clinically measurable disease-specific variables in order to estimate survival. Variables selected as inputs for machine learning include Number of Primaries, Age, Grade, Tumor Size, Stage, and TNM, which are numeric or can readily be converted to numeric type. Minimal up-front processing of the data enables exploring the out-of-the-box capabilities of established unsupervised learning techniques, with little human intervention through the entire process. The output of the techniques is used to predict survival time, with the efficacy of the prediction representing a proxy for the usefulness of the classification. A basic single variable linear regression against each unsupervised output is applied, and the associated Root Mean Squared Error (RMSE) value is calculated as a metric to compare between the outputs. The results show that self-ordering maps exhibit the best performance, while k-Means performs the best of the simpler classification techniques. Predicting against the full data set, it is found that their respective RMSE values (15.591 for self-ordering maps and 16.193 for k-Means) are comparable to supervised regression techniques, such as Gradient Boosting Machine (RMSE of 15.048). We conclude that unsupervised data analysis techniques may be of use to classify patients by defining the classes as effective proxies for survival prediction.

  4. Advanced NMR-based techniques for pore structure analysis of coal. Quarterly report No. 9, October 1, 1993--December 30, 1993

    SciTech Connect

    Smith, D.M.

    1993-12-31

    One of the main problems in coal utilization is the inability to properly characterize its complex pore structure. Coals typically have micro/ultra-micro pores but they also exhibit meso and macroporosity. Conventional pore size techniques (adsorption/condensation, mercury porosimetry) are limited because of this broad pore size range, microporosity, reactive nature of coal, samples must be completely dried, and network/percolation effects. Small angle scattering is limited because it probes both open and dosed pores. Although one would not expect any single technique to provide a satisfactory description of a coal`s structure, it is apparent that better techniques are necessary. Small angle scattering could be improved by combining scattering and adsorption measurements. Also, the measurement of NMR parameters of various gas phase and adsorbed phase NMR active probes can provide pore structure information. We will investigate the dependence of the common NMR parameters such as chemical shifts and relaxation times of several different nuclei and compounds on the pore structure of model microporous solids, carbons, and coals. In particular, we will study the interaction between several small molecules ({sup 129}Xe, {sup 3}He, {sup 14}N{sub 2}, {sup 14}NH{sub 3}, {sup 15}N{sub 2}, {sup 13}CH{sub 4}, {sup 13}CO{sub 2}) and pore surface. Our current work may be divided into three areas: small-angle X-ray scattering (SAXS), adsorption, and NMR.

  5. An Empirical Analysis of Rough Set Categorical Clustering Techniques

    PubMed Central

    2017-01-01

    Clustering a set of objects into homogeneous groups is a fundamental operation in data mining. Recently, many attentions have been put on categorical data clustering, where data objects are made up of non-numerical attributes. For categorical data clustering the rough set based approaches such as Maximum Dependency Attribute (MDA) and Maximum Significance Attribute (MSA) has outperformed their predecessor approaches like Bi-Clustering (BC), Total Roughness (TR) and Min-Min Roughness(MMR). This paper presents the limitations and issues of MDA and MSA techniques on special type of data sets where both techniques fails to select or faces difficulty in selecting their best clustering attribute. Therefore, this analysis motivates the need to come up with better and more generalize rough set theory approach that can cope the issues with MDA and MSA. Hence, an alternative technique named Maximum Indiscernible Attribute (MIA) for clustering categorical data using rough set indiscernible relations is proposed. The novelty of the proposed approach is that, unlike other rough set theory techniques, it uses the domain knowledge of the data set. It is based on the concept of indiscernibility relation combined with a number of clusters. To show the significance of proposed approach, the effect of number of clusters on rough accuracy, purity and entropy are described in the form of propositions. Moreover, ten different data sets from previously utilized research cases and UCI repository are used for experiments. The results produced in tabular and graphical forms shows that the proposed MIA technique provides better performance in selecting the clustering attribute in terms of purity, entropy, iterations, time, accuracy and rough accuracy. PMID:28068344

  6. An Empirical Analysis of Rough Set Categorical Clustering Techniques.

    PubMed

    Uddin, Jamal; Ghazali, Rozaida; Deris, Mustafa Mat

    2017-01-01

    Clustering a set of objects into homogeneous groups is a fundamental operation in data mining. Recently, many attentions have been put on categorical data clustering, where data objects are made up of non-numerical attributes. For categorical data clustering the rough set based approaches such as Maximum Dependency Attribute (MDA) and Maximum Significance Attribute (MSA) has outperformed their predecessor approaches like Bi-Clustering (BC), Total Roughness (TR) and Min-Min Roughness(MMR). This paper presents the limitations and issues of MDA and MSA techniques on special type of data sets where both techniques fails to select or faces difficulty in selecting their best clustering attribute. Therefore, this analysis motivates the need to come up with better and more generalize rough set theory approach that can cope the issues with MDA and MSA. Hence, an alternative technique named Maximum Indiscernible Attribute (MIA) for clustering categorical data using rough set indiscernible relations is proposed. The novelty of the proposed approach is that, unlike other rough set theory techniques, it uses the domain knowledge of the data set. It is based on the concept of indiscernibility relation combined with a number of clusters. To show the significance of proposed approach, the effect of number of clusters on rough accuracy, purity and entropy are described in the form of propositions. Moreover, ten different data sets from previously utilized research cases and UCI repository are used for experiments. The results produced in tabular and graphical forms shows that the proposed MIA technique provides better performance in selecting the clustering attribute in terms of purity, entropy, iterations, time, accuracy and rough accuracy.

  7. Image analysis techniques for automated IVUS contour detection.

    PubMed

    Papadogiorgaki, Maria; Mezaris, Vasileios; Chatzizisis, Yiannis S; Giannoglou, George D; Kompatsiaris, Ioannis

    2008-09-01

    Intravascular ultrasound (IVUS) constitutes a valuable technique for the diagnosis of coronary atherosclerosis. The detection of lumen and media-adventitia borders in IVUS images represents a necessary step towards the reliable quantitative assessment of atherosclerosis. In this work, a fully automated technique for the detection of lumen and media-adventitia borders in IVUS images is presented. This comprises two different steps for contour initialization: one for each corresponding contour of interest and a procedure for the refinement of the detected contours. Intensity information, as well as the result of texture analysis, generated by means of a multilevel discrete wavelet frames decomposition, are used in two different techniques for contour initialization. For subsequently producing smooth contours, three techniques based on low-pass filtering and radial basis functions are introduced. The different combinations of the proposed methods are experimentally evaluated in large datasets of IVUS images derived from human coronary arteries. It is demonstrated that our proposed segmentation approaches can quickly and reliably perform automated segmentation of IVUS images.

  8. 48 CFR 15.404-1 - Proposal analysis techniques.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... are: I Price Analysis, II Quantitative Techniques for Contract Pricing, III Cost Analysis, IV Advanced... estimates. (vi) Comparison of proposed prices with prices obtained through market research for the same...

  9. Three Techniques for Task Analysis: Examples from the Nuclear Utilities.

    ERIC Educational Resources Information Center

    Carlisle, Kenneth E.

    1984-01-01

    Discusses three task analysis techniques utilized at the Palo Verde Nuclear Generating Station to review training programs: analysis of (1) job positions, (2) procedures, and (3) instructional presentations. All of these include task breakdown, relationship determination, and task restructuring. (MBR)

  10. Model-Based Safety Analysis

    NASA Technical Reports Server (NTRS)

    Joshi, Anjali; Heimdahl, Mats P. E.; Miller, Steven P.; Whalen, Mike W.

    2006-01-01

    System safety analysis techniques are well established and are used extensively during the design of safety-critical systems. Despite this, most of the techniques are highly subjective and dependent on the skill of the practitioner. Since these analyses are usually based on an informal system model, it is unlikely that they will be complete, consistent, and error free. In fact, the lack of precise models of the system architecture and its failure modes often forces the safety analysts to devote much of their effort to gathering architectural details about the system behavior from several sources and embedding this information in the safety artifacts such as the fault trees. This report describes Model-Based Safety Analysis, an approach in which the system and safety engineers share a common system model created using a model-based development process. By extending the system model with a fault model as well as relevant portions of the physical system to be controlled, automated support can be provided for much of the safety analysis. We believe that by using a common model for both system and safety engineering and automating parts of the safety analysis, we can both reduce the cost and improve the quality of the safety analysis. Here we present our vision of model-based safety analysis and discuss the advantages and challenges in making this approach practical.

  11. A uniform technique for flood frequency analysis.

    USGS Publications Warehouse

    Thomas, W.O.

    1985-01-01

    This uniform technique consisted of fitting the logarithms of annual peak discharges to a Pearson Type III distribution using the method of moments. The objective was to adopt a consistent approach for the estimation of floodflow frequencies that could be used in computing average annual flood losses for project evaluation. In addition, a consistent approach was needed for defining equitable flood-hazard zones as part of the National Flood Insurance Program. -from ASCE Publications Information

  12. Objective Analysis and Prediction Techniques - 1985

    DTIC Science & Technology

    1985-11-30

    Automatic Kesocyclone Detection 162 3. Kesocyclone Discriminators 163 7 :. . . . .. . . -,, . . . . . ..... - . . . . . ... ,, . ’ TABLE OF CONTENTS...1975: High order accurate difference solutions of fluid- mechanics problems by a compact differencing technique. J. Comut. Physics, 1j, 383-390. 22...160 ft . f ft f ft f ft t f ft ft ft** fftf ftt ~~ :,.--:.*’ 8 2t 12-. . .. .f -- t. S ~ ~ ~ ~ - 7’ - -10 - - - . .. - - 12 16 00 (;M a’ 4 /v r ai i

  13. Liquid Tunable Microlenses based on MEMS techniques

    PubMed Central

    Zeng, Xuefeng; Jiang, Hongrui

    2013-01-01

    The recent rapid development in microlens technology has provided many opportunities for miniaturized optical systems, and has found a wide range of applications. Of these microlenses, tunable-focus microlenses are of special interest as their focal lengths can be tuned using micro-scale actuators integrated with the lens structure. Realization of such tunable microlens generally relies on the microelectromechanical system (MEMS) technologies. Here, we review the recent progress in tunable liquid microlenses. The underlying physics relevant to these microlenses are first discussed, followed by description of three main categories of tunable microlenses involving MEMS techniques, mechanically driven, electrically driven, and those integrated within microfluidic systems. PMID:24163480

  14. PIE: A Dynamic Failure-Based Technique

    NASA Technical Reports Server (NTRS)

    Voas, Jeffrey M.

    1990-01-01

    This paper presents a dynamic technique for statistically estimating three program characteristics that affect a program's computational behavior: (1) the probability that a particular section of a program is executed, (2) the probability that the particular section affects the data state, and (3) the probability that a data state produced by that section has an effect on program output. These three characteristics can be used to predict whether faults are likely to be uncovered by software testing. Index Terms: Software testing, data state, fault, failure, testability. 1 Introduction

  15. Bayesian Analysis of the Pattern Informatics Technique

    NASA Astrophysics Data System (ADS)

    Cho, N.; Tiampo, K.; Klein, W.; Rundle, J.

    2007-12-01

    The pattern informatics (PI) [Rundle et al., 2000; Tiampo et al., 2002; Holliday et al., 2005] is a technique that uses phase dynamics in order to quantify temporal variations in seismicity patterns. This technique has shown interesting results for forecasting earthquakes with magnitude greater than or equal to 5 in southern California from 2000 to 2010 [Rundle et al., 2002]. In this work, a Bayesian approach is used to obtain a modified updated version of the PI called Bayesian pattern informatics (BPI). This alternative method uses the PI result as a prior probability and models such as ETAS [Ogata, 1988, 2004; Helmstetter and Sornette, 2002] or BASS [Turcotte et al., 2007] in order to obtain the likelihood. Its result is similar to the one obtained by the PI: the determination of regions, known as hotspots, that are most susceptible to the occurrence of events with M=5 and larger during the forecast period. As an initial test, retrospective forecasts for the southern California region from 1990 to 2000 were made with both the BPI and the PI techniques, and the results are discussed in this work.

  16. Analysis of Cultural Heritage by Accelerator Techniques and Analytical Imaging

    NASA Astrophysics Data System (ADS)

    Ide-Ektessabi, Ari; Toque, Jay Arre; Murayama, Yusuke

    2011-12-01

    In this paper we present the result of experimental investigation using two very important accelerator techniques: (1) synchrotron radiation XRF and XAFS; and (2) accelerator mass spectrometry and multispectral analytical imaging for the investigation of cultural heritage. We also want to introduce a complementary approach to the investigation of artworks which is noninvasive and nondestructive that can be applied in situ. Four major projects will be discussed to illustrate the potential applications of these accelerator and analytical imaging techniques: (1) investigation of Mongolian Textile (Genghis Khan and Kublai Khan Period) using XRF, AMS and electron microscopy; (2) XRF studies of pigments collected from Korean Buddhist paintings; (3) creating a database of elemental composition and spectral reflectance of more than 1000 Japanese pigments which have been used for traditional Japanese paintings; and (4) visible light-near infrared spectroscopy and multispectral imaging of degraded malachite and azurite. The XRF measurements of the Japanese and Korean pigments could be used to complement the results of pigment identification by analytical imaging through spectral reflectance reconstruction. On the other hand, analysis of the Mongolian textiles revealed that they were produced between 12th and 13th century. Elemental analysis of the samples showed that they contained traces of gold, copper, iron and titanium. Based on the age and trace elements in the samples, it was concluded that the textiles were produced during the height of power of the Mongol empire, which makes them a valuable cultural heritage. Finally, the analysis of the degraded and discolored malachite and azurite demonstrates how multispectral analytical imaging could be used to complement the results of high energy-based techniques.

  17. Overview on techniques in cluster analysis.

    PubMed

    Frades, Itziar; Matthiesen, Rune

    2010-01-01

    Clustering is the unsupervised, semisupervised, and supervised classification of patterns into groups. The clustering problem has been addressed in many contexts and disciplines. Cluster analysis encompasses different methods and algorithms for grouping objects of similar kinds into respective categories. In this chapter, we describe a number of methods and algorithms for cluster analysis in a stepwise framework. The steps of a typical clustering analysis process include sequentially pattern representation, the choice of the similarity measure, the choice of the clustering algorithm, the assessment of the output, and the representation of the clusters.

  18. GAS CURTAIN EXPERIMENTAL TECHNIQUE AND ANALYSIS METHODOLOGIES

    SciTech Connect

    J. R. KAMM; ET AL

    2001-01-01

    The qualitative and quantitative relationship of numerical simulation to the physical phenomena being modeled is of paramount importance in computational physics. If the phenomena are dominated by irregular (i. e., nonsmooth or disordered) behavior, then pointwise comparisons cannot be made and statistical measures are required. The problem we consider is the gas curtain Richtmyer-Meshkov (RM) instability experiments of Rightley et al. (13), which exhibit complicated, disordered motion. We examine four spectral analysis methods for quantifying the experimental data and computed results: Fourier analysis, structure functions, fractal analysis, and continuous wavelet transforms. We investigate the applicability of these methods for quantifying the details of fluid mixing.

  19. GAS CURTAIN EXPERIMENTAL TECHNIQUE AND ANALYSIS METHODOLOGIES.

    SciTech Connect

    Kamm, J. R.; Rider, William; Rightley, P. M.; Prestridge, K. P.; Benjamin, R. F.; Vorobieff, P. V.

    2001-01-01

    The qualitative and quantitative relationship of numerical simulation to the physical phenomena being modeled is of paramount importance in computational physics. If the phenomena are dominated by irregular (i.e., nonsmooth or disordered) behavior, then pointwise comparisons cannot be made and statistical measures are required. The problem we consider is the gas curtain Richtmyer-Meshkov (RM) instability experiments of Rightley et al. [13], which exhibit complicated, disordered motion. We examine four spectral analysis methods for quantifying the experimental data and computed results: Fourier analysis, structure functions, fractal analysis, and continuous wavelet transforms. We investigate the applicability of these methods for quantifying the details of fluid mixing.

  20. Statistical Analysis Techniques for Small Sample Sizes

    NASA Technical Reports Server (NTRS)

    Navard, S. E.

    1984-01-01

    The small sample sizes problem which is encountered when dealing with analysis of space-flight data is examined. Because of such a amount of data available, careful analyses are essential to extract the maximum amount of information with acceptable accuracy. Statistical analysis of small samples is described. The background material necessary for understanding statistical hypothesis testing is outlined and the various tests which can be done on small samples are explained. Emphasis is on the underlying assumptions of each test and on considerations needed to choose the most appropriate test for a given type of analysis.

  1. 78 FR 37690 - Federal Acquisition Regulation; Price Analysis Techniques

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-06-21

    ... Federal Acquisition Regulation; Price Analysis Techniques AGENCY: Department of Defense (DoD), General... clarify and give a precise reference in the use of a price analysis technique in order to establish a fair and reasonable price. DATES: Effective Date: July 22, 2013. FOR FURTHER INFORMATION CONTACT: Mr...

  2. 77 FR 40552 - Federal Acquisition Regulation; Price Analysis Techniques

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-07-10

    ...)(1), which sets forth the requirements of adequate price competition. However, only FAR 15.403-1(c)(1... Federal Acquisition Regulation; Price Analysis Techniques AGENCY: Department of Defense (DoD), General... clarify the use of a price analysis technique in order to establish a fair and reasonable price....

  3. Development of analysis techniques for remote sensing of vegetation resources

    NASA Technical Reports Server (NTRS)

    Draeger, W. C.

    1972-01-01

    Various data handling and analysis techniques are summarized for evaluation of ERTS-A and supporting high flight imagery. These evaluations are concerned with remote sensors applied to wildland and agricultural vegetation resource inventory problems. Monitoring California's annual grassland, automatic texture analysis, agricultural ground data collection techniques, and spectral measurements are included.

  4. Assembly, checkout, and operation optimization analysis technique for complex systems

    NASA Technical Reports Server (NTRS)

    1968-01-01

    Computerized simulation model of a launch vehicle/ground support equipment system optimizes assembly, checkout, and operation of the system. The model is used to determine performance parameters in three phases or modes - /1/ systems optimization techniques, /2/ operation analysis methodology, and /3/ systems effectiveness analysis technique.

  5. 48 CFR 215.404-1 - Proposal analysis techniques.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 48 Federal Acquisition Regulations System 3 2010-10-01 2010-10-01 false Proposal analysis techniques. 215.404-1 Section 215.404-1 Federal Acquisition Regulations System DEFENSE ACQUISITION... Contract Pricing 215.404-1 Proposal analysis techniques. (1) Follow the procedures at PGI 215.404-1 for...

  6. Soil Analysis using the semi-parametric NAA technique

    SciTech Connect

    Zamboni, C. B.; Silveira, M. A. G.; Medina, N. H.

    2007-10-26

    The semi-parametric Neutron Activation Analysis technique, using Au as a flux monitor, was applied to measure element concentrations of Br, Ca, Cl, K, Mn and Na for soil characterization. The results were compared with those using the Instrumental Neutron Activation Analysis technique and they found to be compatible. The viability, advantages, and limitations of using these two analytic methodologies are discussed.

  7. Typology of Delivery Quality: Latent Profile Analysis of Teacher Engagement and Delivery Techniques in a School-Based Prevention Intervention, "Keepin' It REAL" Curriculum

    ERIC Educational Resources Information Center

    Shin, YoungJu; Miller-Day, Michelle; Pettigrew, Jonathan; Hecht, Michael L.; Krieger, Janice L.

    2014-01-01

    Enhancing the delivery quality of school-based, evidence-based prevention programs is one key to ensuring uniform program effects on student outcomes. Program evaluations often focus on content dosage when implementing prevention curricula, however, less is known about implementation quality of prevention content, especially among teachers who may…

  8. Typology of Delivery Quality: Latent Profile Analysis of Teacher Engagement and Delivery Techniques in a School-Based Prevention Intervention, "Keepin' It REAL" Curriculum

    ERIC Educational Resources Information Center

    Shin, YoungJu; Miller-Day, Michelle; Pettigrew, Jonathan; Hecht, Michael L.; Krieger, Janice L.

    2014-01-01

    Enhancing the delivery quality of school-based, evidence-based prevention programs is one key to ensuring uniform program effects on student outcomes. Program evaluations often focus on content dosage when implementing prevention curricula, however, less is known about implementation quality of prevention content, especially among teachers who may…

  9. Development of Single-Nucleotide Polymorphism- Based Phylum-Specific PCR Amplification Technique: Application to the Community Analysis Using Ciliates as a Reference Organism

    PubMed Central

    Jung, Jae-Ho; Kim, Sanghee; Ryu, Seongho; Kim, Min-Seok; Baek, Ye-Seul; Kim, Se-Joo; Choi, Joong- Ki; Park, Joong-Ki; Min, Gi-Sik

    2012-01-01

    Despite recent advance in mass sequencing technologies such as pyrosequencing, assessment of culture-independent microbial eukaryote community structures using universal primers remains very difficult due to the tremendous richness and complexity of organisms in these communities. Use of a specific PCR marker targeting a particular group would provide enhanced sensitivity and more in-depth evaluation of microbial eukaryote communities compared to what can be achieved with universal primers. We discovered that many phylum- or group-specific single-nucleotide polymorphisms (SNPs) exist in small subunit ribosomal RNA (SSU rRNA) genes from diverse eukaryote groups. By applying this discovery to a known simple allele-discriminating (SAP) PCR method, we developed a technique that enables the identification of organisms belonging to a specific higher taxonomic group (or phylum) among diverse types of eukaryotes. We performed an assay using two complementary methods, pyrosequencing and clone library screening. In doing this, specificities for the group (ciliates) targeted in this study in bulked environmental samples were 94.6% for the clone library and 99.2% for pyrosequencing, respectively. In particular, our novel technique showed high selectivity for rare species, a feature that may be more important than the ability to identify quantitatively predominant species in community structure analyses. Additionally, our data revealed that a target-specific library (or ciliate-specific one for the present study) can better explain the ecological features of a sampling locality than a universal library. PMID:22965748

  10. Apprenticeship Learning Techniques for Knowledge Based Systems

    DTIC Science & Technology

    1988-12-01

    domain, such as medicine. The Odysseus explanation-based learning program constructs explanations of problem-solving actions in the domain of medical...theories and empirical methods so as to allow construction of an explanation. The Odysseus learning program provides the first demonstration of using the... Odysseus explanation-based learning program is presfuted, which constructs explanations of human problem-solving actions in the domain of medical di

  11. Visualization techniques for malware behavior analysis

    NASA Astrophysics Data System (ADS)

    Grégio, André R. A.; Santos, Rafael D. C.

    2011-06-01

    Malware spread via Internet is a great security threat, so studying their behavior is important to identify and classify them. Using SSDT hooking we can obtain malware behavior by running it in a controlled environment and capturing interactions with the target operating system regarding file, process, registry, network and mutex activities. This generates a chain of events that can be used to compare them with other known malware. In this paper we present a simple approach to convert malware behavior into activity graphs and show some visualization techniques that can be used to analyze malware behavior, individually or grouped.

  12. Digital imaging techniques in experimental stress analysis

    NASA Technical Reports Server (NTRS)

    Peters, W. H.; Ranson, W. F.

    1982-01-01

    Digital imaging techniques are utilized as a measure of surface displacement components in laser speckle metrology. An image scanner which is interfaced to a computer records and stores in memory the laser speckle patterns of an object in a reference and deformed configuration. Subsets of the deformed images are numerically correlated with the references as a measure of surface displacements. Discrete values are determined around a closed contour for plane problems which then become input into a boundary integral equation method in order to calculate surface traction in the contour. Stresses are then calculated within this boundary. The solution procedure is illustrated by a numerical example of a case of uniform tension.

  13. Trends and Techniques for Space Base Electronics

    NASA Technical Reports Server (NTRS)

    Trotter, J. D.; Wade, T. E.; Gassaway, J. D.

    1979-01-01

    Simulations of various phosphorus and boron diffusions in SOS were completed and a sputtering system, furnaces, and photolithography related equipment were set up. Double layer metal experiments initially utilized wet chemistry techniques. By incorporating ultrasonic etching of the vias, premetal cleaning a modified buffered HF, phosphorus doped vapox, and extended sintering, yields of 98% were obtained using the standard test pattern. A two dimensional modeling program was written for simulating short channel MOSFETs with nonuniform substrate doping. A key simplifying assumption used is that the majority carriers can be represented by a sheet charge at the silicon dioxide silicon interface. Although the program is incomplete, the two dimensional Poisson equation for the potential distribution was achieved. The status of other Z-D MOSFET simulation programs is summarized.

  14. Morphometric techniques for orientation analysis of karst in northern Florida

    SciTech Connect

    Jenkins, D.T.; Beck, B.F.

    1985-01-01

    Morphometric techniques for the analysis of karst landscape orientation data based on swallet catchment areas can be highly inadequate. The long axes of catchment areas may not coincide with structural control, especially in regions having very low relief. Better structural correlation was observed using multiply linear trend measurements of closed depressions rather than drainage basins. Trend analysis was performed on four areas, approximately 25 km/sup 2/ each, forming a sequence from the Suwannee River to the Cody Escarpment in northern Florida. This area is a karst plain, mantled by 12 to 25 meters of unconsolidated sands and clays. Structural control was examined by tabulating the azimuths of distinct linear trends as determined from depression shape based on 1:24,000 topographic maps. The topography was characterized by 1872 individual swallet catchment areas or 1457 closed depressions. The common geomorphic technique of analyzing orientation data in 10/sup 0/ increments beginning with O/sup 0/ may yield incorrect peak width and placement. To correctly detect all significant orientation peaks all possible combinations of peak width and placement must be tested. Fifty-five different plots were reviewed and tested for each area.

  15. Uncertainty analysis technique for OMEGA Dante measurements.

    PubMed

    May, M J; Widmann, K; Sorce, C; Park, H-S; Schneider, M

    2010-10-01

    The Dante is an 18 channel x-ray filtered diode array which records the spectrally and temporally resolved radiation flux from various targets (e.g., hohlraums, etc.) at x-ray energies between 50 eV and 10 keV. It is a main diagnostic installed on the OMEGA laser facility at the Laboratory for Laser Energetics, University of Rochester. The absolute flux is determined from the photometric calibration of the x-ray diodes, filters and mirrors, and an unfold algorithm. Understanding the errors on this absolute measurement is critical for understanding hohlraum energetic physics. We present a new method for quantifying the uncertainties on the determined flux using a Monte Carlo parameter variation technique. This technique combines the uncertainties in both the unfold algorithm and the error from the absolute calibration of each channel into a one sigma Gaussian error function. One thousand test voltage sets are created using these error functions and processed by the unfold algorithm to produce individual spectra and fluxes. Statistical methods are applied to the resultant set of fluxes to estimate error bars on the measurements.

  16. Advanced Imaging Techniques for Multiphase Flows Analysis

    NASA Astrophysics Data System (ADS)

    Amoresano, A.; Langella, G.; Di Santo, M.; Iodice, P.

    2017-08-01

    Advanced numerical techniques, such as fuzzy logic and neural networks have been applied in this work to digital images acquired on two applications, a centrifugal pump and a stationary spray in order to define, in a stochastic way, the gas-liquid interface evolution. Starting from the numeric matrix representing the image it is possible to characterize geometrical parameters and the time evolution of the jet. The algorithm used works with the fuzzy logic concept to binarize the chromatist of the pixels, depending them, by using the difference of the light scattering for the gas and the liquid phase.. Starting from a primary fixed threshold, the applied technique, can select the ‘gas’ pixel from the ‘liquid’ pixel and so it is possible define the first most probably boundary lines of the spray. Acquiring continuously the images, fixing a frame rate, a most fine threshold can be select and, at the limit, the most probably geometrical parameters of the jet can be detected.

  17. Impedance Flow Cytometry: A Novel Technique in Pollen Analysis

    PubMed Central

    Lambalk, Joep; Ottiger, Marcel

    2016-01-01

    Introduction An efficient and reliable method to estimate plant cell viability, especially of pollen, is important for plant breeding research and plant production processes. Pollen quality is determined by classical methods, like staining techniques or in vitro pollen germination, each having disadvantages with respect to reliability, analysis speed, and species dependency. Analysing single cells based on their dielectric properties by impedance flow cytometry (IFC) has developed into a common method for cellular characterisation in microbiology and medicine during the last decade. The aim of this study is to demonstrate the potential of IFC in plant cell analysis with the focus on pollen. Method Developing and mature pollen grains were analysed during their passage through a microfluidic chip to which radio frequencies of 0.5 to 12 MHz were applied. The acquired data provided information about the developmental stage, viability, and germination capacity. The biological relevance of the acquired IFC data was confirmed by classical staining methods, inactivation controls, as well as pollen germination assays. Results Different stages of developing pollen, dead, viable and germinating pollen populations could be detected and quantified by IFC. Pollen viability analysis by classical FDA staining showed a high correlation with IFC data. In parallel, pollen with active germination potential could be discriminated from the dead and the viable but non-germinating population. Conclusion The presented data demonstrate that IFC is an efficient, label-free, reliable and non-destructive technique to analyse pollen quality in a species-independent manner. PMID:27832091

  18. Impedance Flow Cytometry: A Novel Technique in Pollen Analysis.

    PubMed

    Heidmann, Iris; Schade-Kampmann, Grit; Lambalk, Joep; Ottiger, Marcel; Di Berardino, Marco

    2016-01-01

    An efficient and reliable method to estimate plant cell viability, especially of pollen, is important for plant breeding research and plant production processes. Pollen quality is determined by classical methods, like staining techniques or in vitro pollen germination, each having disadvantages with respect to reliability, analysis speed, and species dependency. Analysing single cells based on their dielectric properties by impedance flow cytometry (IFC) has developed into a common method for cellular characterisation in microbiology and medicine during the last decade. The aim of this study is to demonstrate the potential of IFC in plant cell analysis with the focus on pollen. Developing and mature pollen grains were analysed during their passage through a microfluidic chip to which radio frequencies of 0.5 to 12 MHz were applied. The acquired data provided information about the developmental stage, viability, and germination capacity. The biological relevance of the acquired IFC data was confirmed by classical staining methods, inactivation controls, as well as pollen germination assays. Different stages of developing pollen, dead, viable and germinating pollen populations could be detected and quantified by IFC. Pollen viability analysis by classical FDA staining showed a high correlation with IFC data. In parallel, pollen with active germination potential could be discriminated from the dead and the viable but non-germinating population. The presented data demonstrate that IFC is an efficient, label-free, reliable and non-destructive technique to analyse pollen quality in a species-independent manner.

  19. BaTMAn: Bayesian Technique for Multi-image Analysis

    NASA Astrophysics Data System (ADS)

    Casado, J.; Ascasibar, Y.; García-Benito, R.; Guidi, G.; Choudhury, O. S.; Bellocchi, E.; Sánchez, S. F.; Díaz, A. I.

    2016-12-01

    Bayesian Technique for Multi-image Analysis (BaTMAn) characterizes any astronomical dataset containing spatial information and performs a tessellation based on the measurements and errors provided as input. The algorithm iteratively merges spatial elements as long as they are statistically consistent with carrying the same information (i.e. identical signal within the errors). The output segmentations successfully adapt to the underlying spatial structure, regardless of its morphology and/or the statistical properties of the noise. BaTMAn identifies (and keeps) all the statistically-significant information contained in the input multi-image (e.g. an IFS datacube). The main aim of the algorithm is to characterize spatially-resolved data prior to their analysis.

  20. Structural analysis of box beams using symbolic manipulation technique

    NASA Astrophysics Data System (ADS)

    Sathyamoorthy, M.; Sirigiri, Ravindra

    1993-04-01

    The aeroelastic analysis of aircraft wings requires an accurate determination of the influence coefficients. In the past, energy methods have been commonly used to analyze box-type structures and the results have been found to agree well with the experiments. However, when analysis of large wing-type structures is desired, it becomes necessary to automate the energy method. In this article, a method has been developed based on symbolic manipulation as an automated technique to find solutions to box-type structures. Various manipulations required for the energy method have been automatically implemented in a computer program with solutions available at each stage in a symbolic form. The numerical results for several example problems have been compared with alternate theoretical as well as experimental results. Good agreement has been noted in all the cases considered in this article.

  1. A methodological comparison of customer service analysis techniques

    Treesearch

    James Absher; Alan Graefe; Robert Burns

    2003-01-01

    Techniques used to analyze customer service data need to be studied. Two primary analysis protocols, importance-performance analysis (IP) and gap score analysis (GA), are compared in a side-by-side comparison using data from two major customer service research projects. A central concern is what, if any, conclusion might be different due solely to the analysis...

  2. 48 CFR 215.404-1 - Proposal analysis techniques.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 48 Federal Acquisition Regulations System 3 2013-10-01 2013-10-01 false Proposal analysis... Contract Pricing 215.404-1 Proposal analysis techniques. (1) Follow the procedures at PGI 215.404-1 for proposal analysis. (2) For spare parts or support equipment, perform an analysis of— (i) Those line items...

  3. 48 CFR 215.404-1 - Proposal analysis techniques.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 48 Federal Acquisition Regulations System 3 2014-10-01 2014-10-01 false Proposal analysis... Contract Pricing 215.404-1 Proposal analysis techniques. (1) Follow the procedures at PGI 215.404-1 for proposal analysis. (2) For spare parts or support equipment, perform an analysis of— (i) Those line items...

  4. Injection Locking Techniques for Spectrum Analysis

    NASA Astrophysics Data System (ADS)

    Gathma, Timothy D.; Buckwalter, James F.

    2011-04-01

    Wideband spectrum analysis supports future communication systems that reconfigure and adapt to the capacity of the spectral environment. While test equipment manufacturers offer wideband spectrum analyzers with excellent sensitivity and resolution, these spectrum analyzers typically cannot offer acceptable size, weight, and power (SWAP). CMOS integrated circuits offer the potential to fully integrate spectrum analysis capability with analog front-end circuitry and digital signal processing on a single chip. Unfortunately, CMOS lacks high-Q passives and wideband resonator tunability that is necessary for heterodyne implementations of spectrum analyzers. As an alternative to the heterodyne receiver architectures, two nonlinear methods for performing wideband, low-power spectrum analysis are presented. The first method involves injecting the spectrum of interest into an array of injection-locked oscillators. The second method employs the closed loop dynamics of both injection locking and phase locking to independently estimate the injected frequency and power.

  5. Injection Locking Techniques for Spectrum Analysis

    SciTech Connect

    Gathma, Timothy D.; Buckwalter, James F.

    2011-04-19

    Wideband spectrum analysis supports future communication systems that reconfigure and adapt to the capacity of the spectral environment. While test equipment manufacturers offer wideband spectrum analyzers with excellent sensitivity and resolution, these spectrum analyzers typically cannot offer acceptable size, weight, and power (SWAP). CMOS integrated circuits offer the potential to fully integrate spectrum analysis capability with analog front-end circuitry and digital signal processing on a single chip. Unfortunately, CMOS lacks high-Q passives and wideband resonator tunability that is necessary for heterodyne implementations of spectrum analyzers. As an alternative to the heterodyne receiver architectures, two nonlinear methods for performing wideband, low-power spectrum analysis are presented. The first method involves injecting the spectrum of interest into an array of injection-locked oscillators. The second method employs the closed loop dynamics of both injection locking and phase locking to independently estimate the injected frequency and power.

  6. Automated fluid analysis apparatus and techniques

    DOEpatents

    Szecsody, James E.

    2004-03-16

    An automated device that couples a pair of differently sized sample loops with a syringe pump and a source of degassed water. A fluid sample is mounted at an inlet port and delivered to the sample loops. A selected sample from the sample loops is diluted in the syringe pump with the degassed water and fed to a flow through detector for analysis. The sample inlet is also directly connected to the syringe pump to selectively perform analysis without dilution. The device is airtight and used to detect oxygen-sensitive species, such as dithionite in groundwater following a remedial injection to treat soil contamination.

  7. Accelerator based techniques for contraband detection

    NASA Astrophysics Data System (ADS)

    Vourvopoulos, George

    1994-05-01

    It has been shown that narcotics, explosives, and other contraband materials, contain various chemical elements such as H, C, N, O, P, S, and Cl in quantities and ratios that differentiate them from each other and from other innocuous substances. Neutrons and γ-rays have the ability to penetrate through various materials at large depths. They are thus able, in a non-intrusive way, to interrogate volumes ranging from suitcases to Sea-Land containers, and have the ability to image the object with an appreciable degree of reliability. Neutron induced reactions such as (n, γ), (n, n') (n, p) or proton induced γ-resonance absorption are some of the reactions currently investigated for the identification of the chemical elements mentioned above. Various DC and pulsed techniques are discussed and their advantages, characteristics, and current progress are shown. Areas where use of these methods is currently under evaluation are detection of hidden explosives, illicit drug interdiction, chemical war agents identification, nuclear waste assay, nuclear weapons destruction and others.

  8. Comparison of Hydrogen Sulfide Analysis Techniques

    ERIC Educational Resources Information Center

    Bethea, Robert M.

    1973-01-01

    A summary and critique of common methods of hydrogen sulfide analysis is presented. Procedures described are: reflectance from silver plates and lead acetate-coated tiles, lead acetate and mercuric chloride paper tapes, sodium nitroprusside and methylene blue wet chemical methods, infrared spectrophotometry, and gas chromatography. (BL)

  9. Solution Techniques in Finite Element Analysis.

    DTIC Science & Technology

    1983-05-01

    7. we show a plane strain rubber block subjected to large deforma- tion. We employ a 4-node element and a Mooney - Rivlin material as described in...0 Rubber Block U: 0.30 Figure 7. Large Deformation Analysis of the R ubber Block with Mooney - Rivlin Material Model. GEOMETRY node iE 10 4 -0.3 1.0 1

  10. Comparison of Hydrogen Sulfide Analysis Techniques

    ERIC Educational Resources Information Center

    Bethea, Robert M.

    1973-01-01

    A summary and critique of common methods of hydrogen sulfide analysis is presented. Procedures described are: reflectance from silver plates and lead acetate-coated tiles, lead acetate and mercuric chloride paper tapes, sodium nitroprusside and methylene blue wet chemical methods, infrared spectrophotometry, and gas chromatography. (BL)

  11. Permethylation Linkage Analysis Techniques for Residual Carbohydrates

    USDA-ARS?s Scientific Manuscript database

    Permethylation analysis is the classic approach to establishing the position of glycosidic linkages between sugar residues. Typically, the carbohydrate is derivatized to form acid-stable methyl ethers, hydrolyzed, peracetylated, and analyzed by gas chromatography-mass spectrometry (GC-MS). The pos...

  12. Gearbox diagnostics using wavelet-based windowing technique

    NASA Astrophysics Data System (ADS)

    Omar, F. K.; Gaouda, A. M.

    2009-08-01

    In extracting gear box acoustic signals embedded in excessive noise, the need for an online and automated tool becomes a crucial necessity. One of the recent approaches that have gained some acceptance within the research arena is the Wavelet multi-resolution analysis (WMRA). However selecting an accurate mother wavelet, defining dynamic threshold values and identifying the resolution levels to be considered in gearboxes fault detection and diagnosis are still challenging tasks. This paper proposes a novel wavelet-based technique for detecting, locating and estimating the severity of defects in gear tooth fracture. The proposed technique enhances the WMRA by decomposing the noisy data into different resolution levels while data sliding it into Kaiser's window. Only the maximum expansion coefficients at each resolution level are used in de-noising, detecting and measuring the severity of the defects. A small set of coefficients is used in the monitoring process without assigning threshold values or performing signal reconstruction. The proposed monitoring technique has been applied to a laboratory data corrupted with high noise level.

  13. Analysis techniques for background rejection at the Majorana Demonstrator

    SciTech Connect

    Cuestra, Clara; Rielage, Keith Robert; Elliott, Steven Ray; Xu, Wenqin; Goett, John Jerome III

    2015-06-11

    The MAJORANA Collaboration is constructing the MAJORANA DEMONSTRATOR, an ultra-low background, 40-kg modular HPGe detector array to search for neutrinoless double beta decay in 76Ge. In view of the next generation of tonne-scale Ge-based 0νββ-decay searches that will probe the neutrino mass scale in the inverted-hierarchy region, a major goal of the MAJORANA DEMONSTRATOR is to demonstrate a path forward to achieving a background rate at or below 1 count/tonne/year in the 4 keV region of interest around the Q-value at 2039 keV. The background rejection techniques to be applied to the data include cuts based on data reduction, pulse shape analysis, event coincidences, and time correlations. The Point Contact design of the DEMONSTRATOR's germanium detectors allows for significant reduction of gamma background.

  14. Analysis techniques for background rejection at the MAJORANA DEMONSTRATOR

    SciTech Connect

    Cuesta, C; Abgrall, N.; Arnquist, I. J.; Avignone, III, F. T.; Barabash, A.S.; Bertrand, F. E.; Bradley, A. W.; Brudanin, V.; Busch, M.; Byram, D.; Caldwell, A. S.; Chan, Y-D; Christofferson, C. D.; Detwiler, J. A.; Efremenko, Yu.; Ejiri, H.; Elliott, S. R.; Galindo-Uribarri, A.; Gilliss, T.; Green, M. P.; Gruszko, J; Guinn, I S; Guiseppe, V E; Henning, R.; Howard, S.; Howe, M. A.; Keeter, K.J.; Kidd, M. F.; Konovalov, S.I.; Kouzes, R. T.; LaFerriere, B. D.; Leon, J.; MacMullin, J.; Meijer, S. J.; Orrell, J. L.; O'Shaughnessy, C.; Radford, D. C.; Rager, J.; Robertson, R.G.H.; Romero-Romero, E.; Snyder, N; Suriano, A. M.; Tedeschi, D; Trimble, J. E.; Vasilyev, S.; Vetter, K. [University of California et al.

    2015-01-01

    The MAJORANA Collaboration is constructing the MAJORANA DEMONSTRATOR, an ultra-low background, 40-kg modular HPGe detector array to search for neutrinoless double beta decay in Ge-76. In view of the next generation of tonne-scale Ge-based 0 nu beta beta-decay searches that will probe the neutrino mass scale in the inverted-hierarchy region, a major goal of the MAJORANA DEMONSTRATOR is to demonstrate a path forward to achieving a background rate at or below 1 count/tonne/year in the 4 keV region of interest around the Q-value at 2039 keV. The background rejection techniques to be applied to the data include cuts based on data reduction, pulse shape analysis, event coincidences, and time correlations. The Point Contact design of the DEMONSTRATOR's germanium detectors allows for significant reduction of gamma background.

  15. Visualization techniques for tongue analysis in traditional Chinese medicine

    NASA Astrophysics Data System (ADS)

    Pham, Binh L.; Cai, Yang

    2004-05-01

    Visual inspection of the tongue has been an important diagnostic method of Traditional Chinese Medicine (TCM). Clinic data have shown significant connections between various viscera cancers and abnormalities in the tongue and the tongue coating. Visual inspection of the tongue is simple and inexpensive, but the current practice in TCM is mainly experience-based and the quality of the visual inspection varies between individuals. The computerized inspection method provides quantitative models to evaluate color, texture and surface features on the tongue. In this paper, we investigate visualization techniques and processes to allow interactive data analysis with the aim to merge computerized measurements with human expert's diagnostic variables based on five-scale diagnostic conditions: Healthy (H), History Cancers (HC), History of Polyps (HP), Polyps (P) and Colon Cancer (C).

  16. Analysis techniques for background rejection at the MAJORANA DEMONSTRATOR

    SciTech Connect

    Cuesta, C.; Buuck, M.; Detwiler, J. A.; Gruszko, J.; Guinn, I. S.; Leon, J.; Robertson, R. G. H.; Abgrall, N.; Bradley, A. W.; Chan, Y-D.; Mertens, S.; Poon, A. W. P.; Arnquist, I. J.; Hoppe, E. W.; Kouzes, R. T.; LaFerriere, B. D.; Orrell, J. L.; Avignone, F. T.; Baldenegro-Barrera, C. X.; Bertrand, F. E.; and others

    2015-08-17

    The MAJORANA Collaboration is constructing the MAJORANA DEMONSTRATOR, an ultra-low background, 40- kg modular HPGe detector array to search for neutrinoless double beta decay in {sup 76}Ge. In view of the next generation of tonne-scale Ge-based 0νβ β-decay searches that will probe the neutrino mass scale in the inverted-hierarchy region, a major goal of the MAJORANA DEMONSTRATOR is to demonstrate a path forward to achieving a background rate at or below 1 count/tonne/year in the 4 keV region of interest around the Q-value at 2039 keV. The background rejection techniques to be applied to the data include cuts based on data reduction, pulse shape analysis, event coincidences, and time correlations. The Point Contact design of the DEMONSTRATOR’s germanium detectors allows for significant reduction of gamma background.

  17. Application of principal component analysis for improvement of X-ray fluorescence images obtained by polycapillary-based micro-XRF technique

    NASA Astrophysics Data System (ADS)

    Aida, S.; Matsuno, T.; Hasegawa, T.; Tsuji, K.

    2017-07-01

    Micro X-ray fluorescence (micro-XRF) analysis is repeated as a means of producing elemental maps. In some cases, however, the XRF images of trace elements that are obtained are not clear due to high background intensity. To solve this problem, we applied principal component analysis (PCA) to XRF spectra. We focused on improving the quality of XRF images by applying PCA. XRF images of the dried residue of standard solution on the glass substrate were taken. The XRF intensities for the dried residue were analyzed before and after PCA. Standard deviations of XRF intensities in the PCA-filtered images were improved, leading to clear contrast of the images. This improvement of the XRF images was effective in cases where the XRF intensity was weak.

  18. Maintenance Audit through Value Analysis Technique: A Case Study

    NASA Astrophysics Data System (ADS)

    Carnero, M. C.; Delgado, S.

    2008-11-01

    The increase in competitiveness, technological changes and the increase in the requirements of quality and service have forced a change in the design and application of maintenance, as well as the way in which it is considered within the managerial strategy. There are numerous maintenance activities that must be developed in a service company. As a result the maintenance functions as a whole have to be outsourced. Nevertheless, delegating this subject to specialized personnel does not exempt the company from responsibilities, but rather leads to the need for control of each maintenance activity. In order to achieve this control and to evaluate the efficiency and effectiveness of the company it is essential to carry out an audit that diagnoses the problems that could develop. In this paper a maintenance audit applied to a service company is developed. The methodology applied is based on the expert systems. The expert system by means of rules uses the weighting technique SMART and value analysis to obtain the weighting between the decision functions and between the alternatives. The expert system applies numerous rules and relations between different variables associated with the specific maintenance functions, to obtain the maintenance state by sections and the general maintenance state of the enterprise. The contributions of this paper are related to the development of a maintenance audit in a service enterprise, in which maintenance is not generally considered a strategic subject and to the integration of decision-making tools such as the weighting technique SMART with value analysis techniques, typical in the design of new products, in the area of the rule-based expert systems.

  19. [Applicability of laser-based geological techniques in bone research: analysis of calcium oxide distribution in thin-cut animal bones].

    PubMed

    Andrássy, László; Maros, Gyula; Kovács, István János; Horváth, Ágnes; Gulyás, Katalin; Bertalan, Éva; Besnyi, Anikó; Füri, Judit; Fancsik, Tamás; Szekanecz, Zoltán; Bhattoa, Harjit Pal

    2014-11-09

    The structural similarities between the inorganic component of bone tissue and geological formations make it possible that mathematic models may be used to determine weight percentage composition of different mineral element oxides constituting the inorganic component of bone tissue. The determined weight percentage composition can be verified with the determination of element oxide concentration values by laser induced plasma spectroscopy and inductively coupled plasma optical emission spectrometry. It can be concluded from calculated weight percentage composition of the inorganic component of bone tissue and laboratory analyses that the properties of bone tissue are determined primarily by hydroxylapatite. The inorganic bone structure can be studied well by determining the calcium oxide concentration distribution using the laser induced plasma spectroscopy technique. In the present study, thin polished bone slides prepared from male bovine tibia were examined with laser induced plasma spectroscopy in a regular network and combined sampling system to derive the calculated calcium oxide concentration distribution. The superficial calcium oxide concentration distribution, as supported by "frequency distribution" curves, can be categorized into a number of groups. This, as such, helps in clearly demarcating the cortical and trabecular bone structures. Following analyses of bovine tibial bone, the authors found a positive association between the attenuation value, as determined by quantitative computer tomography and the "ρ" density, as used in geology. Furthermore, the calculated "ρ" density and the measured average calcium oxide concentration values showed inverse correlation.

  20. Autoregressive and bispectral analysis techniques: EEG applications.

    PubMed

    Ning, T; Bronzino, J D

    1990-01-01

    Some basic properties of autoregressive (AR) modeling and bispectral analysis are reviewed, and examples of their application in electroencephalography (EEG) research are provided. A second-order AR model was used to score cortical EEGs in order. In tests performed on five adult rats to distinguish between different vigilance states such a quiet-waking (QW), rapid-eye-movement (REM), and slow-wave sleep (SWS), SWS activity was correctly identified over 96% of the time, and a 95% agreement rate was achieved in recognizing the REM sleep stage. In a bispectral analysis of the rat EEG, third-order cumulant (TOC) sequences of 32 epochs belonging to the same vigilance state were estimated and then averaged. Preliminary results have shown that bispectra of hippocampal EEGs during REM Sleep exhibit significant quadratic phase couplings between frequencies in the 6-8-Hz range, associated with the theta rhythm.

  1. A comparison of wavelet analysis techniques in digital holograms

    NASA Astrophysics Data System (ADS)

    Molony, Karen M.; Maycock, Jonathan; McDonald, John B.; Hennelly, Bryan M.; Naughton, Thomas J.

    2008-04-01

    This study explores the effectiveness of wavelet analysis techniques on digital holograms of real-world 3D objects. Stationary and discrete wavelet transform techniques have been applied for noise reduction and compared. Noise is a common problem in image analysis and successful reduction of noise without degradation of content is difficult to achieve. These wavelet transform denoising techniques are contrasted with traditional noise reduction techniques; mean filtering, median filtering, Fourier filtering. The different approaches are compared in terms of speckle reduction, edge preservation and resolution preservation.

  2. Envelopment technique and topographic overlays in bite mark analysis

    PubMed Central

    Djeapragassam, Parimala; Daniel, Mariappan Jonathan; Srinivasan, Subramanian Vasudevan; Ramadoss, Koliyan; Jimsha, Vannathan Kumaran

    2015-01-01

    Aims and Objectives: The aims and objectives of our study were to compare four sequential overlays generated using the envelopment technique and to evaluate inter- and intraoperator reliability of the overlays obtained by the envelopment technique. Materials and Methods: Dental stone models were prepared from impressions made from healthy individuals; photographs were taken and computer-assisted overlays were generated. The models were then enveloped in a different-color dental stone. After this, four sequential cuts were made at a thickness of 1mm each. Each sectional cut was photographed and overlays were generated. Thus, 125 overlays were generated and compared. Results: The scoring was done based on matching accuracy and the data were analyzed. The Kruskal-Wallis one-way analysis of variance (ANOVA) test was used to compare four sequential overlays and Spearman's rank correlation tests were used to evaluate the inter- and intraoperator reliability of the overlays obtained by the envelopment technique. Conclusion: Through our study, we conclude that the third and fourth cuts were the best among the four cuts and inter- and intraoperator reliability were found to be statistically significant at 5% level that is 95% confidence interval (P < 0.05). PMID:26816458

  3. In situ hybridization freeze-assisted punches (IFAP): technique for liquid-based tissue extraction from thin slide-mounted sections for DNA methylation analysis.

    PubMed

    Own, Lawrence S; Patel, Paresh D

    2014-01-01

    In situ hybridization-assisted punches (IFAP) are a low-cost method for extracting tissue from frozen slide-mounted sections as thin as 12 μm. The method synergizes well with standard histological workflows and uses in situ hybridization to target corresponding slide-mounted cryosections that contain the region of interest. Liquid beads of M-1 embedding matrix are applied and snap frozen, binding the matrix to the underlying tissue. Bead-tissue complexes are removed and DNA extracted using a high-salt method. IFAP-extracted DNA is suitable for downstream DNA methylation analysis.

  4. Multiwavelet-transform-based image compression techniques

    NASA Astrophysics Data System (ADS)

    Rao, Sathyanarayana S.; Yoon, Sung H.; Shenoy, Deepak

    1996-10-01

    Multiwavelet transforms are a new class of wavelet transforms that use more than one prototype scaling function and wavelet in the multiresolution analysis/synthesis. The popular Geronimo-Hardin-Massopust multiwavelet basis functions have properties of compact support, orthogonality, and symmetry which cannot be obtained simultaneously in scalar wavelets. The performance of multiwavelets in still image compression is studied using vector quantization of multiwavelet subbands with a multiresolution codebook. The coding gain of multiwavelets is compared with that of other well-known wavelet families using performance measures such as unified coding gain. Implementation aspects of multiwavelet transforms such as pre-filtering/post-filtering and symmetric extension are also considered in the context of image compression.

  5. Analysis of aerosol transport over southern Poland in August 2015 based on a synergy of remote sensing and backward trajectory techniques

    NASA Astrophysics Data System (ADS)

    Szkop, Artur; Pietruczuk, Aleksander

    2017-01-01

    Lufft's CHM 15k "Nimbus" ceilometer and a collocated Cimel sunphotometer were used, in tandem with satellite data, to observe the transport of atmospheric aerosols over Raciborz, Poland, during an exceptionally warm month of August 2015. Two distinct periods are identified: increased aerosol optical thickness (AOT) values, exceeding 0.65, during August 6 to 13 concurrent with the planetary boundary layer elevated up to ˜2.5 km and thin aerosol layers up to 7 km above ground visible from August 25 to 27. A newly developed scheme for backward air mass trajectory analysis is employed. The scheme utilizes satellite data on thermal anomalies as well as multiangle imaging of aerosol clouds. The obtained retrievals provide evidence that aerosols of biomass burning type were present during the first period, originating from a strong episode of wildfires in Ukraine. Moreover, satellite AOT data from the MODIS instrument show that a significant part of the observed aerosol was accumulated during transport between the region of intense biomass burning and the receptor, Raciborz. The same analysis scheme suggests that a long-range transport of biomass burning products from the United States was a source of the layers observed during the second period.

  6. Techniques for geothermal liquid sampling and analysis

    SciTech Connect

    Kindle, C.H.; Woodruff, E.M.

    1981-07-01

    A methodology has been developed that is particularly suited to liquid-dominated resources and adaptable to a variety of situations. It is intended to be a base methodology upon which variations can be made to meet specific needs or situations. The approach consists of recording flow conditions at the time of sampling, a specific insertable probe sampling system, a sample stabilization procedure, commercially available laboratory instruments, and data quality check procedures.

  7. Diesel Combustion Analysis Using Rapid Sampling Techniques.

    DTIC Science & Technology

    1982-08-01

    fuel injection is stopped and the engine valves are deactivated. The quench tank, after a dump, contains 80 to 90 percent of the cylinder gas. This...and was based on a I.H.C. four valve prototype head. This head was operated with only two valves using specially designed valve gear. The other two... valve locations were modified to become instrumentation ports. A number of factors caused this first modified head to fall short of the research needs

  8. Advanced Experimental Techniques in Crack Tip Analysis.

    DTIC Science & Technology

    1983-06-01

    The shear stress in the xy plane is then given by B sin 20 (3) axy =2 3 Clark, Mignogna and Sanford E18 ) used the above relations to measure the...directly using crack tip measurements in contrast to the ASTM 12 _ designated far-field procedure which is based on many simplifying assumptions. 2-0...effect of Crack Front Curvature in an ASTM Compact Tension Speciment", Proc. of the Fourth Brazilian Congress of Mechanical Engineering, 1977, pp 13 - 26

  9. FDI and Accommodation Using NN Based Techniques

    NASA Astrophysics Data System (ADS)

    Garcia, Ramon Ferreiro; de Miguel Catoira, Alberto; Sanz, Beatriz Ferreiro

    Massive application of dynamic backpropagation neural networks is used on closed loop control FDI (fault detection and isolation) tasks. The process dynamics is mapped by means of a trained backpropagation NN to be applied on residual generation. Process supervision is then applied to discriminate faults on process sensors, and process plant parameters. A rule based expert system is used to implement the decision making task and the corresponding solution in terms of faults accommodation and/or reconfiguration. Results show an efficient and robust FDI system which could be used as the core of an SCADA or alternatively as a complement supervision tool operating in parallel with the SCADA when applied on a heat exchanger.

  10. Speckle-adaptive VISAR fringe analysis technique

    NASA Astrophysics Data System (ADS)

    Erskine, David

    2017-01-01

    A line-VISAR (velocity interferometer) is an important diagnostic in shock physics, simultaneously measuring many fringe histories of adjacent portions of a target splayed along a line on a target, with fringes recorded vs time and space by a streak camera. Due to laser speckle the reflected intensity may be uneven spatially, and due to irregularities in the streak camera electron optics the phase along the slit may be slightly nonlinear. Conventional fringe analysis algorithms which do not properly model these variations can suffer from inferred velocity errors. A speckle-adaptive algorithm has been developed which senses the interferometer and illumination parameters for each individual row (spatial position Y) of the 2d interferogram, so that the interferogram can be compensated for Y-dependent nonfringing intensity, fringe visibility, and nonlinear phase distribution. In numerical simulations and on actual data we have found this individual row-by-row modeling improves the accuracy of the result, compared to a conventional column-by-column analysis approach.

  11. [The test system to identify mucin MUC1 in human blood serum using the technique of immune-enzyme analysis based on monoclonal antibody ICO25].

    PubMed

    Karmakova, T A; Vorontsova, M S; Skripnik, V V; Bezborodova, O A; Iakubovskaia, R I

    2012-02-01

    On the basis of genuine mouse monoclonal antibody ICO25 the test system IEA ICO25 was developed and standardized to quantitative detect tumor-associated antigen, mucin1 in human blood serum in format of inhibitory immune-enzyme analysis. The analytic characteristics of test-system correspond to the standards applied to immune-enzyme diagnostic kits. The results of identification of MUC1 in blood serum of healthy donors and female patients with breast pathology using IEA ICO25 fully correlate with the data concerning the detection of antigen CA15-3 using certified commercial kits. The test system IEA ICO25 can be used to detect MUC1 in human blood serum for research purpose.

  12. BATMAN: Bayesian Technique for Multi-image Analysis

    NASA Astrophysics Data System (ADS)

    Casado, J.; Ascasibar, Y.; García-Benito, R.; Guidi, G.; Choudhury, O. S.; Bellocchi, E.; Sánchez, S. F.; Díaz, A. I.

    2017-04-01

    This paper describes the Bayesian Technique for Multi-image Analysis (BATMAN), a novel image-segmentation technique based on Bayesian statistics that characterizes any astronomical data set containing spatial information and performs a tessellation based on the measurements and errors provided as input. The algorithm iteratively merges spatial elements as long as they are statistically consistent with carrying the same information (i.e. identical signal within the errors). We illustrate its operation and performance with a set of test cases including both synthetic and real integral-field spectroscopic data. The output segmentations adapt to the underlying spatial structure, regardless of its morphology and/or the statistical properties of the noise. The quality of the recovered signal represents an improvement with respect to the input, especially in regions with low signal-to-noise ratio. However, the algorithm may be sensitive to small-scale random fluctuations, and its performance in presence of spatial gradients is limited. Due to these effects, errors may be underestimated by as much as a factor of 2. Our analysis reveals that the algorithm prioritizes conservation of all the statistically significant information over noise reduction, and that the precise choice of the input data has a crucial impact on the results. Hence, the philosophy of BaTMAn is not to be used as a 'black box' to improve the signal-to-noise ratio, but as a new approach to characterize spatially resolved data prior to its analysis. The source code is publicly available at http://astro.ft.uam.es/SELGIFS/BaTMAn.

  13. Vortex metrology using Fourier analysis techniques: vortex networks correlation fringes.

    PubMed

    Angel-Toro, Luciano; Sierra-Sosa, Daniel; Tebaldi, Myrian; Bolognini, Néstor

    2012-10-20

    In this work, we introduce an alternative method of analysis in vortex metrology based on the application of the Fourier optics techniques. The first part of the procedure is conducted as is usual in vortex metrology for uniform in-plane displacement determination. On the basis of two recorded intensity speckled distributions, corresponding to two states of a diffuser coherently illuminated, we numerically generate an analytical signal from each recorded intensity pattern by using a version of the Riesz integral transform. Then, from each analytical signal, a two-dimensional pseudophase map is generated in which the vortices are located and characterized in terms of their topological charges and their core's structural properties. The second part of the procedure allows obtaining Young's interference fringes when Fourier transforming the light passing through a diffracting mask with multiple apertures at the locations of the homologous vortices. In fact, we use the Fourier transform as a mathematical operation to compute the far-field diffraction intensity pattern corresponding to the multiaperture set. Each aperture from the set is associated with a rectangular hole that coincides both in shape and size with a pixel from recorded images. We show that the fringe analysis can be conducted as in speckle photography in an extended range of displacement measurements. Effects related with speckled decorrelation are also considered. Our experimental results agree with those of speckle photography in the range in which both techniques are applicable.

  14. Arts-based data collection techniques used in child research.

    PubMed

    Driessnack, Martha; Furukawa, Ryoko

    2012-01-01

    The purpose of this study was to identify the different arts-based techniques being used in health-related research with children. A systematic survey of literature was conducted. Two hundred and ten articles were initially identified and reviewed. Of these, 116 met inclusion criteria of arts-based techniques in research with children 7-12 years of age. The different categories of techniques identified included (a) drawings, (b) photographs, (c) graphics, and (d) artifacts. Only 19% of the studies were health related. Further, 79% were conducted outside the United States, revealing that arts-based techniques appear to be underused by nurses and other healthcare researchers, especially in the United States. To ensure that children actively engage in research involving them, nurses can familiarize themselves with and advocate for the use of arts-based techniques. © 2011, Wiley Periodicals, Inc.

  15. Techniques of DNA methylation analysis with nutritional applications.

    PubMed

    Mansego, Maria L; Milagro, Fermín I; Campión, Javier; Martínez, J Alfredo

    2013-01-01

    Epigenetic mechanisms are likely to play an important role in the regulation of metabolism and body weight through gene-nutrient interactions. This review focuses on methods for analyzing one of the most important epigenetic mechanisms, DNA methylation, from single nucleotide to global measurement depending on the study goal and scope. In addition, this study highlights the major principles and methods for DNA methylation analysis with emphasis on nutritional applications. Recent developments concerning epigenetic technologies are showing promising results of DNA methylation levels at a single-base resolution and provide the ability to differentiate between 5-methylcytosine and other nucleotide modifications such as 5-hydroxymethylcytosine. A large number of methods can be used for the analysis of DNA methylation such as pyrosequencing™, primer extension or real-time PCR methods, and genome-wide DNA methylation profile from microarray or sequencing-based methods. Researchers should conduct a preliminary analysis focused on the type of validation and information provided by each technique in order to select the best method fitting for their nutritional research interests.

  16. Partitioning Strategy Using Static Analysis Techniques

    NASA Astrophysics Data System (ADS)

    Seo, Yongjin; Soo Kim, Hyeon

    2016-08-01

    Flight software is software used in satellites' on-board computers. It has requirements such as real time and reliability. The IMA architecture is used to satisfy these requirements. The IMA architecture has the concept of partitions and this affected the configuration of flight software. That is, situations occurred in which software that had been loaded on one system was divided into many partitions when being loaded. For new issues, existing studies use experience based partitioning methods. However, these methods have a problem that they cannot be reused. In this respect, this paper proposes a partitioning method that is reusable and consistent.

  17. Improvement of Rocket Engine Plume Analysis Techniques

    NASA Technical Reports Server (NTRS)

    Smith, S. D.

    1982-01-01

    A nozzle plume flow field code was developed. The RAMP code which was chosen as the basic code is of modular construction and has the following capabilities: two phase with two phase transonic solution; a two phase, reacting gas (chemical equilibrium reaction kinetics), supersonic inviscid nozzle/plume solution; and is operational for inviscid solutions at both high and low altitudes. The following capabilities were added to the code: a direct interface with JANNAF SPF code; shock capturing finite difference numerical operator; two phase, equilibrium/frozen, boundary layer analysis; a variable oxidizer to fuel ratio transonic solution; an improved two phase transonic solution; and a two phase real gas semiempirical nozzle boundary layer expansion.

  18. Comparison Of Four FFT-Based Frequency-Acquisition Techniques

    NASA Technical Reports Server (NTRS)

    Shah, Biren N.; Hinedi, Sami M.; Holmes, Jack K.

    1993-01-01

    Report presents comparative theoretical analysis of four conceptual techniques for initial estimation of carrier frequency of suppressed-carrier, binary-phase-shift-keyed radio signal. Each technique effected by open-loop analog/digital signal-processing subsystem part of Costas-loop phase-error detector functioning in closed-loop manner overall.

  19. Validation of Design and Analysis Techniques of Tailored Composite Structures

    NASA Technical Reports Server (NTRS)

    Jegley, Dawn C. (Technical Monitor); Wijayratne, Dulnath D.

    2004-01-01

    Aeroelasticity is the relationship between the elasticity of an aircraft structure and its aerodynamics. This relationship can cause instabilities such as flutter in a wing. Engineers have long studied aeroelasticity to ensure such instabilities do not become a problem within normal operating conditions. In recent decades structural tailoring has been used to take advantage of aeroelasticity. It is possible to tailor an aircraft structure to respond favorably to multiple different flight regimes such as takeoff, landing, cruise, 2-g pull up, etc. Structures can be designed so that these responses provide an aerodynamic advantage. This research investigates the ability to design and analyze tailored structures made from filamentary composites. Specifically the accuracy of tailored composite analysis must be verified if this design technique is to become feasible. To pursue this idea, a validation experiment has been performed on a small-scale filamentary composite wing box. The box is tailored such that its cover panels induce a global bend-twist coupling under an applied load. Two types of analysis were chosen for the experiment. The first is a closed form analysis based on a theoretical model of a single cell tailored box beam and the second is a finite element analysis. The predicted results are compared with the measured data to validate the analyses. The comparison of results show that the finite element analysis is capable of predicting displacements and strains to within 10% on the small-scale structure. The closed form code is consistently able to predict the wing box bending to 25% of the measured value. This error is expected due to simplifying assumptions in the closed form analysis. Differences between the closed form code representation and the wing box specimen caused large errors in the twist prediction. The closed form analysis prediction of twist has not been validated from this test.

  20. Energy minimization versus pseudo force technique for nonlinear structural analysis

    NASA Technical Reports Server (NTRS)

    Kamat, M. P.; Hayduk, R. J.

    1980-01-01

    The effectiveness of using minimization techniques for the solution of nonlinear structural analysis problems is discussed and demonstrated by comparison with the conventional pseudo force technique. The comparison involves nonlinear problems with a relatively few degrees of freedom. A survey of the state-of-the-art of algorithms for unconstrained minimization reveals that extension of the technique to large scale nonlinear systems is possible.

  1. Evaluation of energy system analysis techniques for identifying underground facilities

    SciTech Connect

    VanKuiken, J.C.; Kavicky, J.A.; Portante, E.C.

    1996-03-01

    This report describes the results of a study to determine the feasibility and potential usefulness of applying energy system analysis techniques to help detect and characterize underground facilities that could be used for clandestine activities. Four off-the-shelf energy system modeling tools were considered: (1) ENPEP (Energy and Power Evaluation Program) - a total energy system supply/demand model, (2) ICARUS (Investigation of Costs and Reliability in Utility Systems) - an electric utility system dispatching (or production cost and reliability) model, (3) SMN (Spot Market Network) - an aggregate electric power transmission network model, and (4) PECO/LF (Philadelphia Electric Company/Load Flow) - a detailed electricity load flow model. For the purposes of most of this work, underground facilities were assumed to consume about 500 kW to 3 MW of electricity. For some of the work, facilities as large as 10-20 MW were considered. The analysis of each model was conducted in three stages: data evaluation, base-case analysis, and comparative case analysis. For ENPEP and ICARUS, open source data from Pakistan were used for the evaluations. For SMN and PECO/LF, the country data were not readily available, so data for the state of Arizona were used to test the general concept.

  2. Advanced NMR-based techniques for pore structure analysis of coal. Quarterly report No. 10, January 1, 1994--March 31, 1994

    SciTech Connect

    Smith, D.M.

    1994-06-01

    In the present quarter, results from {sup 129}Xe NMR experiments were made available that allowed the determination of the mean free path of a Xenon molecule within the pores of the material. The chemical shift at various loadings of Xenon was determined and the shift at zero loading was obtained by extrapolating the data to zero pressure. At zero loading, the collisions suffered by a Xenon molecule can be regarded as being entirely with the pore walls, since the concentration of Xenon molecules in the system is very low. Thus, the mean free path {lambda} is a measure of the distance travelled by a Xenon molecule before colliding with a wall, and hence is also a measure of the pore dimension. SAXS data reported in previous quarters gave the average radius of gyration R{sub g} which is also a measure of the average dimension of the pores of the material. In addition, application of the potential theory to the CO{sub 2} (274 K) adsorption data allowed the determination of a characteristic adsorption potential E, which is inversely proportional to the width of the pore. Thus, E should correlate inversely with the mean free path {lambda} as determined using {sup 129}Xe NMR. Also, E should correlate inversely with the radius of gyration R{sub g} from SAXS experiments. Another parameter obtained by analysis of the CO{sub 2} (274 K) adsorption data is the exponent n in the Dubinin-Astakhov equation. We had shown in previous quarters that this is a measure of the heterogeneity of the material.

  3. A dried blood spots technique based LC-MS/MS method for the analysis of posaconazole in human whole blood samples.

    PubMed

    Reddy, Todime M; Tama, Cristina I; Hayes, Roger N

    2011-11-15

    A rugged and robust liquid chromatographic tandem mass spectrometric (LC-MS/MS) method utilizing dried blood spots (DBS) was developed and validated for the analysis of posaconazole in human whole blood. Posaconazole fortified blood samples were spotted (15 μL) onto Ahlstrom Alh-226 DBS cards and dried for at least 2h. Punched spots were then extracted by using a mixture of acetonitrile and water containing stable labeled internal standard (IS). Posaconazole and its IS were separated from endogenous matrix components on a Kinetex™ C18 column under gradient conditions with a mobile phase A consisting of 0.1% formic acid and a mobile phase B consisting of 0.1% formic acid in acetonitrile/methanol (70/30, v/v). The analyte and IS were detected using a Sciex API 4000 triple quadrupole LC-MS/MS system equipped with a TurboIonSpray™ source operated in the positive ion mode. The assay was linear over the concentration range of 5-5000 ng/mL. The inter-run accuracy and precision of the assay were -1.8% to 0.8% and 4.0% to 10.4%, respectively. Additional assessments unique to DBS were investigated including sample spot homogeneity, spot volume, and hematocrit. Blood spot homogeneity was maintained and accurate and precise quantitation results were obtained when using a blood spot volume of between 15 and 35 μL. Human blood samples with hematocrit values ranging between 25% and 41% gave acceptable quantitation results. The validation results indicate that the method is accurate, precise, sensitive, selective and reproducible.

  4. Flexible control techniques for a lunar base

    NASA Technical Reports Server (NTRS)

    Kraus, Thomas W.

    1992-01-01

    applications with little or no customization. This means that lunar process control projects will not be delayed by unforeseen problems or last minute process modifications. The software will include all of the tools needed to adapt to virtually any changes. In contrast to other space programs which required the development of tremendous amounts of custom software, lunar-based processing facilities will benefit from the use of existing software technology which is being proven in commercial applications on Earth.

  5. Contact pin-printing of albumin-fungicide conjugate for silicon nitride-based sensors biofunctionalization: Multi-technique surface analysis for optimum immunoassay performance

    NASA Astrophysics Data System (ADS)

    Gajos, Katarzyna; Budkowski, Andrzej; Tsialla, Zoi; Petrou, Panagiota; Awsiuk, Kamil; Dąbczyński, Paweł; Bernasik, Andrzej; Rysz, Jakub; Misiakos, Konstantinos; Raptis, Ioannis; Kakabakos, Sotirios

    2017-07-01

    Mass fabrication of integrated biosensors on silicon chips is facilitated by contact pin-printing, applied for biofunctionalization of individual Si3N4-based transducers at wafer-scale. To optimize the biofunctionalization for immunochemical (competitive) detection of fungicide thiabendazole (TBZ), Si3N4 surfaces are modified with (3-aminopropyl)triethoxysilane and examined after: immobilization of BSA-TBZ conjugate (probe) from solutions with different concentration, blocking with bovine serum albumin (BSA), and immunoreaction with a mouse monoclonal antibody against TBZ. Nanostructure, surface density, probe composition and coverage uniformity of protein layers are evaluated with Atomic Force Microscopy, Spectroscopic Ellipsometry, Time-of-Flight Secondary Ion Mass Spectrometry and X-ray Photoelectron Spectroscopy. Contact pin-printing of overlapping probe spots is compared with hand spotted areas. Contact pin-printing resulted in two-fold increase of immobilized probe surface density as compared to hand spotting. Regarding BSA-TBZ immobilization, an incomplete monolayer develops into a bilayer as the concentration of BSA-TBZ molecules in the printing solution increases from 25 to 100 μg/mL. Upon blocking, however, a complete protein monolayer is formed for all the BSA-TBZ concentrations used. Free surface sites are filled with BSA for low surface coverage with BSA-TBZ, whereas loosely bound BSA-TBZ molecules are removed from the BSA-TBZ bilayer. As a consequence immunoreaction efficiency increases with the printing probe concentration.

  6. On the factors governing water vapor turbulence mixing in the convective boundary layer over land: Concept and data analysis technique using ground-based lidar measurements.

    PubMed

    Pal, Sandip

    2016-06-01

    The convective boundary layer (CBL) turbulence is the key process for exchanging heat, momentum, moisture and trace gases between the earth's surface and the lower part of the troposphere. The turbulence parameterization of the CBL is a challenging but important component in numerical models. In particular, correct estimation of CBL turbulence features, parameterization, and the determination of the contribution of eddy diffusivity are important for simulating convection initiation, and the dispersion of health hazardous air pollutants and Greenhouse gases. In general, measurements of higher-order moments of water vapor mixing ratio (q) variability yield unique estimates of turbulence in the CBL. Using the high-resolution lidar-derived profiles of q variance, third-order moment, and skewness and analyzing concurrent profiles of vertical velocity, potential temperature, horizontal wind and time series of near-surface measurements of surface flux and meteorological parameters, a conceptual framework based on bottom up approach is proposed here for the first time for a robust characterization of the turbulent structure of CBL over land so that our understanding on the processes governing CBL q turbulence could be improved. Finally, principal component analyses will be applied on the lidar-derived long-term data sets of q turbulence statistics to identify the meteorological factors and the dominant physical mechanisms governing the CBL turbulence features.

  7. PVUSA instrumentation and data analysis techniques for photovoltaic systems

    SciTech Connect

    Newmiller, J.; Hutchinson, P.; Townsend, T.; Whitaker, C.

    1995-10-01

    The Photovoltaics for Utility Scale Applications (PVUSA) project tests two types of PV systems at the main test site in Davis, California: new module technologies fielded as 20-kW Emerging Module Technology (EMT) arrays and more mature technologies fielded as 70- to 500-kW turnkey Utility-Scale (US) systems. PVUSA members have also installed systems in their service areas. Designed appropriately, data acquisition systems (DASs) can be a convenient and reliable means of assessing system performance, value, and health. Improperly designed, they can be complicated, difficult to use and maintain, and provide data of questionable validity. This report documents PVUSA PV system instrumentation and data analysis techniques and lessons learned. The report is intended to assist utility engineers, PV system designers, and project managers in establishing an objective, then, through a logical series of topics, facilitate selection and design of a DAS to meet the objective. Report sections include Performance Reporting Objectives (including operational versus research DAS), Recommended Measurements, Measurement Techniques, Calibration Issues, and Data Processing and Analysis Techniques. Conclusions and recommendations based on the several years of operation and performance monitoring are offered. This report is one in a series of 1994--1995 PVUSA reports documenting PVUSA lessons learned at the demonstration sites in Davis and Kerman, California. Other topical reports address: five-year assessment of EMTs; validation of the Kerman 500-kW grid support PV plant benefits; construction and safety experience in installing and operating PV systems; balance-of-system design and costs; procurement, acceptance, and rating practices for PV power plants; experience with power conditioning units and power quality.

  8. Retrieval techniques and information content analysis to improve remote sensing of atmospheric water vapor, liquid water and temperature from ground-based microwave radiometer measurements

    NASA Astrophysics Data System (ADS)

    Sahoo, Swaroop

    Observation of profiles of temperature, humidity and winds with sufficient accuracy and fine vertical and temporal resolution are needed to improve mesoscale weather prediction, track conditions in the lower to mid-troposphere, predict winds for renewable energy, inform the public of severe weather and improve transportation safety. In comparing these thermodynamic variables, the absolute atmospheric temperature varies only by 15%; in contrast, total water vapor may change by up to 50% over several hours. In addition, numerical weather prediction (NWP) models are initialized using water vapor profile information, so improvements in their accuracy and resolution tend to improve the accuracy of NWP. Current water vapor profile observation systems are expensive and have insufficient spatial coverage to observe humidity in the lower to mid-troposphere. To address this important scientific need, the principal objective of this dissertation is to improve the accuracy, vertical resolution and revisit time of tropospheric water vapor profiles retrieved from microwave and millimeter-wave brightness temperature measurements. This dissertation advances the state of knowledge of retrieval of atmospheric water vapor from microwave brightness temperature measurements. It focuses on optimizing two information sources of interest for water vapor profile retrieval, i.e. independent measurements and background data set size. From a theoretical perspective, it determines sets of frequencies in the ranges of 20-23, 85-90 and 165-200 GHz that are optimal for water vapor retrieval from each of ground-based and airborne radiometers. The maximum number of degrees of freedom for the selected frequencies for ground-based radiometers is 5-6, while the optimum vertical resolution is 0.5 to 1.5 km. On the other hand, the maximum number of degrees of freedom for airborne radiometers is 8-9, while the optimum vertical resolution is 0.2 to 0.5 km. From an experimental perspective, brightness

  9. Tools and techniques for failure analysis and qualification of MEMS.

    SciTech Connect

    Walraven, Jeremy Allen

    2003-07-01

    Many of the tools and techniques used to evaluate and characterize ICs can be applied to MEMS technology. In this paper we discuss various tools and techniques used to provide structural, chemical, and electrical analysis and how these data aid in qualifying MEMS technologies.

  10. Comparative analysis of 4 impression techniques for implants.

    PubMed

    Cabral, Leonardo Moreira; Guedes, Carlos Gramani

    2007-06-01

    This in vitro study investigated 4 impression techniques to determine their dimensional accuracy in comparison with a standard technique. A master metal framework with 2 inner hex implants (SIN; Sistema de Implante Nacional Ltda., Sao Paulo, Brazil) was used as a standard for the comparisons. Sixty master casts were prepared to evaluate 4 impression techniques: (1) indirect impression technique with tapered transfer copings, (2) direct impression technique with unsplinted squared transfer copings, (3) direct impression technique with squared transfer copings splinted with acrylic resin, and (4) direct impression technique with squared transfer copings with acrylic resin splints sectioned 17 minutes after setting and welded with the same resin. A profile projector was used to measure the distance between the copings attached to the analogs. Mean distances (mm) were calculated from 3 measurements for each sample in the master casts and in the master metal framework. Analysis of variance and the Tukey HSD test were used for statistical analysis of data (alpha = 0.05). The results for the direct technique with squared transfer copings with acrylic resin splints sectioned and welded after setting were not significantly different from results for the master metal framework. Considering the methodology used and the results obtained, the direct impression technique with squared transfer copings with acrylic resin splints sectioned and welded after setting had better results than the other techniques studied.

  11. Rugosimetric analysis of amalgam restorations polished using different techniques.

    PubMed

    Pereira, M A; Centola, A L; do Nascimento, T N; Turbino, M L

    1998-01-01

    The objective of the present study was to submit specimens with amalgam restorations to 4 different polishing techniques and one control group without polishing. The specimens were then submitted to rugosimetric analysis and the differences compared.

  12. Assesment of dial data collection and analysis techniques

    NASA Technical Reports Server (NTRS)

    Browell, E. V.; Woods, P. T.

    1986-01-01

    The key issues in all areas of Differential Absorption Lidar (DIAL) data collection and analysis techniques were examined. This included consideration of the practical and theoretical limitations of DIAL and the range of possible DIAL measurements.

  13. Chaos based Analytical techniques for daily extreme hydrological observations

    NASA Astrophysics Data System (ADS)

    Ng, W. W.; Panu, U. S.; Lennox, W. C.

    2007-08-01

    SummaryThe existence of outliers in data sets affects the decision-making process related to design, operation, and management of water resources. Insufficient information on outliers limits our understanding and predictive ability of such extreme hydrologic phenomena. Hydrologic systems are complex and dynamic in nature where current state and future evolutions depend on numerous physical variables and their interactions. Such systems can be represented in a simplified form through chaotic approach. Chaotic approach can determine the level of complexity of a system that provides the required information and parameters for subsequent predictive analyses. This research focuses on the application of chaotic analytical techniques to daily hydrologic series comprising of outliers. Different techniques and concepts of chaotic theory are adopted to enhance our understanding of the phenomena of outliers. Employing the streamflow data of the Saugeen River in Ontario, Canada, this paper illustrates the use of the autocorrelation functions, mutual information, power spectrum analysis, phase space reconstruction, correlation dimension, surrogate tests, and Hurst coefficients for the analysis of chaotic systems. Based on the results of analyses, one can arrive at the following conclusions: (1) The analyzed series exhibited random-like fluctuations. However, by rejecting the hypothesis of a random process, the analyzed series were found to be non-random. (2) The existence of outliers was found to increase the complexity of the analyzed series. High embedding dimensionalities obtained from the correlation analysis of the analyzed series support our conclusion. (3) The differentiation of a highly complex system from a random process, and the impact of outliers on the complexity of a system were quantitatively as well as visually presented from a chaotic perspective.

  14. Comparison and improvement of color-based image retrieval techniques

    NASA Astrophysics Data System (ADS)

    Zhang, Yujin; Liu, Zhong W.; He, Yun

    1997-12-01

    With the increasing popularity of image manipulation with contents, many color-based image retrieval techniques have been proposed in the literature. A systematic and comparative study of 8 representative techniques is first presented in this paper, which uses a database of 200 images of flags and trademarks. These techniques are determined to cover the variations of the color models used, of the characteristic color features employed and of the distance measures calculated for judging the similarity of color images. The results of this comparative study are presented both by the list of retrieved images for subjective visual inspection and by the retrieving ratios computed for objective judgement. All of them show that the cumulative histogram based techniques using Euclidean distance measures in two perception related color spaces give best results among the 8 techniques under consideration. Started from the best performed techniques, works toward further improving their retrieving capability are then carried on and this has resulted 2 new techniques which use local cumulative histograms. The new techniques have been tested by using a database of 400 images of real flowers which are quite complicated in color contents. Some satisfactory results, compared to that obtained by using existing cumulative histogram based techniques are obtained and presented.

  15. Hyperspectral-imaging-based techniques applied to wheat kernels characterization

    NASA Astrophysics Data System (ADS)

    Serranti, Silvia; Cesare, Daniela; Bonifazi, Giuseppe

    2012-05-01

    Single kernels of durum wheat have been analyzed by hyperspectral imaging (HSI). Such an approach is based on the utilization of an integrated hardware and software architecture able to digitally capture and handle spectra as an image sequence, as they results along a pre-defined alignment on a surface sample properly energized. The study was addressed to investigate the possibility to apply HSI techniques for classification of different types of wheat kernels: vitreous, yellow berry and fusarium-damaged. Reflectance spectra of selected wheat kernels of the three typologies have been acquired by a laboratory device equipped with an HSI system working in near infrared field (1000-1700 nm). The hypercubes were analyzed applying principal component analysis (PCA) to reduce the high dimensionality of data and for selecting some effective wavelengths. Partial least squares discriminant analysis (PLS-DA) was applied for classification of the three wheat typologies. The study demonstrated that good classification results were obtained not only considering the entire investigated wavelength range, but also selecting only four optimal wavelengths (1104, 1384, 1454 and 1650 nm) out of 121. The developed procedures based on HSI can be utilized for quality control purposes or for the definition of innovative sorting logics of wheat.

  16. Introducing Risk Management Techniques Within Project Based Software Engineering Courses

    NASA Astrophysics Data System (ADS)

    Port, Daniel; Boehm, Barry

    2002-03-01

    In 1996, USC switched its core two-semester software engineering course from a hypothetical-project, homework-and-exam course based on the Bloom taxonomy of educational objectives (knowledge, comprehension, application, analysis, synthesis, and evaluation). The revised course is a real-client team-project course based on the CRESST model of learning objectives (content understanding, problem solving, collaboration, communication, and self-regulation). We used the CRESST cognitive demands analysis to determine the necessary student skills required for software risk management and the other major project activities, and have been refining the approach over the last 5 years of experience, including revised versions for one-semester undergraduate and graduate project course at Columbia. This paper summarizes our experiences in evolving the risk management aspects of the project course. These have helped us mature more general techniques such as risk-driven specifications, domain-specific simplifier and complicator lists, and the schedule as an independent variable (SAIV) process model. The largely positive results in terms of review of pass / fail rates, client evaluations, product adoption rates, and hiring manager feedback are summarized as well.

  17. Comparative analysis of NDE techniques with image processing

    NASA Astrophysics Data System (ADS)

    Rathod, Vijay R.; Anand, R. S.; Ashok, Alaknanda

    2012-12-01

    The paper reports comparative results of nondestructive testing (NDT) based experimentation done on created flaws in the casting at the Central Foundry Forge Plant (CFFP) of Bharat Heavy Electrical Ltd. India (BHEL). The present experimental study is aimed at comparing the evaluation of image processing methods applied on the radiographic images of welding defects such as slag inclusion, porosity, lack-of-root penetration and cracks with other NDT methods. Different image segmentation techniques have been proposed here for identifying the above created welding defects. Currently, there is a large amount of research work going on in the field of automated system for inspection, analysis and detection of flaws in the weldments. Comparison of other NDT methods and application of image processing on the radiographic images of weld defects are aimed to detect defects reliably and to make accept/reject decisions as per the international standard.

  18. Bifurcation techniques for nonlinear dynamic analysis of compressor stall phenomena

    NASA Technical Reports Server (NTRS)

    Razavi, H. C.; Mehra, R. K.

    1985-01-01

    Compressor stall phenomena is analyzed from nonlinear control theory viewpoint, based on bifurcation-catastrophe techniques. This new approach appears promising and offers insight into such well known compressor instability problems as surge and rotating stall; furthermore it suggests strategies for recovery from stall. Three interlocking dynamic nonlinear state space models are developed. It is shown that the problem of rotating stall can be viewed as an (induced) bifurcation of solution of the unstalled model. Hysteresis effect is shown to exist in the stall/recovery process. Surge cycles are observed to develop for some critical parameter values. It is shown that the oscillatory behavior is due to development of limit cycles, generated by Hopf bifurcation of solutions. Both stable and unstable limit cycles are observed. To further illustrate the usefulness of the methodology some partial computation of domains of attraction of equilibria is carried out, and parameter sensitivity analysis is performed.

  19. Hand-Based Biometric Analysis

    NASA Technical Reports Server (NTRS)

    Bebis, George (Inventor); Amayeh, Gholamreza (Inventor)

    2015-01-01

    Hand-based biometric analysis systems and techniques are described which provide robust hand-based identification and verification. An image of a hand is obtained, which is then segmented into a palm region and separate finger regions. Acquisition of the image is performed without requiring particular orientation or placement restrictions. Segmentation is performed without the use of reference points on the images. Each segment is analyzed by calculating a set of Zernike moment descriptors for the segment. The feature parameters thus obtained are then fused and compared to stored sets of descriptors in enrollment templates to arrive at an identity decision. By using Zernike moments, and through additional manipulation, the biometric analysis is invariant to rotation, scale, or translation or an in put image. Additionally, the analysis utilizes re-use of commonly-seen terms in Zernike calculations to achieve additional efficiencies over traditional Zernike moment calculation.

  20. Hand-Based Biometric Analysis

    NASA Technical Reports Server (NTRS)

    Bebis, George

    2013-01-01

    Hand-based biometric analysis systems and techniques provide robust hand-based identification and verification. An image of a hand is obtained, which is then segmented into a palm region and separate finger regions. Acquisition of the image is performed without requiring particular orientation or placement restrictions. Segmentation is performed without the use of reference points on the images. Each segment is analyzed by calculating a set of Zernike moment descriptors for the segment. The feature parameters thus obtained are then fused and compared to stored sets of descriptors in enrollment templates to arrive at an identity decision. By using Zernike moments, and through additional manipulation, the biometric analysis is invariant to rotation, scale, or translation or an input image. Additionally, the analysis uses re-use of commonly seen terms in Zernike calculations to achieve additional efficiencies over traditional Zernike moment calculation.

  1. A Word-Based Compression Technique for Text Files.

    ERIC Educational Resources Information Center

    Vernor, Russel L., III; Weiss, Stephen F.

    1978-01-01

    Presents a word-based technique for storing natural language text in compact form. The compressed text consists of a dictionary and a text that is a combination of actual running text and pointers to the dictionary. This technique has shown itself to be effective for both text storage and retrieval. (VT)

  2. Water quality analysis of Godavari river basin using multivariate analysis techniques.

    PubMed

    Gupta, Indrani; Salunkhe, Abhaysinh; Rohra, Nanda; Kumar, Rakesh

    2013-01-01

    Multivariate statistical techniques, including cluster analysis, principal component analysis factor analysis and discriminant analysis, have been used to evaluate spatial variations and to interpret a large and complex water quality data set collected from the Godavari river basin. The data sets, containing 7 parameters, were generated during the 3-years (2007-2009) at 78 different sites along the river and its tributaries. Water quality indices based on four parameters (pH, DO, BOD and FC) calculated for all the sites were found to be medium to good, good to excellent and bad using modified NSF index. Three significant groups (cleaner, slightly and moderately polluted sites) were detected by CA method, and three latent factors were identified by PCA method. The results of DA revealed that only two parameters (i.e. pH and BOD) were necessary for analysis in spatial variation. 83.3% of the original sites were correctly.classified using discriminant function developed from the analysis.

  3. Comparison of laser transit anemometry data analysis techniques

    NASA Technical Reports Server (NTRS)

    Humphreys, William M., Jr.; Gartrell, Luther R.

    1991-01-01

    Two techniques for the extraction of two-dimensional flow information from laser transit anemometry (LTA) data sets are presented and compared via a simulation study and experimental investigation. The methods are a probability density function (PDF) estimation technique and a marginal distribution analysis technique. The simulation study builds on the results of previous work and provides a quantification of the accuracy of both techniques for various LTA data acquisition scenarios. The experimental comparison consists of using an LTA system to survey the flow downstream of a turbulence generator in a small low-speed wind tunnel. The collected data sets are analyzed and compared.

  4. Model-checking techniques based on cumulative residuals.

    PubMed

    Lin, D Y; Wei, L J; Ying, Z

    2002-03-01

    Residuals have long been used for graphical and numerical examinations of the adequacy of regression models. Conventional residual analysis based on the plots of raw residuals or their smoothed curves is highly subjective, whereas most numerical goodness-of-fit tests provide little information about the nature of model misspecification. In this paper, we develop objective and informative model-checking techniques by taking the cumulative sums of residuals over certain coordinates (e.g., covariates or fitted values) or by considering some related aggregates of residuals, such as moving sums and moving averages. For a variety of statistical models and data structures, including generalized linear models with independent or dependent observations, the distributions of these stochastic processes tinder the assumed model can be approximated by the distributions of certain zero-mean Gaussian processes whose realizations can be easily generated by computer simulation. Each observed process can then be compared, both graphically and numerically, with a number of realizations from the Gaussian process. Such comparisons enable one to assess objectively whether a trend seen in a residual plot reflects model misspecification or natural variation. The proposed techniques are particularly useful in checking the functional form of a covariate and the link function. Illustrations with several medical studies are provided.

  5. Investigations on landmine detection by neutron-based techniques.

    PubMed

    Csikai, J; Dóczi, R; Király, B

    2004-07-01

    Principles and techniques of some neutron-based methods used to identify the antipersonnel landmines (APMs) are discussed. New results have been achieved in the field of neutron reflection, transmission, scattering and reaction techniques. Some conclusions are as follows: The neutron hand-held detector is suitable for the observation of anomaly caused by a DLM2-like sample in different soils with a scanning speed of 1m(2)/1.5 min; the reflection cross section of thermal neutrons rendered the determination of equivalent thickness of different soil components possible; a simple method was developed for the determination of the thermal neutron flux perturbation factor needed for multi-elemental analysis of bulky samples; unfolded spectra of elastically backscattered neutrons using broad-spectrum sources render the identification of APMs possible; the knowledge of leakage spectra of different source neutrons is indispensable for the determination of the differential and integrated reaction rates and through it the dimension of the interrogated volume; the precise determination of the C/O atom fraction requires the investigations on the angular distribution of the 6.13MeV gamma-ray emitted in the (16)O(n,n'gamma) reaction. These results, in addition to the identification of landmines, render the improvement of the non-intrusive neutron methods possible.

  6. A human visual based binarization technique for histological images

    NASA Astrophysics Data System (ADS)

    Shreyas, Kamath K. M.; Rajendran, Rahul; Panetta, Karen; Agaian, Sos

    2017-05-01

    In the field of vision-based systems for object detection and classification, thresholding is a key pre-processing step. Thresholding is a well-known technique for image segmentation. Segmentation of medical images, such as Computed Axial Tomography (CAT), Magnetic Resonance Imaging (MRI), X-Ray, Phase Contrast Microscopy, and Histological images, present problems like high variability in terms of the human anatomy and variation in modalities. Recent advances made in computer-aided diagnosis of histological images help facilitate detection and classification of diseases. Since most pathology diagnosis depends on the expertise and ability of the pathologist, there is clearly a need for an automated assessment system. Histological images are stained to a specific color to differentiate each component in the tissue. Segmentation and analysis of such images is problematic, as they present high variability in terms of color and cell clusters. This paper presents an adaptive thresholding technique that aims at segmenting cell structures from Haematoxylin and Eosin stained images. The thresholded result can further be used by pathologists to perform effective diagnosis. The effectiveness of the proposed method is analyzed by visually comparing the results to the state of art thresholding methods such as Otsu, Niblack, Sauvola, Bernsen, and Wolf. Computer simulations demonstrate the efficiency of the proposed method in segmenting critical information.

  7. A study of trends and techniques for space base electronics

    NASA Technical Reports Server (NTRS)

    Trotter, J. D.; Wade, T. E.; Gassaway, J. D.; Mahmood, Q.

    1978-01-01

    A sputtering system was developed to deposit aluminum and aluminum alloys by the dc sputtering technique. This system is designed for a high level of cleanliness and for monitoring the deposition parameters during film preparation. This system is now ready for studying the deposition and annealing parameters upon double-level metal preparation. A technique recently applied for semiconductor analysis, the finite element method, was studied for use in the computer modeling of two dimensional MOS transistor structures. It was concluded that the method has not been sufficiently well developed for confident use at this time. An algorithm was developed for confident use at this time. An algorithm was developed for implementing a computer study which is based upon the finite difference method. The program which was developed was modified and used to calculate redistribution data for boron and phosphorous which had been predeposited by ion implantation with range and straggle conditions. Data were generated for 111 oriented SOS films with redistribution in N2, dry O2 and steam ambients.

  8. The role of technological analysis. Electronic technology: Means and techniques

    NASA Astrophysics Data System (ADS)

    Geraud-Liria, Nadine

    1988-08-01

    Technological analysis concepts are reviewed including technical assistance, quality support, decision aids, failure analysis, and technical arbitration. A procedure of failure analysis of electronic components is presented as well as the techniques used. The destructive operations start with an internal visual inspection. The physical defect localization is obtained by microprobes, liquid crystals, or dynamic voltage contrast. It is shown that failure analysis allows it to follow corrective actions at the manufacturing level and at the utilization level.

  9. Techniques for the Analysis of Extracellular Vesicles Using Flow Cytometry

    PubMed Central

    Inglis, Heather; Norris, Philip; Danesh, Ali

    2015-01-01

    Extracellular Vesicles (EVs) are small, membrane-derived vesicles found in bodily fluids that are highly involved in cell-cell communication and help regulate a diverse range of biological processes. Analysis of EVs using flow cytometry (FCM) has been notoriously difficult due to their small size and lack of discrete populations positive for markers of interest. Methods for EV analysis, while considerably improved over the last decade, are still a work in progress. Unfortunately, there is no one-size-fits-all protocol, and several aspects must be considered when determining the most appropriate method to use. Presented here are several different techniques for processing EVs and two protocols for analyzing EVs using either individual detection or a bead-based approach. The methods described here will assist with eliminating the antibody aggregates commonly found in commercial preparations, increasing signal–to-noise ratio, and setting gates in a rational fashion that minimizes detection of background fluorescence. The first protocol uses an individual detection method that is especially well suited for analyzing a high volume of clinical samples, while the second protocol uses a bead-based approach to capture and detect smaller EVs and exosomes. PMID:25867010

  10. Key-space analysis of double random phase encryption technique

    NASA Astrophysics Data System (ADS)

    Monaghan, David S.; Gopinathan, Unnikrishnan; Naughton, Thomas J.; Sheridan, John T.

    2007-09-01

    We perform a numerical analysis on the double random phase encryption/decryption technique. The key-space of an encryption technique is the set of possible keys that can be used to encode data using that technique. In the case of a strong encryption scheme, many keys must be tried in any brute-force attack on that technique. Traditionally, designers of optical image encryption systems demonstrate only how a small number of arbitrary keys cannot decrypt a chosen encrypted image in their system. However, this type of demonstration does not discuss the properties of the key-space nor refute the feasibility of an efficient brute-force attack. To clarify these issues we present a key-space analysis of the technique. For a range of problem instances we plot the distribution of decryption errors in the key-space indicating the lack of feasibility of a simple brute-force attack.

  11. High-level power analysis and optimization techniques

    NASA Astrophysics Data System (ADS)

    Raghunathan, Anand

    1997-12-01

    This thesis combines two ubiquitous trends in the VLSI design world--the move towards designing at higher levels of design abstraction, and the increasing importance of power consumption as a design metric. Power estimation and optimization tools are becoming an increasingly important part of design flows, driven by a variety of requirements such as prolonging battery life in portable computing and communication devices, thermal considerations and system cooling and packaging costs, reliability issues (e.g. electromigration, ground bounce, and I-R drops in the power network), and environmental concerns. This thesis presents a suite of techniques to automatically perform power analysis and optimization for designs at the architecture or register-transfer, and behavior or algorithm levels of the design hierarchy. High-level synthesis refers to the process of synthesizing, from an abstract behavioral description, a register-transfer implementation that satisfies the desired constraints. High-level synthesis tools typically perform one or more of the following tasks: transformations, module selection, clock selection, scheduling, and resource allocation and assignment (also called resource sharing or hardware sharing). High-level synthesis techniques for minimizing the area, maximizing the performance, and enhancing the testability of the synthesized designs have been investigated. This thesis presents high-level synthesis techniques that minimize power consumption in the synthesized data paths. This thesis investigates the effects of resource sharing on the power consumption in the data path, provides techniques to efficiently estimate power consumption during resource sharing, and resource sharing algorithms to minimize power consumption. The RTL circuit that is obtained from the high-level synthesis process can be further optimized for power by applying power-reducing RTL transformations. This thesis presents macro-modeling and estimation techniques for switching

  12. A New MRI-Based Pediatric Subcortical Segmentation Technique (PSST).

    PubMed

    Loh, Wai Yen; Connelly, Alan; Cheong, Jeanie L Y; Spittle, Alicia J; Chen, Jian; Adamson, Christopher; Ahmadzai, Zohra M; Fam, Lillian Gabra; Rees, Sandra; Lee, Katherine J; Doyle, Lex W; Anderson, Peter J; Thompson, Deanne K

    2016-01-01

    Volumetric and morphometric neuroimaging studies of the basal ganglia and thalamus in pediatric populations have utilized existing automated segmentation tools including FIRST (Functional Magnetic Resonance Imaging of the Brain's Integrated Registration and Segmentation Tool) and FreeSurfer. These segmentation packages, however, are mostly based on adult training data. Given that there are marked differences between the pediatric and adult brain, it is likely an age-specific segmentation technique will produce more accurate segmentation results. In this study, we describe a new automated segmentation technique for analysis of 7-year-old basal ganglia and thalamus, called Pediatric Subcortical Segmentation Technique (PSST). PSST consists of a probabilistic 7-year-old subcortical gray matter atlas (accumbens, caudate, pallidum, putamen and thalamus) combined with a customized segmentation pipeline using existing tools: ANTs (Advanced Normalization Tools) and SPM (Statistical Parametric Mapping). The segmentation accuracy of PSST in 7-year-old data was compared against FIRST and FreeSurfer, relative to manual segmentation as the ground truth, utilizing spatial overlap (Dice's coefficient), volume correlation (intraclass correlation coefficient, ICC) and limits of agreement (Bland-Altman plots). PSST achieved spatial overlap scores ≥90% and ICC scores ≥0.77 when compared with manual segmentation, for all structures except the accumbens. Compared with FIRST and FreeSurfer, PSST showed higher spatial overlap (p FDR  < 0.05) and ICC scores, with less volumetric bias according to Bland-Altman plots. PSST is a customized segmentation pipeline with an age-specific atlas that accurately segments typical and atypical basal ganglia and thalami at age 7 years, and has the potential to be applied to other pediatric datasets.

  13. A microhistological technique for analysis of food habits of mycophagous rodents.

    Treesearch

    Patrick W. McIntire; Andrew B. Carey

    1989-01-01

    We present a technique, based on microhistological analysis of fecal pellets, for quantifying the diets of forest rodents. This technique provides for the simultaneous recording of fungal spores and vascular plant material. Fecal samples should be freeze dried, weighed, and rehydrated with distilled water. We recommend a minimum sampling intensity of 50 fields of view...

  14. Specimen preparation and image processing and analysis techniques for automated quantification of concrete microcracks and voids

    SciTech Connect

    Soroushian, Parviz; Elzafraney, Mohamed; Nossoni, Ali

    2003-12-01

    Specimen preparation and image processing/analysis techniques were developed for use in automated quantitative microstructural investigation of concrete, focusing on concrete microcracks and voids. Different specimen preparation techniques were developed for use in fluorescent and scanning electron microscopy (SEM) of concrete; then techniques produce a sharp contrast between microcracks/voids and the body of concrete. The image processing/analysis techniques developed specifically for use with concrete address the following usages: automatic threshold; development of intersecting microcracks/voids and connected voids; distinction of microcracks form voids based on geometric attributes; and noise filtration.

  15. The detection of bulk explosives using nuclear-based techniques

    SciTech Connect

    Morgado, R.E.; Gozani, T.; Seher, C.C.

    1988-01-01

    In 1986 we presented a rationale for the detection of bulk explosives based on nuclear techniques that addressed the requirements of civil aviation security in the airport environment. Since then, efforts have intensified to implement a system based on thermal neutron activation (TNA), with new work developing in fast neutron and energetic photon reactions. In this paper we will describe these techniques and present new results from laboratory and airport testing. Based on preliminary results, we contended in our earlier paper that nuclear-based techniques did provide sufficiently penetrating probes and distinguishable detectable reaction products to achieve the FAA operational goals; new data have supported this contention. The status of nuclear-based techniques for the detection of bulk explosives presently under investigation by the US Federal Aviation Administration (FAA) is reviewed. These include thermal neutron activation (TNA), fast neutron activation (FNA), the associated particle technique, nuclear resonance absorption, and photoneutron activation. The results of comprehensive airport testing of the TNA system performed during 1987-88 are summarized. From a technical point of view, nuclear-based techniques now represent the most comprehensive and feasible approach for meeting the operational criteria of detection, false alarms, and throughput. 9 refs., 5 figs., 2 tabs.

  16. Evolution of electroencephalogram signal analysis techniques during anesthesia.

    PubMed

    Al-Kadi, Mahmoud I; Reaz, Mamun Bin Ibne; Ali, Mohd Alauddin Mohd

    2013-05-17

    Biosignal analysis is one of the most important topics that researchers have tried to develop during the last century to understand numerous human diseases. Electroencephalograms (EEGs) are one of the techniques which provides an electrical representation of biosignals that reflect changes in the activity of the human brain. Monitoring the levels of anesthesia is a very important subject, which has been proposed to avoid both patient awareness caused by inadequate dosage of anesthetic drugs and excessive use of anesthesia during surgery. This article reviews the bases of these techniques and their development within the last decades and provides a synopsis of the relevant methodologies and algorithms that are used to analyze EEG signals. In addition, it aims to present some of the physiological background of the EEG signal, developments in EEG signal processing, and the effective methods used to remove various types of noise. This review will hopefully increase efforts to develop methods that use EEG signals for determining and classifying the depth of anesthesia with a high data rate to produce a flexible and reliable detection device.

  17. Evolution of Electroencephalogram Signal Analysis Techniques during Anesthesia

    PubMed Central

    Al-Kadi, Mahmoud I.; Reaz, Mamun Bin Ibne; Ali, Mohd Alauddin Mohd

    2013-01-01

    Biosignal analysis is one of the most important topics that researchers have tried to develop during the last century to understand numerous human diseases. Electroencephalograms (EEGs) are one of the techniques which provides an electrical representation of biosignals that reflect changes in the activity of the human brain. Monitoring the levels of anesthesia is a very important subject, which has been proposed to avoid both patient awareness caused by inadequate dosage of anesthetic drugs and excessive use of anesthesia during surgery. This article reviews the bases of these techniques and their development within the last decades and provides a synopsis of the relevant methodologies and algorithms that are used to analyze EEG signals. In addition, it aims to present some of the physiological background of the EEG signal, developments in EEG signal processing, and the effective methods used to remove various types of noise. This review will hopefully increase efforts to develop methods that use EEG signals for determining and classifying the depth of anesthesia with a high data rate to produce a flexible and reliable detection device. PMID:23686141

  18. Separation/Preconcentration Techniques for Rare Earth Elements Analysis

    NASA Astrophysics Data System (ADS)

    Hu, Bin; He, Man; Chen, Beibei; Jiang, Zucheng

    2016-10-01

    The main aim of this chapter exactly characterizes the contribution. The analytical chemistry of the rare earth elements (REEs) very often is highly complicated and the determination of a specific element is impossible without a sample pre-concentration. Sample preparation can be carried out either by separation of the REEs from the matrix or by concentrating the REEs. The separation of REEs from each other is mainly made by chromatography. At the beginning of REE analysis, the method of precipitation/coprecipitation was applied for the treatment of REE mixtures. The method is not applicable for the separation of trace amounts of REEs. The majority of the methods used are based on the distribution of REEs in a two-phase system, a liquid-liquid or a liquid-solid system. Various techniques have been developed for the liquid-liquid extraction (LLE), in particular the liquid phase micro-extraction. The extraction is always combined with a pre-concentration of the REEs in a single drop of extractant or in a hollow fiber filled with the extractant. Further modified techniques for special applications and for difficult REE separation have been developed. Compared to the LLE, the solid phase micro-extraction is preferred. The method is robust and easy to handle, in which the solid phase loaded with the REEs can be used directly for subsequent determination methods. At present, very new solid materials, like nanotubes, are developed and tested for solid phase extraction.

  19. Wood lens design philosophy based on a binary additive manufacturing technique

    NASA Astrophysics Data System (ADS)

    Marasco, Peter L.; Bailey, Christopher

    2016-04-01

    Using additive manufacturing techniques in optical engineering to construct a gradient index (GRIN) optic may overcome a number of limitations of GRIN technology. Such techniques are maturing quickly, yielding additional design degrees of freedom for the engineer. How best to employ these degrees of freedom is not completely clear at this time. This paper describes a preliminary design philosophy, including assumptions, pertaining to a particular printing technique for GRIN optics. It includes an analysis based on simulation and initial component measurement.

  20. Metrology optical power budgeting in SIM using statistical analysis techniques

    NASA Astrophysics Data System (ADS)

    Kuan, Gary M.

    2008-07-01

    The Space Interferometry Mission (SIM) is a space-based stellar interferometry instrument, consisting of up to three interferometers, which will be capable of micro-arc second resolution. Alignment knowledge of the three interferometer baselines requires a three-dimensional, 14-leg truss with each leg being monitored by an external metrology gauge. In addition, each of the three interferometers requires an internal metrology gauge to monitor the optical path length differences between the two sides. Both external and internal metrology gauges are interferometry based, operating at a wavelength of 1319 nanometers. Each gauge has fiber inputs delivering measurement and local oscillator (LO) power, split into probe-LO and reference-LO beam pairs. These beams experience power loss due to a variety of mechanisms including, but not restricted to, design efficiency, material attenuation, element misalignment, diffraction, and coupling efficiency. Since the attenuation due to these sources may degrade over time, an accounting of the range of expected attenuation is needed so an optical power margin can be book kept. A method of statistical optical power analysis and budgeting, based on a technique developed for deep space RF telecommunications, is described in this paper and provides a numerical confidence level for having sufficient optical power relative to mission metrology performance requirements.

  1. Metrology Optical Power Budgeting in SIM Using Statistical Analysis Techniques

    NASA Technical Reports Server (NTRS)

    Kuan, Gary M

    2008-01-01

    The Space Interferometry Mission (SIM) is a space-based stellar interferometry instrument, consisting of up to three interferometers, which will be capable of micro-arc second resolution. Alignment knowledge of the three interferometer baselines requires a three-dimensional, 14-leg truss with each leg being monitored by an external metrology gauge. In addition, each of the three interferometers requires an internal metrology gauge to monitor the optical path length differences between the two sides. Both external and internal metrology gauges are interferometry based, operating at a wavelength of 1319 nanometers. Each gauge has fiber inputs delivering measurement and local oscillator (LO) power, split into probe-LO and reference-LO beam pairs. These beams experience power loss due to a variety of mechanisms including, but not restricted to, design efficiency, material attenuation, element misalignment, diffraction, and coupling efficiency. Since the attenuation due to these sources may degrade over time, an accounting of the range of expected attenuation is needed so an optical power margin can be book kept. A method of statistical optical power analysis and budgeting, based on a technique developed for deep space RF telecommunications, is described in this paper and provides a numerical confidence level for having sufficient optical power relative to mission metrology performance requirements.

  2. Techniques for a structural analysis of dermatoscopic imagery.

    PubMed

    Fleming, M G; Steger, C; Zhang, J; Gao, J; Cognetta, A B; Pollak, I; Dyer, C R

    1998-01-01

    Techniques were developed for automated detection and characterization of dermatoscopic structures, including the pigment network and brown globules. These techniques incorporate algorithms for grayscale shape extraction based on differential geometry developed by Steger, a snake algorithm, and a modification of the region competition strategy of Zhu and Yuille. A novel approach was developed for global segmentation of pigmented lesions, based on stabilized inverse diffusion equations. Procedures for detection of air bubbles and hairs in dermatoscopic images are also reported.

  3. Basic Sequence Analysis Techniques for Use with Audit Trail Data

    ERIC Educational Resources Information Center

    Judd, Terry; Kennedy, Gregor

    2008-01-01

    Audit trail analysis can provide valuable insights to researchers and evaluators interested in comparing and contrasting designers' expectations of use and students' actual patterns of use of educational technology environments (ETEs). Sequence analysis techniques are particularly effective but have been neglected to some extent because of real…

  4. Regional environmental analysis and management: New techniques for current problems

    NASA Technical Reports Server (NTRS)

    Honea, R. B.; Paludan, C. T. N.

    1974-01-01

    Advances in data acquisition and processing procedures for regional environmental analysis are discussed. Automated and semi-automated techniques employing Earth Resources Technology Satellite data and conventional data sources are presented. Experiences are summarized. The ERTS computer compatible tapes provide a very complete and flexible record of earth resources data and represent a viable medium to enhance regional environmental analysis research.

  5. Multi-elemental Analysis Of Steel By Combined Nuclear Techniques

    SciTech Connect

    Ene, A.; Popescu, I. V.; Badica, T.

    2007-04-23

    In this work the nuclear techniques PIGE (Particle-Induced Gamma-ray Emission), PIXE (Particle-Induced X-ray Emission) and NAA (Neutron Activation Analysis) used for the multi-elemental analysis of steels have been compared in terms of detection limits, advantages and limitations.

  6. Considerations and Techniques for the Analysis of NAEP Data.

    ERIC Educational Resources Information Center

    Johnson, Eugene

    The special characteristics of the data from the National Assessment of Educational Progress (NAEP) that affect the validity of conventional techniques of statistical analysis are considered. In contrast to the assumptions underlying standard methods of statistical analysis, the NAEP samples are obtained via a stratified multi-stage probability…

  7. Considerations and Techniques for the Analysis of NAEP Data.

    ERIC Educational Resources Information Center

    Johnson, Eugene G.

    1989-01-01

    The effects of certain characteristics (e.g., sample design) of National Assessment of Educational Progress (NAEP) data on statistical analysis techniques are considered. Ignoring special features of NAEP data and proceeding with a standard analysis can produce inferences that underestimate the true variability and overestimate the true degrees of…

  8. TU-EF-BRD-02: Indicators and Technique Analysis

    SciTech Connect

    Carlone, M.

    2015-06-15

    Research related to quality and safety has been a staple of medical physics academic activities for a long time. From very early on, medical physicists have developed new radiation measurement equipment and analysis techniques, created ever increasingly accurate dose calculation models, and have vastly improved imaging, planning, and delivery techniques. These and other areas of interest have improved the quality and safety of radiotherapy for our patients. With the advent of TG-100, quality and safety is an area that will garner even more research interest in the future. As medical physicists pursue quality and safety research in greater numbers, it is worthwhile to consider what actually constitutes research on quality and safety. For example, should the development of algorithms for real-time EPID-based in-vivo dosimetry be defined as “quality and safety” research? How about the clinical implementation of such as system? Surely the application of failure modes and effects analysis to a clinical process would be considered quality and safety research, but is this type of research that should be included in the medical physics peer-reviewed literature? The answers to such questions are of critical importance to set researchers in a direction that will provide the greatest benefit to our field and the patients we serve. The purpose of this symposium is to consider what constitutes research in the arena of quality and safety and differentiate it from other research directions. The key distinction here is developing the tool itself (e.g. algorithms for EPID dosimetry) vs. studying the impact of the tool with some quantitative metric. Only the latter would I call quality and safety research. Issues of ‘basic’ versus ‘applied’ quality and safety research will be covered as well as how the research results should be structured to provide increasing levels of support that a quality and safety intervention is effective and sustainable. Examples from existing

  9. Determination of Volatile Organic Compounds in the Atmosphere Using Two Complementary Analysis Techniques.

    PubMed

    Alonso, L; Durana, N; Navazo, M; García, J A; Ilardia, J L

    1999-08-01

    During a preliminary field campaign of volatile organic compound (VOC) measurements carried out in an urban area, two complementary analysis techniques were applied to establish the technical and scientific bases for a strategy to monitor and control VOCs and photochemical oxidants in the Autonomous Community of the Basque Country. Integrated sampling was conducted using Tenax sorbent tubes and laboratory analysis by gas chromatography, and grab sampling and in situ analysis also were conducted using a portable gas chromatograph. With the first technique, monocyclic aromatic hydrocarbons appeared as the compounds with the higher mean concentrations. The second technique allowed the systematic analysis of eight chlorinated and aromatic hydrocarbons. Results of comparing both techniques, as well as the additional information obtained with the second technique, are included.

  10. Initial planetary base construction techniques and machine implementation

    NASA Technical Reports Server (NTRS)

    Crockford, William W.

    1987-01-01

    Conceptual designs of (1) initial planetary base structures, and (2) an unmanned machine to perform the construction of these structures using materials local to the planet are presented. Rock melting is suggested as a possible technique to be used by the machine in fabricating roads, platforms, and interlocking bricks. Identification of problem areas in machine design and materials processing is accomplished. The feasibility of the designs is contingent upon favorable results of an analysis of the engineering behavior of the product materials. The analysis requires knowledge of several parameters for solution of the constitutive equations of the theory of elasticity. An initial collection of these parameters is presented which helps to define research needed to perform a realistic feasibility study. A qualitative approach to estimating power and mass lift requirements for the proposed machine is used which employs specifications of currently available equipment. An initial, unmanned mission scenario is discussed with emphasis on identifying uncompleted tasks and suggesting design considerations for vehicles and primitive structures which use the products of the machine processing.

  11. Geotechnical Analysis of Paleoseismic Shaking Using Liquefaction Features: Part I. Major Updating of Analysis Techniques

    USGS Publications Warehouse

    Olson, Scott M.; Green, Russell A.; Obermeier, Stephen F.

    2003-01-01

    A new methodology is proposed for the geotechnical analysis of strength of paleoseismic shaking using liquefaction effects. The proposed method provides recommendations for selection of both individual and regionally located test sites, techniques for validation of field data for use in back-analysis, and use of a recently developed energy-based solution to back-calculate paleoearthquake magnitude and strength of shaking. The proposed method allows investigators to assess the influence of post-earthquake density change and aging. The proposed method also describes how the back-calculations from individual sites should be integrated into a regional assessment of paleoseismic parameters.

  12. A Flexible Hierarchical Bayesian Modeling Technique for Risk Analysis of Major Accidents.

    PubMed

    Yu, Hongyang; Khan, Faisal; Veitch, Brian

    2017-02-28

    Safety analysis of rare events with potentially catastrophic consequences is challenged by data scarcity and uncertainty. Traditional causation-based approaches, such as fault tree and event tree (used to model rare event), suffer from a number of weaknesses. These include the static structure of the event causation, lack of event occurrence data, and need for reliable prior information. In this study, a new hierarchical Bayesian modeling based technique is proposed to overcome these drawbacks. The proposed technique can be used as a flexible technique for risk analysis of major accidents. It enables both forward and backward analysis in quantitative reasoning and the treatment of interdependence among the model parameters. Source-to-source variability in data sources is also taken into account through a robust probabilistic safety analysis. The applicability of the proposed technique has been demonstrated through a case study in marine and offshore industry.

  13. Damage detection technique by measuring laser-based mechanical impedance

    SciTech Connect

    Lee, Hyeonseok; Sohn, Hoon

    2014-02-18

    This study proposes a method for measurement of mechanical impedance using noncontact laser ultrasound. The measurement of mechanical impedance has been of great interest in nondestructive testing (NDT) or structural health monitoring (SHM) since mechanical impedance is sensitive even to small-sized structural defects. Conventional impedance measurements, however, have been based on electromechanical impedance (EMI) using contact-type piezoelectric transducers, which show deteriorated performances induced by the effects of a) Curie temperature limitations, b) electromagnetic interference (EMI), c) bonding layers and etc. This study aims to tackle the limitations of conventional EMI measurement by utilizing laser-based mechanical impedance (LMI) measurement. The LMI response, which is equivalent to a steady-state ultrasound response, is generated by shooting the pulse laser beam to the target structure, and is acquired by measuring the out-of-plane velocity using a laser vibrometer. The formation of the LMI response is observed through the thermo-mechanical finite element analysis. The feasibility of applying the LMI technique for damage detection is experimentally verified using a pipe specimen under high temperature environment.

  14. S-192 analysis: Conventional and special data processing techniques. [Michigan

    NASA Technical Reports Server (NTRS)

    Nalepka, R. F. (Principal Investigator); Morganstern, J.; Cicone, R.; Sarno, J.; Lambeck, P.; Malila, W.

    1975-01-01

    The author has identified the following significant results. Multispectral scanner data gathered over test sites in southeast Michigan were analyzed. This analysis showed the data to be somewhat deficient especially in terms of the limited signal range in most SDOs and also in regard to SDO-SDO misregistration. Further analysis showed that the scan line straightening algorithm increased the misregistration of the data. Data were processed using the conic format. The effects of such misregistration on classification accuracy was analyzed via simulation and found to be significant. Results of employing conventional as well as special, unresolved object, processing techniques were disappointing due, at least in part, to the limited signal range and noise content of the data. Application of a second class of special processing techniques, signature extension techniques, yielded better results. Two of the more basic signature extension techniques seemed to be useful in spite of the difficulties.

  15. Crystallographic texture analysis of archaeological metals: interpretation of manufacturing techniques

    NASA Astrophysics Data System (ADS)

    Artioli, G.

    2007-12-01

    Neutron probes and high energy X-rays are sources of primary importance for the non-invasive characterization of materials related to cultural heritage. Their employment in the characterization of archaeological metal objects, combined with the recent instrumental and computational developments in the field of crystallographic texture analysis (CTA) from diffraction data proves to be a powerful tool for the interpretation of ancient metal working techniques. Diffraction based CTA, when performed using penetrating probes and adequate detector coverage of reciprocal space, for example using large detector arrays and/or ToF mode, allows simultaneous identification and quantification of crystalline phases, besides the microstructural and textural characterization of the object, and it can be effectively used as a totally non-invasive tool for metallographic analysis. Furthermore, the chemical composition of the object may also be obtained by the simultaneous detection of prompt gamma rays induced by neutron activation, or by the fluorescence signal from high energy X-rays, in order to obtain a large amount of complementary information in a single experiment. The specific application of neutron CTA to the characterization of the manufacturing processes of prehistoric copper axes is discussed in detail.

  16. VLBI Analysis with the Multi-Technique Software GEOSAT

    NASA Technical Reports Server (NTRS)

    Kierulf, Halfdan Pascal; Andersen, Per-Helge; Boeckmann, Sarah; Kristiansen, Oddgeir

    2010-01-01

    GEOSAT is a multi-technique geodetic analysis software developed at Forsvarets Forsknings Institutt (Norwegian defense research establishment). The Norwegian Mapping Authority has now installed the software and has, together with Forsvarets Forsknings Institutt, adapted the software to deliver datum-free normal equation systems in SINEX format. The goal is to be accepted as an IVS Associate Analysis Center and to provide contributions to the IVS EOP combination on a routine basis. GEOSAT is based on an upper diagonal factorized Kalman filter which allows estimation of time variable parameters like the troposphere and clocks as stochastic parameters. The tropospheric delays in various directions are mapped to tropospheric zenith delay using ray-tracing. Meteorological data from ECMWF with a resolution of six hours is used to perform the ray-tracing which depends both on elevation and azimuth. Other models are following the IERS and IVS conventions. The Norwegian Mapping Authority has submitted test SINEX files produced with GEOSAT to IVS. The results have been compared with the existing IVS combined products. In this paper the outcome of these comparisons is presented.

  17. Efficient Plant Supervision Strategy Using NN Based Techniques

    NASA Astrophysics Data System (ADS)

    Garcia, Ramon Ferreiro; Rolle, Jose Luis Calvo; Castelo, Francisco Javier Perez

    Most of non-linear type one and type two control systems suffers from lack of detectability when model based techniques are applied on FDI (fault detection and isolation) tasks. In general, all types of processes suffer from lack of detectability also due to the ambiguity to discriminate the process, sensors and actuators in order to isolate any given fault. This work deals with a strategy to detect and isolate faults which include massive neural networks based functional approximation procedures associated to recursive rule based techniques applied to a parity space approach.

  18. CANDU in-reactor quantitative visual-based inspection techniques

    NASA Astrophysics Data System (ADS)

    Rochefort, P. A.

    2009-02-01

    This paper describes two separate visual-based inspection procedures used at CANDU nuclear power generating stations. The techniques are quantitative in nature and are delivered and operated in highly radioactive environments with access that is restrictive, and in one case is submerged. Visual-based inspections at stations are typically qualitative in nature. For example a video system will be used to search for a missing component, inspect for a broken fixture, or locate areas of excessive corrosion in a pipe. In contrast, the methods described here are used to measure characteristic component dimensions that in one case ensure ongoing safe operation of the reactor and in the other support reactor refurbishment. CANDU reactors are Pressurized Heavy Water Reactors (PHWR). The reactor vessel is a horizontal cylindrical low-pressure calandria tank approximately 6 m in diameter and length, containing heavy water as a neutron moderator. Inside the calandria, 380 horizontal fuel channels (FC) are supported at each end by integral end-shields. Each FC holds 12 fuel bundles. The heavy water primary heat transport water flows through the FC pressure tube, removing the heat from the fuel bundles and delivering it to the steam generator. The general design of the reactor governs both the type of measurements that are required and the methods to perform the measurements. The first inspection procedure is a method to remotely measure the gap between FC and other in-core horizontal components. The technique involves delivering vertically a module with a high-radiation-resistant camera and lighting into the core of a shutdown but fuelled reactor. The measurement is done using a line-of-sight technique between the components. Compensation for image perspective and viewing elevation to the measurement is required. The second inspection procedure measures flaws within the reactor's end shield FC calandria tube rolled joint area. The FC calandria tube (the outer shell of the FC) is

  19. Analysis techniques used on field degraded photovoltaic modules

    SciTech Connect

    Hund, T.D.; King, D.L.

    1995-09-01

    Sandia National Laboratory`s PV System Components Department performs comprehensive failure analysis of photovoltaic modules after extended field exposure at various sites around the world. A full spectrum of analytical techniques are used to help identify the causes of degradation. The techniques are used to make solder fatigue life predictions for PV concentrator modules, identify cell damage or current mismatch, and measure the adhesive strength of the module encapsulant.

  20. Dynamic analysis of large structures by modal synthesis techniques.

    NASA Technical Reports Server (NTRS)

    Hurty, W. C.; Hart, G. C.; Collins, J. D.

    1971-01-01

    Several criteria that may be used to evaluate the merits of some of the existing techniques for the dynamic analysis of large structures which involve division into substructures or components are examined. These techniques make use of component displacement modes to synthetize global systems of generalized coordinates and, for that reason, they have come to be known as modal synthesis or component mode methods. Two techniques have been found to be particularly useful - i.e., the modal synthesis method with fixed attachment modes, and the modal synthesis method with free attachment modes. These two methods are treated in detail, and general flow charts are presented for guidance in computer programming.

  1. Diode laser based water vapor DIAL using modulated pulse technique

    NASA Astrophysics Data System (ADS)

    Pham, Phong Le Hoai; Abo, Makoto

    2014-11-01

    In this paper, we propose a diode laser based differential absorption lidar (DIAL) for measuring lower-tropospheric water vapor profile using the modulated pulse technique. The transmitter is based on single-mode diode laser and tapered semiconductor optical amplifier with a peak power of 10W around 800nm absorption band, and the receiver telescope diameter is 35cm. The selected wavelengths are compared to referenced wavelengths in terms of random error and systematic errors. The key component of modulated pulse technique, a macropulse, is generated with a repetition rate of 10 kHz, and the modulation within the macropulse is coded according to a pseudorandom sequence with 100ns chip width. As a result, we evaluate both single pulse modulation and pseudorandom coded pulse modulation technique. The water vapor profiles conducted from these modulation techniques are compared to the real observation data in summer in Japan.

  2. Root Cause Analysis - A Diagnostic Failure Analysis Technique for Managers

    DTIC Science & Technology

    1975-03-26

    sources of data -There are many additional sources of data which may produce facts useful during a root cause analysis. Each fact can be applied to the...supporting or refuting data columns of the root cause analysis chart. to help establish the root cause mode of failure. Some of the sources of data are as...facts when performing a root cause analysis. Use verified data from all available sources . Take steps to verify all data used as rapidly as possible

  3. Applications of Electromigration Techniques: Applications of Electromigration Techniques in Food Analysis

    NASA Astrophysics Data System (ADS)

    Wieczorek, Piotr; Ligor, Magdalena; Buszewski, Bogusław

    Electromigration techniques, including capillary electrophoresis (CE), are widely used for separation and identification of compounds present in food products. These techniques may also be considered as alternate and complementary with respect to commonly used analytical techniques, such as high-performance liquid chromatography (HPLC), or gas chromatography (GC). Applications of CE concern the determination of high-molecular compounds, like polyphenols, including flavonoids, pigments, vitamins, food additives (preservatives, antioxidants, sweeteners, artificial pigments) are presented. Also, the method developed for the determination of proteins and peptides composed of amino acids, which are basic components of food products, are studied. Other substances such as carbohydrates, nucleic acids, biogenic amines, natural toxins, and other contaminations including pesticides and antibiotics are discussed. The possibility of CE application in food control laboratories, where analysis of the composition of food and food products are conducted, is of great importance. CE technique may be used during the control of technological processes in the food industry and for the identification of numerous compounds present in food. Due to the numerous advantages of the CE technique it is successfully used in routine food analysis.

  4. Image analysis techniques for the study of turbulent flows

    NASA Astrophysics Data System (ADS)

    Ferrari, Simone

    In this paper, a brief review of Digital Image Analysis techniques employed in Fluid Mechanics for the study of turbulent flows is given. Particularly the focus is on the techniques developed by the research teams the Author worked in, that can be considered relatively "low cost" techniques. Digital Image Analysis techniques have the advantage, when compared to the traditional techniques employing physical point probes, to be non-intrusive and quasi-continuous in space, as every pixel on the camera sensor works as a single probe: consequently, they allow to obtain two-dimensional or three-dimensional fields of the measured quantity in less time. Traditionally, the disadvantages are related to the frequency of acquisition, but modern high-speed cameras are typically able to acquire at frequencies from the order of 1 KHz to the order of 1 MHz. Digital Image Analysis techniques can be employed to measure concentration, temperature, position, displacement, velocity, acceleration and pressure fields with similar equipment and setups, and can be consequently considered as a flexible and powerful tool for measurements on turbulent flows.

  5. Impact of Knowledge-Based Techniques on Emerging Technologies

    DTIC Science & Technology

    2006-09-01

    coherent location (PCL), tracking in multistatic radar, and ‘spatial denial’ as a waveform diversity technique to prevent the exploitation by an enemy...performing a variety of surveillance and tracking tasks. Knowledge-based processing may be used to control the scheduling of tasks in such a radar, showing...techniques to bistatic and multistatic radar, including the use of information on waveform properties in passive coherent location (PCL), tracking

  6. Bond strength with custom base indirect bonding techniques.

    PubMed

    Klocke, Arndt; Shi, Jianmin; Kahl-Nieke, Bärbel; Bismayer, Ulrich

    2003-04-01

    Different types of adhesives for indirect bonding techniques have been introduced recently. But there is limited information regarding bond strength with these new materials. In this in vitro investigation, stainless steel brackets were bonded to 100 permanent bovine incisors using the Thomas technique, the modified Thomas technique, and light-cured direct bonding for a control group. The following five groups of 20 teeth each were formed: (1) modified Thomas technique with thermally cured base composite (Therma Cure) and chemically cured sealant (Maximum Cure), (2) Thomas technique with thermally cured base composite (Therma Cure) and chemically cured sealant (Custom I Q), (3) Thomas technique with light-cured base composite (Transbond XT) and chemically cured sealant (Sondhi Rapid Set), (4) modified Thomas technique with chemically cured base adhesive (Phase II) and chemically cured sealant (Maximum Cure), and (5) control group directly bonded with light-cured adhesive (Transbond XT). Mean bond strengths in groups 3, 4, and 5 were 14.99 +/- 2.85, 15.41 +/- 3.21, and 13.88 +/- 2.33 MPa, respectively, and these groups were not significantly different from each other. Groups 1 (mean bond strength 7.28 +/- 4.88 MPa) and 2 (mean bond strength 7.07 +/- 4.11 MPa) showed significantly lower bond strengths than groups 3, 4, and 5 and a higher probability of bond failure. Both the original (group 2) and the modified (group 1) Thomas technique were able to achieve bond strengths comparable to the light-cured direct bonded control group.

  7. Review of surface profile measurement techniques based on optical interferometry

    NASA Astrophysics Data System (ADS)

    Wang, Yunzhi; Xie, Fang; Ma, Sen; Dong, Lianlian

    2017-06-01

    With the fast development of modern science and technology, two or three-dimensional surface profile measurement techniques with high resolution and large dynamic range are urgently required. Among them, the techniques based on optical interferometry have been widely used for their good properties of non-contact, high resolution, large dynamic measurement range and well-defined traceability route to the definition of meter. A review focused on surface profile measurement techniques of optical interferometry is introduced in this paper with a detailed classification sorted by operating principles. Examples in each category are discussed and analyzed for better understanding.

  8. Review and classification of variability analysis techniques with clinical applications

    PubMed Central

    2011-01-01

    Analysis of patterns of variation of time-series, termed variability analysis, represents a rapidly evolving discipline with increasing applications in different fields of science. In medicine and in particular critical care, efforts have focussed on evaluating the clinical utility of variability. However, the growth and complexity of techniques applicable to this field have made interpretation and understanding of variability more challenging. Our objective is to provide an updated review of variability analysis techniques suitable for clinical applications. We review more than 70 variability techniques, providing for each technique a brief description of the underlying theory and assumptions, together with a summary of clinical applications. We propose a revised classification for the domains of variability techniques, which include statistical, geometric, energetic, informational, and invariant. We discuss the process of calculation, often necessitating a mathematical transform of the time-series. Our aims are to summarize a broad literature, promote a shared vocabulary that would improve the exchange of ideas, and the analyses of the results between different studies. We conclude with challenges for the evolving science of variability analysis. PMID:21985357

  9. Review and classification of variability analysis techniques with clinical applications.

    PubMed

    Bravi, Andrea; Longtin, André; Seely, Andrew J E

    2011-10-10

    Analysis of patterns of variation of time-series, termed variability analysis, represents a rapidly evolving discipline with increasing applications in different fields of science. In medicine and in particular critical care, efforts have focussed on evaluating the clinical utility of variability. However, the growth and complexity of techniques applicable to this field have made interpretation and understanding of variability more challenging. Our objective is to provide an updated review of variability analysis techniques suitable for clinical applications. We review more than 70 variability techniques, providing for each technique a brief description of the underlying theory and assumptions, together with a summary of clinical applications. We propose a revised classification for the domains of variability techniques, which include statistical, geometric, energetic, informational, and invariant. We discuss the process of calculation, often necessitating a mathematical transform of the time-series. Our aims are to summarize a broad literature, promote a shared vocabulary that would improve the exchange of ideas, and the analyses of the results between different studies. We conclude with challenges for the evolving science of variability analysis.

  10. Automated thermal mapping techniques using chromatic image analysis

    NASA Technical Reports Server (NTRS)

    Buck, Gregory M.

    1989-01-01

    Thermal imaging techniques are introduced using a chromatic image analysis system and temperature sensitive coatings. These techniques are used for thermal mapping and surface heat transfer measurements on aerothermodynamic test models in hypersonic wind tunnels. Measurements are made on complex vehicle configurations in a timely manner and at minimal expense. The image analysis system uses separate wavelength filtered images to analyze surface spectral intensity data. The system was initially developed for quantitative surface temperature mapping using two-color thermographic phosphors but was found useful in interpreting phase change paint and liquid crystal data as well.

  11. Toner and paper-based fabrication techniques for microfluidic applications.

    PubMed

    Coltro, Wendell Karlos Tomazelli; de Jesus, Dosil Pereira; da Silva, José Alberto Fracassi; do Lago, Claudimir Lucio; Carrilho, Emanuel

    2010-08-01

    The interest in low-cost microfluidic platforms as well as emerging microfabrication techniques has increased considerably over the last years. Toner- and paper-based techniques have appeared as two of the most promising platforms for the production of disposable devices for on-chip applications. This review focuses on recent advances in the fabrication techniques and in the analytical/bioanalytical applications of toner and paper-based devices. The discussion is divided in two parts dealing with (i) toner and (ii) paper devices. Examples of miniaturized devices fabricated by using direct-printing or toner transfer masking in polyester-toner, glass, PDMS as well as conductive platforms as recordable compact disks and printed circuit board are presented. The construction and the use of paper-based devices for off-site diagnosis and bioassays are also described to cover this emerging platform for low-cost diagnostics.

  12. Machine Learning Techniques for Arterial Pressure Waveform Analysis

    PubMed Central

    Almeida, Vânia G.; Vieira, João; Santos, Pedro; Pereira, Tânia; Pereira, H. Catarina; Correia, Carlos; Pego, Mariano; Cardoso, João

    2013-01-01

    The Arterial Pressure Waveform (APW) can provide essential information about arterial wall integrity and arterial stiffness. Most of APW analysis frameworks individually process each hemodynamic parameter and do not evaluate inter-dependencies in the overall pulse morphology. The key contribution of this work is the use of machine learning algorithms to deal with vectorized features extracted from APW. With this purpose, we follow a five-step evaluation methodology: (1) a custom-designed, non-invasive, electromechanical device was used in the data collection from 50 subjects; (2) the acquired position and amplitude of onset, Systolic Peak (SP), Point of Inflection (Pi) and Dicrotic Wave (DW) were used for the computation of some morphological attributes; (3) pre-processing work on the datasets was performed in order to reduce the number of input features and increase the model accuracy by selecting the most relevant ones; (4) classification of the dataset was carried out using four different machine learning algorithms: Random Forest, BayesNet (probabilistic), J48 (decision tree) and RIPPER (rule-based induction); and (5) we evaluate the trained models, using the majority-voting system, comparatively to the respective calculated Augmentation Index (AIx). Classification algorithms have been proved to be efficient, in particular Random Forest has shown good accuracy (96.95%) and high area under the curve (AUC) of a Receiver Operating Characteristic (ROC) curve (0.961). Finally, during validation tests, a correlation between high risk labels, retrieved from the multi-parametric approach, and positive AIx values was verified. This approach gives allowance for designing new hemodynamic morphology vectors and techniques for multiple APW analysis, thus improving the arterial pulse understanding, especially when compared to traditional single-parameter analysis, where the failure in one parameter measurement component, such as Pi, can jeopardize the whole evaluation. PMID

  13. Emerging techniques for soil analysis via mid-infrared spectroscopy

    NASA Astrophysics Data System (ADS)

    Linker, R.; Shaviv, A.

    2009-04-01

    Transmittance and diffuse reflectance (DRIFT) spectroscopy in the mid-IR range are well-established methods for soil analysis. Over the last five years, additional mid-IR techniques have been investigated, and in particular: 1. Attenuated total reflectance (ATR) Attenuated total reflectance is commonly used for analysis of liquids and powders for which simple transmittance measurements are not possible. The method relies on a crystal with a high refractive index, which is in contact with the sample and serves as a waveguide for the IR radiation. The radiation beam is directed in such a way that it hits the crystal/sample interface several times, each time penetrating a few microns into the sample. Since the penetration depth is limited to a few microns, very good contact between the sample and the crystal must be ensured, which can be achieved by working with samples close to water saturation. However, the strong absorbance of water in the mid-infrared range as well as the absorbance of some soil constituents (e.g., calcium carbonate) interfere with some of the absorbance bands of interest. This has led to the development of several post-processing methods for analysis of the spectra. The FTIR-ATR technique has been successfully applied to soil classification as well as to determination of nitrate concentration [1, 6-8, 10]. Furthermore, Shaviv et al. [12] demonstrated the possibility of using fiber optics as an ATR devise for direct determination of nitrate concentration in soil extracts. Recently, Du et al. [5] showed that it is possible to differentiate between 14N and 15N in such spectra, which opens very promising opportunities for developing FTIR-ATR based methods for investigating nitrogen transformation in soils by tracing changes in N-isotopic species. 2. Photo-acoustic spectroscopy Photoacoustic spectroscopy (PAS) is based on absorption-induced heating of the sample, which produces pressure fluctuations in a surrounding gas. These fluctuations are

  14. Laser-based direct-write techniques for cell printing

    PubMed Central

    Schiele, Nathan R; Corr, David T; Huang, Yong; Raof, Nurazhani Abdul; Xie, Yubing; Chrisey, Douglas B

    2016-01-01

    Fabrication of cellular constructs with spatial control of cell location (±5 μm) is essential to the advancement of a wide range of applications including tissue engineering, stem cell and cancer research. Precise cell placement, especially of multiple cell types in co- or multi-cultures and in three dimensions, can enable research possibilities otherwise impossible, such as the cell-by-cell assembly of complex cellular constructs. Laser-based direct writing, a printing technique first utilized in electronics applications, has been adapted to transfer living cells and other biological materials (e.g., enzymes, proteins and bioceramics). Many different cell types have been printed using laser-based direct writing, and this technique offers significant improvements when compared to conventional cell patterning techniques. The predominance of work to date has not been in application of the technique, but rather focused on demonstrating the ability of direct writing to pattern living cells, in a spatially precise manner, while maintaining cellular viability. This paper reviews laser-based additive direct-write techniques for cell printing, and the various cell types successfully laser direct-written that have applications in tissue engineering, stem cell and cancer research are highlighted. A particular focus is paid to process dynamics modeling and process-induced cell injury during laser-based cell direct writing. PMID:20814088

  15. Laser-based direct-write techniques for cell printing.

    PubMed

    Schiele, Nathan R; Corr, David T; Huang, Yong; Raof, Nurazhani Abdul; Xie, Yubing; Chrisey, Douglas B

    2010-09-01

    Fabrication of cellular constructs with spatial control of cell location (+/-5 microm) is essential to the advancement of a wide range of applications including tissue engineering, stem cell and cancer research. Precise cell placement, especially of multiple cell types in co- or multi-cultures and in three dimensions, can enable research possibilities otherwise impossible, such as the cell-by-cell assembly of complex cellular constructs. Laser-based direct writing, a printing technique first utilized in electronics applications, has been adapted to transfer living cells and other biological materials (e.g., enzymes, proteins and bioceramics). Many different cell types have been printed using laser-based direct writing, and this technique offers significant improvements when compared to conventional cell patterning techniques. The predominance of work to date has not been in application of the technique, but rather focused on demonstrating the ability of direct writing to pattern living cells, in a spatially precise manner, while maintaining cellular viability. This paper reviews laser-based additive direct-write techniques for cell printing, and the various cell types successfully laser direct-written that have applications in tissue engineering, stem cell and cancer research are highlighted. A particular focus is paid to process dynamics modeling and process-induced cell injury during laser-based cell direct writing.

  16. Surface plasmon resonance based biosensor technique: a review.

    PubMed

    Guo, Xiaowei

    2012-07-01

    Optical Surface plasmon resonance (SPR) biosensors represent the most advanced and developed optical label-free biosensor technology. Optical SPR biosensors are a powerful detection and analysis tool that has vast applications in environmental protection, biotechnology, medical diagnostics, drug screening, food safety and security. This article reviews the recent development of SPR biosensor techniques, including bulk SPR and localized SPR (LSPR) biosensors, for detecting interactions between an analyte of interest in solution and a biomolecular recognition. The concepts of bulk and localized SPs and the working principles of both sensing techniques are introduced. Major sensing advances on biorecognition elements, measurement formats, and sensing platforms are presented. Finally, the discussions on both biosensor techniques as well as comparison of both SPR sensing techniques are made.

  17. The Analysis of Dimensionality Reduction Techniques in Cryptographic Object Code Classification

    SciTech Connect

    Jason L. Wright; Milos Manic

    2010-05-01

    This paper compares the application of three different dimension reduction techniques to the problem of locating cryptography in compiled object code. A simple classi?er is used to compare dimension reduction via sorted covariance, principal component analysis, and correlation-based feature subset selection. The analysis concentrates on the classi?cation accuracy as the number of dimensions is increased.

  18. Construction of dynamic stochastic simulation models using knowledge-based techniques

    NASA Technical Reports Server (NTRS)

    Williams, M. Douglas; Shiva, Sajjan G.

    1990-01-01

    Over the past three decades, computer-based simulation models have proven themselves to be cost-effective alternatives to the more structured deterministic methods of systems analysis. During this time, many techniques, tools and languages for constructing computer-based simulation models have been developed. More recently, advances in knowledge-based system technology have led many researchers to note the similarities between knowledge-based programming and simulation technologies and to investigate the potential application of knowledge-based programming techniques to simulation modeling. The integration of conventional simulation techniques with knowledge-based programming techniques is discussed to provide a development environment for constructing knowledge-based simulation models. A comparison of the techniques used in the construction of dynamic stochastic simulation models and those used in the construction of knowledge-based systems provides the requirements for the environment. This leads to the design and implementation of a knowledge-based simulation development environment. These techniques were used in the construction of several knowledge-based simulation models including the Advanced Launch System Model (ALSYM).

  19. Ivory species identification using electrophoresis-based techniques.

    PubMed

    Kitpipit, Thitika; Thanakiatkrai, Phuvadol; Penchart, Kitichaya; Ouithavon, Kanita; Satasook, Chutamas; Linacre, Adrian

    2016-12-01

    Despite continuous conservation efforts by national and international organizations, the populations of the three extant elephant species are still dramatically declining due to the illegal trade in ivory leading to the killing of elephants. A requirement to aid investigations and prosecutions is the accurate identification of the elephant species from which the ivory was removed. We report on the development of the first fully validated multiplex PCR-electrophoresis assay for ivory DNA analysis that can be used as a screening or confirmatory test. SNPs from the NADH dehydrogenase 5 and cytochrome b gene loci were identified and used in the development of the assay. The three extant elephant species could be identified based on three peaks/bands. Elephas maximus exhibited two distinct PCR fragments at approximate 129 and 381 bp; Loxodonta cyclotis showed two PCR fragments at 89 and 129 bp; and Loxodonta africana showed a single fragment of 129 bp. The assay correctly identified the elephant species using all 113 ivory and blood samples used in this report. We also report on the high sensitivity and specificity of the assay. All single-blinded samples were correctly classified, which demonstrated the assay's ability to be used for real casework. In addition, the assay could be used in conjunction with the technique of direct amplification. We propose that the test will benefit wildlife forensic laboratories and aid in the transition to the criminal justice system.

  20. [THE COMPARATIVE ANALYSIS OF TECHNIQUES OF IDENTIFICATION OF CORYNEBACTERIUM NON DIPHTHERIAE].

    PubMed

    Kharseeva, G G; Voronina, N A; Mironov, A Yu; Alutina, E L

    2015-12-01

    The comparative analysis was carried out concerning effectiveness of three techniques of identification of Corynebacterium non diphtheriae: bacteriological, molecular genetic (sequenation on 16SpRNA) andmass-spectrometric (MALDI-ToFMS). The analysis covered 49 strains of Corynebacterium non diphtheriae (C.pseudodiphheriticum, C.amycolatum, C.propinquum, C.falsenii) and 2 strains of Corynebacterium diphtheriae isolated under various pathology form urogenital tract and upper respiratory ways. The corinbacteria were identified using bacteriologic technique, sequenation on 16SpRNA and mass-spectrometric technique (MALDIToF MS). The full concordance of results of species' identification was marked in 26 (51%) of strains of Corynebacterium non diphtheriae at using three analysis techniques; in 43 (84.3%) strains--at comparison of bacteriologic technique with sequenation on 16S pRNA and in 29 (57%)--at mass-spectrometric analysis and sequenation on 16S pRNA. The bacteriologic technique is effective for identification of Corynebacterium diphtheriae. The precise establishment of species belonging of corynebacteria with variable biochemical characteristics the molecular genetic technique of analysis is to be applied. The mass-spectrometric technique (MALDI-ToF MS) requires further renewal of data bases for identifying larger spectrum of representatives of genus Corynebacterium.

  1. A comparative analysis of soft computing techniques for gene prediction.

    PubMed

    Goel, Neelam; Singh, Shailendra; Aseri, Trilok Chand

    2013-07-01

    The rapid growth of genomic sequence data for both human and nonhuman species has made analyzing these sequences, especially predicting genes in them, very important and is currently the focus of many research efforts. Beside its scientific interest in the molecular biology and genomics community, gene prediction is of considerable importance in human health and medicine. A variety of gene prediction techniques have been developed for eukaryotes over the past few years. This article reviews and analyzes the application of certain soft computing techniques in gene prediction. First, the problem of gene prediction and its challenges are described. These are followed by different soft computing techniques along with their application to gene prediction. In addition, a comparative analysis of different soft computing techniques for gene prediction is given. Finally some limitations of the current research activities and future research directions are provided.

  2. A comparative analysis of the accuracy of implant transfer techniques.

    PubMed

    Hsu, C C; Millstein, P L; Stein, R S

    1993-06-01

    Four different implant transfer techniques using two master cast systems (solid cast and Zeiser system) were evaluated and compared with respect to the accuracy with which abutment positions were reproduced. A stainless steel experimental analogue with two anterior and two posterior fixtures and abutments was fabricated. Polyether impressions (14 each) were made by use of four techniques, (I) nonsplinted, (II) splinted with dental floss and acrylic resin, (III) splinted with orthodontic wire and acrylic resin, and (IV) splinted with acrylic resin alone. The fourteen impressions of each technique were divided into two equal groups: group 1, solid cast system, and group 2, Zeiser system. The abutments of each master cast were measured vertically and horizontally with a profile projector. Statistical analysis indicated no significant difference between the splinted and nonsplinted techniques. The Zeiser system provided more accurate interabutment relationships for the posterior region than the solid cast system.

  3. Performance analysis of multiple PRF technique for ambiguity resolution

    NASA Technical Reports Server (NTRS)

    Chang, C. Y.; Curlander, J. C.

    1992-01-01

    For short wavelength spaceborne synthetic aperture radar (SAR), ambiguity in Doppler centroid estimation occurs when the azimuth squint angle uncertainty is larger than the azimuth antenna beamwidth. Multiple pulse recurrence frequency (PRF) hopping is a technique developed to resolve the ambiguity by operating the radar in different PRF's in the pre-imaging sequence. Performance analysis results of the multiple PRF technique are presented, given the constraints of the attitude bound, the drift rate uncertainty, and the arbitrary numerical values of PRF's. The algorithm performance is derived in terms of the probability of correct ambiguity resolution. Examples, using the Shuttle Imaging Radar-C (SIR-C) and X-SAR parameters, demonstrate that the probability of correct ambiguity resolution obtained by the multiple PRF technique is greater than 95 percent and 80 percent for the SIR-C and X-SAR applications, respectively. The success rate is significantly higher than that achieved by the range cross correlation technique.

  4. Performance analysis of multiple PRF technique for ambiguity resolution

    NASA Technical Reports Server (NTRS)

    Chang, C. Y.; Curlander, J. C.

    1992-01-01

    For short wavelength spaceborne synthetic aperture radar (SAR), ambiguity in Doppler centroid estimation occurs when the azimuth squint angle uncertainty is larger than the azimuth antenna beamwidth. Multiple pulse recurrence frequency (PRF) hopping is a technique developed to resolve the ambiguity by operating the radar in different PRF's in the pre-imaging sequence. Performance analysis results of the multiple PRF technique are presented, given the constraints of the attitude bound, the drift rate uncertainty, and the arbitrary numerical values of PRF's. The algorithm performance is derived in terms of the probability of correct ambiguity resolution. Examples, using the Shuttle Imaging Radar-C (SIR-C) and X-SAR parameters, demonstrate that the probability of correct ambiguity resolution obtained by the multiple PRF technique is greater than 95 percent and 80 percent for the SIR-C and X-SAR applications, respectively. The success rate is significantly higher than that achieved by the range cross correlation technique.

  5. LOFT Debriefings: An Analysis of Instructor Techniques and Crew Participation

    NASA Technical Reports Server (NTRS)

    Dismukes, R. Key; Jobe, Kimberly K.; McDonnell, Lori K.

    1997-01-01

    This study analyzes techniques instructors use to facilitate crew analysis and evaluation of their Line-Oriented Flight Training (LOFT) performance. A rating instrument called the Debriefing Assessment Battery (DAB) was developed which enables raters to reliably assess instructor facilitation techniques and characterize crew participation. Thirty-six debriefing sessions conducted at five U.S. airlines were analyzed to determine the nature of instructor facilitation and crew participation. Ratings obtained using the DAB corresponded closely with descriptive measures of instructor and crew performance. The data provide empirical evidence that facilitation can be an effective tool for increasing the depth of crew participation and self-analysis of CRM performance. Instructor facilitation skill varied dramatically, suggesting a need for more concrete hands-on training in facilitation techniques. Crews were responsive but fell short of actively leading their own debriefings. Ways to improve debriefing effectiveness are suggested.

  6. Projectile Base Flow Analysis

    DTIC Science & Technology

    2007-11-02

    S) AND ADDRESS(ES) DCW Industries, Inc. 5354 Palm Drive La Canada, CA 91011 8. PERFORMING ORGANIZATION...REPORT NUMBER DCW -38-R-05 9. SPONSORING / MONITORING AGENCY NAME(S) AND ADDRESS(ES) U. S. Army Research Office...Turbulence Modeling for CFD, Second Edition, DCW Industries, Inc., La Cañada, CA. Wilcox, D. C. (2001), “Projectile Base Flow Analysis,” DCW

  7. Evaluation of Analysis Techniques for Fluted-Core Sandwich Cylinders

    NASA Technical Reports Server (NTRS)

    Lovejoy, Andrew E.; Schultz, Marc R.

    2012-01-01

    Buckling-critical launch-vehicle structures require structural concepts that have high bending stiffness and low mass. Fluted-core, also known as truss-core, sandwich construction is one such concept. In an effort to identify an analysis method appropriate for the preliminary design of fluted-core cylinders, the current paper presents and compares results from several analysis techniques applied to a specific composite fluted-core test article. The analysis techniques are evaluated in terms of their ease of use and for their appropriateness at certain stages throughout a design analysis cycle (DAC). Current analysis techniques that provide accurate determination of the global buckling load are not readily applicable early in the DAC, such as during preliminary design, because they are too costly to run. An analytical approach that neglects transverse-shear deformation is easily applied during preliminary design, but the lack of transverse-shear deformation results in global buckling load predictions that are significantly higher than those from more detailed analysis methods. The current state of the art is either too complex to be applied for preliminary design, or is incapable of the accuracy required to determine global buckling loads for fluted-core cylinders. Therefore, it is necessary to develop an analytical method for calculating global buckling loads of fluted-core cylinders that includes transverse-shear deformations, and that can be easily incorporated in preliminary design.

  8. Magnetic separation techniques in sample preparation for biological analysis: a review.

    PubMed

    He, Jincan; Huang, Meiying; Wang, Dongmei; Zhang, Zhuomin; Li, Gongke

    2014-12-01

    Sample preparation is a fundamental and essential step in almost all the analytical procedures, especially for the analysis of complex samples like biological and environmental samples. In past decades, with advantages of superparamagnetic property, good biocompatibility and high binding capacity, functionalized magnetic materials have been widely applied in various processes of sample preparation for biological analysis. In this paper, the recent advancements of magnetic separation techniques based on magnetic materials in the field of sample preparation for biological analysis were reviewed. The strategy of magnetic separation techniques was summarized. The synthesis, stabilization and bio-functionalization of magnetic nanoparticles were reviewed in detail. Characterization of magnetic materials was also summarized. Moreover, the applications of magnetic separation techniques for the enrichment of protein, nucleic acid, cell, bioactive compound and immobilization of enzyme were described. Finally, the existed problems and possible trends of magnetic separation techniques for biological analysis in the future were proposed.

  9. Dynamic mechanical analysis: A practical introduction to techniques and applications

    SciTech Connect

    Menard, K.

    1999-01-01

    This book introduces DMA, its history, and its current position as part of thermal analysis on polymers. It discusses major types of instrumentation, including oscillatory rotational, oscillatory axial, and torsional pendulum. It also describes analytical techniques in terms of utility, quality of data, methods of calibration, and suitability for different types of materials and assesses applications for thermoplastics, thermosetting systems, and thermosets.

  10. Regression Commonality Analysis: A Technique for Quantitative Theory Building

    ERIC Educational Resources Information Center

    Nimon, Kim; Reio, Thomas G., Jr.

    2011-01-01

    When it comes to multiple linear regression analysis (MLR), it is common for social and behavioral science researchers to rely predominately on beta weights when evaluating how predictors contribute to a regression model. Presenting an underutilized statistical technique, this article describes how organizational researchers can use commonality…

  11. Regression Commonality Analysis: A Technique for Quantitative Theory Building

    ERIC Educational Resources Information Center

    Nimon, Kim; Reio, Thomas G., Jr.

    2011-01-01

    When it comes to multiple linear regression analysis (MLR), it is common for social and behavioral science researchers to rely predominately on beta weights when evaluating how predictors contribute to a regression model. Presenting an underutilized statistical technique, this article describes how organizational researchers can use commonality…

  12. The Recoverability of P-Technique Factor Analysis

    ERIC Educational Resources Information Center

    Molenaar, Peter C. M.; Nesselroade, John R.

    2009-01-01

    It seems that just when we are about to lay P-technique factor analysis finally to rest as obsolete because of newer, more sophisticated multivariate time-series models using latent variables--dynamic factor models--it rears its head to inform us that an obituary may be premature. We present the results of some simulations demonstrating that even…

  13. The Recoverability of P-Technique Factor Analysis

    ERIC Educational Resources Information Center

    Molenaar, Peter C. M.; Nesselroade, John R.

    2009-01-01

    It seems that just when we are about to lay P-technique factor analysis finally to rest as obsolete because of newer, more sophisticated multivariate time-series models using latent variables--dynamic factor models--it rears its head to inform us that an obituary may be premature. We present the results of some simulations demonstrating that even…

  14. Evaluation of Meterorite Amono Acid Analysis Data Using Multivariate Techniques

    NASA Technical Reports Server (NTRS)

    McDonald, G.; Storrie-Lombardi, M.; Nealson, K.

    1999-01-01

    The amino acid distributions in the Murchison carbonaceous chondrite, Mars meteorite ALH84001, and ice from the Allan Hills region of Antarctica are shown, using a multivariate technique known as Principal Component Analysis (PCA), to be statistically distinct from the average amino acid compostion of 101 terrestrial protein superfamilies.

  15. Nitrous oxide-based techniques versus nitrous oxide-free techniques for general anaesthesia.

    PubMed

    Sun, Rao; Jia, Wen Qin; Zhang, Peng; Yang, KeHu; Tian, Jin Hui; Ma, Bin; Liu, Yali; Jia, Run H; Luo, Xiao F; Kuriyama, Akira

    2015-11-06

    anaesthesia (or both) with any general anaesthesia using a volatile anaesthetic or propofol-based maintenance of anaesthesia but no nitrous oxide for adults undergoing surgery. Our primary outcome was inhospital case fatality rate. Secondary outcomes were complications and length of stay. Two review authors independently assessed trial quality and extracted the outcome data. We used meta-analysis for data synthesis. Heterogeneity was examined with the Chi² test and by calculating the I² statistic. We used a fixed-effect model if the measure of inconsistency was low for all comparisons (I² statistic < 50%); otherwise we used a random-effects model for measures with high inconsistency. We undertook subgroup analyses to explore inconsistency and sensitivity analyses to evaluate whether the results were robust. We assessed the quality of evidence of the main outcomes using the Grading of Recommendations, Assessment, Development and Evaluation (GRADE) system. We included 35 trials (13,872 adult participants). Seven included studies were at low risk of bias. We identified eight studies as awaiting classification since we could not obtain the full texts, and had insufficient information to include or exclude them. We included data from 24 trials for quantitative synthesis. The results of meta-analyses showed that nitrous oxide-based techniques increased the incidence of pulmonary atelectasis (odds ratio (OR) 1.57, 95% confidence interval (CI) 1.18 to 2.10, P = 0.002), but had no effects on the inhospital case fatality rate, the incidence of pneumonia, myocardial infarction, stroke, severe nausea and vomiting, venous thromboembolism, wound infection, or the length of hospital stay. The sensitivity analyses suggested that the results of the meta-analyses were all robust except for the outcomes of pneumonia, and severe nausea and vomiting. Two trials reported length of intensive care unit (ICU) stay but the data were skewed so were not pooled. Both trials reported that nitrous oxide-based

  16. Canonical Analysis as a Generalized Regression Technique for Multivariate Analysis.

    ERIC Educational Resources Information Center

    Williams, John D.

    The use of characteristic coding (dummy coding) is made in showing solutions to four multivariate problems using canonical analysis. The canonical variates can be themselves analyzed by the use of multiple linear regression. When the canonical variates are used as criteria in a multiple linear regression, the R2 values are equal to 0, where 0 is…

  17. A Novel Nanofabrication Technique of Silicon-Based Nanostructures

    NASA Astrophysics Data System (ADS)

    Meng, Lingkuan; He, Xiaobin; Gao, Jianfeng; Li, Junjie; Wei, Yayi; Yan, Jiang

    2016-11-01

    A novel nanofabrication technique which can produce highly controlled silicon-based nanostructures in wafer scale has been proposed using a simple amorphous silicon (α-Si) material as an etch mask. SiO2 nanostructures directly fabricated can serve as nanotemplates to transfer into the underlying substrates such as silicon, germanium, transistor gate, or other dielectric materials to form electrically functional nanostructures and devices. In this paper, two typical silicon-based nanostructures such as nanoline and nanofin have been successfully fabricated by this technique, demonstrating excellent etch performance. In addition, silicon nanostructures fabricated above can be further trimmed to less than 10 nm by combing with assisted post-treatment methods. The novel nanofabrication technique will be expected a new emerging technology with low process complexity and good compatibility with existing silicon integrated circuit and is an important step towards the easy fabrication of a wide variety of nanoelectronics, biosensors, and optoelectronic devices.

  18. Membrane-based microextraction techniques in analytical chemistry: A review.

    PubMed

    Carasek, Eduardo; Merib, Josias

    2015-06-23

    The use of membrane-based sample preparation techniques in analytical chemistry has gained growing attention from the scientific community since the development of miniaturized sample preparation procedures in the 1990s. The use of membranes makes the microextraction procedures more stable, allowing the determination of analytes in complex and "dirty" samples. This review describes some characteristics of classical membrane-based microextraction techniques (membrane-protected solid-phase microextraction, hollow-fiber liquid-phase microextraction and hollow-fiber renewal liquid membrane) as well as some alternative configurations (thin film and electromembrane extraction) used successfully for the determination of different analytes in a large variety of matrices, some critical points regarding each technique are highlighted.

  19. A Novel Nanofabrication Technique of Silicon-Based Nanostructures.

    PubMed

    Meng, Lingkuan; He, Xiaobin; Gao, Jianfeng; Li, Junjie; Wei, Yayi; Yan, Jiang

    2016-12-01

    A novel nanofabrication technique which can produce highly controlled silicon-based nanostructures in wafer scale has been proposed using a simple amorphous silicon (α-Si) material as an etch mask. SiO2 nanostructures directly fabricated can serve as nanotemplates to transfer into the underlying substrates such as silicon, germanium, transistor gate, or other dielectric materials to form electrically functional nanostructures and devices. In this paper, two typical silicon-based nanostructures such as nanoline and nanofin have been successfully fabricated by this technique, demonstrating excellent etch performance. In addition, silicon nanostructures fabricated above can be further trimmed to less than 10 nm by combing with assisted post-treatment methods. The novel nanofabrication technique will be expected a new emerging technology with low process complexity and good compatibility with existing silicon integrated circuit and is an important step towards the easy fabrication of a wide variety of nanoelectronics, biosensors, and optoelectronic devices.

  20. Chemiresistors based on conducting polymers: a review on measurement techniques.

    PubMed

    Lange, Ulrich; Mirsky, Vladimir M

    2011-02-21

    This review covers the development of measurement configurations for chemiresistors based on conducting polymers. The simplest chemiresistors are based on application of a two-electrode technique. Artifacts caused by contact resistance can be overcome by application of a four-electrode technique. Simultaneous application of the two- and four-electrode measurement configurations provides an internal control of sensor integrity. An incorporation of two additional electrodes controlling the redox state of chemosensitive polymers and connecting to the measurement electrodes through liquid or (quasi)solid electrolyte results in a six-electrode technique; an electrically driven regeneration of such sensors allows one to perform fast and completely reversible measurements. Copyright © 2010 Elsevier B.V. All rights reserved.

  1. Recent Electrochemical and Optical Sensors in Flow-Based Analysis

    PubMed Central

    Chailapakul, Orawon; Ngamukot, Passapol; Yoosamran, Alongkorn; Siangproh, Weena; Wangfuengkanagul, Nattakarn

    2006-01-01

    Some recent analytical sensors based on electrochemical and optical detection coupled with different flow techniques have been chosen in this overview. A brief description of fundamental concepts and applications of each flow technique, such as flow injection analysis (FIA), sequential injection analysis (SIA), all injection analysis (AIA), batch injection analysis (BIA), multicommutated FIA (MCFIA), multisyringe FIA (MSFIA), and multipumped FIA (MPFIA) were reviewed.

  2. Advanced alloy design technique: High temperature cobalt base superalloy

    NASA Technical Reports Server (NTRS)

    Dreshfield, R. L.; Freche, J. C.; Sandrock, G. D.

    1972-01-01

    Advanced alloy design technique was developed for treating alloys that will have extended life in service at high temperature and intermediate temperatures. Process stabilizes microstructure of the alloy by designing it so that compound identified with embrittlement is eliminated or minimized. Design process is being used to develop both nickel and cobalt-base superalloys.

  3. Image encryption techniques based on the fractional Fourier transform

    NASA Astrophysics Data System (ADS)

    Hennelly, B. M.; Sheridan, J. T.

    2003-11-01

    The fractional Fourier transform, (FRT), is a generalisation of the Fourier transform which allows domains of mixed spatial frequency and spatial information to be examined. A number of method have recently been proposed in the literature for the encryption of two dimensional information using optical systems based on the FRT. Typically, these methods require random phase screen keys to decrypt the data, which must be stored at the receiver and must be carefully aligned with the received encrypted data. We have proposed a new technique based on a random shifting or Jigsaw transformation. This method does not require the use of phase keys. The image is encrypted by juxtaposition of sections of the image in various FRT domains. The new method has been compared numerically with existing methods and shows comparable or superior robustness to blind decryption. An optical implementation is also proposed and the sensitivity of the various encryption keys to blind decryption is quantified. We also present a second image encryption technique, which is based on a recently proposed method of optical phase retrieval using the optical FRT and one of its discrete counterparts. Numerical simulations of the new algorithm indicates that the sensitivity of the keys is much greater than any of the techniques currently available. In fact the sensitivity appears to be so high that optical implementation, based on existing optical signal processing technology, may be impossible. However, the technique has been shown to be a powerful method of 2-D image data encryption.

  4. What Child Analysis Can Teach Us about Psychoanalytic Technique.

    PubMed

    Ablon, Steven Luria

    2014-01-01

    Child analysis has much to teach us about analytic technique. Children have an innate, developmentally driven sense of analytic process. Children in analysis underscore the importance of an understanding and belief in the therapeutic action of play, the provisional aspects of play, and that not all play will be understood. Each analysis requires learning a new play signature that is constantly reorganized. Child analysis emphasizes the emergence and integration of dissociated states, the negotiation of self-other relationships, the importance of co-creation, and the child's awareness of the analyst's sensibility. Child analysis highlights the robust nature of transference and how working through and repairing is related to the initiation of coordinated patterns of high predictability in the context of deep attachments. I will illustrate these and other ideas in the description of the analysis of a nine-year-old boy.

  5. Knowledge-based GIS techniques applied to geological engineering

    USGS Publications Warehouse

    Usery, E. Lynn; Altheide, Phyllis; Deister, Robin R.P.; Barr, David J.

    1988-01-01

    A knowledge-based geographic information system (KBGIS) approach which requires development of a rule base for both GIS processing and for the geological engineering application has been implemented. The rule bases are implemented in the Goldworks expert system development shell interfaced to the Earth Resources Data Analysis System (ERDAS) raster-based GIS for input and output. GIS analysis procedures including recoding, intersection, and union are controlled by the rule base, and the geological engineering map product is generted by the expert system. The KBGIS has been used to generate a geological engineering map of Creve Coeur, Missouri.

  6. Comparative analysis of infrared images degraded by lossy compression techniques

    NASA Astrophysics Data System (ADS)

    Toussaint, W. A.; Weber, Reed A.

    2015-09-01

    This work addresses image degradation introduced by lossy compression techniques and the effects of such degradation on signal detection statistics for applications in fast-framing (<100 Hz) IR image analysis. As future space systems make use of increasingly higher pixel count IR focal plane arrays, data generation rates are anticipated to become too copious for continuous download. The prevailing solution to this issue has been to compress image data prior to downlink. While this solution is application independent for lossless compression, the expected benefits of lossy compression, including higher compression ratio, necessitate several application specific trades in order to characterize preservation of critical information within the data. Current analyses via standard statistical image processing techniques following tunably lossy compression algorithms (JPEG2000, JPEG-LS) allow for detection statistics nearly identical to analyses following standard lossless compression techniques, such as Rice and PNG, even at degradation levels offering a greater than twofold increase in compression ratio. Ongoing efforts focus on repeating the analysis for other tunably lossy compression techniques while also assessing the relative computational burden of each algorithm. Current results suggest that lossy compression techniques can preserve critical information in fast-framing IR data while either significantly reducing downlink bandwidth requirements or significantly increasing the usable focal plane array window size.

  7. Microfluidic IEF technique for sequential phosphorylation analysis of protein kinases

    NASA Astrophysics Data System (ADS)

    Choi, Nakchul; Song, Simon; Choi, Hoseok; Lim, Bu-Taek; Kim, Young-Pil

    2015-11-01

    Sequential phosphorylation of protein kinases play the important role in signal transduction, protein regulation, and metabolism in living cells. The analysis of these phosphorylation cascades will provide new insights into their physiological functions in many biological functions. Unfortunately, the existing methods are limited to analyze the cascade activity. Therefore, we suggest a microfluidic isoelectric focusing technique (μIEF) for the analysis of the cascade activity. Using the technique, we show that the sequential phosphorylation of a peptide by two different kinases can be successfully detected on a microfluidic chip. In addition, the inhibition assay for kinase activity and the analysis on a real sample have also been conducted. The results indicate that μIEF is an excellent means for studies on phosphorylation cascade activity.

  8. Intestinal Preparation Techniques for Histological Analysis in the Mouse.

    PubMed

    Williams, Jonathan M; Duckworth, Carrie A; Vowell, Kate; Burkitt, Michael D; Pritchard, D Mark

    2016-06-01

    The murine intestinal tract represents a difficult organ system to study due to its long convoluted tubular structure, narrow diameter, and delicate mucosa which undergoes rapid changes after sampling prior to fixation. These features do not make for easy histological analysis as rapid fixation in situ, or after simple removal without careful dissection, results in poor postfixation tissue handling and limited options for high quality histological sections. Collecting meaningful quantitative data by analysis of this tissue is further complicated by the anatomical changes in structure along its length. This article describes two methods of intestinal sampling at necropsy that allow systematic histological analysis of the entire intestinal tract, either through examination of cross sections (circumferences) by the gut bundling technique or longitudinal sections by the adapted Swiss roll technique, together with basic methods for data collection. © 2016 by John Wiley & Sons, Inc. Copyright © 2016 John Wiley & Sons, Inc.

  9. Environmental Immunoassays: Alternative Techniques for Soil and Water Analysis

    USGS Publications Warehouse

    Aga, D.S.; Thurman, E.M.

    1996-01-01

    Analysis of soil and water samples for environmental studies and compliance testing can be formidable, time consuming, and costly. As a consequence, immunochemical techniques have become popular for environmental analysis because they are reliable, rapid, and cost effective. During the past 5 years, the use of immunoassays for environmental monitoring has increased substantially, and their use as an integral analytical tool in many environmental laboratories is now commonplace. This chapter will present the basic concept of immunoassays, recent advances in the development of immunochemical methods, and examples of successful applications of immunoassays in environmental analysis.

  10. Design, data analysis and sampling techniques for clinical research.

    PubMed

    Suresh, Karthik; Thomas, Sanjeev V; Suresh, Geetha

    2011-10-01

    Statistical analysis is an essential technique that enables a medical research practitioner to draw meaningful inference from their data analysis. Improper application of study design and data analysis may render insufficient and improper results and conclusion. Converting a medical problem into a statistical hypothesis with appropriate methodological and logical design and then back-translating the statistical results into relevant medical knowledge is a real challenge. This article explains various sampling methods that can be appropriately used in medical research with different scenarios and challenges.

  11. Video Multiple Watermarking Technique Based on Image Interlacing Using DWT

    PubMed Central

    Ibrahim, Mohamed M.; Abdel Kader, Neamat S.; Zorkany, M.

    2014-01-01

    Digital watermarking is one of the important techniques to secure digital media files in the domains of data authentication and copyright protection. In the nonblind watermarking systems, the need of the original host file in the watermark recovery operation makes an overhead over the system resources, doubles memory capacity, and doubles communications bandwidth. In this paper, a robust video multiple watermarking technique is proposed to solve this problem. This technique is based on image interlacing. In this technique, three-level discrete wavelet transform (DWT) is used as a watermark embedding/extracting domain, Arnold transform is used as a watermark encryption/decryption method, and different types of media (gray image, color image, and video) are used as watermarks. The robustness of this technique is tested by applying different types of attacks such as: geometric, noising, format-compression, and image-processing attacks. The simulation results show the effectiveness and good performance of the proposed technique in saving system resources, memory capacity, and communications bandwidth. PMID:25587570

  12. Video multiple watermarking technique based on image interlacing using DWT.

    PubMed

    Ibrahim, Mohamed M; Abdel Kader, Neamat S; Zorkany, M

    2014-01-01

    Digital watermarking is one of the important techniques to secure digital media files in the domains of data authentication and copyright protection. In the nonblind watermarking systems, the need of the original host file in the watermark recovery operation makes an overhead over the system resources, doubles memory capacity, and doubles communications bandwidth. In this paper, a robust video multiple watermarking technique is proposed to solve this problem. This technique is based on image interlacing. In this technique, three-level discrete wavelet transform (DWT) is used as a watermark embedding/extracting domain, Arnold transform is used as a watermark encryption/decryption method, and different types of media (gray image, color image, and video) are used as watermarks. The robustness of this technique is tested by applying different types of attacks such as: geometric, noising, format-compression, and image-processing attacks. The simulation results show the effectiveness and good performance of the proposed technique in saving system resources, memory capacity, and communications bandwidth.

  13. Optical supervised filtering technique based on Hopfield neural network

    NASA Astrophysics Data System (ADS)

    Bal, Abdullah

    2004-11-01

    Hopfield neural network is commonly preferred for optimization problems. In image segmentation, conventional Hopfield neural networks (HNN) are formulated as a cost-function-minimization problem to perform gray level thresholding on the image histogram or the pixels' gray levels arranged in a one-dimensional array [R. Sammouda, N. Niki, H. Nishitani, Pattern Rec. 30 (1997) 921-927; K.S. Cheng, J.S. Lin, C.W. Mao, IEEE Trans. Med. Imag. 15 (1996) 560-567; C. Chang, P. Chung, Image and Vision comp. 19 (2001) 669-678]. In this paper, a new high speed supervised filtering technique is proposed for image feature extraction and enhancement problems by modifying the conventional HNN. The essential improvement in this technique is to use 2D convolution operation instead of weight-matrix multiplication. Thereby, neural network based a new filtering technique has been obtained that is required just 3 × 3 sized filter mask matrix instead of large size weight coefficient matrix. Optical implementation of the proposed filtering technique is executed easily using the joint transform correlator. The requirement of non-negative data for optical implementation is provided by bias technique to convert the bipolar data to non-negative data. Simulation results of the proposed optical supervised filtering technique are reported for various feature extraction problems such as edge detection, corner detection, horizontal and vertical line extraction, and fingerprint enhancement.

  14. Nondestructive analysis of oil shales with PGNAA technique

    SciTech Connect

    Maly, J.; Bozorgmanesh, H.

    1984-02-01

    The feasibility of nondestructive analysis of oil shales using the prompt gamma neutron activation analysis (PGNAA) technique was studied. The PGNAA technique, developed originally for continuous analysis of coal on the belt, was applied to the analysis of eight oil-shale samples, containing between 9 and 60 gallons of oil per ton and 0.8% to 3.4% hydrogen. The PGNAA technique was modified using four neutron moderation conditions: non-moderated neutrons; non-moderated and partially moderated neutrons reflected from a water box behind the source; neutrons moderated in a water box behind and in front of the source; and neutrons strongly moderated in a polyethylene block placed in front of the source and with reflected neutrons from a water box behind the source. The studied oil shales were measured in their aluminum or wooden (masonite) boxes. The obtained Ge-Li spectra were processed by LSI-11/23 computer, using the modified programs previously developed by SAI for continuous coal analysis. The results of such processing (the peak areas for several gamma lines) were corrected and plotted against the weight percent of each analyzed element (from the chemical analysis). Response curves developed for H, C, N, S, Na, Mg, Al, Si, Ti, Ca, Fe and K show generally good linear proportions of peak area to the weight percent of the element. For hydrogen determination, NMD conditions had to be used where the response curve was not linear, but followed a curve whose slope rose with hydrogen concentration. This effect is caused by improving neutron self-moderation in sample boxes of rich oil shales, as compared to poor self-moderation of neutrons in very lean oil shales. The moisture in oil shales was measured by microwave absorption technique in small masonite boxes. This method was calibrated four times using oil-shale samples mixed gradually with larger and larger amounts of water.

  15. Machinery Diagnostics Via Mechanical Vibration Analysis using Spectral Analysis Techniques

    DTIC Science & Technology

    1988-09-01

    based on the economics of the situation, it is more advantageous to opt for a continuous monitoring system and or there are a very large number of...etc. 3 formats, to systems where permanently installed sensors feed into a computer equipped with diagnostic software. 3. Applictkon to Machinery...the intervals will only be optimal for those units which degrade exactly as does the average unit of the class. Those which perform below average may

  16. Rain Attenuation Analysis using Synthetic Storm Technique in Malaysia

    NASA Astrophysics Data System (ADS)

    Lwas, A. K.; Islam, Md R.; Chebil, J.; Habaebi, M. H.; Ismail, A. F.; Zyoud, A.; Dao, H.

    2013-12-01

    Generated rain attenuation time series plays an important role for investigating the rain fade characteristics in the lack of real fade measurements. A suitable conversion technique can be applied to measured rain rate time series to produce rain attenuation data and be utilized to understand the rain fade characteristics. This paper focuses on applicability of synthetic storm technique (SST) to convert measured rain rate data to rain attenuation time series. Its performance is assessed for time series generation over a tropical location Kuala Lumpur, in Malaysia. From preliminary analysis, it is found that SST gives satisfactory results to estimate the rain attenuation time series from the rain rate measurements over this region.

  17. Practical applications of activation analysis and other nuclear techniques

    SciTech Connect

    Lyon, W S

    1982-01-01

    Neeutron activation analysis (NAA) is a versatile, sensitive multielement, usually nondestructive analytical technique used to determine elemental concentrations in a variety of materials. Samples are irradiated with neutrons in a nuclear reactor, removed, and for the nondestructive technique, the induced radioactivity measured. This measurement of ..gamma.. rays emitted from specific radionuclides makes possible the quantitative determination of elements present. The method is described, advantages and disadvantages listed and a number of examples of its use given. Two other nuclear methods, particle induced x-ray emission and synchrotron produced x-ray fluorescence are also briefly discussed.

  18. Bispectrum-based feature extraction technique for devising a practical brain-computer interface

    NASA Astrophysics Data System (ADS)

    Shahid, Shahjahan; Prasad, Girijesh

    2011-04-01

    The extraction of distinctly separable features from electroencephalogram (EEG) is one of the main challenges in designing a brain-computer interface (BCI). Existing feature extraction techniques for a BCI are mostly developed based on traditional signal processing techniques assuming that the signal is Gaussian and has linear characteristics. But the motor imagery (MI)-related EEG signals are highly non-Gaussian, non-stationary and have nonlinear dynamic characteristics. This paper proposes an advanced, robust but simple feature extraction technique for a MI-related BCI. The technique uses one of the higher order statistics methods, the bispectrum, and extracts the features of nonlinear interactions over several frequency components in MI-related EEG signals. Along with a linear discriminant analysis classifier, the proposed technique has been used to design an MI-based BCI. Three performance measures, classification accuracy, mutual information and Cohen's kappa have been evaluated and compared with a BCI using a contemporary power spectral density-based feature extraction technique. It is observed that the proposed technique extracts nearly recording-session-independent distinct features resulting in significantly much higher and consistent MI task detection accuracy and Cohen's kappa. It is therefore concluded that the bispectrum-based feature extraction is a promising technique for detecting different brain states.

  19. Comparison of laser-based rapid prototyping techniques

    NASA Astrophysics Data System (ADS)

    Humphreys, Hugh; Wimpenny, David

    2002-04-01

    A diverse range of Rapid Prototyping, or layer manufacturing techniques have evolved since the introduction of the first process in the late 1980s. Many, although not all, rapid prototyping processes rely on lasers to provide a localised and controllable source of light for curing a liquid photopolymer or heat to fuse thermoplastic powders to form objects. This paper will provide an overview of laser based rapid prototyping methods and discuss the future direction of this technology in light of the threats posed by low cost 3D printing techniques and the opportunity for the direct manufacture of metal components.

  20. New physical techniques for IC functional analysis of on-chip devices and interconnects

    NASA Astrophysics Data System (ADS)

    Boit, Christian

    2005-09-01

    Localization of functional fails in ICs makes use of physical interactions that the devices produce under electrical operation. The focus is on electroluminescence (keyword: photon emission) and signal responses to stimulation by scanned beams of laser light or particles. In modern chip technologies access of this information is only available through chip backside. This paradigm shift requires a full revision of chip analysis techniques and processes. This has also been a kick-off of a rush in development of new methodologies. Here, an overview is given which parameters are crucial for successful analysis techniques of the future and how photon emission, laser based techniques and new preparation techniques based on focused ion beam (FIB) open the path into this direction.