Science.gov

Sample records for analysis technique based

  1. New Flutter Analysis Technique for CFD-based Unsteady Aeroelasticity

    NASA Technical Reports Server (NTRS)

    Pak, Chan-gi; Jutte, Christine V.

    2009-01-01

    This paper presents a flutter analysis technique for the transonic flight regime. The technique uses an iterative approach to determine the critical dynamic pressure for a given mach number. Unlike other CFD-based flutter analysis methods, each iteration solves for the critical dynamic pressure and uses this value in subsequent iterations until the value converges. This process reduces the iterations required to determine the critical dynamic pressure. To improve the accuracy of the analysis, the technique employs a known structural model, leaving only the aerodynamic model as the unknown. The aerodynamic model is estimated using unsteady aeroelastic CFD analysis combined with a parameter estimation routine. The technique executes as follows. The known structural model is represented as a finite element model. Modal analysis determines the frequencies and mode shapes for the structural model. At a given mach number and dynamic pressure, the unsteady CFD analysis is performed. The output time history of the surface pressure is converted to a nodal aerodynamic force vector. The forces are then normalized by the given dynamic pressure. A multi-input multi-output parameter estimation software, ERA, estimates the aerodynamic model through the use of time histories of nodal aerodynamic forces and structural deformations. The critical dynamic pressure is then calculated using the known structural model and the estimated aerodynamic model. This output is used as the dynamic pressure in subsequent iterations until the critical dynamic pressure is determined. This technique is demonstrated on the Aerostructures Test Wing-2 model at NASA's Dryden Flight Research Center.

  2. Computer Based Economic Analysis Techniques to Support Functional Economic Analysis

    DTIC Science & Technology

    1993-09-01

    is one of the most frequently used tools to uncover and explore profit potential. B. CALCULATION OF BREAK EVER ANALYSIS Haga and Lang (1992) state...BENEFITS For benfits that are quantifiable, Haga and Lang (1992) express BCR in the following notation. BCR=QOM Equation 9-1UAC (Where QOM is a...emulation. In addition to the software requirements, FEAM has the following hardware criteria: 68 * A mouse "* 2MB of RAM "* 20MB of Hard Disk space "* EGA

  3. Image analysis techniques associated with automatic data base generation.

    NASA Technical Reports Server (NTRS)

    Bond, A. D.; Ramapriyan, H. K.; Atkinson, R. J.; Hodges, B. C.; Thomas, D. T.

    1973-01-01

    This paper considers some basic problems relating to automatic data base generation from imagery, the primary emphasis being on fast and efficient automatic extraction of relevant pictorial information. Among the techniques discussed are recursive implementations of some particular types of filters which are much faster than FFT implementations, a 'sequential similarity detection' technique of implementing matched filters, and sequential linear classification of multispectral imagery. Several applications of the above techniques are presented including enhancement of underwater, aerial and radiographic imagery, detection and reconstruction of particular types of features in images, automatic picture registration and classification of multiband aerial photographs to generate thematic land use maps.

  4. GC-Based Techniques for Breath Analysis: Current Status, Challenges, and Prospects.

    PubMed

    Xu, Mingjun; Tang, Zhentao; Duan, Yixiang; Liu, Yong

    2016-07-03

    Breath analysis is a noninvasive diagnostic method that profiles a person's physical state by volatile organic compounds in the breath. It has huge potential in the field of disease diagnosis. In order to offer opportunities for practical applications, various GC-based techniques have been investigated for on-line breath analysis since GC is the most preferred technique for mixed gas separation. This article reviews the development of breath analysis and GC-based techniques in basic breath research, involving sampling methods, preconcentration methods, conventional GC-based techniques, and newly developed GC techniques for breath analysis. The combination of GC and newly developed detection techniques takes advantages of the virtues of each. In addition, portable GC or micro GC are poised to become field GC-based techniques in breath analysis. Challenges faced in GC-based techniques for breath analysis are discussed candidly. Effective cooperation of experts from different fields is urgent to promote the development of breath analysis.

  5. BCC skin cancer diagnosis based on texture analysis techniques

    NASA Astrophysics Data System (ADS)

    Chuang, Shao-Hui; Sun, Xiaoyan; Chang, Wen-Yu; Chen, Gwo-Shing; Huang, Adam; Li, Jiang; McKenzie, Frederic D.

    2011-03-01

    In this paper, we present a texture analysis based method for diagnosing the Basal Cell Carcinoma (BCC) skin cancer using optical images taken from the suspicious skin regions. We first extracted the Run Length Matrix and Haralick texture features from the images and used a feature selection algorithm to identify the most effective feature set for the diagnosis. We then utilized a Multi-Layer Perceptron (MLP) classifier to classify the images to BCC or normal cases. Experiments showed that detecting BCC cancer based on optical images is feasible. The best sensitivity and specificity we achieved on our data set were 94% and 95%, respectively.

  6. Ionospheric Plasma Drift Analysis Technique Based On Ray Tracing

    NASA Astrophysics Data System (ADS)

    Ari, Gizem; Toker, Cenk

    2016-07-01

    Ionospheric drift measurements provide important information about the variability in the ionosphere, which can be used to quantify ionospheric disturbances caused by natural phenomena such as solar, geomagnetic, gravitational and seismic activities. One of the prominent ways for drift measurement depends on instrumentation based measurements, e.g. using an ionosonde. The drift estimation of an ionosonde depends on measuring the Doppler shift on the received signal, where the main cause of Doppler shift is the change in the length of the propagation path of the signal between the transmitter and the receiver. Unfortunately, ionosondes are expensive devices and their installation and maintenance require special care. Furthermore, the ionosonde network over the world or even Europe is not dense enough to obtain a global or continental drift map. In order to overcome the difficulties related to an ionosonde, we propose a technique to perform ionospheric drift estimation based on ray tracing. First, a two dimensional TEC map is constructed by using the IONOLAB-MAP tool which spatially interpolates the VTEC estimates obtained from the EUREF CORS network. Next, a three dimensional electron density profile is generated by inputting the TEC estimates to the IRI-2015 model. Eventually, a close-to-real situation electron density profile is obtained in which ray tracing can be performed. These profiles can be constructed periodically with a period of as low as 30 seconds. By processing two consequent snapshots together and calculating the propagation paths, we estimate the drift measurements over any coordinate of concern. We test our technique by comparing the results to the drift measurements taken at the DPS ionosonde at Pruhonice, Czech Republic. This study is supported by TUBITAK 115E915 and Joint TUBITAK 114E092 and AS CR14/001 projects.

  7. Analysis of High Contrast Imaging Techniques for Space Based Direct Planetary Imaging

    NASA Technical Reports Server (NTRS)

    Lyon, Richard G.; Gezari, Dan Y.; Nisenson, P.; Fisher, Richard R. (Technical Monitor)

    2001-01-01

    We report on our ongoing investigations of a number of techniques for direct detection and imaging of Earth-like planets around nearby stellar sources. Herein, we give a quantitative analysis of these techniques and compare and contrast them via computer simulations. The techniques we will be reporting on are Bracewell Interferometry, Nisenson Apodized Square Aperture, and Coronagraphic masking techniques. We parameterize our results with respect to wavelength, aperture size, effects of mirror speckle, both mid- and high-spatial frequency, detector and photon noise as well pointing error. The recent numerous detections of Jupiter and Saturn like planets has driven a resurgence in research of space based high contrast imaging techniques for direct planetary imaging. Work is currently ongoing for concepts for NASA's Terrestrial Planet Finder mission and a number of study teams have been funded. The authors are members of one team.

  8. Knowledge-base for the new human reliability analysis method, A Technique for Human Error Analysis (ATHEANA)

    SciTech Connect

    Cooper, S.E.; Wreathall, J.; Thompson, C.M., Drouin, M.; Bley, D.C.

    1996-10-01

    This paper describes the knowledge base for the application of the new human reliability analysis (HRA) method, a ``A Technique for Human Error Analysis`` (ATHEANA). Since application of ATHEANA requires the identification of previously unmodeled human failure events, especially errors of commission, and associated error-forcing contexts (i.e., combinations of plant conditions and performance shaping factors), this knowledge base is an essential aid for the HRA analyst.

  9. Data analysis techniques

    NASA Technical Reports Server (NTRS)

    Park, Steve

    1990-01-01

    A large and diverse number of computational techniques are routinely used to process and analyze remotely sensed data. These techniques include: univariate statistics; multivariate statistics; principal component analysis; pattern recognition and classification; other multivariate techniques; geometric correction; registration and resampling; radiometric correction; enhancement; restoration; Fourier analysis; and filtering. Each of these techniques will be considered, in order.

  10. Robust and discriminating method for face recognition based on correlation technique and independent component analysis model.

    PubMed

    Alfalou, A; Brosseau, C

    2011-03-01

    We demonstrate a novel technique for face recognition. Our approach relies on the performances of a strongly discriminating optical correlation method along with the robustness of the independent component analysis (ICA) model. Simulations were performed to illustrate how this algorithm can identify a face with images from the Pointing Head Pose Image Database. While maintaining algorithmic simplicity, this approach based on ICA representation significantly increases the true recognition rate compared to that obtained using our previously developed all-numerical ICA identity recognition method and another method based on optical correlation and a standard composite filter.

  11. Advanced SuperDARN meteor wind observations based on raw time series analysis technique

    NASA Astrophysics Data System (ADS)

    Tsutsumi, M.; Yukimatu, A. S.; Holdsworth, D. A.; Lester, M.

    2009-04-01

    The meteor observation technique based on SuperDARN raw time series analysis has been upgraded. This technique extracts meteor information as biproducts and does not degrade the quality of normal SuperDARN operations. In the upgrade the radar operating system (RADOPS) has been modified so that it can oversample every 15 km during the normal operations, which have a range resolution of 45 km. As an alternative method for better range determination a frequency domain interferometry (FDI) capability was also coded in RADOPS, where the operating radio frequency can be changed every pulse sequence. Test observations were conducted using the CUTLASS Iceland East and Finland radars, where oversampling and FDI operation (two frequencies separated by 3 kHz) were simultaneously carried out. Meteor ranges obtained in both ranging techniques agreed very well. The ranges were then combined with the interferometer data to estimate meteor echo reflection heights. Although there were still some ambiguities in the arrival angles of echoes because of the rather long antenna spacing of the interferometers, the heights and arrival angles of most of meteor echoes were more accurately determined than previously. Wind velocities were successfully estimated over the height range of 84 to 110 km. The FDI technique developed here can be further applied to the common SuperDARN operation, and study of fine horizontal structures of F region plasma irregularities is expected in the future.

  12. A novel murmur-based heart sound feature extraction technique using envelope-morphological analysis

    NASA Astrophysics Data System (ADS)

    Yao, Hao-Dong; Ma, Jia-Li; Fu, Bin-Bin; Wang, Hai-Yang; Dong, Ming-Chui

    2015-07-01

    Auscultation of heart sound (HS) signals serves as an important primary approach to diagnose cardiovascular diseases (CVDs) for centuries. Confronting the intrinsic drawbacks of traditional HS auscultation, computer-aided automatic HS auscultation based on feature extraction technique has witnessed explosive development. Yet, most existing HS feature extraction methods adopt acoustic or time-frequency features which exhibit poor relationship with diagnostic information, thus restricting the performance of further interpretation and analysis. Tackling such a bottleneck problem, this paper innovatively proposes a novel murmur-based HS feature extraction method since murmurs contain massive pathological information and are regarded as the first indications of pathological occurrences of heart valves. Adapting discrete wavelet transform (DWT) and Shannon envelope, the envelope-morphological characteristics of murmurs are obtained and three features are extracted accordingly. Validated by discriminating normal HS and 5 various abnormal HS signals with extracted features, the proposed method provides an attractive candidate in automatic HS auscultation.

  13. Applications of synchrotron-based micro-imaging techniques for the analysis of Cultural Heritage materials

    SciTech Connect

    Cotte, Marine; Chilida, Javier; Walter, Philippe; Taniguchi, Yoko; Susini, Jean

    2009-01-29

    The analysis of cultural Heritage objects is often technically challenging. When analyzing micro-fragments, the amount of matter is usually very tiny, hence requiring sensitive techniques. These samples, in particular painting fragments, may present multi-layered structures, with layer thickness of {approx}10 {mu}m. It leads to favor micro-imaging techniques, with a good lateral resolution (about one micrometer), that manage the discriminative study of each layer. Besides, samples are usually very complex in term of chemistry, as they are made of mineral and organic matters, amorphous and crystallized phases, major and minor elements. Accordingly, a multi-modal approach is generally essential to solve the chemical complexity of such hybrid materials. Different examples will be given, to illustrate the various possibilities of synchrotron-based micro-imaging techniques, such as micro X-ray diffraction, micro X-ray fluorescence, micro X-ray absorption spectroscopy and micro FTIR spectroscopy. Focus will be made on paintings, but the whole range of museum objects (going from soft matter like paper or wood to hard matter like metal and glass) will be also considered.

  14. Dynamic programming based time-delay estimation technique for analysis of time-varying time-delay

    SciTech Connect

    Gupta, Deepak K.; McKee, George R.; Fonck, Raymond J.

    2010-01-15

    A new time-delay estimation (TDE) technique based on dynamic programming is developed to measure the time-varying time-delay between two signals. The dynamic programming based TDE technique provides a frequency response five to ten times better than previously known TDE techniques, namely, those based on time-lag cross-correlation or wavelet analysis. Effects of frequency spectrum, signal-to-noise ratio, and amplitude of time-delay on response of the TDE technique (represented as transfer function) are studied using simulated data signals. The transfer function for the technique decreases with increase in noise in signal; however it is independent of signal spectrum shape. The dynamic programming based TDE technique is applied to the beam emission spectroscopy diagnostic data to measure poloidal velocity fluctuations, which led to the observation of theoretically predicted zonal flows in high-temperature tokamak plasmas.

  15. Subdivision based isogeometric analysis technique for electric field integral equations for simply connected structures

    NASA Astrophysics Data System (ADS)

    Li, Jie; Dault, Daniel; Liu, Beibei; Tong, Yiying; Shanker, Balasubramaniam

    2016-08-01

    The analysis of electromagnetic scattering has long been performed on a discrete representation of the geometry. This representation is typically continuous but not differentiable. The need to define physical quantities on this geometric representation has led to development of sets of basis functions that need to satisfy constraints at the boundaries of the elements/tessellations (viz., continuity of normal or tangential components across element boundaries). For electromagnetics, these result in either curl/div-conforming basis sets. The geometric representation used for analysis is in stark contrast with that used for design, wherein the surface representation is higher order differentiable. Using this representation for both geometry and physics on geometry has several advantages, and is elucidated in Hughes et al. (2005) [7]. Until now, a bulk of the literature on isogeometric methods have been limited to solid mechanics, with some effort to create NURBS based basis functions for electromagnetic analysis. In this paper, we present the first complete isogeometry solution methodology for the electric field integral equation as applied to simply connected structures. This paper systematically proceeds through surface representation using subdivision, definition of vector basis functions on this surface, to fidelity in the solution of integral equations. We also present techniques to stabilize the solution at low frequencies, and impose a Calderón preconditioner. Several results presented serve to validate the proposed approach as well as demonstrate some of its capabilities.

  16. Nanostructural defects evidenced in failing silicon-based NMOS capacitors by advanced failure analysis techniques

    NASA Astrophysics Data System (ADS)

    Faivre, Emilie; Llido, Roxane; Putero, Magali; Fares, Lahouari; Muller, Christophe

    2014-04-01

    An experimental methodology compliant with industrial constraints was deployed to uncover the origin of soft breakdown events in large planar silicon-based NMOS capacitors. Complementary advanced failure analysis techniques were advantageously employed to localize, isolate and observe structural defects at nanoscale. After an accurate localization of the failing area by optical beam-induced resistance change (OBIRCH), focused ion beam (FIB) technique enabled preparing thin specimens adequate for transmission electron microscopy (TEM). Characterization of the gate oxide microstructure was performed by highresolution TEM imaging and energy-filtered spectroscopy. A dedicated experimental protocol relying on iterative FIB thinning and TEM observation enabled improving the quality of electron imaging of defects at atom scale. In that way, the gate oxide integrity was evaluated and an electrical stress-induced silicon epitaxy was detected concomitantly to soft breakdown events appearing during constant voltage stress. The growth of silicon hillocks enables consuming a part of the breakdown energy and may prevent the soft breakdown event to evolve towards a hard breakdown that is catastrophic for device functionality.

  17. An Evaluation of Microcomputer-Based Strain Analysis Techniques on Meteoritic Chondrules

    NASA Astrophysics Data System (ADS)

    Hill, H. G. M.

    1995-09-01

    Introduction: Chondrule flattening and distinct foliation are preserved in certain chondrites [1] and have been interpreted, by some, as evidence of shock-induced pressure through hypervelocity impacts on parent bodies [2]. Recently, mean aspect ratios of naturally and artificially shocked chondrules, in the Allende (CV3) chondrite, have been correlated with shock intensity [3] using established shock stage criteria [4]. Clearly, quantification of chondrule deformation and appropriate petrographic criteria can be useful tools for constraining parent body shock history and, possibly, post-shock heating [3]. Here, strain analysis techniques (R(sub)(f)/phi and Fry) normally employed in structural geology, have been adapted and evaluated [5], for measuring mean chondrule strain, and orientation. In addition, the possible use of such strain data for partial shock stage classification is considered. R(sub)(f)/phi and Fry Analysis: The relationship between displacement and shape changes in rocks is known as strain [6] and assumes that an initial circle with a unit radius is deformed to form an ellipse, the finite strain ellipse (Rf). The strain ratio (Rs) is an expression of the change of shape. The orientation of the strain ellipse (phi) is the angle subtended between the semi-major axes and the direction of a fixed point of reference. Generally, log mean Rf ~ Rs and, therefore, the approximation Rf = Rs is valid. For chondrules, this is reasonable as they were originally molten, or partially-molten, droplets [7]. Fry's 'center-to-center' geological strain analysis technique [8] is based on the principle that the distribution of particle centers in rocks can sometimes be used to determine the state of finite strain (Rf). Experimental Techniques: The Bovedy (L3) chondrite was chosen for investigation as it contains abundant, oriented, elliptical chondrules [5]. Hardware employed consisted of a Macintosh microcomputer and a flat-bed scanner. Chondrule outlines, obtained

  18. Digital Instrumentation and Control Failure Events Derivation and Analysis by Frame-Based Technique

    SciTech Connect

    Hui-Wen Huang; Chunkuan Shih; Swu Yih; Yen-Chang Tzeng; Ming-Huei Chen

    2006-07-01

    A frame-based technique, including physical frame, logical frame, and cognitive frame, was adopted to perform digital I and C failure events derivation and analysis for generic ABWR. The physical frame was structured with a modified PCTran-ABWR plant simulation code, which was extended and enhanced on the feedwater system, recirculation system, and steam line system. The logical model is structured with MATLAB, which was incorporated into PCTran-ABWR to improve the pressure control system, feedwater control system, recirculation control system, and automated power regulation control system. As a result, the software failure of these digital control systems can be properly simulated and analyzed. The cognitive frame was simulated by the operator awareness status in the scenarios. Moreover, via an internal characteristics tuning technique, the modified PCTran-ABWR can precisely reflect the characteristics of the power-core flow. Hence, in addition to the transient plots, the analysis results can then be demonstrated on the power-core flow map. A number of postulated I and C system software failure events were derived to achieve the dynamic analyses. The basis for event derivation includes the published classification for software anomalies, the digital I and C design data for ABWR, chapter 15 accident analysis of generic SAR, and the reported NPP I and C software failure events. The case study of this research includes (1) the software CMF analysis for the major digital control systems; and (2) postulated ABWR digital I and C software failure events derivation from the actual happening of non-ABWR digital I and C software failure events, which were reported to LER of USNRC or IRS of IAEA. These events were analyzed by PCTran-ABWR. Conflicts among plant status, computer status, and human cognitive status are successfully identified. The operator might not easily recognize the abnormal condition, because the computer status seems to progress normally. However, a well

  19. Operational modal analysis via image based technique of very flexible space structures

    NASA Astrophysics Data System (ADS)

    Sabatini, Marco; Gasbarri, Paolo; Palmerini, Giovanni B.; Monti, Riccardo

    2013-08-01

    Vibrations represent one of the most important topics of the engineering design relevant to flexible structures. The importance of this problem increases when a very flexible system is considered, and this is often the case of space structures. In order to identify the modal characteristics, in terms of natural frequencies and relevant modal parameters, ground tests are performed. However, these parameters could vary due to the operative conditions of the system. In order to continuously monitor the modal characteristics during the satellite lifetime, an operational modal analysis is mandatory. This kind of analysis is usually performed by using classical accelerometers or strain gauges and by properly analyzing the acquired output. In this paper a different approach for the vibrations data acquisition will be performed via image-based technique. In order to simulate a flexible satellite, a free flying platform is used; the problem is furthermore complicated by the fact that the overall system, constituted by a highly rigid bus and very flexible panels, must necessarily be modeled as a multibody system. In the experimental campaign, the camera, placed on the bus, will be used to identify the eigenfrequencies of the vibrating structure; in this case aluminum thin plates simulate very flexible solar panels. The structure is excited by a hammer or studied during a fast attitude maneuver. The results of the experimental activity will be investigated and compared with respect to the numerical simulation obtained via a FEM-multibody software and the relevant results will be proposed and discussed.

  20. Polyspectral signal analysis techniques for condition based maintenance of helicopter drive-train system

    NASA Astrophysics Data System (ADS)

    Hassan Mohammed, Mohammed Ahmed

    For an efficient maintenance of a diverse fleet of air- and rotorcraft, effective condition based maintenance (CBM) must be established based on rotating components monitored vibration signals. In this dissertation, we present theory and applications of polyspectral signal processing techniques for condition monitoring of critical components in the AH-64D helicopter tail rotor drive train system. Currently available vibration-monitoring tools are mostly built around auto- and cross-power spectral analysis which have limited performance in detecting frequency correlations higher than second order. Studying higher order correlations and their Fourier transforms, higher order spectra, provides more information about the vibration signals which helps in building more accurate diagnostic models of the mechanical system. Based on higher order spectral analysis, different signal processing techniques are developed to assess health conditions of different critical rotating-components in the AH-64D helicopter drive-train. Based on cross-bispectrum, quadratic nonlinear transfer function is presented to model second order nonlinearity in a drive-shaft running between the two hanger bearings. Then, quadratic-nonlinearity coupling coefficient between frequency harmonics of the rotating shaft is used as condition metric to study different seeded shaft faults compared to baseline case, namely: shaft misalignment, shaft imbalance, and combination of shaft misalignment and imbalance. The proposed quadratic-nonlinearity metric shows better capabilities in distinguishing the four studied shaft settings than the conventional linear coupling based on cross-power spectrum. We also develop a new concept of Quadratic-Nonlinearity Power-Index spectrum, QNLPI(f), that can be used in signal detection and classification, based on bicoherence spectrum. The proposed QNLPI(f) is derived as a projection of the three-dimensional bicoherence spectrum into two-dimensional spectrum that

  1. An efficient technique for nuclei segmentation based on ellipse descriptor analysis and improved seed detection algorithm.

    PubMed

    Xu, Hongming; Lu, Cheng; Mandal, Mrinal

    2014-09-01

    In this paper, we propose an efficient method for segmenting cell nuclei in the skin histopathological images. The proposed technique consists of four modules. First, it separates the nuclei regions from the background with an adaptive threshold technique. Next, an elliptical descriptor is used to detect the isolated nuclei with elliptical shapes. This descriptor classifies the nuclei regions based on two ellipticity parameters. Nuclei clumps and nuclei with irregular shapes are then localized by an improved seed detection technique based on voting in the eroded nuclei regions. Finally, undivided nuclei regions are segmented by a marked watershed algorithm. Experimental results on 114 different image patches indicate that the proposed technique provides a superior performance in nuclei detection and segmentation.

  2. A neighbourhood analysis based technique for real-time error concealment in H.264 intra pictures

    NASA Astrophysics Data System (ADS)

    Beesley, Steven T. C.; Grecos, Christos; Edirisinghe, Eran

    2007-02-01

    H.264s extensive use of context-based adaptive binary arithmetic or variable length coding makes streams highly susceptible to channel errors, a common occurrence over networks such as those used by mobile devices. Even a single bit error will cause a decoder to discard all stream data up to the next fixed length resynchronisation point, the worst scenario is that an entire slice is lost. In cases where retransmission and forward error concealment are not possible, a decoder should conceal any erroneous data in order to minimise the impact on the viewer. Stream errors can often be spotted early in the decode cycle of a macroblock which if aborted can provide unused processor cycles, these can instead be used to conceal errors at minimal cost, even as part of a real time system. This paper demonstrates a technique that utilises Sobel convolution kernels to quickly analyse the neighbourhood surrounding erroneous macroblocks before performing a weighted multi-directional interpolation. This generates significantly improved statistical (PSNR) and visual (IEEE structural similarity) results when compared to the commonly used weighted pixel value averaging. Furthermore it is also computationally scalable, both during analysis and concealment, achieving maximum performance from the spare processing power available.

  3. Singular value decomposition based feature extraction technique for physiological signal analysis.

    PubMed

    Chang, Cheng-Ding; Wang, Chien-Chih; Jiang, Bernard C

    2012-06-01

    Multiscale entropy (MSE) is one of the popular techniques to calculate and describe the complexity of the physiological signal. Many studies use this approach to detect changes in the physiological conditions in the human body. However, MSE results are easily affected by noise and trends, leading to incorrect estimation of MSE values. In this paper, singular value decomposition (SVD) is adopted to replace MSE to extract the features of physiological signals, and adopt the support vector machine (SVM) to classify the different physiological states. A test data set based on the PhysioNet website was used, and the classification results showed that using SVD to extract features of the physiological signal could attain a classification accuracy rate of 89.157%, which is higher than that using the MSE value (71.084%). The results show the proposed analysis procedure is effective and appropriate for distinguishing different physiological states. This promising result could be used as a reference for doctors in diagnosis of congestive heart failure (CHF) disease.

  4. Application of multivariate data-analysis techniques to biomedical diagnostics based on mid-infrared spectroscopy.

    PubMed

    Wang, Liqun; Mizaikoff, Boris

    2008-07-01

    The objective of this contribution is to review the application of advanced multivariate data-analysis techniques in the field of mid-infrared (MIR) spectroscopic biomedical diagnosis. MIR spectroscopy is a powerful chemical analysis tool for detecting biomedically relevant constituents such as DNA/RNA, proteins, carbohydrates, lipids, etc., and even diseases or disease progression that may induce changes in the chemical composition or structure of biological systems including cells, tissues, and bio-fluids. However, MIR spectra of multiple constituents are usually characterized by strongly overlapping spectral features reflecting the complexity of biological samples. Consequently, MIR spectra of biological samples are frequently difficult to interpret by simple data-analysis techniques. Hence, with increasing complexity of the sample matrix more sophisticated mathematical and statistical data analysis routines are required for deconvoluting spectroscopic data and for providing useful results from information-rich spectroscopic signals. A large body of work relates to the combination of multivariate data-analysis techniques with MIR spectroscopy, and has been applied by a variety of research groups to biomedically relevant areas such as cancer detection and analysis, artery diseases, biomarkers, and other pathologies. The reported results indeed reveal a promising perspective for more widespread application of multivariate data analysis in assisting MIR spectroscopy as a screening or diagnostic tool in biomedical research and clinical studies. While the authors do not mean to ignore any relevant contributions to biomedical analysis across the entire electromagnetic spectrum, they confine the discussion in this contribution to the mid-infrared spectral range as a potentially very useful, yet underutilized frequency region. Selected representative examples without claiming completeness will demonstrate a range of biomedical diagnostic applications with particular

  5. Spatio-temporal analysis of discharge regimes based on hydrograph classification techniques in an agricultural catchment

    NASA Astrophysics Data System (ADS)

    Chen, Xiaofei; Bloeschl, Guenter; Blaschke, Alfred Paul; Silasari, Rasmiaditya; Exner-Kittridge, Mike

    2016-04-01

    The stream, discharges and groundwater hydro-graphs is an integration in spatial and temporal variations for small-scale hydrological response. Characterizing discharges response regime in a drainage farmland is essential to irrigation strategies and hydrologic modeling. Especially for agricultural basins, diurnal hydro-graphs from drainage discharges have been investigated to achieve drainage process inferences in varying magnitudes. To explore the variability of discharge responses, we developed an impersonal method to characterize and classify discharge hydrograph based on features of magnitude and time-series. A cluster analysis (hierarchical k-means) and principal components analysis techniques are used for discharge time-series and groundwater level hydro-graphs to analyze their event characteristics, using 8 different discharge and 18 groundwater level hydro-graphs to test. As the variability of rainfall activity, system location, discharge regime and soil moisture pre-event condition in the catchment, three main clusters of discharge hydro-graph are identified from the test. The results show that : (1) the hydro-graphs from these drainage discharges had similar shapes but different magnitudes for individual rainstorm; the similarity is also showed in overland flow discharge and spring system; (2) for each cluster, the similarity of shape insisted, but the rising slope are different due to different antecedent wetness condition and the rain accumulation meanwhile the difference of regression slope can be explained by system location and discharge area; and (3) surface water always has a close proportional relation with soil moisture throughout the year, while only after the soil moisture exceeds a certain threshold does the outflow of tile drainage systems have a direct ratio relationship with soil moisture and a inverse relationship with the groundwater levels. Finally, we discussed the potential application of hydrograph classification in a wider range of

  6. Aerodynamic measurement techniques. [laser based diagnostic techniques

    NASA Technical Reports Server (NTRS)

    Hunter, W. W., Jr.

    1976-01-01

    Laser characteristics of intensity, monochromatic, spatial coherence, and temporal coherence were developed to advance laser based diagnostic techniques for aerodynamic related research. Two broad categories of visualization and optical measurements were considered, and three techniques received significant attention. These are holography, laser velocimetry, and Raman scattering. Examples of the quantitative laser velocimeter and Raman scattering measurements of velocity, temperature, and density indicated the potential of these nonintrusive techniques.

  7. Wilcoxon signed-rank-based technique for the pulse-shape analysis of HPGe detectors

    NASA Astrophysics Data System (ADS)

    Martín, S.; Quintana, B.; Barrientos, D.

    2016-07-01

    The characterization of the electric response of segmented-contact high-purity germanium detectors requires scanning systems capable of accurately associating each pulse with the position of the interaction that generated it. This process requires an algorithm sensitive to changes above the electronic noise in the pulse shapes produced at different positions, depending on the resolution of the Ge crystal. In this work, a pulse-shape comparison technique based on the Wilcoxon signed-rank test has been developed. It provides a method to distinguish pulses coming from different interaction points in the germanium crystal. Therefore, this technique is a necessary step for building a reliable pulse-shape database that can be used later for the determination of the position of interaction for γ-ray tracking spectrometry devices such as AGATA, GRETA or GERDA. The method was validated by comparison with a χ2 test using simulated and experimental pulses corresponding to a Broad Energy germanium detector (BEGe).

  8. Complexity analysis of sleep and alterations with insomnia based on non-invasive techniques

    PubMed Central

    Holloway, Philip M.; Angelova, Maia; Lombardo, Sara; St Clair Gibson, Alan; Lee, David; Ellis, Jason

    2014-01-01

    For the first time, fractal analysis techniques are implemented to study the correlations present in sleep actigraphy for individuals suffering from acute insomnia with comparisons made against healthy subjects. Analysis was carried out for 21 healthy individuals with no diagnosed sleep disorders and 26 subjects diagnosed with acute insomnia during night-time hours. Detrended fluctuation analysis was applied in order to look for 1/f-fluctuations indicative of high complexity. The aim is to investigate whether complexity analysis can differentiate between people who sleep normally and people who suffer from acute insomnia. We hypothesize that the complexity will be higher in subjects who suffer from acute insomnia owing to increased night-time arousals. This hypothesis, although contrary to much of the literature surrounding complexity in physiology, was found to be correct—for our study. The complexity results for nearly all of the subjects fell within a 1/f-range, indicating the presence of underlying control mechanisms. The subjects with acute insomnia displayed significantly higher correlations, confirmed by significance testing—possibly a result of too much activity in the underlying regulatory systems. Moreover, we found a linear relationship between complexity and variability, both of which increased with the onset of insomnia. Complexity analysis is very promising and could prove to be a useful non-invasive identifier for people who suffer from sleep disorders such as insomnia. PMID:24501273

  9. Complexity analysis of sleep and alterations with insomnia based on non-invasive techniques.

    PubMed

    Holloway, Philip M; Angelova, Maia; Lombardo, Sara; St Clair Gibson, Alan; Lee, David; Ellis, Jason

    2014-04-06

    For the first time, fractal analysis techniques are implemented to study the correlations present in sleep actigraphy for individuals suffering from acute insomnia with comparisons made against healthy subjects. Analysis was carried out for 21 healthy individuals with no diagnosed sleep disorders and 26 subjects diagnosed with acute insomnia during night-time hours. Detrended fluctuation analysis was applied in order to look for 1/f-fluctuations indicative of high complexity. The aim is to investigate whether complexity analysis can differentiate between people who sleep normally and people who suffer from acute insomnia. We hypothesize that the complexity will be higher in subjects who suffer from acute insomnia owing to increased night-time arousals. This hypothesis, although contrary to much of the literature surrounding complexity in physiology, was found to be correct-for our study. The complexity results for nearly all of the subjects fell within a 1/f-range, indicating the presence of underlying control mechanisms. The subjects with acute insomnia displayed significantly higher correlations, confirmed by significance testing-possibly a result of too much activity in the underlying regulatory systems. Moreover, we found a linear relationship between complexity and variability, both of which increased with the onset of insomnia. Complexity analysis is very promising and could prove to be a useful non-invasive identifier for people who suffer from sleep disorders such as insomnia.

  10. Analysis to feature-based video stabilization/registration techniques within application of traffic data collection

    NASA Astrophysics Data System (ADS)

    Sadat, Mojtaba T.; Viti, Francesco

    2015-02-01

    Machine vision is rapidly gaining popularity in the field of Intelligent Transportation Systems. In particular, advantages are foreseen by the exploitation of Aerial Vehicles (AV) in delivering a superior view on traffic phenomena. However, vibration on AVs makes it difficult to extract moving objects on the ground. To partly overcome this issue, image stabilization/registration procedures are adopted to correct and stitch multiple frames taken of the same scene but from different positions, angles, or sensors. In this study, we examine the impact of multiple feature-based techniques for stabilization, and we show that SURF detector outperforms the others in terms of time efficiency and output similarity.

  11. Analysis of meteorological variables in the Australasian region using ground- and space-based GPS techniques

    NASA Astrophysics Data System (ADS)

    Kuleshov, Yuriy; Choy, Suelynn; Fu, Erjiang Frank; Chane-Ming, Fabrice; Liou, Yuei-An; Pavelyev, Alexander G.

    2016-07-01

    Results of analysis of meteorological variables (temperature and moisture) in the Australasian region using the global positioning system (GPS) radio occultation (RO) and GPS ground-based observations verified with in situ radiosonde (RS) data are presented. The potential of using ground-based GPS observations for retrieving column integrated precipitable water vapour (PWV) over the Australian continent has been demonstrated using the Australian ground-based GPS reference stations network. Using data from the 15 ground-based GPS stations, the state of the atmosphere over Victoria during a significant weather event, the March 2010 Melbourne storm, has been investigated, and it has been shown that the GPS observations has potential for monitoring the movement of a weather front that has sharp moisture contrast. Temperature and moisture variability in the atmosphere over various climatic regions (the Indian and the Pacific Oceans, the Antarctic and Australia) has been examined using satellite-based GPS RO and in situ RS observations. Investigating recent atmospheric temperature trends over Antarctica, the time series of the collocated GPS RO and RS data were examined, and strong cooling in the lower stratosphere and warming through the troposphere over Antarctica has been identified, in agreement with outputs of climate models. With further expansion of the Global Navigation Satellite Systems (GNSS) system, it is expected that GNSS satellite- and ground-based measurements would be able to provide an order of magnitude larger amount of data which in turn could significantly advance weather forecasting services, climate monitoring and analysis in the Australasian region.

  12. A discrete wavelet based feature extraction and hybrid classification technique for microarray data analysis.

    PubMed

    Bennet, Jaison; Ganaprakasam, Chilambuchelvan Arul; Arputharaj, Kannan

    2014-01-01

    Cancer classification by doctors and radiologists was based on morphological and clinical features and had limited diagnostic ability in olden days. The recent arrival of DNA microarray technology has led to the concurrent monitoring of thousands of gene expressions in a single chip which stimulates the progress in cancer classification. In this paper, we have proposed a hybrid approach for microarray data classification based on nearest neighbor (KNN), naive Bayes, and support vector machine (SVM). Feature selection prior to classification plays a vital role and a feature selection technique which combines discrete wavelet transform (DWT) and moving window technique (MWT) is used. The performance of the proposed method is compared with the conventional classifiers like support vector machine, nearest neighbor, and naive Bayes. Experiments have been conducted on both real and benchmark datasets and the results indicate that the ensemble approach produces higher classification accuracy than conventional classifiers. This paper serves as an automated system for the classification of cancer and can be applied by doctors in real cases which serve as a boon to the medical community. This work further reduces the misclassification of cancers which is highly not allowed in cancer detection.

  13. Quantitative Techniques in Volumetric Analysis

    NASA Astrophysics Data System (ADS)

    Zimmerman, John; Jacobsen, Jerrold J.

    1996-12-01

    Quantitative Techniques in Volumetric Analysis is a visual library of techniques used in making volumetric measurements. This 40-minute VHS videotape is designed as a resource for introducing students to proper volumetric methods and procedures. The entire tape, or relevant segments of the tape, can also be used to review procedures used in subsequent experiments that rely on the traditional art of quantitative analysis laboratory practice. The techniques included are: Quantitative transfer of a solid with a weighing spoon Quantitative transfer of a solid with a finger held weighing bottle Quantitative transfer of a solid with a paper strap held bottle Quantitative transfer of a solid with a spatula Examples of common quantitative weighing errors Quantitative transfer of a solid from dish to beaker to volumetric flask Quantitative transfer of a solid from dish to volumetric flask Volumetric transfer pipet A complete acid-base titration Hand technique variations The conventional view of contemporary quantitative chemical measurement tends to focus on instrumental systems, computers, and robotics. In this view, the analyst is relegated to placing standards and samples on a tray. A robotic arm delivers a sample to the analysis center, while a computer controls the analysis conditions and records the results. In spite of this, it is rare to find an analysis process that does not rely on some aspect of more traditional quantitative analysis techniques, such as careful dilution to the mark of a volumetric flask. Figure 2. Transfer of a solid with a spatula. Clearly, errors in a classical step will affect the quality of the final analysis. Because of this, it is still important for students to master the key elements of the traditional art of quantitative chemical analysis laboratory practice. Some aspects of chemical analysis, like careful rinsing to insure quantitative transfer, are often an automated part of an instrumental process that must be understood by the

  14. Sensitivity analysis of laboratory based mine overburden analytical techniques for the prediction of acidic mine drainage. Final report

    SciTech Connect

    Bradham, W.S.; Caruccio, F.T.

    1995-09-01

    A three part sensitivity analysis was conducted to evaluate commonly used mine overburden analytical techniques. The primary objectives of the study were: identify and evaluate the effects of variability in mine overburden geochemistry, as measured by pyrite weight percent and neutralization potential (NP), on variability of contaminant production; determine which acid/base accounting interpretation technique best predicts both qualitative and quantitative leachate quality in laboratory analytical testing; and identify the predominant factors of weathering cells, soxhlet extraction, and column leaching tests, and evaluate variability of contaminant production due to variations in; storage conditions, leachant temperature, particle size, particle sorting efficiency, and leaching interval.

  15. Feature analysis of pathological speech signals using local discriminant bases technique.

    PubMed

    Umapathy, K; Krishnan, S

    2005-07-01

    Speech is an integral part of the human communication system. Various pathological conditions affect the vocal functions, inducing speech disorders. Acoustic parameters of speech are commonly used for the assessment of speech disorders and for monitoring the progress of the patient over the course of therapy. In the last two decades, signal-processing techniques have been successfully applied in screening speech disorders. In the paper, a novel approach is proposed to classify pathological speech signals using a local discriminant bases (LDB) algorithm and wavelet packet decompositions. The focus of the paper was to demonstrate the significance of identifying the signal subspaces that contribute to the discriminatory characteristics of normal and pathological speech signals in a computationally efficient way. Features were extracted from target subspaces for classification, and time-frequency decomposition was used to eliminate the need for segmentation of the speech signals. The technique was tested with a database of 212 speech signals (51 normal and 161 pathological) using the Daubechies wavelet (db4). Classification accuracies up to 96% were achieved for a two-group classification as normal and pathological speech signals, and 74% was achieved for a four-group classification as male normal, female normal, male pathological and female pathological signals.

  16. A Block-matching based technique for the analysis of 2D gel images.

    PubMed

    Freire, Ana; Seoane, José A; Rodríguez, Alvaro; Ruiz-Romero, Cristina; López-Campos, Guillermo; Dorado, Julián

    2010-01-01

    Research at protein level is a useful practice in personalized medicine. More specifically, 2D gel images obtained after electrophoresis process can lead to an accurate diagnosis. Several computational approaches try to help the clinicians to establish the correspondence between pairs of proteins of multiple 2D gel images. Most of them perform the alignment of a patient image referred to a reference image. In this work, an approach based on block-matching techniques is developed. Its main characteristic is that it does not need to perform the whole alignment between two images considering each protein separately. A comparison with other published methods is presented. It can be concluded that this method works over broad range of proteomic images, although they have a high level of difficulty.

  17. Automated cloud classification using a ground based infra-red camera and texture analysis techniques

    NASA Astrophysics Data System (ADS)

    Rumi, Emal; Kerr, David; Coupland, Jeremy M.; Sandford, Andrew P.; Brettle, Mike J.

    2013-10-01

    Clouds play an important role in influencing the dynamics of local and global weather and climate conditions. Continuous monitoring of clouds is vital for weather forecasting and for air-traffic control. Convective clouds such as Towering Cumulus (TCU) and Cumulonimbus clouds (CB) are associated with thunderstorms, turbulence and atmospheric instability. Human observers periodically report the presence of CB and TCU clouds during operational hours at airports and observatories; however such observations are expensive and time limited. Robust, automatic classification of cloud type using infrared ground-based instrumentation offers the advantage of continuous, real-time (24/7) data capture and the representation of cloud structure in the form of a thermal map, which can greatly help to characterise certain cloud formations. The work presented here utilised a ground based infrared (8-14 μm) imaging device mounted on a pan/tilt unit for capturing high spatial resolution sky images. These images were processed to extract 45 separate textural features using statistical and spatial frequency based analytical techniques. These features were used to train a weighted k-nearest neighbour (KNN) classifier in order to determine cloud type. Ground truth data were obtained by inspection of images captured simultaneously from a visible wavelength colour camera at the same installation, with approximately the same field of view as the infrared device. These images were classified by a trained cloud observer. Results from the KNN classifier gave an encouraging success rate. A Probability of Detection (POD) of up to 90% with a Probability of False Alarm (POFA) as low as 16% was achieved.

  18. Robust object tracking techniques for vision-based 3D motion analysis applications

    NASA Astrophysics Data System (ADS)

    Knyaz, Vladimir A.; Zheltov, Sergey Y.; Vishnyakov, Boris V.

    2016-04-01

    Automated and accurate spatial motion capturing of an object is necessary for a wide variety of applications including industry and science, virtual reality and movie, medicine and sports. For the most part of applications a reliability and an accuracy of the data obtained as well as convenience for a user are the main characteristics defining the quality of the motion capture system. Among the existing systems for 3D data acquisition, based on different physical principles (accelerometry, magnetometry, time-of-flight, vision-based), optical motion capture systems have a set of advantages such as high speed of acquisition, potential for high accuracy and automation based on advanced image processing algorithms. For vision-based motion capture accurate and robust object features detecting and tracking through the video sequence are the key elements along with a level of automation of capturing process. So for providing high accuracy of obtained spatial data the developed vision-based motion capture system "Mosca" is based on photogrammetric principles of 3D measurements and supports high speed image acquisition in synchronized mode. It includes from 2 to 4 technical vision cameras for capturing video sequences of object motion. The original camera calibration and external orientation procedures provide the basis for high accuracy of 3D measurements. A set of algorithms as for detecting, identifying and tracking of similar targets, so for marker-less object motion capture is developed and tested. The results of algorithms' evaluation show high robustness and high reliability for various motion analysis tasks in technical and biomechanics applications.

  19. Numerical analysis of radiation propagation in innovative volumetric receivers based on selective laser melting techniques

    NASA Astrophysics Data System (ADS)

    Alberti, Fabrizio; Santiago, Sergio; Roccabruna, Mattia; Luque, Salvador; Gonzalez-Aguilar, Jose; Crema, Luigi; Romero, Manuel

    2016-05-01

    Volumetric absorbers constitute one of the key elements in order to achieve high thermal conversion efficiencies in concentrating solar power plants. Regardless of the working fluid or thermodynamic cycle employed, design trends towards higher absorber output temperatures are widespread, which lead to the general need of components of high solar absorptance, high conduction within the receiver material, high internal convection, low radiative and convective heat losses and high mechanical durability. In this context, the use of advanced manufacturing techniques, such as selective laser melting, has allowed for the fabrication of intricate geometries that are capable of fulfilling the previous requirements. This paper presents a parametric design and analysis of the optical performance of volumetric absorbers of variable porosity conducted by means of detailed numerical ray tracing simulations. Sections of variable macroscopic porosity along the absorber depth were constructed by the fractal growth of single-cell structures. Measures of performance analyzed include optical reflection losses from the absorber front and rear faces, penetration of radiation inside the absorber volume, and radiation absorption as a function of absorber depth. The effects of engineering design parameters such as absorber length and wall thickness, material reflectance and porosity distribution on the optical performance of absorbers are discussed, and general design guidelines are given.

  20. A Perspective Study of Koumiss Microbiome by Metagenomics Analysis Based on Single-Cell Amplification Technique

    PubMed Central

    Yao, Guoqiang; Yu, Jie; Hou, Qiangchuan; Hui, Wenyan; Liu, Wenjun; Kwok, Lai-Yu; Menghe, Bilige; Sun, Tiansong; Zhang, Heping; Zhang, Wenyi

    2017-01-01

    Koumiss is a traditional fermented dairy product and a good source for isolating novel bacteria with biotechnology potential. In the present study, we applied the single-cell amplification technique in the metagenomics analysis of koumiss. This approach aimed at detecting the low-abundant bacteria in the koumiss. Briefly, each sample was first serially diluted until reaching the level of approximately 100 cells. Then, three diluted bacterial suspensions were randomly picked for further study. By analyzing 30 diluted koumiss suspensions, a total of 24 bacterial species were identified. In addition to the previously reported koumiss-associated species, such as Lactobacillus (L.) helveticus. Lactococcus lactis. L. buchneri, L. kefiranofaciens, and Acetobacter pasteurianus, we successfully detected three low-abundant taxa in the samples, namely L. otakiensis. Streptococcus macedonicus, and Ruminococcus torques. The functional koumiss metagenomes carried putative genes that relate to lactose metabolism and synthesis of typical flavor compounds. Our study would encourage the use of modern metagenomics to discover novel species of bacteria that could be useful in food industries. PMID:28223973

  1. AVR Microcontroller-based automated technique for analysis of DC motors

    NASA Astrophysics Data System (ADS)

    Kaur, P.; Chatterji, S.

    2014-01-01

    This paper provides essential information on the development of a 'dc motor test and analysis control card' using AVR series ATMega32 microcontroller. This card can be interfaced to PC and calculates parameters like motor losses, efficiency and plot characteristics for dc motors. Presently, there are different tests and methods available to evaluate motor parameters, but a single and universal user-friendly automated set-up has been discussed in this paper. It has been accomplished by designing a data acquisition and SCR bridge firing hardware based on AVR ATMega32 microcontroller. This hardware has the capability to drive the phase-controlled rectifiers and acquire real-time values of current, voltage, temperature and speed of motor. Various analyses feasible with the designed hardware are of immense importance for dc motor manufacturers and quality-sensitive users. Authors, through this paper aim to provide details of this AVR-based hardware which can be used for dc motor parameter analysis and also for motor control applications.

  2. OSSE spectral analysis techniques

    NASA Technical Reports Server (NTRS)

    Purcell, W. R.; Brown, K. M.; Grabelsky, D. A.; Johnson, W. N.; Jung, G. V.; Kinzer, R. L.; Kroeger, R. A.; Kurfess, J. D.; Matz, S. M.; Strickman, M. S.

    1992-01-01

    Analysis of the spectra from the Oriented Scintillation Spectrometer Experiment (OSSE) is complicated because of the typically low signal to noise (approx. 0.1 percent) and the large background variability. The OSSE instrument was designed to address these difficulties by periodically offset-pointing the detectors from the source to perform background measurements. These background measurements are used to estimate the background during each of the source observations. The resulting background-subtracted spectra can then be accumulated and fitted for spectral lines and/or continua. Data selection based on various environmental parameters can be performed at various stages during the analysis procedure. In order to achieve the instrument's statistical sensitivity, however, it will be necessary for investigators to develop a detailed understanding of the instrument operation, data collection, and the background spectrum and its variability. A brief description of the major steps in the OSSE spectral analysis process is described, including a discussion of the OSSE background spectrum and examples of several observational strategies.

  3. Analysis of base fuze functioning of HESH ammunitions through high-speed photographic technique

    NASA Astrophysics Data System (ADS)

    Biswal, T. K.

    2007-01-01

    High-speed photography plays a major role in a Test Range where the direct access is possible through imaging in order to understand a dynamic process thoroughly and both qualitative and quantitative data are obtained thereafter through image processing and analysis. In one of the trials it was difficult to understand the performance of HESH ammunitions on rolled homogeneous armour. There was no consistency in scab formation even though all other parameters like propellant charge mass, charge temperature, impact velocity etc are maintained constant. To understand the event thoroughly high-speed photography was deployed to have a frontal view of the total process. Clear information of shell impact, embedding of HE propellant on armour and base fuze initiation are obtained. In case of scab forming rounds these three processes are clearly observed in sequence. However in non-scab ones base fuze is initiated before the completion of the embedding process resulting non-availability of threshold thrust on to the armour to cause scab. This has been revealed in two rounds where there was a failure of scab formation. As a quantitative measure, fuze delay was calculated for each round and there after premature functioning of base fuze was ascertained in case of non-scab rounds. Such potency of high-speed photography has been depicted in details in this paper.

  4. An Analysis Technique for Active Neutron Multiplicity Measurements Based on First Principles

    SciTech Connect

    Evans, Louise G; Goddard, Braden; Charlton, William S; Peerani, Paolo

    2012-08-13

    Passive neutron multiplicity counting is commonly used to quantify the total mass of plutonium in a sample, without prior knowledge of the sample geometry. However, passive neutron counting is less applicable to uranium measurements due to the low spontaneous fission rates of uranium. Active neutron multiplicity measurements are therefore used to determine the {sup 235}U mass in a sample. Unfortunately, there are still additional challenges to overcome for uranium measurements, such as the coupling of the active source and the uranium sample. Techniques, such as the coupling method, have been developed to help reduce the dependence of calibration curves for active measurements on uranium samples; although, they still require similar geometry known standards. An advanced active neutron multiplicity measurement method is being developed by Texas A&M University, in collaboration with Los Alamos National Laboratory (LANL) in an attempt to overcome the calibration curve requirements. This method can be used to quantify the {sup 235}U mass in a sample containing uranium without using calibration curves. Furthermore, this method is based on existing detectors and nondestructive assay (NDA) systems, such as the LANL Epithermal Neutron Multiplicity Counter (ENMC). This method uses an inexpensive boron carbide liner to shield the uranium sample from thermal and epithermal neutrons while allowing fast neutrons to reach the sample. Due to the relatively low and constant fission and absorption energy dependent cross-sections at high neutron energies for uranium isotopes, fast neutrons can penetrate the sample without significant attenuation. Fast neutron interrogation therefore creates a homogeneous fission rate in the sample, allowing for first principle methods to be used to determine the {sup 235}U mass in the sample. This paper discusses the measurement method concept and development, including measurements and simulations performed to date, as well as the potential

  5. Can state-of-the-art HVS-based objective image quality criteria be used for image reconstruction techniques based on ROI analysis?

    NASA Astrophysics Data System (ADS)

    Dostal, P.; Krasula, L.; Klima, M.

    2012-06-01

    Various image processing techniques in multimedia technology are optimized using visual attention feature of the human visual system. Spatial non-uniformity causes that different locations in an image are of different importance in terms of perception of the image. In other words, the perceived image quality depends mainly on the quality of important locations known as regions of interest. The performance of such techniques is measured by subjective evaluation or objective image quality criteria. Many state-of-the-art objective metrics are based on HVS properties; SSIM, MS-SSIM based on image structural information, VIF based on the information that human brain can ideally gain from the reference image or FSIM utilizing the low-level features to assign the different importance to each location in the image. But still none of these objective metrics utilize the analysis of regions of interest. We solve the question if these objective metrics can be used for effective evaluation of images reconstructed by processing techniques based on ROI analysis utilizing high-level features. In this paper authors show that the state-of-the-art objective metrics do not correlate well with subjective evaluation while the demosaicing based on ROI analysis is used for reconstruction. The ROI were computed from "ground truth" visual attention data. The algorithm combining two known demosaicing techniques on the basis of ROI location is proposed to reconstruct the ROI in fine quality while the rest of image is reconstructed with low quality. The color image reconstructed by this ROI approach was compared with selected demosaicing techniques by objective criteria and subjective testing. The qualitative comparison of the objective and subjective results indicates that the state-of-the-art objective metrics are still not suitable for evaluation image processing techniques based on ROI analysis and new criteria is demanded.

  6. Comparative analysis of H.263 resilience techniques for H.223-based video transmission over slow-fading channels

    NASA Astrophysics Data System (ADS)

    Garzelli, Andrea; Abrardo, Andrea; Barni, Mauro; Marotta, D.

    2000-11-01

    The objective of this work is to analyze and compare H.263 resilience techniques for H.223-based real-time video transmission over narrow-band slow-fading channels. These channel conditions, which are typical for pedestrian video communications, are very critical, because they require Forward Error Correction (FEC), since data retransmission is not feasible, due to high network delay, and they reduce the effectiveness of FEC techniques- due to the bursty nature of the channel. In this work, two different strategies for H.263 video protection against channel errors are considered and compared. The strategies are tested over a slow-fading wireless channel, over which the H.263 video streams, organized and multiplexed by the H.223 Multiplex Protocol, are transmitted. Both standard FEC techniques considered by the H.223 recommendation for equal error protection of the video stream, and unequal error protection (UEP) through GOB synchronization are tested. The experimental results of this comparative analysis prove the superiority of the UEP technique for H.223-based video transmission.

  7. Experimental investigation of evanescence-based infrared biodetection technique for micro-total-analysis systems

    NASA Astrophysics Data System (ADS)

    Chandrasekaran, Arvind; Packirisamy, Muthukumaran

    2009-09-01

    The advent of microoptoelectromechanical systems (MOEMS) and its integration with other technologies such as microfluidics, microthermal, immunoproteomics, etc. has led to the concept of an integrated micro-total-analysis systems (μTAS) or Lab-on-a-Chip for chemical and biological applications. Recently, research and development of μTAS have attained a significant growth rate over several biodetection sciences, in situ medical diagnoses, and point-of-care testing applications. However, it is essential to develop suitable biophysical label-free detection methods for the success, reliability, and ease of use of the μTAS. We proposed an infrared (IR)-based evanescence wave detection system on the silicon-on-insulator platform for biodetection with μTAS. The system operates on the principle of bio-optical interaction that occurs due to the evanescence of light from the waveguide device. The feasibility of biodetection has been experimentally investigated by the detection of horse radish peroxidase upon its reaction with hydrogen peroxide.

  8. A new approach to the analysis of alpha spectra based on neural network techniques

    NASA Astrophysics Data System (ADS)

    Baeza, A.; Miranda, J.; Guillén, J.; Corbacho, J. A.; Pérez, R.

    2011-10-01

    The analysis of alpha spectra requires good radiochemical procedures in order to obtain well differentiated alpha peaks in the spectrum, and the easiest way to analyze them is by directly summing the counts obtained in the Regions of Interest (ROIs). However, the low-energy tails of the alpha peaks frequently make this simple approach unworkable because some peaks partially overlap. Many fitting procedures have been proposed to solve this problem, most of them based on semi-empirical mathematical functions that emulate the shape of a theoretical alpha peak. The main drawback of these methods is that the great number of fitting parameters used means that their physical meaning is obscure or completely lacking. We propose another approach—the application of an artificial neural network. Instead of fitting the experimental data to a mathematical function, the fit is carried out by an artificial neural network (ANN) that has previously been trained to model the shape of an alpha peak using as training patterns several polonium spectra obtained from actual samples analyzed in our laboratory. In this sense, the ANN is able to learn the shape of an actual alpha peak. We have designed such an ANN as a feed-forward multi-layer perceptron with supervised training based on a back-propagation algorithm. The fitting procedure is based on the experimental observables that are characteristic of alpha peaks—the number of counts of the maximum and several peak widths at different heights. Polonium isotope spectra were selected because the alpha peaks corresponding to 208Po, 209Po, and 210Po are monoenergetic and well separated. The uncertainties introduced by this fitting procedure were less than the counting uncertainties. This new approach was applied to the problem of resolving overlapping peaks. Firstly, a theoretical study was carried out by artificially overlapping alpha peaks from actual samples in order to test the ability of the ANN to resolve each peak. Then, the ANN

  9. Advanced NMR-based techniques for pore structure analysis of coal. Final project report

    SciTech Connect

    Smith, D.M.; Hua, D.W.

    1996-02-01

    During the 3 year term of the project, new methods have been developed for characterizing the pore structure of porous materials such as coals, carbons, and amorphous silica gels. In general, these techniques revolve around; (1) combining multiple techniques such as small-angle x-ray scattering (SAXS) and adsorption of contrast-matched adsorbates or {sup 129}Xe NMR and thermoporometry (the change in freezing point with pore size), (2) combining adsorption isotherms over several pressure ranges to obtain a more complete description of pore filling, or (3) applying NMR ({sup 129}Xe, {sup 14}N{sub 2}, {sup 15}N{sub 2}) techniques with well-defined porous solids with pores in the large micropore size range (>1 nm).

  10. End effect analysis of linear induction motor based on the wavelet transform technique

    SciTech Connect

    Mori, Yoshihiko; Torii, Susumu; Ebihara, Daiki

    1999-09-01

    HSST (High Speed Surface Transport) is currently being developed for the railway systems of urban transportation in Japan. It is used in the electromagnetic suspension and short-stator Linear Induction Motor (LIM) for the HSST. The performance of LIM is degraded due to the influence of the end effects. LIM is analyzed using the Fourier series expansion to throw light on this problem. However, to obtain the high-accuracy in this technique, the number of times for calculating is increased. In case of the Wavelet transform technique, as the Wavelet coefficients converge rapidly to zero, this technique has been applied to analyze the end effects of LIM. In this paper, the authors investigated the method for determining of mother wavelet.

  11. Novel Laser-Based Technique is Ideal for Real-Time Environmental Analysis

    ERIC Educational Resources Information Center

    Journal of Chemical Education, 2005

    2005-01-01

    Ocean Optics offers laser-induced breakdown spectrometer systems (LIBS) that can be used to identify light to heavy metals in a variety of sample types and geometries in environmental analysis applications. LIBS are versatile, real-time, high-resolution analyzers for qualitative analysis, in less than one second, of every element in solids,…

  12. Models, Web-Based Simulations, and Integrated Analysis Techniques for Improved Logistical Performance

    DTIC Science & Technology

    2001-11-01

    may be complex or discontinuous, and to manage multiple, conflicting system objectives (Lu et al, 1991). Statistical methods, such as designed...937.775.7364; e-mail: snarayan@cs.wright.edu. MYKYTKA, EDWARD F., Ph.D. Professor, Dept. of Engineering Management and Systems , University of...reliability measurement methodologies, and application of advanced program management techniques. Positions have ranged from aircraft maintenance to system

  13. Characterization of electrically-active defects in ultraviolet light-emitting diodes with laser-based failure analysis techniques

    SciTech Connect

    Miller, Mary A.; Tangyunyong, Paiboon; Cole, Edward I.

    2016-01-14

    Laser-based failure analysis techniques demonstrate the ability to quickly and non-intrusively screen deep ultraviolet light-emitting diodes (LEDs) for electrically-active defects. In particular, two laser-based techniques, light-induced voltage alteration and thermally-induced voltage alteration, generate applied voltage maps (AVMs) that provide information on electrically-active defect behavior including turn-on bias, density, and spatial location. Here, multiple commercial LEDs were examined and found to have dark defect signals in the AVM indicating a site of reduced resistance or leakage through the diode. The existence of the dark defect signals in the AVM correlates strongly with an increased forward-bias leakage current. This increased leakage is not present in devices without AVM signals. Transmission electron microscopy analysis of a dark defect signal site revealed a dislocation cluster through the pn junction. The cluster included an open core dislocation. Even though LEDs with few dark AVM defect signals did not correlate strongly with power loss, direct association between increased open core dislocation densities and reduced LED device performance has been presented elsewhere [M. W. Moseley et al., J. Appl. Phys. 117, 095301 (2015)].

  14. Characterization of electrically-active defects in ultraviolet light-emitting diodes with laser-based failure analysis techniques

    NASA Astrophysics Data System (ADS)

    Miller, Mary A.; Tangyunyong, Paiboon; Cole, Edward I.

    2016-01-01

    Laser-based failure analysis techniques demonstrate the ability to quickly and non-intrusively screen deep ultraviolet light-emitting diodes (LEDs) for electrically-active defects. In particular, two laser-based techniques, light-induced voltage alteration and thermally-induced voltage alteration, generate applied voltage maps (AVMs) that provide information on electrically-active defect behavior including turn-on bias, density, and spatial location. Here, multiple commercial LEDs were examined and found to have dark defect signals in the AVM indicating a site of reduced resistance or leakage through the diode. The existence of the dark defect signals in the AVM correlates strongly with an increased forward-bias leakage current. This increased leakage is not present in devices without AVM signals. Transmission electron microscopy analysis of a dark defect signal site revealed a dislocation cluster through the pn junction. The cluster included an open core dislocation. Even though LEDs with few dark AVM defect signals did not correlate strongly with power loss, direct association between increased open core dislocation densities and reduced LED device performance has been presented elsewhere [M. W. Moseley et al., J. Appl. Phys. 117, 095301 (2015)].

  15. Model building techniques for analysis.

    SciTech Connect

    Walther, Howard P.; McDaniel, Karen Lynn; Keener, Donald; Cordova, Theresa Elena; Henry, Ronald C.; Brooks, Sean; Martin, Wilbur D.

    2009-09-01

    The practice of mechanical engineering for product development has evolved into a complex activity that requires a team of specialists for success. Sandia National Laboratories (SNL) has product engineers, mechanical designers, design engineers, manufacturing engineers, mechanical analysts and experimentalists, qualification engineers, and others that contribute through product realization teams to develop new mechanical hardware. The goal of SNL's Design Group is to change product development by enabling design teams to collaborate within a virtual model-based environment whereby analysis is used to guide design decisions. Computer-aided design (CAD) models using PTC's Pro/ENGINEER software tools are heavily relied upon in the product definition stage of parts and assemblies at SNL. The three-dimensional CAD solid model acts as the design solid model that is filled with all of the detailed design definition needed to manufacture the parts. Analysis is an important part of the product development process. The CAD design solid model (DSM) is the foundation for the creation of the analysis solid model (ASM). Creating an ASM from the DSM currently is a time-consuming effort; the turnaround time for results of a design needs to be decreased to have an impact on the overall product development. This effort can be decreased immensely through simple Pro/ENGINEER modeling techniques that summarize to the method features are created in a part model. This document contains recommended modeling techniques that increase the efficiency of the creation of the ASM from the DSM.

  16. Application of the windowed-Fourier-transform-based fringe analysis technique for investigating temperature and concentration fields in fluids.

    PubMed

    Mohanan, Sharika; Srivastava, Atul

    2014-04-10

    The present work is concerned with the development and application of a novel fringe analysis technique based on the principles of the windowed-Fourier-transform (WFT) for the determination of temperature and concentration fields from interferometric images for a range of heat and mass transfer applications. Based on the extent of the noise level associated with the experimental data, the technique has been coupled with two different phase unwrapping methods: the Itoh algorithm and the quality guided phase unwrapping technique for phase extraction. In order to generate the experimental data, a range of experiments have been carried out which include cooling of a vertical flat plate in free convection conditions, combustion of mono-propellant flames, and growth of organic as well as inorganic crystals from their aqueous solutions. The flat plate and combustion experiments are modeled as heat transfer applications wherein the interest is to determine the whole-field temperature distribution. Aqueous-solution-based crystal growth experiments are performed to simulate the mass transfer phenomena and the interest is to determine the two-dimensional solute concentration field around the growing crystal. A Mach-Zehnder interferometer has been employed to record the path-integrated quantity of interest (temperature and/or concentration) in the form of interferometric images in the experiments. The potential of the WFT method has also been demonstrated on numerically simulated phase data for varying noise levels, and the accuracy in phase extraction have been quantified in terms of the root mean square errors. Three levels of noise, i.e., 0%, 10%, and 20% have been considered. Results of the present study show that the WFT technique allows an accurate extraction of phase values that can subsequently be converted into two-dimensional temperature and/or concentration distribution fields. Moreover, since WFT is a local processing technique, speckle patterns and the inherent

  17. [Measurement Error Analysis and Calibration Technique of NTC - Based Body Temperature Sensor].

    PubMed

    Deng, Chi; Hu, Wei; Diao, Shengxi; Lin, Fujiang; Qian, Dahong

    2015-11-01

    A NTC thermistor-based wearable body temperature sensor was designed. This paper described the design principles and realization method of the NTC-based body temperature sensor. In this paper the temperature measurement error sources of the body temperature sensor were analyzed in detail. The automatic measurement and calibration method of ADC error was given. The results showed that the measurement accuracy of calibrated body temperature sensor is better than ± 0.04 degrees C. The temperature sensor has high accuracy, small size and low power consumption advantages.

  18. Research on fault diagnosis technique on aerocamera communication based on fault tree analysis

    NASA Astrophysics Data System (ADS)

    Li, Lijuan; He, Binggao; Tian, Chengjun; Yang, Chengyu; Duan, Jie

    2008-12-01

    ARINC429 is the standard of digital transmission of avionic device. This paper used fault tree analysis to diagnosis failures of aerocamera 429 communication, built up fault tree of aerocamera 429 communication, analyzed and diagnosed the failures, and designed the detecting flaw, finished aerocamera 429 communication detecting system finally. This detecting system can detect aerocamera 429 communication board fast and effectively, and cut down period of clearing of fault. In addition, it can increase the direction of maintenance and repair, improve the overall function of aerocamera.

  19. The Role of Liquid Based Cytology and Ancillary Techniques in the Peritoneal Washing Analysis: Our Institutional Experience

    PubMed Central

    Rossi, Esther; Bizzarro, Tommaso; Martini, Maurizio; Longatto-Filho, Adhemar; Schmitt, Fernando; Fagotti, Anna; Scambia, Giovanni; Zannoni, Gian Franco

    2017-01-01

    Background The cytological analysis of peritoneal effusions serves as a diagnostic and prognostic aid for either primary or metastatic diseases. Among the different cytological preparations, liquid based cytology (LBC) represents a feasible and reliable method ensuring also the application of ancillary techniques (i.e immunocytochemistry-ICC and molecular testing). Methods We recorded 10348 LBC peritoneal effusions between January 2000 and December 2014. They were classified as non-diagnostic (ND), negative for malignancy-NM, atypical-suspicious for malignancy-SM and positive for malignancy-PM. Results The cytological diagnosis included 218 ND, 9.035 NM, 213 SM and 882 PM. A total of 8048 (7228 NM, 115SM, 705 PM) cases with histological follow-up were included. Our NM included 21 malignant and 7207 benign histological diagnoses. Our 820 SMs+PMs were diagnosed as 107 unknown malignancies (30SM and 77PM), 691 metastatic lesions (81SM and 610PM), 9 lymphomas (2SM and 7PM), 9 mesotheliomas (1SM and 8SM), 4 sarcomas (1SM and 3PM). Primary gynecological cancers contributed with 64% of the cases. We documented 97.4% sensitivity, 99.9% specificity, 98% diagnostic accuracy, 99.7% negative predictive value (NPV) and 99.7% positive predictive value (PPV). Furthermore, the morphological diagnoses were supported by either 173 conclusive ICC results or 50 molecular analyses. Specifically the molecular testing was performed for the EGFR and KRAS mutational analysis based on the previous or contemporary diagnoses of Non Small Cell Lung Cancer (NSCLC) and colon carcinomas. We identified 10 EGFR in NSCCL and 7 KRAS mutations on LBC stored material. Conclusions Peritoneal cytology is an adjunctive tool in the surgical management of tumors mostly gynecological cancers. LBC maximizes the application of ancillary techniques such as ICC and molecular analysis with feasible diagnostic and predictive yields also in controversial cases. PMID:28099523

  20. High-precision technique for in-situ testing of the PZT scanner based on fringe analysis

    NASA Astrophysics Data System (ADS)

    Wang, Daodang; Yang, Yongying; Liu, Dong; Zhuo, Yongmo

    2010-08-01

    A technique based on fringe analysis is presented for the in-situ testing of the PZT scanner, including the end rotation analysis and displacement measurement. With the interferograms acquired in the Twyman-Green interferometer, the testing can be carried out in real time. The end rotation of the PZT scanner and its spatial displacement deviation are analyzed by processing the fringe rotation and interval changes; displacement of the PZT scanner is determined by fringe shift according to the algorithm of template-matching, from which the relation between the driving voltage and displacement is measured to calibrate the nonlinearity of the PZT scanner. It is shown by computer simulation and experiments that the proposed technique for in-situ testing of the PZT scanner takes a short time, and achieves precise displacement measurement as well as the end rotation angle and displacement deviation measurement. The proposed method has high efficiency and precision, and is of great practicality for in-situ calibration of the PZT scanner.

  1. Characterization of electrically-active defects in ultraviolet light-emitting diodes with laser-based failure analysis techniques

    SciTech Connect

    Miller, Mary A.; Tangyunyong, Paiboon; Edward I. Cole, Jr.

    2016-01-12

    In this study, laser-based failure analysis techniques demonstrate the ability to quickly and non-intrusively screen deep ultraviolet light-emitting diodes(LEDs) for electrically-active defects. In particular, two laser-based techniques, light-induced voltage alteration and thermally-induced voltage alteration, generate applied voltage maps (AVMs) that provide information on electrically-active defect behavior including turn-on bias, density, and spatial location. Here, multiple commercial LEDs were examined and found to have dark defect signals in the AVM indicating a site of reduced resistance or leakage through the diode. The existence of the dark defect signals in the AVM correlates strongly with an increased forward-bias leakage current. This increased leakage is not present in devices without AVM signals. Transmission electron microscopyanalysis of a dark defect signal site revealed a dislocation cluster through the pn junction. The cluster included an open core dislocation. Even though LEDs with few dark AVM defect signals did not correlate strongly with power loss, direct association between increased open core dislocation densities and reduced LED device performance has been presented elsewhere [M. W. Moseley et al., J. Appl. Phys. 117, 095301 (2015)].

  2. Characterization of electrically-active defects in ultraviolet light-emitting diodes with laser-based failure analysis techniques

    DOE PAGES

    Miller, Mary A.; Tangyunyong, Paiboon; Edward I. Cole, Jr.

    2016-01-12

    In this study, laser-based failure analysis techniques demonstrate the ability to quickly and non-intrusively screen deep ultraviolet light-emitting diodes(LEDs) for electrically-active defects. In particular, two laser-based techniques, light-induced voltage alteration and thermally-induced voltage alteration, generate applied voltage maps (AVMs) that provide information on electrically-active defect behavior including turn-on bias, density, and spatial location. Here, multiple commercial LEDs were examined and found to have dark defect signals in the AVM indicating a site of reduced resistance or leakage through the diode. The existence of the dark defect signals in the AVM correlates strongly with an increased forward-bias leakage current. This increasedmore » leakage is not present in devices without AVM signals. Transmission electron microscopyanalysis of a dark defect signal site revealed a dislocation cluster through the pn junction. The cluster included an open core dislocation. Even though LEDs with few dark AVM defect signals did not correlate strongly with power loss, direct association between increased open core dislocation densities and reduced LED device performance has been presented elsewhere [M. W. Moseley et al., J. Appl. Phys. 117, 095301 (2015)].« less

  3. NASA standard: Trend analysis techniques

    NASA Technical Reports Server (NTRS)

    1990-01-01

    Descriptive and analytical techniques for NASA trend analysis applications are presented in this standard. Trend analysis is applicable in all organizational elements of NASA connected with, or supporting, developmental/operational programs. This document should be consulted for any data analysis activity requiring the identification or interpretation of trends. Trend analysis is neither a precise term nor a circumscribed methodology: it generally connotes quantitative analysis of time-series data. For NASA activities, the appropriate and applicable techniques include descriptive and graphical statistics, and the fitting or modeling of data by linear, quadratic, and exponential models. Usually, but not always, the data is time-series in nature. Concepts such as autocorrelation and techniques such as Box-Jenkins time-series analysis would only rarely apply and are not included in this document. The basic ideas needed for qualitative and quantitative assessment of trends along with relevant examples are presented.

  4. Urban major road extraction from IKONOS imagery based on modified texture progressing analysis technique

    NASA Astrophysics Data System (ADS)

    Wu, Xuewen; Xu, Hanqiu; Wu, Pingli

    2010-11-01

    A method for urban major road extraction from IKONOS imagery was proposed. The texture features of the image were first analyzed in three different levels. The first level calculated the Mahalanobis distance between test pixels and training pixels. The second level was the calculation results of Bhattacharyya distance between the distributions of the pixels in the training area and the pixels within a 3×3 window in the test area. The third level employed cooccurrence matrices over the texture cube built around one pixel, and then Bhattacharyya distance was used again. The processed results were thresholded and thinned, respectively. With the assistance of the geometrical characteristic of roads, the three resultant images corresponding to three levels were computed using fuzzy mathematics for their likelihood belonging to road and then merged together. A knowledge-based algorithm was used to link the segmented roads. The result was finally optimized by polynomial fitting. The experiment shows that the proposed method can effectively extract the urban major roads from the high-resolution imagery such as IKONOS.

  5. Urban major road extraction from IKONOS imagery based on modified texture progressing analysis technique

    NASA Astrophysics Data System (ADS)

    Wu, Xuewen; Xu, Hanqiu; Wu, Pingli

    2009-09-01

    A method for urban major road extraction from IKONOS imagery was proposed. The texture features of the image were first analyzed in three different levels. The first level calculated the Mahalanobis distance between test pixels and training pixels. The second level was the calculation results of Bhattacharyya distance between the distributions of the pixels in the training area and the pixels within a 3×3 window in the test area. The third level employed cooccurrence matrices over the texture cube built around one pixel, and then Bhattacharyya distance was used again. The processed results were thresholded and thinned, respectively. With the assistance of the geometrical characteristic of roads, the three resultant images corresponding to three levels were computed using fuzzy mathematics for their likelihood belonging to road and then merged together. A knowledge-based algorithm was used to link the segmented roads. The result was finally optimized by polynomial fitting. The experiment shows that the proposed method can effectively extract the urban major roads from the high-resolution imagery such as IKONOS.

  6. Get to Understand More from Single-Cells: Current Studies of Microfluidic-Based Techniques for Single-Cell Analysis

    PubMed Central

    Lo, Shih-Jie; Yao, Da-Jeng

    2015-01-01

    This review describes the microfluidic techniques developed for the analysis of a single cell. The characteristics of microfluidic (e.g., little sample amount required, high-throughput performance) make this tool suitable to answer and to solve biological questions of interest about a single cell. This review aims to introduce microfluidic related techniques for the isolation, trapping and manipulation of a single cell. The major approaches for detection in single-cell analysis are introduced; the applications of single-cell analysis are then summarized. The review concludes with discussions of the future directions and opportunities of microfluidic systems applied in analysis of a single cell. PMID:26213918

  7. 2D wavelet-analysis-based calibration technique for flat-panel imaging detectors: application in cone beam volume CT

    NASA Astrophysics Data System (ADS)

    Tang, Xiangyang; Ning, Ruola; Yu, Rongfeng; Conover, David L.

    1999-05-01

    The application of the newly developed flat panel x-ray imaging detector in cone beam volume CT has attracted increasing interest recently. Due to an imperfect solid state array manufacturing process, however, defective elements, gain non-uniformity and offset image unavoidably exist in all kinds of flat panel x-ray imaging detectors, which will cause severe streak and ring artifacts in a cone beam reconstruction image and severely degrade image quality. A calibration technique, in which the artifacts resulting from the defective elements, gain non-uniformity and offset image can be reduced significantly, is presented in this paper. The detection of defective elements is distinctively based upon two-dimensional (2D) wavelet analysis. Because of its inherent localizability in recognizing singularities or discontinuities, wavelet analysis possesses the capability of detecting defective elements over a rather large x-ray exposure range, e.g., 20% to approximately 60% of the dynamic range of the detector used. Three-dimensional (3D) images of a low-contrast CT phantom have been reconstructed from projection images acquired by a flat panel x-ray imaging detector with and without calibration process applied. The artifacts caused individually by defective elements, gain non-uniformity and offset image have been separated and investigated in detail, and the correlation with each other have also been exposed explicitly. The investigation is enforced by quantitative analysis of the signal to noise ratio (SNR) and the image uniformity of the cone beam reconstruction image. It has been demonstrated that the ring and streak artifacts resulting from the imperfect performance of a flat panel x-ray imaging detector can be reduced dramatically, and then the image qualities of a cone beam reconstruction image, such as contrast resolution and image uniformity are improved significantly. Furthermore, with little modification, the calibration technique presented here is also applicable

  8. Comparative analysis of DNA polymorphisms and phylogenetic relationships among Syzygium cumini Skeels based on phenotypic characters and RAPD technique.

    PubMed

    Singh, Jitendra P; Singh, Ak; Bajpai, Anju; Ahmad, Iffat Zareen

    2014-01-01

    The Indian black berry (Syzygium cumini Skeels) has a great nutraceutical and medicinal properties. As in other fruit crops, the fruit characteristics are important attributes for differentiation were also determined for different accessions of S. cumini. The fruit weight, length, breadth, length: breadth ratio, pulp weight, pulp content, seed weight and pulp: seed ratio significantly varied in different accessions. Molecular characterization was carried out using PCR based RAPD technique. Out of 80 RAPD primers, only 18 primers produced stable polymorphisms that were used to examine the phylogenetic relationship. A sum of 207 loci were generated out of which 201 loci found polymorphic. The average genetic dissimilarity was 97 per cent among jamun accessions. The phylogenetic relationship was also determined by principal coordinates analysis (PCoA) that explained 46.95 per cent cumulative variance. The two-dimensional PCoA analysis showed grouping of the different accessions that were plotted into four sub-plots, representing clustering of accessions. The UPGMA (r = 0.967) and NJ (r = 0.987) dendrogram constructed based on the dissimilarity matrix revealed a good degree of fit with the cophenetic correlation value. The dendrogram grouped the accessions into three main clusters according to their eco-geographical regions which given useful insight into their phylogenetic relationships.

  9. Technique based on LED multispectral imaging and multivariate analysis for monitoring the conservation state of the Dead Sea Scrolls.

    PubMed

    Marengo, Emilio; Manfredi, Marcello; Zerbinati, Orfeo; Robotti, Elisa; Mazzucco, Eleonora; Gosetti, Fabio; Bearman, Greg; France, Fenella; Shor, Pnina

    2011-09-01

    The aim of this project is the development of a noninvasive technique based on LED multispectral imaging (MSI) for monitoring the conservation state of the Dead Sea Scrolls (DSS) collection. It is well-known that changes in the parchment reflectance drive the transition of the scrolls from legible to illegible. Capitalizing on this fact, we will use spectral imaging to detect changes in the reflectance before they become visible to the human eye. The technique uses multivariate analysis and statistical process control theory. The present study was carried out on a "sample" parchment of calfskin. The monitoring of the surface of a commercial modern parchment aged consecutively for 2 h and 6 h at 80 °C and 50% relative humidity (ASTM) was performed at the Imaging Lab of the Library of Congress (Washington, DC, U.S.A.). MSI is here carried out in the vis-NIR range limited to 1 μm, with a number of bands of 13 and bandwidths that range from about 10 nm in UV to 40 nm in IR. Results showed that we could detect and locate changing pixels, on the basis of reflectance changes, after only a few "hours" of aging.

  10. Near-infrared spectral image analysis of pork marbling based on Gabor filter and wide line detector techniques.

    PubMed

    Huang, Hui; Liu, Li; Ngadi, Michael O; Gariépy, Claude; Prasher, Shiv O

    2014-01-01

    Marbling is an important quality attribute of pork. Detection of pork marbling usually involves subjective scoring, which raises the efficiency costs to the processor. In this study, the ability to predict pork marbling using near-infrared (NIR) hyperspectral imaging (900-1700 nm) and the proper image processing techniques were studied. Near-infrared images were collected from pork after marbling evaluation according to current standard chart from the National Pork Producers Council. Image analysis techniques-Gabor filter, wide line detector, and spectral averaging-were applied to extract texture, line, and spectral features, respectively, from NIR images of pork. Samples were grouped into calibration and validation sets. Wavelength selection was performed on calibration set by stepwise regression procedure. Prediction models of pork marbling scores were built using multiple linear regressions based on derivatives of mean spectra and line features at key wavelengths. The results showed that the derivatives of both texture and spectral features produced good results, with correlation coefficients of validation of 0.90 and 0.86, respectively, using wavelengths of 961, 1186, and 1220 nm. The results revealed the great potential of the Gabor filter for analyzing NIR images of pork for the effective and efficient objective evaluation of pork marbling.

  11. Compendium on Risk Analysis Techniques

    DTIC Science & Technology

    The evolution of risk analysis in the materiel acquisition process is traced from the Secretary Packard memorandum to current AMC guidance. Risk ... analysis is defined and many of the existing techniques are described in light of this definition and their specific role in program management and

  12. Graph-Based Symbolic Technique and Its Application in the Frequency Response Bound Analysis of Analog Integrated Circuits

    PubMed Central

    Tlelo-Cuautle, E.; Rodriguez-Chavez, S.; Palma-Rodriguez, A. A.

    2014-01-01

    A new graph-based symbolic technique (GBST) for deriving exact analytical expressions like the transfer function H(s) of an analog integrated circuit (IC), is introduced herein. The derived H(s) of a given analog IC is used to compute the frequency response bounds (maximum and minimum) associated to the magnitude and phase of H(s), subject to some ranges of process variational parameters, and by performing nonlinear constrained optimization. Our simulations demonstrate the usefulness of the new GBST for deriving the exact symbolic expression for H(s), and the last section highlights the good agreement between the frequency response bounds computed by our variational analysis approach versus traditional Monte Carlo simulations. As a conclusion, performing variational analysis using our proposed GBST for computing the frequency response bounds of analog ICs, shows a gain in computing time of 100x for a differential circuit topology and 50x for a 3-stage amplifier, compared to traditional Monte Carlo simulations. PMID:25136650

  13. Comprehensive theoretical analysis and experimental exploration of ultrafast microchip-based high-field asymmetric ion mobility spectrometry (FAIMS) technique.

    PubMed

    Li, Lingfeng; Wang, Yonghuan; Chen, Chilai; Wang, Xiaozhi; Luo, Jikui

    2015-06-01

    High-field asymmetric ion mobility spectrometry (FAIMS) has become an efficient technique for separation and characterization of gas-phase ions at ambient pressure, which utilizes the mobility differences of ions at high and low fields. Micro FAIMS devices made by micro-electromechanical system technology have small gaps of the channels, high electric field and good installation precision, as thus they have received great attentions. However, the disadvantage of relatively low resolution limits their applications in some areas. In this study, theoretical analysis and experimental exploration were carried out to overcome the disadvantage. Multiple scans, characteristic decline curves of ion transmission and pattern recognitions were proposed to improve the performance of the microchip-based FAIMS. The results showed that although micro FAIMS instruments as a standalone chemical analyzer suffer from low resolution, by using one or more of the methods proposed, they can identify chemicals precisely and provide quantitative analysis with low detection limit in some applications. Copyright © 2015 John Wiley & Sons, Ltd.

  14. Visualization and Analysis of Wireless Sensor Network Data for Smart Civil Structure Applications Based On Spatial Correlation Technique

    NASA Astrophysics Data System (ADS)

    Chowdhry, Bhawani Shankar; White, Neil M.; Jeswani, Jai Kumar; Dayo, Khalil; Rathi, Manorma

    2009-07-01

    Disasters affecting infrastructure, such as the 2001 earthquakes in India, 2005 in Pakistan, 2008 in China and the 2004 tsunami in Asia, provide a common need for intelligent buildings and smart civil structures. Now, imagine massive reductions in time to get the infrastructure working again, realtime information on damage to buildings, massive reductions in cost and time to certify that structures are undamaged and can still be operated, reductions in the number of structures to be rebuilt (if they are known not to be damaged). Achieving these ideas would lead to huge, quantifiable, long-term savings to government and industry. Wireless sensor networks (WSNs) can be deployed in buildings to make any civil structure both smart and intelligent. WSNs have recently gained much attention in both public and research communities because they are expected to bring a new paradigm to the interaction between humans, environment, and machines. This paper presents the deployment of WSN nodes in the Top Quality Centralized Instrumentation Centre (TQCIC). We created an ad hoc networking application to collect real-time data sensed from the nodes that were randomly distributed throughout the building. If the sensors are relocated, then the application automatically reconfigures itself in the light of the new routing topology. WSNs are event-based systems that rely on the collective effort of several micro-sensor nodes, which are continuously observing a physical phenomenon. WSN applications require spatially dense sensor deployment in order to achieve satisfactory coverage. The degree of spatial correlation increases with the decreasing inter-node separation. Energy consumption is reduced dramatically by having only those sensor nodes with unique readings transmit their data. We report on an algorithm based on a spatial correlation technique that assures high QoS (in terms of SNR) of the network as well as proper utilization of energy, by suppressing redundant data transmission

  15. NASA standard: Trend analysis techniques

    NASA Technical Reports Server (NTRS)

    1988-01-01

    This Standard presents descriptive and analytical techniques for NASA trend analysis applications. Trend analysis is applicable in all organizational elements of NASA connected with, or supporting, developmental/operational programs. Use of this Standard is not mandatory; however, it should be consulted for any data analysis activity requiring the identification or interpretation of trends. Trend Analysis is neither a precise term nor a circumscribed methodology, but rather connotes, generally, quantitative analysis of time-series data. For NASA activities, the appropriate and applicable techniques include descriptive and graphical statistics, and the fitting or modeling of data by linear, quadratic, and exponential models. Usually, but not always, the data is time-series in nature. Concepts such as autocorrelation and techniques such as Box-Jenkins time-series analysis would only rarely apply and are not included in this Standard. The document presents the basic ideas needed for qualitative and quantitative assessment of trends, together with relevant examples. A list of references provides additional sources of information.

  16. Prefractionation techniques in proteome analysis.

    PubMed

    Righetti, Pier Giorgio; Castagna, Annalisa; Herbert, Ben; Reymond, Frederic; Rossier, Joël S

    2003-08-01

    The present review deals with a number of prefractionation protocols in preparation for two-dimensional map analysis, both in the fields of chromatography and in the field of electrophoresis. In the first case, Fountoulaki's groups has reported just about any chromatographic procedure useful as a prefractionation step, including affinity, ion-exchange, and reversed-phase resins. As a result of the various enrichment steps, several hundred new species, previously undetected in unfractionated samples, could be revealed for the first time. Electrophoretic prefractionation protocols include all those electrokinetic methodologies which are performed in free solution, essentially all relying on isoelectric focusing steps. The devices here reviewed include multichamber apparatus, such as the multicompartment electrolyzer with Immobiline membranes, Off-Gel electrophoresis in a multicup device and the Rotofor, an instrument also based on a multichamber system but exploiting the conventional technique of carrier-ampholyte-focusing. Other instruments of interest are the Octopus, a continuous-flow device for isoelectric focusing in a upward flowing liquid curtain, and the Gradiflow, where different pI cuts are obtained by a multistep passage through two compartments buffered at different pH values. It is felt that this panoply of methods could offer a strong step forward in "mining below the tip of the iceberg" for detecting the "unseen proteome".

  17. A new technique for calculating reentry base heating. [analysis of laminar base flow field of two dimensional reentry body

    NASA Technical Reports Server (NTRS)

    Meng, J. C. S.

    1973-01-01

    The laminar base flow field of a two-dimensional reentry body has been studied by Telenin's method. The flow domain was divided into strips along the x-axis, and the flow variations were represented by Lagrange interpolation polynomials in the transformed vertical coordinate. The complete Navier-Stokes equations were used in the near wake region, and the boundary layer equations were applied elsewhere. The boundary conditions consisted of the flat plate thermal boundary layer in the forebody region and the near wake profile in the downstream region. The resulting two-point boundary value problem of 33 ordinary differential equations was then solved by the multiple shooting method. The detailed flow field and thermal environment in the base region are presented in the form of temperature contours, Mach number contours, velocity vectors, pressure distributions, and heat transfer coefficients on the base surface. The maximum heating rate was found on the centerline, and the two-dimensional stagnation point flow solution was adquate to estimate the maximum heating rate so long as the local Reynolds number could be obtained.

  18. Searching the most appropriate sample pretreatment for the elemental analysis of wines by inductively coupled plasma-based techniques.

    PubMed

    Gonzálvez, A; Armenta, S; Pastor, A; de la Guardia, M

    2008-07-09

    Different sample preparation methods were evaluated for the simultaneous multielement analysis of wine samples by inductively coupled plasma optical emission spectrometry (ICP-OES) and inductively coupled plasma mass spectrometry (ICP-MS). Microwave-assisted digestion in closed vessel, thermal digestion in open reactor, and direct sample dilution were considered for the determination of Li, Be, Na, Mg, Al, K, Ca, Sc, Ti, V, Cr, Mn, Fe, Co, Ni, Cu, Zn, As, Se, Sr, Y, Mo, Cd, Ba, La, Ce, Pr, Nd, Sm, Eu, Gd, Tb, Dy, Ho, Er, Tm, Yb, Lu, Tl, Pb, and Bi in 12 samples of red wine from Valencia and Utiel-Requena protected designation of origin. ICP-MS allows the determination of 17 elements in most of the samples, and using ICP-OES, a maximum of 15 elements were determined. On comparing the sample pretreatment methodology, it can be concluded that the three assayed procedures provide comparable results for the concentration of Li, Na, Mg, Al, K, Ca, Mn, Fe, Zn, and Sr by ICP-OES. Furthermore, ICP-MS data found for Cu, Pb, and Ba were comparable. Digestion treatment provides comparable values using both total decomposition in open system and microwave-assisted treatment for Cu by ICP-OES and for Cr, Ni, and Zn by ICP-MS. Open vessel total digestion provides excess values for Cr, Mn, Fe, and Zn by ICP-OES and defect values for Se. However, direct measurement of diluted wine samples provided uncomparable results with the digestion treatment for Mn, Cu, Pb, Zn, Ba, and Bi by ICP-OES and for Mg, Cr, Fe, Ni, and Zn by ICP-MS. Therefore, it can be concluded that microwave-assisted digestion is the pretreatment procedure of choice for elemental analysis of wine by ICP-based techniques.

  19. A preliminary structural analysis of space-base living quarters modules to verify a weight-estimating technique

    NASA Technical Reports Server (NTRS)

    Grissom, D. S.; Schneider, W. C.

    1971-01-01

    The determination of a base line (minimum weight) design for the primary structure of the living quarters modules in an earth-orbiting space base was investigated. Although the design is preliminary in nature, the supporting analysis is sufficiently thorough to provide a reasonably accurate weight estimate of the major components that are considered to comprise the structural weight of the space base.

  20. [Evidence-based TEP technique].

    PubMed

    Köckerling, F

    2017-01-13

    The guidelines of all international hernia societies recommend as procedures of choice the laparoendoscopic techniques total extraperitoneal patch plasty (TEP) and transabdominal preperitoneal patch plasty (TAPP) as well as the open Lichtenstein operation for elective inguinal hernia repair. The learning curve associated with the laparoendoscopic techniques, in particular TEP, is longer than that for the open Lichtenstein technique due to the complexity of the procedures. Accordingly, for laparoendoscopic techniques it is particularly important that the operations are conducted in a standardized manner in compliance with the evidence-based recommendations given for the technical details. When procedures are carried out in strict compliance with the guidelines of the international hernia societies, low rates of perioperative complications, complication-related reoperations, recurrences and chronic pain can be expected for TEP. Compliance with the guidelines can also positively impact mastery of the learning curve for TEP. The technical guidelines on TEP are based on study results and on the experiences of numerous experts; therefore, it is imperative that they are implemented in routine surgical practice.

  1. Techniques for Automated Performance Analysis

    SciTech Connect

    Marcus, Ryan C.

    2014-09-02

    The performance of a particular HPC code depends on a multitude of variables, including compiler selection, optimization flags, OpenMP pool size, file system load, memory usage, MPI configuration, etc. As a result of this complexity, current predictive models have limited applicability, especially at scale. We present a formulation of scientific codes, nodes, and clusters that reduces complex performance analysis to well-known mathematical techniques. Building accurate predictive models and enhancing our understanding of scientific codes at scale is an important step towards exascale computing.

  2. Proteomic data analysis of glioma cancer stem-cell lines based on novel nonlinear dimensional data reduction techniques

    NASA Astrophysics Data System (ADS)

    Lespinats, Sylvain; Pinker-Domenig, Katja; Wengert, Georg; Houben, Ivo; Lobbes, Marc; Stadlbauer, Andreas; Meyer-Bäse, Anke

    2016-05-01

    Glioma-derived cancer stem cells (GSCs) are tumor-initiating cells and may be refractory to radiation and chemotherapy and thus have important implications for tumor biology and therapeutics. The analysis and interpretation of large proteomic data sets requires the development of new data mining and visualization approaches. Traditional techniques are insufficient to interpret and visualize these resulting experimental data. The emphasis of this paper lies in the application of novel approaches for the visualization, clustering and projection representation to unveil hidden data structures relevant for the accurate interpretation of biological experiments. These qualitative and quantitative methods are applied to the proteomic analysis of data sets derived from the GSCs. The achieved clustering and visualization results provide a more detailed insight into the protein-level fold changes and putative upstream regulators for the GSCs. However the extracted molecular information is insufficient in classifying GSCs and paving the pathway to an improved therapeutics of the heterogeneous glioma.

  3. Global precipitation estimates based on a technique for combining satellite-based estimates, rain gauge analysis, and NWP model precipitation information

    NASA Technical Reports Server (NTRS)

    Huffman, George J.; Adler, Robert F.; Rudolf, Bruno; Schneider, Udo; Keehn, Peter R.

    1995-01-01

    The 'satellite-gauge model' (SGM) technique is described for combining precipitation estimates from microwave satellite data, infrared satellite data, rain gauge analyses, and numerical weather prediction models into improved estimates of global precipitation. Throughout, monthly estimates on a 2.5 degrees x 2.5 degrees lat-long grid are employed. First, a multisatellite product is developed using a combination of low-orbit microwave and geosynchronous-orbit infrared data in the latitude range 40 degrees N - 40 degrees S (the adjusted geosynchronous precipitation index) and low-orbit microwave data alone at higher latitudes. Then the rain gauge analysis is brougth in, weighting each field by its inverse relative error variance to produce a nearly global, observationally based precipitation estimate. To produce a complete global estimate, the numerical model results are used to fill data voids in the combined satellite-gauge estimate. Our sequential approach to combining estimates allows a user to select the multisatellite estimate, the satellite-gauge estimate, or the full SGM estimate (observationally based estimates plus the model information). The primary limitation in the method is imperfections in the estimation of relative error for the individual fields. The SGM results for one year of data (July 1987 to June 1988) show important differences from the individual estimates, including model estimates as well as climatological estimates. In general, the SGM results are drier in the subtropics than the model and climatological results, reflecting the relatively dry microwave estimates that dominate the SGM in oceanic regions.

  4. High Throughput Sample Preparation and Analysis for DNA Sequencing, PCR and Combinatorial Screening of Catalysis Based on Capillary Array Technique

    SciTech Connect

    Zhang, Yonghua

    2000-01-01

    Sample preparation has been one of the major bottlenecks for many high throughput analyses. The purpose of this research was to develop new sample preparation and integration approach for DNA sequencing, PCR based DNA analysis and combinatorial screening of homogeneous catalysis based on multiplexed capillary electrophoresis with laser induced fluorescence or imaging UV absorption detection. The author first introduced a method to integrate the front-end tasks to DNA capillary-array sequencers. protocols for directly sequencing the plasmids from a single bacterial colony in fused-silica capillaries were developed. After the colony was picked, lysis was accomplished in situ in the plastic sample tube using either a thermocycler or heating block. Upon heating, the plasmids were released while chromsomal DNA and membrane proteins were denatured and precipitated to the bottom of the tube. After adding enzyme and Sanger reagents, the resulting solution was aspirated into the reaction capillaries by a syringe pump, and cycle sequencing was initiated. No deleterious effect upon the reaction efficiency, the on-line purification system, or the capillary electrophoresis separation was observed, even though the crude lysate was used as the template. Multiplexed on-line DNA sequencing data from 8 parallel channels allowed base calling up to 620 bp with an accuracy of 98%. The entire system can be automatically regenerated for repeated operation. For PCR based DNA analysis, they demonstrated that capillary electrophoresis with UV detection can be used for DNA analysis starting from clinical sample without purification. After PCR reaction using cheek cell, blood or HIV-1 gag DNA, the reaction mixtures was injected into the capillary either on-line or off-line by base stacking. The protocol was also applied to capillary array electrophoresis. The use of cheaper detection, and the elimination of purification of DNA sample before or after PCR reaction, will make this approach an

  5. Judging complex movement performances for excellence: a principal components analysis-based technique applied to competitive diving.

    PubMed

    Young, Cole; Reinkensmeyer, David J

    2014-08-01

    Athletes rely on subjective assessment of complex movements from coaches and judges to improve their motor skills. In some sports, such as diving, snowboard half pipe, gymnastics, and figure skating, subjective scoring forms the basis for competition. It is currently unclear whether this scoring process can be mathematically modeled; doing so could provide insight into what motor skill is. Principal components analysis has been proposed as a motion analysis method for identifying fundamental units of coordination. We used PCA to analyze movement quality of dives taken from USA Diving's 2009 World Team Selection Camp, first identifying eigenpostures associated with dives, and then using the eigenpostures and their temporal weighting coefficients, as well as elements commonly assumed to affect scoring - gross body path, splash area, and board tip motion - to identify eigendives. Within this eigendive space we predicted actual judges' scores using linear regression. This technique rated dives with accuracy comparable to the human judges. The temporal weighting of the eigenpostures, body center path, splash area, and board tip motion affected the score, but not the eigenpostures themselves. These results illustrate that (1) subjective scoring in a competitive diving event can be mathematically modeled; (2) the elements commonly assumed to affect dive scoring actually do affect scoring (3) skill in elite diving is more associated with the gross body path and the effect of the movement on the board and water than the units of coordination that PCA extracts, which might reflect the high level of technique these divers had achieved. We also illustrate how eigendives can be used to produce dive animations that an observer can distort continuously from poor to excellent, which is a novel approach to performance visualization.

  6. Photogrammetric Techniques for Road Surface Analysis

    NASA Astrophysics Data System (ADS)

    Knyaz, V. A.; Chibunichev, A. G.

    2016-06-01

    The quality and condition of a road surface is of great importance for convenience and safety of driving. So the investigations of the behaviour of road materials in laboratory conditions and monitoring of existing roads are widely fulfilled for controlling a geometric parameters and detecting defects in the road surface. Photogrammetry as accurate non-contact measuring method provides powerful means for solving different tasks in road surface reconstruction and analysis. The range of dimensions concerned in road surface analysis can have great variation from tenths of millimetre to hundreds meters and more. So a set of techniques is needed to meet all requirements of road parameters estimation. Two photogrammetric techniques for road surface analysis are presented: for accurate measuring of road pavement and for road surface reconstruction based on imagery obtained from unmanned aerial vehicle. The first technique uses photogrammetric system based on structured light for fast and accurate surface 3D reconstruction and it allows analysing the characteristics of road texture and monitoring the pavement behaviour. The second technique provides dense 3D model road suitable for road macro parameters estimation.

  7. Insights into the prominent effect of mahanimbine on Acanthamoeba castellanii: Cell profiling analysis based on microscopy techniques

    NASA Astrophysics Data System (ADS)

    Hashim, Fatimah; Amin, Nakisah Mat

    2017-02-01

    Mahanimbine (MH), has been shown to have antiamoeba properties. Therefore, the aim of this study was to assess the growth inhibitory mechanisms of MH on Acanthamoeba castellanii, a causative agents for Acanthamoeba keratitis. The IC50 value obtained for MH against A. castellanii was 1.18 µg/ml. Light and scanning electron microscopy observation showed that most cells were in cystic appearance. While transmission electron microscopy observation revealed changes at the ultrastructural level and fluorescence microscopy observation indicated the induction of apoptosis and autophagic activity in the amoeba cytoplasms. In conclusion, MH has very potent anti-amoebic properties on A. castellanii as is shown by cytotoxicity analyses based on microscopy techniques.

  8. Automatic control of a robot camera for broadcasting based on cameramen's techniques and subjective evaluation and analysis of reproduced images.

    PubMed

    Kato, D; Katsuura, T; Koyama, H

    2000-03-01

    With the goal of achieving an intelligent robot camera system that can take dynamic images automatically through humanlike, natural camera work, we analyzed how images were shot, subjectively evaluated reproduced images, and examined effects of camerawork, using camera control technique as a parameter. It was found that (1) A high evaluation is obtained when human-based data are used for the position adjusting velocity curve of the target; (2) Evaluation scores are relatively high for images taken with feedback-feedforward camera control method for target movement in one direction; (3) Keeping the target within the image area using the control method that imitates human camera handling becomes increasingly difficult when the target changes both direction and velocity and becomes bigger and faster, and (4) The mechanical feedback method can cope with rapid changes in the target's direction and velocity, constantly keeping the target within the image area, though the viewer finds the image rather mechanical as opposed to humanlike.

  9. Characterization of Grain Size Distribution and Grain Shape Analysis of Tephra Deposits: a New Approach Based on Automated Microscopy and Image Analysis Techniques

    NASA Astrophysics Data System (ADS)

    Volentik, A. C.; Bonadonna, C.; Connor, C. B.

    2006-12-01

    Grain size distribution (GSD) is a key parameter in physical volcanology, not only for the characterization of tephra deposits, but also for the application of numerical models and for the compilation of reliable tephra hazard assessments. In addition, grain shape analysis (GSA) has important implications in the determination of particle settling velocities, crucial factor in tephra dispersal models. We present a new method for the determination of reliable GSD together with 2D GSA for volcanic particle ranging between 0.5 μ and 2 mm in size (from 13 to -1 in the Φ scale): the application of the Malvern PharmaVision 830 (PVS) automated optical device. PVS provides several morphological parameters that can be used to determine GSD of volcanic ash. We have compared GSD results for 3 different morphological parameters (mean diameter, maximum length and width) of three samples collected along a dispersal axis of a Plinian eruption of Pululagua volcano (Ecuador). GSD based on particle width gives the best fit with GSD data resulting from sieving techniques and is thus recommended when GSD of volcanic ash (< 2mm) has to be combined with GSD of volcanic lapilli (> 2mm) that require hand sieving. GSA data were investigated to characterize morphology variations with magma composition and with distance from the volcanic vent. In fact, GSA results were analyzed for 3 different tephra deposits with different magma composition: (i) Cerro Negro (basaltic), (ii) Pululagua (dacitic) and (iii) Bishop Tuff (rhyolitic). In particular, we have found that particle intensity shows the same trend for all deposits, whereas trends of roundness and convexity are different for different magma compositions, suggesting that roundness and convexity are strongly dependent on magma fragmentation mechanisms. Preliminary results have also shown that mean roundness, mean convexity and mean intensity do not vary significantly with distance from vent for Pululagua. Finally, an attempt has been made

  10. Task-based strategy for optimized contrast enhanced breast imaging: Analysis of six imaging techniques for mammography and tomosynthesis

    SciTech Connect

    Ikejimba, Lynda C.; Kiarashi, Nooshin; Ghate, Sujata V.; Samei, Ehsan; Lo, Joseph Y.

    2014-06-15

    Purpose: The use of contrast agents in breast imaging has the capability of enhancing nodule detectability and providing physiological information. Accordingly, there has been a growing trend toward using iodine as a contrast medium in digital mammography (DM) and digital breast tomosynthesis (DBT). Widespread use raises concerns about the best way to use iodine in DM and DBT, and thus a comparison is necessary to evaluate typical iodine-enhanced imaging methods. This study used a task-based observer model to determine the optimal imaging approach by analyzing six imaging paradigms in terms of their ability to resolve iodine at a given dose: unsubtracted mammography and tomosynthesis, temporal subtraction mammography and tomosynthesis, and dual energy subtraction mammography and tomosynthesis. Methods: Imaging performance was characterized using a detectability index d{sup ′}, derived from the system task transfer function (TTF), an imaging task, iodine signal difference, and the noise power spectrum (NPS). The task modeled a 10 mm diameter lesion containing iodine concentrations between 2.1 mg/cc and 8.6 mg/cc. TTF was obtained using an edge phantom, and the NPS was measured over several exposure levels, energies, and target-filter combinations. Using a structured CIRS phantom, d{sup ′} was generated as a function of dose and iodine concentration. Results: For all iodine concentrations and dose, temporal subtraction techniques for mammography and tomosynthesis yielded the highest d{sup ′}, while dual energy techniques for both modalities demonstrated the next best performance. Unsubtracted imaging resulted in the lowest d{sup ′} values for both modalities, with unsubtracted mammography performing the worst out of all six paradigms. Conclusions: At any dose, temporal subtraction imaging provides the greatest detectability, with temporally subtracted DBT performing the highest. The authors attribute the successful performance to excellent cancellation of

  11. Task-based strategy for optimized contrast enhanced breast imaging: Analysis of six imaging techniques for mammography and tomosynthesis

    PubMed Central

    Ikejimba, Lynda C.; Kiarashi, Nooshin; Ghate, Sujata V.; Samei, Ehsan; Lo, Joseph Y.

    2014-01-01

    Purpose: The use of contrast agents in breast imaging has the capability of enhancing nodule detectability and providing physiological information. Accordingly, there has been a growing trend toward using iodine as a contrast medium in digital mammography (DM) and digital breast tomosynthesis (DBT). Widespread use raises concerns about the best way to use iodine in DM and DBT, and thus a comparison is necessary to evaluate typical iodine-enhanced imaging methods. This study used a task-based observer model to determine the optimal imaging approach by analyzing six imaging paradigms in terms of their ability to resolve iodine at a given dose: unsubtracted mammography and tomosynthesis, temporal subtraction mammography and tomosynthesis, and dual energy subtraction mammography and tomosynthesis. Methods: Imaging performance was characterized using a detectability index d′, derived from the system task transfer function (TTF), an imaging task, iodine signal difference, and the noise power spectrum (NPS). The task modeled a 10 mm diameter lesion containing iodine concentrations between 2.1 mg/cc and 8.6 mg/cc. TTF was obtained using an edge phantom, and the NPS was measured over several exposure levels, energies, and target-filter combinations. Using a structured CIRS phantom, d′ was generated as a function of dose and iodine concentration. Results: For all iodine concentrations and dose, temporal subtraction techniques for mammography and tomosynthesis yielded the highest d′, while dual energy techniques for both modalities demonstrated the next best performance. Unsubtracted imaging resulted in the lowest d′ values for both modalities, with unsubtracted mammography performing the worst out of all six paradigms. Conclusions: At any dose, temporal subtraction imaging provides the greatest detectability, with temporally subtracted DBT performing the highest. The authors attribute the successful performance to excellent cancellation of inplane structures and

  12. A RT-based Technique for the Analysis and the Removal of Titan's Atmosphere by Cassini/VIMS-IR data

    NASA Astrophysics Data System (ADS)

    Sindoni, G.; Tosi, F.; Adriani, A.; Moriconi, M. L.; D'Aversa, E.; Grassi, D.; Oliva, F.; Dinelli, B. M.; Castelli, E.

    2015-12-01

    Since 2004, the Visual and Infrared Mapping Spectrometer (VIMS), together with the CIRS and UVIS spectrometers, aboard the Cassini spacecraft has provided insight on Saturn and Titan atmospheres through remote sensing observations. The presence of clouds and aerosols in Titan's dense atmosphere makes the analysis of the surface radiation a difficult task. For this purpose, an atmospheric radiative transfer (RT) model is required. The implementation of a RT code, which includes multiple scattering, in an inversion algorithm based on the Bayesian approach, can provide strong constraints about both the surface albedo and the atmospheric composition. The application of this retrieval procedure we have developed to VIMS-IR spectra acquired in nadir or slant geometries allows us to retrieve the equivalent opacity of Titan's atmosphere in terms of variable aerosols and gaseous content. Thus, the separation of the atmospheric and surface contributions in the observed spectrum is possible. The atmospheric removal procedure was tested on the spectral range 1-2.2μm of publicly available VIMS data covering the Ontario Lacus and Ligeia Mare regions. The retrieval of the accurate composition of Titan's atmosphere is a much more complex task. So far, the information about the vertical structure of the atmosphere by limb spectra was mostly derived under conditions where the scattering could be neglected [1,2]. Indeed, since the very high aerosol load in the middle-low atmosphere produces strong scattering effects on the measured spectra, the analysis requires a RT modeling taking into account multiple scattering in a spherical-shell geometry. Therefore the use of an innovative method we are developing based on the Monte-Carlo approach, can provide important information about the vertical distribution of the aerosols and the gases composing Titan's atmosphere.[1]Bellucci et al., (2009). Icarus, 201, Issue 1, p. 198-216.[2]de Kok et al., (2007). Icarus, 191, Issue 1, p. 223-235.

  13. A PLL-based resampling technique for vibration analysis in variable-speed wind turbines with PMSG: A bearing fault case

    NASA Astrophysics Data System (ADS)

    Pezzani, Carlos M.; Bossio, José M.; Castellino, Ariel M.; Bossio, Guillermo R.; De Angelo, Cristian H.

    2017-02-01

    Condition monitoring in permanent magnet synchronous machines has gained interest due to the increasing use in applications such as electric traction and power generation. Particularly in wind power generation, non-invasive condition monitoring techniques are of great importance. Usually, in such applications the access to the generator is complex and costly, while unexpected breakdowns results in high repair costs. This paper presents a technique which allows using vibration analysis for bearing fault detection in permanent magnet synchronous generators used in wind turbines. Given that in wind power applications the generator rotational speed may vary during normal operation, it is necessary to use special sampling techniques to apply spectral analysis of mechanical vibrations. In this work, a resampling technique based on order tracking without measuring the rotor position is proposed. To synchronize sampling with rotor position, an estimation of the rotor position obtained from the angle of the voltage vector is proposed. This angle is obtained from a phase-locked loop synchronized with the generator voltages. The proposed strategy is validated by laboratory experimental results obtained from a permanent magnet synchronous generator. Results with single point defects in the outer race of a bearing under variable speed and load conditions are presented.

  14. Terahertz Wide-Angle Imaging and Analysis on Plane-wave Criteria Based on Inverse Synthetic Aperture Techniques

    NASA Astrophysics Data System (ADS)

    Gao, Jing Kun; Qin, Yu Liang; Deng, Bin; Wang, Hong Qiang; Li, Jin; Li, Xiang

    2016-04-01

    This paper presents two parts of work around terahertz imaging applications. The first part aims at solving the problems occurred with the increasing of the rotation angle. To compensate for the nonlinearity of terahertz radar systems, a calibration signal acquired from a bright target is always used. Generally, this compensation inserts an extra linear phase term in the intermediate frequency (IF) echo signal which is not expected in large-rotation angle imaging applications. We carried out a detailed theoretical analysis on this problem, and a minimum entropy criterion was employed to estimate and compensate for the linear-phase errors. In the second part, the effects of spherical wave on terahertz inverse synthetic aperture imaging are analyzed. Analytic criteria of plane-wave approximation were derived in the cases of different rotation angles. Experimental results of corner reflectors and an aircraft model based on a 330-GHz linear frequency-modulated continuous wave (LFMCW) radar system validated the necessity and effectiveness of the proposed compensation. By comparing the experimental images obtained under plane-wave assumption and spherical-wave correction, it also showed to be highly consistent with the analytic criteria we derived.

  15. A Dimensionality Reduction Technique for Efficient Time Series Similarity Analysis

    PubMed Central

    Wang, Qiang; Megalooikonomou, Vasileios

    2008-01-01

    We propose a dimensionality reduction technique for time series analysis that significantly improves the efficiency and accuracy of similarity searches. In contrast to piecewise constant approximation (PCA) techniques that approximate each time series with constant value segments, the proposed method--Piecewise Vector Quantized Approximation--uses the closest (based on a distance measure) codeword from a codebook of key-sequences to represent each segment. The new representation is symbolic and it allows for the application of text-based retrieval techniques into time series similarity analysis. Experiments on real and simulated datasets show that the proposed technique generally outperforms PCA techniques in clustering and similarity searches. PMID:18496587

  16. LHC Olympics: Advanced Analysis Techniques

    NASA Astrophysics Data System (ADS)

    Armour, Kyle; Larkoski, Andrew; Gray, Amanda; Ventura, Dan; Walsh, Jon; Schabinger, Rob

    2006-05-01

    The LHC Olympics is a series of workshop aimed at encouraging theorists and experimentalists to prepare for the soon-to-be-online Large Hadron Collider in Geneva, Switzerland. One aspect of the LHC Olympics program consists of the study of simulated data sets which represent various possible new physics signals as they would be seen in LHC detectors. Through this exercise, LHC Olympians learn the phenomenology of possible new physics models and gain experience in analyzing LHC data. Additionally, the LHC Olympics encourages discussion between theorists and experimentalists, and through this collaboration new techniques could be developed. The University of Washington LHC Olympics group consists of several first-year graduate and senior undergraduate students, in both theoretical and experimental particle physics. Presented here is a discussion of some of the more advanced techniques used and the recent results of one such LHC Olympics study.

  17. Application of spectral analysis techniques to the intercomparison of aerosol data - Part 4: Synthesized analysis of multisensor satellite and ground-based AOD measurements using combined maximum covariance analysis

    NASA Astrophysics Data System (ADS)

    Li, J.; Carlson, B. E.; Lacis, A. A.

    2014-08-01

    In this paper, we introduce the usage of a newly developed spectral decomposition technique - combined maximum covariance analysis (CMCA) - in the spatiotemporal comparison of four satellite data sets and ground-based observations of aerosol optical depth (AOD). This technique is based on commonly used principal component analysis (PCA) and maximum covariance analysis (MCA). By decomposing the cross-covariance matrix between the joint satellite data field and Aerosol Robotic Network (AERONET) station data, both parallel comparison across different satellite data sets and the evaluation of the satellite data against the AERONET measurements are simultaneously realized. We show that this new method not only confirms the seasonal and interannual variability of aerosol optical depth, aerosol-source regions and events represented by different satellite data sets, but also identifies the strengths and weaknesses of each data set in capturing the variability associated with sources, events or aerosol types. Furthermore, by examining the spread of the spatial modes of different satellite fields, regions with the largest uncertainties in aerosol observation are identified. We also present two regional case studies that respectively demonstrate the capability of the CMCA technique in assessing the representation of an extreme event in different data sets, and in evaluating the performance of different data sets on seasonal and interannual timescales. Global results indicate that different data sets agree qualitatively for major aerosol-source regions. Discrepancies are mostly found over the Sahel, India, eastern and southeastern Asia. Results for eastern Europe suggest that the intense wildfire event in Russia during summer 2010 was less well-represented by SeaWiFS (Sea-viewing Wide Field-of-view Sensor) and OMI (Ozone Monitoring Instrument), which might be due to misclassification of smoke plumes as clouds. Analysis for the Indian subcontinent shows that here SeaWiFS agrees

  18. Application of Electromigration Techniques in Environmental Analysis

    NASA Astrophysics Data System (ADS)

    Bald, Edward; Kubalczyk, Paweł; Studzińska, Sylwia; Dziubakiewicz, Ewelina; Buszewski, Bogusław

    Inherently trace-level concentration of pollutants in the environment, together with the complexity of sample matrices, place a strong demand on the detection capabilities of electromigration methods. Significant progress is continually being made, widening the applicability of these techniques, mostly capillary zone electrophoresis, micellar electrokinetic chromatography, and capillary electrochromatography, to the analysis of real-world environmental samples, including the concentration sensitivity and robustness of the developed analytical procedures. This chapter covers the recent major developments in the domain of capillary electrophoresis analysis of environmental samples for pesticides, polycyclic aromatic hydrocarbons, phenols, amines, carboxylic acids, explosives, pharmaceuticals, and ionic liquids. Emphasis is made on pre-capillary and on-capillary chromatography and electrophoresis-based concentration of analytes and detection improvement.

  19. Analysis of maximum reach in WDM PON architecture based on distributed Raman amplification and pump recycling technique.

    PubMed

    Kim, Chul Han; Lee, Ju Han; Lee, Kwanil

    2007-10-29

    We analyze the performance of bidirectional WDM PON architecture which utilizes distributed Raman amplification and pump recycling technique. The maximum reach at data rates of 622 Mb/s and 1.25 Gb/s in the proposed WDM PON architecture is calculated by taking into account the effects of power budget, chromatic dispersion of transmission fiber, and Raman amplification-induced noises with a given amount of Raman pump power. From the result, the maximum reach for 622 Mb/s and 1.25 Gb/s signal transmission is calculated to be 65 km and 60 km with a Raman pump power of 700 mW, respectively. We also find that the calculated results agree well with the experimental results which were reported previously.

  20. Digital techniques for ULF wave polarization analysis

    NASA Technical Reports Server (NTRS)

    Arthur, C. W.

    1979-01-01

    Digital power spectral and wave polarization analysis are powerful techniques for studying ULF waves in the earth's magnetosphere. Four different techniques for using the spectral matrix to perform such an analysis have been presented in the literature. Three of these techniques are similar in that they require transformation of the spectral matrix to the principal axis system prior to performing the polarization analysis. The differences in the three techniques lie in the manner in which determine this transformation. A comparative study of these three techniques using both simulated and real data has shown them to be approximately equal in quality of performance. The fourth technique does not require transformation of the spectral matrix. Rather, it uses the measured spectral matrix and state vectors for a desired wave type to design a polarization detector function in the frequency domain. The design of various detector functions and their application to both simulated and real data will be presented.

  1. Data analysis techniques: Spectral processing

    NASA Technical Reports Server (NTRS)

    Strauch, R. G.

    1983-01-01

    The individual steps in the data processing scheme applied to most radars used for wind sounding are analyzed. This processing method uses spectral analysis and assumes a pulse Doppler radar. Improvement in the signal to noise ratio of some radars is discussed.

  2. WE-G-BRA-07: Analyzing the Safety Implications of a Brachytherapy Process Improvement Project Utilizing a Novel System-Theory-Based Hazard-Analysis Technique

    SciTech Connect

    Tang, A; Samost, A; Viswanathan, A; Cormack, R; Damato, A

    2015-06-15

    Purpose: To investigate the hazards in cervical-cancer HDR brachytherapy using a novel hazard-analysis technique, System Theoretic Process Analysis (STPA). The applicability and benefit of STPA to the field of radiation oncology is demonstrated. Methods: We analyzed the tandem and ring HDR procedure through observations, discussions with physicists and physicians, and the use of a previously developed process map. Controllers and their respective control actions were identified and arranged into a hierarchical control model of the system, modeling the workflow from applicator insertion through initiating treatment delivery. We then used the STPA process to identify potentially unsafe control actions. Scenarios were then generated from the identified unsafe control actions and used to develop recommendations for system safety constraints. Results: 10 controllers were identified and included in the final model. From these controllers 32 potentially unsafe control actions were identified, leading to more than 120 potential accident scenarios, including both clinical errors (e.g., using outdated imaging studies for planning), and managerial-based incidents (e.g., unsafe equipment, budget, or staffing decisions). Constraints identified from those scenarios include common themes, such as the need for appropriate feedback to give the controllers an adequate mental model to maintain safe boundaries of operations. As an example, one finding was that the likelihood of the potential accident scenario of the applicator breaking during insertion might be reduced by establishing a feedback loop of equipment-usage metrics and equipment-failure reports to the management controller. Conclusion: The utility of STPA in analyzing system hazards in a clinical brachytherapy system was demonstrated. This technique, rooted in system theory, identified scenarios both technical/clinical and managerial in nature. These results suggest that STPA can be successfully used to analyze safety in

  3. Analysis of fatty acids in 12 Mediterranean fish species: advantages and limitations of a new GC-FID/GC-MS based technique.

    PubMed

    Nevigato, Teresina; Masci, Maurizio; Orban, Elena; Di Lena, Gabriella; Casini, Irene; Caproni, Roberto

    2012-07-01

    When fatty acids in fish are analyzed, results in percentage form (profile analysis) are mostly reported. However, the much more useful results expressed as mg/100 g (absolute analysis) is the main information required. Absolute methods based on calibration curves are of good accuracy but with a high degree of complexity if applied to a great number of analytes. Procedures based on the sequence profile analysis-total FA determination-absolute analysis may be suitable for routine use, but suffer from a number of uncertainties that have never been really resolved. These uncertainties are mainly related to the profile analysis. In fact, most profile analyses reported in the literature disagree about the number and type of fatty acids monitored as well as about the total percentage to assign to their sum so leading to possible inaccuracies; in addition the instrumental response factor for all FAME (fatty acid methyl esters) is often considered as a constant, but this is not exactly true. In this work, a set of 24 fatty acids was selected and studied on 12 fish species in the Mediterranean area (variable in lipid content and month of sampling): in our results, and in these species, this set constitutes, on average, 90 ± 3 % of the total fatty acid content. Moreover the error derived from the assumption of a unique response factor was investigated. Two different detection techniques (GC-FID and GC-MS) together with two capillary columns (different in length and polarity) were used in order to acquire complementary data on the same sample. With the protocol here proposed absolute analyses on the 12 cited species are easily achievable by the total FA determination procedure. The accuracy of this approach is good in general, but in some cases (DHA for example) is lower than the accuracy of calibration-based methods. The differences were evaluated on a case by case basis.

  4. [Acupoints selecting and medication rules analysis based on data mining technique for bronchial asthma treated with acupoint application].

    PubMed

    Wang, Zhaohui; Han, Dongyue; Qie, Lili; Liu, Chang; Wang, Fuchun

    2015-06-01

    Clinical literature of bronchial asthma treated with acupoints application from January 2000 to March 2014 in modern periodicals databases was retrieved through computer. With cluster analysis and frequency analysis methods of data mining, acupoints selecting and medication rules of bronchial asthma treated with acupoints application were analyzed. Total 38 articles were included eventually, including 25 acupoints and 42 medicines. The results indicate that on acupoints selecting, Feishu (BL 13) is used as the main acupoint and 3 groups of bladder meridian and conception vessel acupoints are applied alternately and on medicines, Baijiezi (Brassica alba Boiss), Xixin (Radix et Rhizoma Asari), Gansui (Radix Kansui), Yanhusuo (Corydalis) and Mahuang (Radix et Rhizonma Ephedrae) are primarily adopted, epispastic medicines being the main medicines; medicines mostly belong to lung meridian, main medicines being unchanged mostly with Shengjiang as guiding drug.

  5. Novel Texture-based Probabilistic Object Recognition and Tracking Techniques for Food Intake Analysis and Traffic Monitoring

    DTIC Science & Technology

    2015-10-02

    nutritional intake analysis and a tracking application intended for surveillance in low quality videos. Auto- mated food recognition is useful for...a tracking application intended for surveillance in low quality videos. Auto- mated food recognition is useful for personal health applications as...performance. Our tracking systems consider the problem of car and human tracking on potentially very low quality surveillance videos, from xed camera or

  6. Alternate Spectrometric Oil Analysis Techniques

    DTIC Science & Technology

    1992-04-01

    Spectrometric Analysis 75 V. CONCLUSIONS 81 VI. RECOMMENDAT IONS 85 APPENDIX A MICROFILTRATION TEST RIG DATA 87 APPENDIX B MEMBRANE FILTRATION TEST DATA 135...SPECTROETERS 9 2. DESCRIPTION OF MICROFILTRATION TEST RIG FLUIDS 12 3. DESCRIPTION OF SAMPLES USED FOR 3 MICRON PORE SIZE MEMBRANE FILTRATION 15 4... microfiltration , 10-20 ml portions of the used oil samples were passed through a 3 Pm Nucleopore membrane filter. 14 TABLE 3 DESCRIPTION OF SAMPLES USED

  7. Bone quality around bioactive silica-based coated stainless steel implants: analysis by micro-Raman, XRF and XAS techniques.

    PubMed

    Ballarre, Josefina; Desimone, Paula M; Chorro, Matthieu; Baca, Matías; Orellano, Juan Carlos; Ceré, Silvia M

    2013-11-01

    Surface modification of surgical stainless steel implants by sol gel coatings has been proposed as a tool to generate a surface that besides being protective could also create a "bioactive" interface to generate a natural bonding between the metal surface and the existing bone. The aim of this work is to analyze the quality and bone formation around hybrid bioactive coatings containing glass-ceramic particles, made by sol-gel process on 316L stainless steel used as permanent implant in terms of mineralization, calcium content and bone maturity with micro Raman, X-ray microfluorescence and X-ray absorption techniques. Uncoated implants seem to generate a thin bone layer at the beginning of osseointegration process and then this layer being separated from the surface with time. The hybrid coatings without glass-ceramic particles generate new bone around implants, with high concentration of Ca and P at the implant/tissue interface. This fact seems to be related with the presence of silica nanoparticles in the layer. The addition of bioactive particles promotes and enhances the bone quality with a homogeneous Ca and P content and a low rate of beta carbonate substitution and crystallinity, similar to young and mechanical resistant bone.

  8. Real Time Data Analysis Techniques

    NASA Astrophysics Data System (ADS)

    Silberberg, George G.

    1983-03-01

    By the early 1970s, classical photo-optical range instrumentation technology (as a means of gathering weapons' system performance data) had become a costly and inefficient process. Film costs were increasing due to soaring silver prices. Time required to process, read, and produce optical data was becoming unacceptable as a means of supporting weapon system development programs. NWC investigated the feasibility of utilizing Closed Circuit Television (CCTV) technology as an alternative solution for providing optical data. In 1978 a program entitled Metric Video (measurements from video images) was formulated at the Naval Weapons Center, China Lake, California. The purpose of this program was to provide timely data, to reduce the number of operating personnel, and to lower data acquisition costs. Some of the task elements for this program included a near real-time vector miss-distance system, a weapons scoring system, a velocity measuring system, a time-space position system, and a system to replace film cameras for gathering real-time engineering sequential data. These task elements and the development of special hardware and techniques to achieve real-time data will be discussed briefly in this paper.

  9. FDTD based SAR analysis in human head using irregular volume averaging techniques of different resolutions at GSM 900 band

    NASA Astrophysics Data System (ADS)

    Ali, Md Faruk; Ray, Sudhabindu

    2014-06-01

    Specific absorption rate (SAR) induced inside human head in the near-field of a mobile phone antenna has been investigated for three different SAR resolutions using Finite Difference in Time Domain (FDTD) method at GSM 900 band. Voxel based anthropomorphic human head model, consisting of different anatomical tissues, is used to calculate the peak SAR values averaged over 10-g, 1-g and 0.1-g mass. It is observed that the maximum local SAR increases significantly for smaller mass averages.

  10. Innovative Techniques Simplify Vibration Analysis

    NASA Technical Reports Server (NTRS)

    2010-01-01

    In the early years of development, Marshall Space Flight Center engineers encountered challenges related to components in the space shuttle main engine. To assess the problems, they evaluated the effects of vibration and oscillation. To enhance the method of vibration signal analysis, Marshall awarded Small Business Innovation Research (SBIR) contracts to AI Signal Research, Inc. (ASRI), in Huntsville, Alabama. ASRI developed a software package called PC-SIGNAL that NASA now employs on a daily basis, and in 2009, the PKP-Module won Marshall s Software of the Year award. The technology is also used in many industries: aircraft and helicopter, rocket engine manufacturing, transportation, and nuclear power."

  11. Analysis of Different Classification Techniques for Two-Class Functional Near-Infrared Spectroscopy-Based Brain-Computer Interface

    PubMed Central

    Qureshi, Nauman Khalid; Noori, Farzan Majeed; Hong, Keum-Shik

    2016-01-01

    We analyse and compare the classification accuracies of six different classifiers for a two-class mental task (mental arithmetic and rest) using functional near-infrared spectroscopy (fNIRS) signals. The signals of the mental arithmetic and rest tasks from the prefrontal cortex region of the brain for seven healthy subjects were acquired using a multichannel continuous-wave imaging system. After removal of the physiological noises, six features were extracted from the oxygenated hemoglobin (HbO) signals. Two- and three-dimensional combinations of those features were used for classification of mental tasks. In the classification, six different modalities, linear discriminant analysis (LDA), quadratic discriminant analysis (QDA), k-nearest neighbour (kNN), the Naïve Bayes approach, support vector machine (SVM), and artificial neural networks (ANN), were utilized. With these classifiers, the average classification accuracies among the seven subjects for the 2- and 3-dimensional combinations of features were 71.6, 90.0, 69.7, 89.8, 89.5, and 91.4% and 79.6, 95.2, 64.5, 94.8, 95.2, and 96.3%, respectively. ANN showed the maximum classification accuracies: 91.4 and 96.3%. In order to validate the results, a statistical significance test was performed, which confirmed that the p values were statistically significant relative to all of the other classifiers (p < 0.005) using HbO signals. PMID:27725827

  12. Analysis of algebraic reconstruction technique for accurate imaging of gas temperature and concentration based on tunable diode laser absorption spectroscopy

    NASA Astrophysics Data System (ADS)

    Hui-Hui, Xia; Rui-Feng, Kan; Jian-Guo, Liu; Zhen-Yu, Xu; Ya-Bai, He

    2016-06-01

    An improved algebraic reconstruction technique (ART) combined with tunable diode laser absorption spectroscopy(TDLAS) is presented in this paper for determining two-dimensional (2D) distribution of H2O concentration and temperature in a simulated combustion flame. This work aims to simulate the reconstruction of spectroscopic measurements by a multi-view parallel-beam scanning geometry and analyze the effects of projection rays on reconstruction accuracy. It finally proves that reconstruction quality dramatically increases with the number of projection rays increasing until more than 180 for 20 × 20 grid, and after that point, the number of projection rays has little influence on reconstruction accuracy. It is clear that the temperature reconstruction results are more accurate than the water vapor concentration obtained by the traditional concentration calculation method. In the present study an innovative way to reduce the error of concentration reconstruction and improve the reconstruction quality greatly is also proposed, and the capability of this new method is evaluated by using appropriate assessment parameters. By using this new approach, not only the concentration reconstruction accuracy is greatly improved, but also a suitable parallel-beam arrangement is put forward for high reconstruction accuracy and simplicity of experimental validation. Finally, a bimodal structure of the combustion region is assumed to demonstrate the robustness and universality of the proposed method. Numerical investigation indicates that the proposed TDLAS tomographic algorithm is capable of detecting accurate temperature and concentration profiles. This feasible formula for reconstruction research is expected to resolve several key issues in practical combustion devices. Project supported by the Young Scientists Fund of the National Natural Science Foundation of China (Grant No. 61205151), the National Key Scientific Instrument and Equipment Development Project of China (Grant

  13. Analysis of the nonlinear behavior of shear-Alfvén modes in tokamaks based on Hamiltonian mapping techniques

    SciTech Connect

    Briguglio, S. Vlad, G.; Fogaccia, G.; Di Troia, C.; Fusco, V.; Wang, X.; Zonca, F.

    2014-11-15

    We present a series of numerical simulation experiments set up to illustrate the fundamental physics processes underlying the nonlinear dynamics of Alfvénic modes resonantly excited by energetic particles in tokamak plasmas and of the ensuing energetic particle transports. These phenomena are investigated by following the evolution of a test particle population in the electromagnetic fields computed in self-consistent MHD-particle simulation performed by the HMGC code. Hamiltonian mapping techniques are used to extract and illustrate several features of wave-particle dynamics. The universal structure of resonant particle phase space near an isolated resonance is recovered and analyzed, showing that bounded orbits and untrapped trajectories, divided by the instantaneous separatrix, form phase space zonal structures, whose characteristic non-adiabatic evolution time is the same as the nonlinear time of the underlying fluctuations. Bounded orbits correspond to a net outward resonant particle flux, which produces a flattening and/or gradient inversion of the fast ion density profile around the peak of the linear wave-particle resonance. The connection of this phenomenon to the mode saturation is analyzed with reference to two different cases: a Toroidal Alfvén eigenmode in a low shear magnetic equilibrium and a weakly unstable energetic particle mode for stronger magnetic shear. It is shown that, in the former case, saturation is reached because of radial decoupling (resonant particle redistribution matching the mode radial width) and is characterized by a weak dependence of the mode amplitude on the growth rate. In the latter case, saturation is due to resonance detuning (resonant particle redistribution matching the resonance width) with a stronger dependence of the mode amplitude on the growth rate.

  14. Advanced techniques in current signature analysis

    NASA Astrophysics Data System (ADS)

    Smith, S. F.; Castleberry, K. N.

    1992-02-01

    In general, both ac and dc motors can be characterized as weakly nonlinear systems, in which both linear and nonlinear effects occur simultaneously. Fortunately, the nonlinearities are generally well behaved and understood and can be handled via several standard mathematical techniques already well developed in the systems modeling area; examples are piecewise linear approximations and Volterra series representations. Field measurements of numerous motors and motor-driven systems confirm the rather complex nature of motor current spectra and illustrate both linear and nonlinear effects (including line harmonics and modulation components). Although previous current signature analysis (CSA) work at Oak Ridge and other sites has principally focused on the modulation mechanisms and detection methods (AM, PM, and FM), more recent studies have been conducted on linear spectral components (those appearing in the electric current at their actual frequencies and not as modulation sidebands). For example, large axial-flow compressors (approximately 3300 hp) in the US gaseous diffusion uranium enrichment plants exhibit running-speed (approximately 20 Hz) and high-frequency vibrational information (greater than 1 kHz) in their motor current spectra. Several signal-processing techniques developed to facilitate analysis of these components, including specialized filtering schemes, are presented. Finally, concepts for the designs of advanced digitally based CSA units are offered, which should serve to foster the development of much more computationally capable 'smart' CSA instrumentation in the next several years.

  15. Advanced Techniques for Scene Analysis

    DTIC Science & Technology

    2010-06-01

    provide segmentation of the flow field. Wang and Adelson described in [21] a method to represent moving objects using sets of overlapping layers. The... Wang and E. H. Adelson, “Representing moving images with layers,” in Image Processing, IEEE Transactions on, Sep 1994, pp. 625–638. [22] Y. A. G. D...B. Han, C. Paulson, J. Wang , and D. Wu, “Depth-based image registration,” in Proceedings of SPIE Defense & Security Symposium, vol. 7699, Orlando

  16. Typology of delivery quality: latent profile analysis of teacher engagement and delivery techniques in a school-based prevention intervention, keepin’ it REAL curriculum

    PubMed Central

    Shin, YoungJu; Miller-Day, Michelle; Pettigrew, Jonathan; Hecht, Michael L.; Krieger, Janice L.

    2014-01-01

    Enhancing the delivery quality of school-based, evidence-based prevention programs is one key to ensuring uniform program effects on student outcomes. Program evaluations often focus on content dosage when implementing prevention curricula, however, less is known about implementation quality of prevention content, especially among teachers who may or may not have a prevention background. The goal of the current study is to add to the scholarly literature on implementation quality for a school-based substance use prevention intervention. Twenty-five schools in Ohio and Pennsylvania implemented the original keepin’ REAL (kiR) substance use prevention curriculum. Each of the 10, 40–45 min lessons of the kiR curriculum was video recorded. Coders observed and rated a random sample of 276 videos reflecting 78 classes taught by 31 teachers. Codes included teachers’ delivery techniques (e.g. lecture, discussion, demonstration and role play) and engagement with students (e.g. attentiveness, enthusiasm and positivity). Based on the video ratings, a latent profile analysis was run to identify typology of delivery quality. Five profiles were identified: holistic approach, attentive teacher-orientated approach, enthusiastic lecture approach, engaged interactive learning approach and skill practice-only approach. This study provides a descriptive typology of delivery quality while implementing a school-based substance use prevention intervention. PMID:25274721

  17. Typology of delivery quality: latent profile analysis of teacher engagement and delivery techniques in a school-based prevention intervention, keepin' it REAL curriculum.

    PubMed

    Shin, YoungJu; Miller-Day, Michelle; Pettigrew, Jonathan; Hecht, Michael L; Krieger, Janice L

    2014-12-01

    Enhancing the delivery quality of school-based, evidence-based prevention programs is one key to ensuring uniform program effects on student outcomes. Program evaluations often focus on content dosage when implementing prevention curricula, however, less is known about implementation quality of prevention content, especially among teachers who may or may not have a prevention background. The goal of the current study is to add to the scholarly literature on implementation quality for a school-based substance use prevention intervention. Twenty-five schools in Ohio and Pennsylvania implemented the original keepin' REAL (kiR) substance use prevention curriculum. Each of the 10, 40-45 min lessons of the kiR curriculum was video recorded. Coders observed and rated a random sample of 276 videos reflecting 78 classes taught by 31 teachers. Codes included teachers' delivery techniques (e.g., lecture, discussion, demonstration and role play) and engagement with students (e.g. attentiveness, enthusiasm and positivity). Based on the video ratings, a latent profile analysis was run to identify typology of delivery quality. Five profiles were identified: holistic approach, attentive teacher-orientated approach, enthusiastic lecture approach, engaged interactive learning approach and skill practice-only approach. This study provides a descriptive typology of delivery quality while implementing a school-based substance use prevention intervention.

  18. Physicochemical bases of differences between the sedimentometric and laser-diffraction techniques of soil particle-size analysis

    NASA Astrophysics Data System (ADS)

    Fedotov, G. N.; Shein, E. V.; Putlyaev, V. I.; Arkhangel'Skaya, T. A.; Eliseev, A. V.; Milanovskii, E. Yu.

    2007-03-01

    Comparison of the particle-size distributions in different soils showed that the sedimentation method (Kachinskii pipette method) gives higher (by 1.5-5 times) values of the clay content than the laser diffraction method. This is related to the significant variation in density of soil solids, which is taken to be constant in the sedimentation method. Therefore, particles of significantly larger size and lower density fall into this fraction. Using optical, electron, and confocal microscopy, it was shown that the low density of soil particles of silt size falling into the sedimentometric clay fraction is related to the organomineral shell (film) around the soil microparticles. This shell contributes to the linking of microparticles into aggregates at the lower average density. As a result, these aggregates have significantly larger size and lower density and settle with the same velocity as the small particles with the average density of the solid phase during the sedimentation particle-size analysis.

  19. A decision support system for fusion of hard and soft sensor information based on probabilistic latent semantic analysis technique

    NASA Astrophysics Data System (ADS)

    Shirkhodaie, Amir; Elangovan, Vinayak; Alkilani, Amjad; Habibi, Mohammad

    2013-05-01

    This paper presents an ongoing effort towards development of an intelligent Decision-Support System (iDSS) for fusion of information from multiple sources consisting of data from hard (physical sensors) and soft (textural sources. Primarily, this paper defines taxonomy of decision support systems for latent semantic data mining from heterogeneous data sources. A Probabilistic Latent Semantic Analysis (PLSA) approach is proposed for latent semantic concepts search from heterogeneous data sources. An architectural model for generating semantic annotation of multi-modality sensors in a modified Transducer Markup Language (TML) is described. A method for TML messages fusion is discussed for alignment and integration of spatiotemporally correlated and associated physical sensory observations. Lastly, the experimental results which exploit fusion of soft/hard sensor sources with support of iDSS are discussed.

  20. Comparing Techniques for Certified Static Analysis

    NASA Technical Reports Server (NTRS)

    Cachera, David; Pichardie, David

    2009-01-01

    A certified static analysis is an analysis whose semantic validity has been formally proved correct with a proof assistant. The recent increasing interest in using proof assistants for mechanizing programming language metatheory has given rise to several approaches for certification of static analysis. We propose a panorama of these techniques and compare their respective strengths and weaknesses.

  1. An inertial sensor-based system for spatio-temporal analysis in classic cross-country skiing diagonal technique.

    PubMed

    Fasel, Benedikt; Favre, Julien; Chardonnens, Julien; Gremion, Gérald; Aminian, Kamiar

    2015-09-18

    The present study proposes a method based on ski fixed inertial sensors to automatically compute spatio-temporal parameters (phase durations, cycle speed and cycle length) for the diagonal stride in classical cross-country skiing. The proposed system was validated against a marker-based motion capture system during indoor treadmill skiing. Skiing movement of 10 junior to world-cup athletes was measured for four different conditions. The accuracy (i.e. median error) and precision (i.e. interquartile range of error) of the system was below 6 ms for cycle duration and ski thrust duration and below 35 ms for pole push duration. Cycle speed precision (accuracy) was below 0.1m/s (0.00 5m/s) and cycle length precision (accuracy) was below 0.15m (0.005 m). The system was sensitive to changes of conditions and was accurate enough to detect significant differences reported in previous studies. Since capture volume is not limited and setup is simple, the system would be well suited for outdoor measurements on snow.

  2. A dual resolution measurement based Monte Carlo simulation technique for detailed dose analysis of small volume organs in the skull base region

    NASA Astrophysics Data System (ADS)

    Yeh, Chi-Yuan; Tung, Chuan-Jung; Chao, Tsi-Chain; Lin, Mu-Han; Lee, Chung-Chi

    2014-11-01

    The purpose of this study was to examine dose distribution of a skull base tumor and surrounding critical structures in response to high dose intensity-modulated radiosurgery (IMRS) with Monte Carlo (MC) simulation using a dual resolution sandwich phantom. The measurement-based Monte Carlo (MBMC) method (Lin et al., 2009) was adopted for the study. The major components of the MBMC technique involve (1) the BEAMnrc code for beam transport through the treatment head of a Varian 21EX linear accelerator, (2) the DOSXYZnrc code for patient dose simulation and (3) an EPID-measured efficiency map which describes non-uniform fluence distribution of the IMRS treatment beam. For the simulated case, five isocentric 6 MV photon beams were designed to deliver a total dose of 1200 cGy in two fractions to the skull base tumor. A sandwich phantom for the MBMC simulation was created based on the patient's CT scan of a skull base tumor [gross tumor volume (GTV)=8.4 cm3] near the right 8th cranial nerve. The phantom, consisted of a 1.2-cm thick skull base region, had a voxel resolution of 0.05×0.05×0.1 cm3 and was sandwiched in between 0.05×0.05×0.3 cm3 slices of a head phantom. A coarser 0.2×0.2×0.3 cm3 single resolution (SR) phantom was also created for comparison with the sandwich phantom. A particle history of 3×108 for each beam was used for simulations of both the SR and the sandwich phantoms to achieve a statistical uncertainty of <2%. Our study showed that the planning target volume (PTV) receiving at least 95% of the prescribed dose (VPTV95) was 96.9%, 96.7% and 99.9% for the TPS, SR, and sandwich phantom, respectively. The maximum and mean doses to large organs such as the PTV, brain stem, and parotid gland for the TPS, SR and sandwich MC simulations did not show any significant difference; however, significant dose differences were observed for very small structures like the right 8th cranial nerve, right cochlea, right malleus and right semicircular canal. Dose

  3. Crude Oil Price Forecasting Based on Hybridizing Wavelet Multiple Linear Regression Model, Particle Swarm Optimization Techniques, and Principal Component Analysis

    PubMed Central

    Shabri, Ani; Samsudin, Ruhaidah

    2014-01-01

    Crude oil prices do play significant role in the global economy and are a key input into option pricing formulas, portfolio allocation, and risk measurement. In this paper, a hybrid model integrating wavelet and multiple linear regressions (MLR) is proposed for crude oil price forecasting. In this model, Mallat wavelet transform is first selected to decompose an original time series into several subseries with different scale. Then, the principal component analysis (PCA) is used in processing subseries data in MLR for crude oil price forecasting. The particle swarm optimization (PSO) is used to adopt the optimal parameters of the MLR model. To assess the effectiveness of this model, daily crude oil market, West Texas Intermediate (WTI), has been used as the case study. Time series prediction capability performance of the WMLR model is compared with the MLR, ARIMA, and GARCH models using various statistics measures. The experimental results show that the proposed model outperforms the individual models in forecasting of the crude oil prices series. PMID:24895666

  4. Ion beam analysis techniques in interdisciplinary applications

    SciTech Connect

    Respaldiza, Miguel A.; Ager, Francisco J.

    1999-11-16

    The ion beam analysis techniques emerge in the last years as one of the main applications of electrostatic accelerators. A short summary of the most used IBA techniques will be given as well as some examples of applications in interdisciplinary sciences.

  5. Ion Beam Analysis Techniques in Interdisciplinary Applications

    SciTech Connect

    Respaldiza, Miguel A.; Ager, Francisco J.

    1999-12-31

    The ion beam analysis techniques emerge in the last years as one of the main applications of electrostatic accelerators. A short summary of the most used IBA techniques will be given as well as some examples of applications in interdisciplinary sciences.

  6. A New Microcell Technique for NMR Analysis.

    ERIC Educational Resources Information Center

    Yu, Sophia J.

    1987-01-01

    Describes a new laboratory technique for working with small samples of compounds used in nuclear magnetic resonance (NMR) analysis. Demonstrates how microcells can be constructed for each experiment and samples can be recycled. (TW)

  7. Statistical Evaluation of Time Series Analysis Techniques

    NASA Technical Reports Server (NTRS)

    Benignus, V. A.

    1973-01-01

    The performance of a modified version of NASA's multivariate spectrum analysis program is discussed. A multiple regression model was used to make the revisions. Performance improvements were documented and compared to the standard fast Fourier transform by Monte Carlo techniques.

  8. Statistical evaluation of vibration analysis techniques

    NASA Technical Reports Server (NTRS)

    Milner, G. Martin; Miller, Patrice S.

    1987-01-01

    An evaluation methodology is presented for a selection of candidate vibration analysis techniques applicable to machinery representative of the environmental control and life support system of advanced spacecraft; illustrative results are given. Attention is given to the statistical analysis of small sample experiments, the quantification of detection performance for diverse techniques through the computation of probability of detection versus probability of false alarm, and the quantification of diagnostic performance.

  9. AI-based technique for tracking chains of discontinuous symbols and its application to the analysis of topographic maps

    NASA Astrophysics Data System (ADS)

    Mecocci, Alessandro; Lilla, Massimiliano

    1994-12-01

    Automatic digitization of topographic maps is a very important task nowadays. Among the different elements of a topographic map discontinuous lines represent important information. Generally they are difficult to track because they show very large gaps, and abrupt direction changes. In this paper an architecture that automates the digitalization of discontinuous lines (dot-dot lines, dash-dot-dash lines, dash-asterisk lines, etc.) is presented. The tracking process must detect the elementary symbols and then concatenate these symbols into a significant chain that represents the line. The proposed architecture is composed of a common kernel, based on a suitable modification of the A* algorithm, that starts different auxiliary processes depending on the particular line to be tracked. Three auxiliary processes are considered: search strategy generation (SSG) which is responsible for the strategy used to scan the image pixels; low level symbol detection (LSD) which decides if a certain image region around the pixel selected by the SSG is an elementary symbol; cost evaluation (CE) which gives the quality of each symbol with respect to the global course of the line. The whole system has been tested on a 1:50.000 map furnished by the Istituto Geografico Militare Italiano (IGMI). The results were very good for different types of discontinuous lines. Over the whole map (i.e. about 80 Mbytes of digitized data) 95% of the elementary symbols of the lines have been correctly chained. The operator time required to correct misclassifications is a small part of the time needed to manually digitize the discontinuous lines.

  10. Techniques for Enhancing Web-Based Education.

    ERIC Educational Resources Information Center

    Barbieri, Kathy; Mehringer, Susan

    The Virtual Workshop is a World Wide Web-based set of modules on high performance computing developed at the Cornell Theory Center (CTC) (New York). This approach reaches a large audience, leverages staff effort, and poses challenges for developing interesting presentation techniques. This paper describes the following techniques with their…

  11. A technique to reduce motion artifact for externally triggered cine-MRI(EC-MRI) based on detecting the onset of the articulated word with spectral analysis.

    PubMed

    Shimada, Yasuhiro; Nishimoto, Hironori; Kochiyama, Takanori; Fujimoto, Ichiro; Mano, Hiroaki; Masaki, Shinobu; Murase, Kenya

    2012-01-01

    One issue in externally triggered cine-magnetic resonance imaging (EC-MRI) for the dynamic observation of speech organs is motion artifact in the phase-encoding direction caused by unstable repetitions of speech during data acquisition. We propose a technique to reduce such artifact by rearranging the k-space data used to reconstruct MR images based on the analysis of recorded speech sounds. We recorded the subject's speech sounds during EC-MRI and used post hoc acoustical processing to reduce scanning noise and detect the onset of each utterance based on analysis of the recorded sounds. We selected each line of k-space from several data acquisition sessions and rearranged them to reconstruct a new series of dynamic MR images according to the analyzed time of utterance onset. Comparative evaluation showed significant reduction in motion artifact signal in the dynamic MR images reconstructed by the proposed method. The quality of the reconstructed images was sufficient to observe the dynamic aspects of speech production mechanisms.

  12. Advanced analysis techniques for uranium assay

    SciTech Connect

    Geist, W. H.; Ensslin, Norbert; Carrillo, L. A.; Beard, C. A.

    2001-01-01

    Uranium has a negligible passive neutron emission rate making its assay practicable only with an active interrogation method. The active interrogation uses external neutron sources to induce fission events in the uranium in order to determine the mass. This technique requires careful calibration with standards that are representative of the items to be assayed. The samples to be measured are not always well represented by the available standards which often leads to large biases. A technique of active multiplicity counting is being developed to reduce some of these assay difficulties. Active multiplicity counting uses the measured doubles and triples count rates to determine the neutron multiplication (f4) and the product of the source-sample coupling ( C ) and the 235U mass (m). Since the 35U mass always appears in the multiplicity equations as the product of Cm, the coupling needs to be determined before the mass can be known. A relationship has been developed that relates the coupling to the neutron multiplication. The relationship is based on both an analytical derivation and also on empirical observations. To determine a scaling constant present in this relationship, known standards must be used. Evaluation of experimental data revealed an improvement over the traditional calibration curve analysis method of fitting the doubles count rate to the 235Um ass. Active multiplicity assay appears to relax the requirement that the calibration standards and unknown items have the same chemical form and geometry.

  13. Multiattribute Decision Modeling Techniques: A Comparative Analysis

    DTIC Science & Technology

    1988-08-01

    Rating Technique (SMART) as a direct response to Raiffa’s (1969) article on multiattribute utility theory , which Edwards found extremcy stimulating but...approaches such as multiattribute utility /value assessment and hierarchical analysis and have applied these techniques to a number of non-military... multiattributed ) outcomes O(l)...O(k), and if the utility function is denoted by u and the probabilities of the k events are p(l)...p(k), then the

  14. Photomorphic analysis techniques: An interim spatial analysis using satellite remote sensor imagery and historical data

    NASA Technical Reports Server (NTRS)

    Keuper, H. R.; Peplies, R. W.; Gillooly, R. P.

    1977-01-01

    The use of machine scanning and/or computer-based techniques to provide greater objectivity in the photomorphic approach was investigated. Photomorphic analysis and its application in regional planning are discussed. Topics included: delineation of photomorphic regions; inadequacies of existing classification systems; tonal and textural characteristics and signature analysis techniques; pattern recognition and Fourier transform analysis; and optical experiments. A bibliography is included.

  15. Magnetic Analysis Techniques Applied to Desert Varnish

    NASA Technical Reports Server (NTRS)

    Schmidgall, E. R.; Moskowitz, B. M.; Dahlberg, E. D.; Kuhlman, K. R.

    2003-01-01

    Desert varnish is a black or reddish coating commonly found on rock samples from arid regions. Typically, the coating is very thin, less than half a millimeter thick. Previous research has shown that the primary components of desert varnish are silicon oxide clay minerals (60%), manganese and iron oxides (20-30%), and trace amounts of other compounds [1]. Desert varnish is thought to originate when windborne particles containing iron and manganese oxides are deposited onto rock surfaces where manganese oxidizing bacteria concentrate the manganese and form the varnish [4,5]. If desert varnish is indeed biogenic, then the presence of desert varnish on rock surfaces could serve as a biomarker, indicating the presence of microorganisms. This idea has considerable appeal, especially for Martian exploration [6]. Magnetic analysis techniques have not been extensively applied to desert varnish. The only previous magnetic study reported that based on room temperature demagnetization experiments, there were noticeable differences in magnetic properties between a sample of desert varnish and the substrate sandstone [7]. Based upon the results of the demagnetization experiments, the authors concluded that the primary magnetic component of desert varnish was either magnetite (Fe3O4) or maghemite ( Fe2O3).

  16. Intrinsic biodegradation potential of aromatic hydrocarbons in an alluvial aquifer--potentials and limits of signature metabolite analysis and two stable isotope-based techniques.

    PubMed

    Morasch, Barbara; Hunkeler, Daniel; Zopfi, Jakob; Temime, Brice; Höhener, Patrick

    2011-10-01

    Three independent techniques were used to assess the biodegradation of monoaromatic hydrocarbons and low-molecular weight polyaromatic hydrocarbons in the alluvial aquifer at the site of a former cokery (Flémalle, Belgium). Firstly, a stable carbon isotope-based field method allowed quantifying biodegradation of monoaromatic compounds in situ and confirmed the degradation of naphthalene. No evidence could be deduced from stable isotope shifts for the intrinsic biodegradation of larger molecules such as methylnaphthalenes or acenaphthene. Secondly, using signature metabolite analysis, various intermediates of the anaerobic degradation of (poly-) aromatic and heterocyclic compounds were identified. The discovery of a novel metabolite of acenaphthene in groundwater samples permitted deeper insights into the anaerobic biodegradation of almost persistent environmental contaminants. A third method, microcosm incubations with 13C-labeled compounds under in situ-like conditions, complemented techniques one and two by providing quantitative information on contaminant biodegradation independent of molecule size and sorption properties. Thanks to stable isotope labels, the sensitivity of this method was much higher compared to classical microcosm studies. The 13C-microcosm approach allowed the determination of first-order rate constants for 13C-labeled benzene, naphthalene, or acenaphthene even in cases when degradation activities were only small. The plausibility of the third method was checked by comparing 13C-microcosm-derived rates to field-derived rates of the first approach. Further advantage of the use of 13C-labels in microcosms is that novel metabolites can be linked more easily to specific mother compounds even in complex systems. This was achieved using alluvial sediments where 13C-acenaphthyl methylsuccinate was identified as transformation product of the anaerobic degradation of acenaphthene.

  17. A review of sensitivity analysis techniques

    SciTech Connect

    Hamby, D.M.

    1993-12-31

    Mathematical models are utilized to approximate various highly complex engineering, physical, environmental, social, and economic phenomena. Model parameters exerting the most influence on model results are identified through a {open_quotes}sensitivity analysis.{close_quotes} A comprehensive review is presented of more than a dozen sensitivity analysis methods. The most fundamental of sensitivity techniques utilizes partial differentiation whereas the simplest approach requires varying parameter values one-at-a-time. Correlation analysis is used to determine relationships between independent and dependent variables. Regression analysis provides the most comprehensive sensitivity measure and is commonly utilized to build response surfaces that approximate complex models.

  18. Regional flood frequency analysis in eastern Australia: Bayesian GLS regression-based methods within fixed region and ROI framework - Quantile Regression vs. Parameter Regression Technique

    NASA Astrophysics Data System (ADS)

    Haddad, Khaled; Rahman, Ataur

    2012-04-01

    SummaryIn this article, an approach using Bayesian Generalised Least Squares (BGLS) regression in a region-of-influence (ROI) framework is proposed for regional flood frequency analysis (RFFA) for ungauged catchments. Using the data from 399 catchments in eastern Australia, the BGLS-ROI is constructed to regionalise the flood quantiles (Quantile Regression Technique (QRT)) and the first three moments of the log-Pearson type 3 (LP3) distribution (Parameter Regression Technique (PRT)). This scheme firstly develops a fixed region model to select the best set of predictor variables for use in the subsequent regression analyses using an approach that minimises the model error variance while also satisfying a number of statistical selection criteria. The identified optimal regression equation is then used in the ROI experiment where the ROI is chosen for a site in question as the region that minimises the predictive uncertainty. To evaluate the overall performances of the quantiles estimated by the QRT and PRT, a one-at-a-time cross-validation procedure is applied. Results of the proposed method indicate that both the QRT and PRT in a BGLS-ROI framework lead to more accurate and reliable estimates of flood quantiles and moments of the LP3 distribution when compared to a fixed region approach. Also the BGLS-ROI can deal reasonably well with the heterogeneity in Australian catchments as evidenced by the regression diagnostics. Based on the evaluation statistics it was found that both BGLS-QRT and PRT-ROI perform similarly well, which suggests that the PRT is a viable alternative to QRT in RFFA. The RFFA methods developed in this paper is based on the database available in eastern Australia. It is expected that availability of a more comprehensive database (in terms of both quality and quantity) will further improve the predictive performance of both the fixed and ROI based RFFA methods presented in this study, which however needs to be investigated in future when such a

  19. Techniques for the Analysis of Human Movement.

    ERIC Educational Resources Information Center

    Grieve, D. W.; And Others

    This book presents the major analytical techniques that may be used in the appraisal of human movement. Chapter 1 is devoted to the photopgraphic analysis of movement with particular emphasis on cine filming. Cine film may be taken with little or no restriction on the performer's range of movement; information on the film is permanent and…

  20. Quantifying sources of black carbon in western North America using observationally based analysis and an emission tagging technique in the Community Atmosphere Model

    DOE PAGES

    Zhang, R.; Wang, H.; Hegg, D. A.; ...

    2015-11-18

    The Community Atmosphere Model (CAM5), equipped with a technique to tag black carbon (BC) emissions by source regions and types, has been employed to establish source–receptor relationships for atmospheric BC and its deposition to snow over western North America. The CAM5 simulation was conducted with meteorological fields constrained by reanalysis for year 2013 when measurements of BC in both near-surface air and snow are available for model evaluation. We find that CAM5 has a significant low bias in predicted mixing ratios of BC in snow but only a small low bias in predicted atmospheric concentrations over northwestern USA and westernmore » Canada. Even with a strong low bias in snow mixing ratios, radiative transfer calculations show that the BC-in-snow darkening effect is substantially larger than the BC dimming effect at the surface by atmospheric BC. Local sources contribute more to near-surface atmospheric BC and to deposition than distant sources, while the latter are more important in the middle and upper troposphere where wet removal is relatively weak. Fossil fuel (FF) is the dominant source type for total column BC burden over the two regions. FF is also the dominant local source type for BC column burden, deposition, and near-surface BC, while for all distant source regions combined the contribution of biomass/biofuel (BB) is larger than FF. An observationally based positive matrix factorization (PMF) analysis of the snow-impurity chemistry is conducted to quantitatively evaluate the CAM5 BC source-type attribution. While CAM5 is qualitatively consistent with the PMF analysis with respect to partitioning of BC originating from BB and FF emissions, it significantly underestimates the relative contribution of BB. In addition to a possible low bias in BB emissions used in the simulation, the model is likely missing a significant source of snow darkening from local soil found in the observations.« less

  1. Microextraction sample preparation techniques in biomedical analysis.

    PubMed

    Szultka, Malgorzata; Pomastowski, Pawel; Railean-Plugaru, Viorica; Buszewski, Boguslaw

    2014-11-01

    Biologically active compounds are found in biological samples at relatively low concentration levels. The sample preparation of target compounds from biological, pharmaceutical, environmental, and food matrices is one of the most time-consuming steps in the analytical procedure. The microextraction techniques are dominant. Metabolomic studies also require application of proper analytical technique for the determination of endogenic metabolites present in biological matrix on trace concentration levels. Due to the reproducibility of data, precision, relatively low cost of the appropriate analysis, simplicity of the determination, and the possibility of direct combination of those techniques with other methods (combination types on-line and off-line), they have become the most widespread in routine determinations. Additionally, sample pretreatment procedures have to be more selective, cheap, quick, and environmentally friendly. This review summarizes the current achievements and applications of microextraction techniques. The main aim is to deal with the utilization of different types of sorbents for microextraction and emphasize the use of new synthesized sorbents as well as to bring together studies concerning the systematic approach to method development. This review is dedicated to the description of microextraction techniques and their application in biomedical analysis.

  2. Solution Techniques in Finite Element Analysis.

    DTIC Science & Technology

    1983-05-01

    CR 83.027 NAVAL CIVIL ENGINEERING LABORATORY Port Hueneme, California Sponsored by NAVAL FACILITIES ENGINEERING COMMAND ___ SOLUTION TECHNIQUES IN...CATALOG NUMBER CR 83.027 A bA/Z3 SZ *4 TITLE fori SoobIt, S TYPE F REP RT II PERIOD COVERED SOLUTION TECHNIQUES IN FINITE ELEMENT Not 192in Jna98 ANALYSIS...elements; nonlinear algebraic equations; numierical solution methods 20 ABSTRACT (Contlinue mI e.se mde It nc..Ac.. Wd ordonhifI, by block .- abe,) ,A

  3. Dissociation techniques in mass spectrometry-based proteomics.

    PubMed

    Jones, Andrew W; Cooper, Helen J

    2011-09-07

    The field of proteomics, the large-scale analysis of proteins, has undergone a huge expansion over the past decade. Mass spectrometry-based proteomics relies on the dissociation of peptide and/or protein ions to provide information on primary sequence and sites of post-translational modifications. Fragmentation techniques include collision-induced dissociation, electron capture dissociation and electron transfer dissociation. Here, we describe each of these techniques and their use in proteomics. The principles, advantages, limitations, and applications are discussed.

  4. Signal Analysis Techniques for Interpreting Electroencephalograms

    DTIC Science & Technology

    1980-12-01

    et al. Near-orthogonal basis functions: A real time fetal ECG technique. IEEE Trans Biomed Eng BME-24:39-43 (1971. 66. MacKay, D. M., and D. A...multivariate and nonlinear time series analysis of fetal heart rate. Ph.D. Dissertation, Carnegie-wellon Univ., Pittsburgh, Pa., 1979. Jarisch, W...of fetal heart rate variability. IEEE Trans Biomed Eng (Submitted, 1979). Johnson, A. S. Automated detection of petit mal seizures in the human EEG

  5. Infusing Reliability Techniques into Software Safety Analysis

    NASA Technical Reports Server (NTRS)

    Shi, Ying

    2015-01-01

    Software safety analysis for a large software intensive system is always a challenge. Software safety practitioners need to ensure that software related hazards are completely identified, controlled, and tracked. This paper discusses in detail how to incorporate the traditional reliability techniques into the entire software safety analysis process. In addition, this paper addresses how information can be effectively shared between the various practitioners involved in the software safety analyses. The author has successfully applied the approach to several aerospace applications. Examples are provided to illustrate the key steps of the proposed approach.

  6. Accelerometer Data Analysis and Presentation Techniques

    NASA Technical Reports Server (NTRS)

    Rogers, Melissa J. B.; Hrovat, Kenneth; McPherson, Kevin; Moskowitz, Milton E.; Reckart, Timothy

    1997-01-01

    The NASA Lewis Research Center's Principal Investigator Microgravity Services project analyzes Orbital Acceleration Research Experiment and Space Acceleration Measurement System data for principal investigators of microgravity experiments. Principal investigators need a thorough understanding of data analysis techniques so that they can request appropriate analyses to best interpret accelerometer data. Accelerometer data sampling and filtering is introduced along with the related topics of resolution and aliasing. Specific information about the Orbital Acceleration Research Experiment and Space Acceleration Measurement System data sampling and filtering is given. Time domain data analysis techniques are discussed and example environment interpretations are made using plots of acceleration versus time, interval average acceleration versus time, interval root-mean-square acceleration versus time, trimmean acceleration versus time, quasi-steady three dimensional histograms, and prediction of quasi-steady levels at different locations. An introduction to Fourier transform theory and windowing is provided along with specific analysis techniques and data interpretations. The frequency domain analyses discussed are power spectral density versus frequency, cumulative root-mean-square acceleration versus frequency, root-mean-square acceleration versus frequency, one-third octave band root-mean-square acceleration versus frequency, and power spectral density versus frequency versus time (spectrogram). Instructions for accessing NASA Lewis Research Center accelerometer data and related information using the internet are provided.

  7. Analysis techniques for residual acceleration data

    NASA Technical Reports Server (NTRS)

    Rogers, Melissa J. B.; Alexander, J. Iwan D.; Snyder, Robert S.

    1990-01-01

    Various aspects of residual acceleration data are of interest to low-gravity experimenters. Maximum and mean values and various other statistics can be obtained from data as collected in the time domain. Additional information may be obtained through manipulation of the data. Fourier analysis is discussed as a means of obtaining information about dominant frequency components of a given data window. Transformation of data into different coordinate axes is useful in the analysis of experiments with different orientations and can be achieved by the use of a transformation matrix. Application of such analysis techniques to residual acceleration data provides additional information than what is provided in a time history and increases the effectiveness of post-flight analysis of low-gravity experiments.

  8. Multiview video codec based on KTA techniques

    NASA Astrophysics Data System (ADS)

    Seo, Jungdong; Kim, Donghyun; Ryu, Seungchul; Sohn, Kwanghoon

    2011-03-01

    Multi-view video coding (MVC) is a video coding standard developed by MPEG and VCEG for multi-view video. It showed average PSNR gain of 1.5dB compared with view-independent coding by H.264/AVC. However, because resolutions of multi-view video are getting higher for more realistic 3D effect, high performance video codec is needed. MVC adopted hierarchical B-picture structure and inter-view prediction as core techniques. The hierarchical B-picture structure removes the temporal redundancy, and the inter-view prediction reduces the inter-view redundancy by compensated prediction from the reconstructed neighboring views. Nevertheless, MVC has inherent limitation in coding efficiency, because it is based on H.264/AVC. To overcome the limit, an enhanced video codec for multi-view video based on Key Technology Area (KTA) is proposed. KTA is a high efficiency video codec by Video Coding Expert Group (VCEG), and it was carried out for coding efficiency beyond H.264/AVC. The KTA software showed better coding gain than H.264/AVC by using additional coding techniques. The techniques and the inter-view prediction are implemented into the proposed codec, which showed high coding gain compared with the view-independent coding result by KTA. The results presents that the inter-view prediction can achieve higher efficiency in a multi-view video codec based on a high performance video codec such as HEVC.

  9. DETECTION OF DNA DAMAGE USING MELTING ANALYSIS TECHNIQUES

    EPA Science Inventory

    A rapid and simple fluorescence screening assay for UV radiation-, chemical-, and enzyme-induced DNA damage is reported. This assay is based on a melting/annealing analysis technique and has been used with both calf thymus DNA and plasmid DNA (puc 19 plasmid from E. coli). DN...

  10. Multi-Variable Analysis and Design Techniques.

    DTIC Science & Technology

    1981-09-01

    by A.G.J.MacFarlane 2 MULTIVARIABLE DESIGN TECHNIQUES BASED ON SINGULAR VALUE GENERALIZATIONS OF CLASSICAL CONTROL by J.C. Doyle 3 LIMITATIONS ON...prototypes to complex mathematical representations. All of these assemblages of information or information generators can loosely be termed "models...non linearities (e.g., control saturation) I neglect of high frequency dynamics. T hese approximations are well understood and in general their impact

  11. Respiratory monitoring system based on the nasal pressure technique for the analysis of sleep breathing disorders: Reduction of static and dynamic errors, and comparisons with thermistors and pneumotachographs

    NASA Astrophysics Data System (ADS)

    Alves de Mesquita, Jayme; Lopes de Melo, Pedro

    2004-03-01

    Thermally sensitive devices—thermistors—have usually been used to monitor sleep-breathing disorders. However, because of their long time constant, these devices are not able to provide a good characterization of fast events, like hypopneas. Nasal pressure recording technique (NPR) has recently been suggested to quantify airflow during sleep. It is claimed that the short time constants of the devices used to implement this technique would allow an accurate analysis of fast abnormal respiratory events. However, these devices present errors associated with nonlinearities and acoustic resonance that could reduce the diagnostic value of the NPR. Moreover, in spite of the high scientific and clinical potential, there is no detailed description of a complete instrumentation system to implement this promising technique in sleep studies. In this context, the purpose of this work was twofold: (1) describe the development of a flexible NPR device and (2) evaluate the performance of this device when compared to pneumotachographs (PNTs) and thermistors. After the design details are described, the system static accuracy is evaluated by a comparative analysis with a PNT. This analysis revealed a significant reduction (p<0.001) of the static error when system nonlinearities were reduced. The dynamic performance of the NPR system was investigated by frequency response analysis and time constant evaluations and the results showed that the developed device response was as good as PNT and around 100 times faster (τ=5,3 ms) than thermistors (τ=512 ms). Experimental results obtained in simulated clinical conditions and in a patient are presented as examples, and confirmed the good features achieved in engineering tests. These results are in close agreement with physiological fundamentals, supplying substantial evidence that the improved dynamic and static characteristics of this device can contribute to a more accurate implementation of medical research projects and to improve the

  12. Electrical Load Profile Analysis Using Clustering Techniques

    NASA Astrophysics Data System (ADS)

    Damayanti, R.; Abdullah, A. G.; Purnama, W.; Nandiyanto, A. B. D.

    2017-03-01

    Data mining is one of the data processing techniques to collect information from a set of stored data. Every day the consumption of electricity load is recorded by Electrical Company, usually at intervals of 15 or 30 minutes. This paper uses a clustering technique, which is one of data mining techniques to analyse the electrical load profiles during 2014. The three methods of clustering techniques were compared, namely K-Means (KM), Fuzzy C-Means (FCM), and K-Means Harmonics (KHM). The result shows that KHM is the most appropriate method to classify the electrical load profile. The optimum number of clusters is determined using the Davies-Bouldin Index. By grouping the load profile, the demand of variation analysis and estimation of energy loss from the group of load profile with similar pattern can be done. From the group of electric load profile, it can be known cluster load factor and a range of cluster loss factor that can help to find the range of values of coefficients for the estimated loss of energy without performing load flow studies.

  13. Forensic Analysis using Geological and Geochemical Techniques

    NASA Astrophysics Data System (ADS)

    Hoogewerff, J.

    2009-04-01

    Due to the globalisation of legal (and illegal) trade there is an increasing demand for techniques which can verify the geographical origin and transfer routes of many legal and illegal commodities and products. Although geological techniques have been used in forensic investigations since the emergence of forensics as a science in the late eighteen hundreds, the last decade has seen a marked increase in geo-scientists initiating concept studies using the latest analytical techniques, including studying natural abundance isotope variations, micro analysis with laser ablation ICPMS and geochemical mapping. Most of the concept studies have shown a good potential but uptake by the law enforcement and legal community has been limited due to concerns about the admissibility of the new methods. As an introduction to the UGU2009 session "Forensic Provenancing using Geological and Geochemical Techniques" I will give an overview of the state of the art of forensic geology and the issues that concern the admissibility of geological forensic evidence. I will use examples from the NITECRIME and FIRMS networks, the EU TRACE project and other projects and literature to illustrate the important issues at hand.

  14. UPLC: a preeminent technique in pharmaceutical analysis.

    PubMed

    Kumar, Ashok; Saini, Gautam; Nair, Anroop; Sharma, Rishbha

    2012-01-01

    The pharmaceutical companies today are driven to create novel and more efficient tools to discover, develop, deliver and monitor the drugs. In this contest the development of rapid chromatographic method is crucial for the analytical laboratories. In precedent decade, substantial technological advances have been done in enhancing particle chemistry performance, improving detector design and in optimizing the system, data processors and various controls of chromatographic techniques. When all was blended together, it resulted in the outstanding performance via ultra-high performance liquid chromatography (UPLC), which holds back the principle of HPLC technique. UPLC shows a dramatic enhancement in speed, resolution as well as the sensitivity of analysis by using particle size less than 2 pm and the system is operational at higher pressure, while the mobile phase could be able to run at greater linear velocities as compared to HPLC. This technique is considered as a new focal point in field of liquid chromatographic studies. This review focuses on the basic principle, instrumentation of UPLC and its advantages over HPLC, furthermore, this article emphasizes various pharmaceutical applications of this technique.

  15. Flash Infrared Thermography Contrast Data Analysis Technique

    NASA Technical Reports Server (NTRS)

    Koshti, Ajay

    2014-01-01

    This paper provides information on an IR Contrast technique that involves extracting normalized contrast versus time evolutions from the flash thermography inspection infrared video data. The analysis calculates thermal measurement features from the contrast evolution. In addition, simulation of the contrast evolution is achieved through calibration on measured contrast evolutions from many flat-bottom holes in the subject material. The measurement features and the contrast simulation are used to evaluate flash thermography data in order to characterize delamination-like anomalies. The thermal measurement features relate to the anomaly characteristics. The contrast evolution simulation is matched to the measured contrast evolution over an anomaly to provide an assessment of the anomaly depth and width which correspond to the depth and diameter of the equivalent flat-bottom hole (EFBH) similar to that used as input to the simulation. A similar analysis, in terms of diameter and depth of an equivalent uniform gap (EUG) providing a best match with the measured contrast evolution, is also provided. An edge detection technique called the half-max is used to measure width and length of the anomaly. Results of the half-max width and the EFBH/EUG diameter are compared to evaluate the anomaly. The information provided here is geared towards explaining the IR Contrast technique. Results from a limited amount of validation data on reinforced carbon-carbon (RCC) hardware are included in this paper.

  16. Application of spectral analysis techniques to the intercomparison of aerosol data - Part 4: Combined maximum covariance analysis to bridge the gap between multi-sensor satellite retrievals and ground-based measurements

    NASA Astrophysics Data System (ADS)

    Li, J.; Carlson, B. E.; Lacis, A. A.

    2014-04-01

    The development of remote sensing techniques has greatly advanced our knowledge of atmospheric aerosols. Various satellite sensors and the associated retrieval algorithms all add to the information of global aerosol variability, while well-designed surface networks provide time series of highly accurate measurements at specific locations. In studying the variability of aerosol properties, aerosol climate effects, and constraining aerosol fields in climate models, it is essential to make the best use of all of the available information. In the previous three parts of this series, we demonstrated the usefulness of several spectral decomposition techniques in the analysis and comparison of temporal and spatial variability of aerosol optical depth using satellite and ground-based measurements. Specifically, Principal Component Analysis (PCA) successfully captures and isolates seasonal and interannual variability from different aerosol source regions, Maximum Covariance Analysis (MCA) provides a means to verify the variability in one satellite dataset against Aerosol Robotic Network (AERONET) data, and Combined Principal Component Analysis (CPCA) realized parallel comparison among multi-satellite, multi-sensor datasets. As the final part of the study, this paper introduces a novel technique that integrates both multi-sensor datasets and ground observations, and thus effectively bridges the gap between these two types of measurements. The Combined Maximum Covariance Analysis (CMCA) decomposes the cross covariance matrix between the combined multi-sensor satellite data field and AERONET station data. We show that this new method not only confirms the seasonal and interannual variability of aerosol optical depth, aerosol source regions and events represented by different satellite datasets, but also identifies the strengths and weaknesses of each dataset in capturing the variability associated with sources, events or aerosol types. Furthermore, by examining the spread of

  17. Analysis automation with paving: A new quadrilateral meshing technique

    SciTech Connect

    Blacker, T.D. ); Stephenson, M.B.; Canann, S. )

    1990-01-01

    This paper describes the impact of paving, a new automatic mesh generation algorithm, on the analysis portion of the design process. Paving generates an all-quadrilateral mesh in arbitrary 2D geometries. The paving technique significantly impacts the analysis process by drastically reducing the time and expertise requirements of traditional mesh generation. Paving produces a high quality mesh based on geometric boundary definitions and user specified element sizing constraints. In this paper we describe the paving algorithm, discuss varying aspects of the impact of the technique on design automation, and elaborate on current research into 3D all-hexahedral mesh generation. 11 refs., 10 figs.

  18. ROC Analysis of IR Segmentation Techniques.

    DTIC Science & Technology

    1994-12-01

    WrOChtAnattysisoRSgetton Ti’oc aechnique Gpro edo g’bia K lay e;ru Dseciutond LiutnanUA AFIT/GE/ENG/94D-15 ROC Analysis of IR Segmentation Techniques THESIS...classification systems was measured by the percentage of correct decisions, but that "percent correct" does not account for the false-positive and...false-negative errors involved [13]. For example, if 5% of people have a particular disease, then a system can be 95% accurate by calling everyone

  19. COMBINING A NEW 3-D SEISMIC S-WAVE PROPAGATION ANALYSIS FOR REMOTE FRACTURE DETECTION WITH A ROBUST SUBSURFACE MICROFRACTURE-BASED VERIFICATION TECHNIQUE

    SciTech Connect

    Bob Hardage; M.M. Backus; M.V. DeAngelo; R.J. Graebner; S.E. Laubach; Paul Murray

    2004-02-01

    Fractures within the producing reservoirs at McElroy Field could not be studied with the industry-provided 3C3D seismic data used as a cost-sharing contribution in this study. The signal-to-noise character of the converted-SV data across the targeted reservoirs in these contributed data was not adequate for interpreting azimuth-dependent data effects. After illustrating the low signal quality of the converted-SV data at McElroy Field, the seismic portion of this report abandons the McElroy study site and defers to 3C3D seismic data acquired across a different fractured carbonate reservoir system to illustrate how 3C3D seismic data can provide useful information about fracture systems. Using these latter data, we illustrate how fast-S and slow-S data effects can be analyzed in the prestack domain to recognize fracture azimuth, and then demonstrate how fast-S and slow-S data volumes can be analyzed in the poststack domain to estimate fracture intensity. In the geologic portion of the report, we analyze published regional stress data near McElroy Field and numerous formation multi-imager (FMI) logs acquired across McElroy to develop possible fracture models for the McElroy system. Regional stress data imply a fracture orientation different from the orientations observed in most of the FMI logs. This report culminates Phase 2 of the study, ''Combining a New 3-D Seismic S-Wave Propagation Analysis for Remote Fracture Detection with a Robust Subsurface Microfracture-Based Verification Technique''. Phase 3 will not be initiated because wells were to be drilled in Phase 3 of the project to verify the validity of fracture-orientation maps and fracture-intensity maps produced in Phase 2. Such maps cannot be made across McElroy Field because of the limitations of the available 3C3D seismic data at the depth level of the reservoir target.

  20. Further development of ultrasonic techniques for non-destructive evaluation based on Fourier analysis of signals from irregular and inhomogeneous structures

    NASA Technical Reports Server (NTRS)

    Miller, J. G.

    1979-01-01

    To investigate the use of Fourier analysis techniques model systems had to be designed to test some of the general properties of the interaction of sound with an inhomogeneity. The first models investigated were suspensions of solid spheres in water. These systems allowed comparison between theoretical computation of the frequency dependence of the attenuation coefficient and measurement of the attenuation coefficient over a range of frequencies. Ultrasonic scattering processes in both suspensions of hard spheres in water, and suspensions of hard spheres in polyester resin were investigated. The second model system was constructed to test the applicability of partial wave analysis to the description of an inhomogeneity in a solid, and to test the range of material properties over which the measurement systems were valid.

  1. Visual exploratory analysis of integrated chromosome 19 proteomic data derived from glioma cancer stem-cell lines based on novel nonlinear dimensional data reduction techniques

    NASA Astrophysics Data System (ADS)

    Lespinats, Sylvain; Pinker-Domenig, Katja; Meyer-Bäse, Uwe; Meyer-Bäse, Anke

    2015-05-01

    Chromosome 19 is known to be linked to neurodegeneration and many cancers. Glioma-derived cancer stem cells (GSCs) are tumor-initiating cells and may be refractory to radiation and chemotherapy and thus have important implications for tumor biology and therapeutics. The analysis and interpretation of large proteomic data sets requires the development of new data mining and visualization approaches. Traditional techniques are insufficient to interpret and visualize these resulting experimental data. The emphasis of this paper lies in the presentation of novel approaches for the visualization, clustering and projection representation to unveil hidden data structures relevant for the accurate interpretation of biological experiments. These qualitative and quantitative methods are applied to the proteomic analysis of data sets derived from the GSCs. The achieved clustering and visualization results provide a more detailed insight into the expression patterns for chromosome 19 proteins.

  2. Stratifying land use/land cover for spatial analysis of disease ecology and risk: an example using object-based classification techniques.

    PubMed

    Koch, David E; Mohler, Rhett L; Goodin, Douglas G

    2007-11-01

    Landscape epidemiology has made significant strides recently, driven in part by increasing availability of land cover data derived from remotely-sensed imagery. Using an example from a study of land cover effects on hantavirus dynamics at an Atlantic Forest site in eastern Paraguay, we demonstrate how automated classification methods can be used to stratify remotely-sensed land cover for studies of infectious disease dynamics. For this application, it was necessary to develop a scheme that could yield both land cover and land use data from the same classification. Hypothesizing that automated discrimination between classes would be more accurate using an object-based method compared to a per-pixel method, we used a single Landsat Enhanced Thematic Mapper+ (ETM+) image to classify land cover into eight classes using both per-pixel and object-based classification algorithms. Our results show that the object-based method achieves 84% overall accuracy, compared to only 43% using the per-pixel method. Producer's and user's accuracies for the object-based map were higher for every class compared to the per-pixel classification. The Kappa statistic was also significantly higher for the object-based classification. These results show the importance of using image information from domains beyond the spectral domain, and also illustrate the importance of object-based techniques for remote sensing applications in epidemiological studies.

  3. A review of residual stress analysis using thermoelastic techniques

    NASA Astrophysics Data System (ADS)

    Robinson, A. F.; Dulieu-Barton, J. M.; Quinn, S.; Burguete, R. L.

    2009-08-01

    Thermoelastic Stress Analysis (TSA) is a full-field technique for experimental stress analysis that is based on infra-red thermography. The technique has proved to be extremely effective for studying elastic stress fields and is now well established. It is based on the measurement of the temperature change that occurs as a result of a stress change. As residual stress is essentially a mean stress it is accepted that the linear form of the TSA relationship cannot be used to evaluate residual stresses. However, there are situations where this linear relationship is not valid or departures in material properties due to manufacturing procedures have enabled evaluations of residual stresses. The purpose of this paper is to review the current status of using a TSA based approach for the evaluation of residual stresses and to provide some examples of where promising results have been obtained.

  4. Multiclass pesticide analysis in fruit-based baby food: A comparative study of sample preparation techniques previous to gas chromatography-mass spectrometry.

    PubMed

    Petrarca, Mateus H; Fernandes, José O; Godoy, Helena T; Cunha, Sara C

    2016-12-01

    With the aim to develop a new gas chromatography-mass spectrometry method to analyze 24 pesticide residues in baby foods at the level imposed by established regulation two simple, rapid and environmental-friendly sample preparation techniques based on QuEChERS (quick, easy, cheap, effective, robust and safe) were compared - QuEChERS with dispersive liquid-liquid microextraction (DLLME) and QuEChERS with dispersive solid-phase extraction (d-SPE). Both sample preparation techniques achieved suitable performance criteria, including selectivity, linearity, acceptable recovery (70-120%) and precision (⩽20%). A higher enrichment factor was observed for DLLME and consequently better limits of detection and quantification were obtained. Nevertheless, d-SPE provided a more effective removal of matrix co-extractives from extracts than DLLME, which contributed to lower matrix effects. Twenty-two commercial fruit-based baby food samples were analyzed by the developed method, being procymidone detected in one sample at a level above the legal limit established by EU.

  5. High-accuracy and long-range Brillouin optical time-domain analysis sensor based on the combination of pulse prepump technique and complementary coding

    NASA Astrophysics Data System (ADS)

    Sun, Qiao; Tu, Xiaobo; Lu, Yang; Sun, Shilin; Meng, Zhou

    2016-06-01

    A Brillouin optical time-domain analysis (BOTDA) sensor that combines the conventional complementary coding with the pulse prepump technique for high-accuracy and long-range distributed sensing is implemented and analyzed. The employment of the complementary coding provides an enhanced signal-to-noise ratio (SNR) of the sensing system and an extended sensing distance, and the measurement time is also reduced compared with a BOTDA sensor using linear coding. The combination of pulse prepump technique enables the establishment of a preactivated acoustic field in each pump pulse of the complementary codeword, which ensures measurements of high spatial resolution and high frequency accuracy. The feasibility of the prepumped complementary coding is analyzed theoretically and experimentally. The experiments are carried out beyond 50-km single-mode fiber, and experimental results show the capabilities of the proposed scheme to achieve 1-m spatial resolution with temperature and strain resolutions equal to ˜1.6°C and ˜32 μɛ, and 2-m spatial resolution with temperature and strain resolutions equal to ˜0.3°C and ˜6 μɛ, respectively. A longer sensing distance with the same spatial resolution and measurement accuracy can be achieved through increasing the code length of the prepumped complementary code.

  6. Oil species identification technique developed by Gabor wavelet analysis and support vector machine based on concentration-synchronous-matrix-fluorescence spectroscopy.

    PubMed

    Wang, Chunyan; Shi, Xiaofeng; Li, Wendong; Wang, Lin; Zhang, Jinliang; Yang, Chun; Wang, Zhendi

    2016-03-15

    Concentration-synchronous-matrix-fluorescence (CSMF) spectroscopy was applied to discriminate the oil species by characterizing the concentration dependent fluorescence properties of petroleum related samples. Seven days weathering experiment of 3 crude oil samples from the Bohai Sea platforms of China was carried out under controlled laboratory conditions and showed that weathering had no significant effect on the CSMF spectra. While different feature extraction methods, such as PCA, PLS and Gabor wavelet analysis, were applied to extract discriminative patterns from CSMF spectra, classifications were made via SVM to compare their respective performance of oil species recognition. Ideal correct rates of oil species recognition of 100% for the different types of oil spill samples and 92% for the closely-related source oil samples were achieved by combining Gabor wavelet with SVM, which indicated its advantages to be developed to a rapid, cost-effective, and accurate forensic oil spill identification technique.

  7. Modular Sampling and Analysis Techniques for the Real-Time Analysis of Human Breath

    SciTech Connect

    Frank, M; Farquar, G; Adams, K; Bogan, M; Martin, A; Benner, H; Spadaccini, C; Steele, P; Davis, C; Loyola, B; Morgan, J; Sankaran, S

    2007-07-09

    At LLNL and UC Davis, we are developing several techniques for the real-time sampling and analysis of trace gases, aerosols and exhaled breath that could be useful for a modular, integrated system for breath analysis. Those techniques include single-particle bioaerosol mass spectrometry (BAMS) for the analysis of exhaled aerosol particles or droplets as well as breath samplers integrated with gas chromatography mass spectrometry (GC-MS) or MEMS-based differential mobility spectrometry (DMS). We describe these techniques and present recent data obtained from human breath or breath condensate, in particular, addressing the question of how environmental exposure influences the composition of breath.

  8. Alternative Analysis Techniques for Needs and Needs Documentation Techniques,

    DTIC Science & Technology

    1980-06-20

    economic if business planning *s to Forecasting in the seventies, if it is to be be effective in the seventies.’ Little is report- effective, must include...The strategic business planning and related perspective-tree approach to the identification research and development planning. of threats and...the Seventies: a Trend Analysis for Business Planning (New York: McGraw-Hi Book Com- from either business or technical planning. peny. 1970). FEBRUARY

  9. Laser Scanning–Based Tissue Autofluorescence/Fluorescence Imaging (LS-TAFI), a New Technique for Analysis of Microanatomy in Whole-Mount Tissues

    PubMed Central

    Mori, Hidetoshi; Borowsky, Alexander D.; Bhat, Ramray; Ghajar, Cyrus M.; Seiki, Motoharu; Bissell, Mina J.

    2012-01-01

    Intact organ structure is essential in maintaining tissue specificity and cellular differentiation. Small physiological or genetic variations lead to changes in microanatomy that, if persistent, could have functional consequences and may easily be masked by the heterogeneity of tissue anatomy. Current imaging techniques rely on histological, two-dimensional sections requiring sample manipulation that are essentially two dimensional. We have developed a method for three-dimensional imaging of whole-mount, unsectioned mammalian tissues to elucidate subtle and detailed micro- and macroanatomies in adult organs and embryos. We analyzed intact or dissected organ whole mounts with laser scanning–based tissue autofluorescence/fluorescence imaging (LS-TAFI). We obtained clear visualization of microstructures within murine mammary glands and mammary tumors and other organs without the use of immunostaining and without probes or fluorescent reporter genes. Combining autofluorescence with reflected light signals from chromophore-stained tissues allowed identification of individual cells within three-dimensional structures of whole-mounted organs. This technique could be useful for rapid diagnosis of human clinical samples and possibly the effect of subtle variations such as low dose radiation. PMID:22542846

  10. HELCATS - Heliospheric Cataloguing, Analysis and Techniques Service

    NASA Astrophysics Data System (ADS)

    Harrison, Richard; Davies, Jackie; Perry, Chris; Moestl, Christian; Rouillard, Alexis; Bothmer, Volker; Rodriguez, Luciano; Eastwood, Jonathan; Kilpua, Emilia; Gallagher, Peter

    2016-04-01

    Understanding the evolution of the solar wind is fundamental to advancing our knowledge of energy and mass transport in the solar system, rendering it crucial to space weather and its prediction. The advent of truly wide-angle heliospheric imaging has revolutionised the study of both transient (CMEs) and background (SIRs/CIRs) solar wind plasma structures, by enabling their direct and continuous observation out to 1 AU and beyond. The EU-funded FP7 HELCATS project combines European expertise in heliospheric imaging, built up in particular through lead involvement in NASA's STEREO mission, with expertise in solar and coronal imaging as well as in-situ and radio measurements of solar wind phenomena, in a programme of work that will enable a much wider exploitation and understanding of heliospheric imaging observations. With HELCATS, we are (1.) cataloguing transient and background solar wind structures imaged in the heliosphere by STEREO/HI, since launch in late October 2006 to date, including estimates of their kinematic properties based on a variety of established techniques and more speculative, approaches; (2.) evaluating these kinematic properties, and thereby the validity of these techniques, through comparison with solar source observations and in-situ measurements made at multiple points throughout the heliosphere; (3.) appraising the potential for initialising advanced numerical models based on these kinematic properties; (4.) assessing the complementarity of radio observations (in particular of Type II radio bursts and interplanetary scintillation) in combination with heliospheric imagery. We will, in this presentation, provide an overview of progress from the first 18 months of the HELCATS project.

  11. Which Combinations of Techniques and Modes of Delivery in Internet-Based Interventions Effectively Change Health Behavior? A Meta-Analysis

    PubMed Central

    van Genugten, Lenneke; Webb, Thomas Llewelyn; van Empelen, Pepijn

    2016-01-01

    Background Many online interventions designed to promote health behaviors combine multiple behavior change techniques (BCTs), adopt different modes of delivery (MoD) (eg, text messages), and range in how usable they are. Research is therefore needed to examine the impact of these features on the effectiveness of online interventions. Objective This study applies Classification and Regression Trees (CART) analysis to meta-analytic data, in order to identify synergistic effects of BCTs, MoDs, and usability factors. Methods We analyzed data from Webb et al. This review included effect sizes from 52 online interventions targeting a variety of health behaviors and coded the use of 40 BCTs and 11 MoDs. Our research also developed a taxonomy for coding the usability of interventions. Meta-CART analyses were performed using the BCTs and MoDs as predictors and using treatment success (ie, effect size) as the outcome. Results Factors related to usability of the interventions influenced their efficacy. Specifically, subgroup analyses indicated that more efficient interventions (interventions that take little time to understand and use) are more likely to be effective than less efficient interventions. Meta-CART identified one synergistic effect: Interventions that included barrier identification/ problem solving and provided rewards for behavior change reported an average effect size that was smaller (ḡ=0.23, 95% CI 0.08-0.44) than interventions that used other combinations of techniques (ḡ=0.43, 95% CI 0.27-0.59). No synergistic effects were found for MoDs or for MoDs combined with BCTs. Conclusions Interventions that take little time to understand and use were more effective than those that require more time. Few specific combinations of BCTs that contribute to the effectiveness of online interventions were found. Furthermore, no synergistic effects between BCTs and MoDs were found, even though MoDs had strong effects when analyzed univariately in the original study

  12. Developing techniques for cause-responsibility analysis of occupational accidents.

    PubMed

    Jabbari, Mousa; Ghorbani, Roghayeh

    2016-11-01

    The aim of this study was to specify the causes of occupational accidents, determine social responsibility and the role of groups involved in work-related accidents. This study develops occupational accidents causes tree, occupational accidents responsibility tree, and occupational accidents component-responsibility analysis worksheet; based on these methods, it develops cause-responsibility analysis (CRA) techniques, and for testing them, analyzes 100 fatal/disabling occupational accidents in the construction setting that were randomly selected from all the work-related accidents in Tehran, Iran, over a 5-year period (2010-2014). The main result of this study involves two techniques for CRA: occupational accidents tree analysis (OATA) and occupational accidents components analysis (OACA), used in parallel for determination of responsible groups and responsibilities rate. From the results, we find that the management group of construction projects has 74.65% responsibility of work-related accidents. The developed techniques are purposeful for occupational accidents investigation/analysis, especially for the determination of detailed list of tasks, responsibilities, and their rates. Therefore, it is useful for preventing work-related accidents by focusing on the responsible group's duties.

  13. Comparison of performance of object-based image analysis techniques available in open source software (Spring and Orfeo Toolbox/Monteverdi) considering very high spatial resolution data

    NASA Astrophysics Data System (ADS)

    Teodoro, Ana C.; Araujo, Ricardo

    2016-01-01

    The use of unmanned aerial vehicles (UAVs) for remote sensing applications is becoming more frequent. However, this type of information can result in several software problems related to the huge amount of data available. Object-based image analysis (OBIA) has proven to be superior to pixel-based analysis for very high-resolution images. The main objective of this work was to explore the potentialities of the OBIA methods available in two different open source software applications, Spring and OTB/Monteverdi, in order to generate an urban land cover map. An orthomosaic derived from UAVs was considered, 10 different regions of interest were selected, and two different approaches were followed. The first one (Spring) uses the region growing segmentation algorithm followed by the Bhattacharya classifier. The second approach (OTB/Monteverdi) uses the mean shift segmentation algorithm followed by the support vector machine (SVM) classifier. Two strategies were followed: four classes were considered using Spring and thereafter seven classes were considered for OTB/Monteverdi. The SVM classifier produces slightly better results and presents a shorter processing time. However, the poor spectral resolution of the data (only RGB bands) is an important factor that limits the performance of the classifiers applied.

  14. Analysis of diagnostic calorimeter data by the transfer function technique

    SciTech Connect

    Delogu, R. S. Pimazzoni, A.; Serianni, G.; Poggi, C.; Rossi, G.

    2016-02-15

    This paper describes the analysis procedure applied to the thermal measurements on the rear side of a carbon fibre composite calorimeter with the purpose of reconstructing the energy flux due to an ion beam colliding on the front side. The method is based on the transfer function technique and allows a fast analysis by means of the fast Fourier transform algorithm. Its efficacy has been tested both on simulated and measured temperature profiles: in all cases, the energy flux features are well reproduced and beamlets are well resolved. Limits and restrictions of the method are also discussed, providing strategies to handle issues related to signal noise and digital processing.

  15. Numerical modeling techniques for flood analysis

    NASA Astrophysics Data System (ADS)

    Anees, Mohd Talha; Abdullah, K.; Nawawi, M. N. M.; Ab Rahman, Nik Norulaini Nik; Piah, Abd. Rahni Mt.; Zakaria, Nor Azazi; Syakir, M. I.; Mohd. Omar, A. K.

    2016-12-01

    Topographic and climatic changes are the main causes of abrupt flooding in tropical areas. It is the need to find out exact causes and effects of these changes. Numerical modeling techniques plays a vital role for such studies due to their use of hydrological parameters which are strongly linked with topographic changes. In this review, some of the widely used models utilizing hydrological and river modeling parameters and their estimation in data sparse region are discussed. Shortcomings of 1D and 2D numerical models and the possible improvements over these models through 3D modeling are also discussed. It is found that the HEC-RAS and FLO 2D model are best in terms of economical and accurate flood analysis for river and floodplain modeling respectively. Limitations of FLO 2D in floodplain modeling mainly such as floodplain elevation differences and its vertical roughness in grids were found which can be improve through 3D model. Therefore, 3D model was found to be more suitable than 1D and 2D models in terms of vertical accuracy in grid cells. It was also found that 3D models for open channel flows already developed recently but not for floodplain. Hence, it was suggested that a 3D model for floodplain should be developed by considering all hydrological and high resolution topographic parameter's models, discussed in this review, to enhance the findings of causes and effects of flooding.

  16. Function Analysis and Decomposistion using Function Analysis Systems Technique

    SciTech Connect

    J. R. Wixson

    1999-06-01

    The "Father of Value Analysis", Lawrence D. Miles, was a design engineer for General Electric in Schenectady, New York. Miles developed the concept of function analysis to address difficulties in satisfying the requirements to fill shortages of high demand manufactured parts and electrical components during World War II. His concept of function analysis was further developed in the 1960s by Charles W. Bytheway, a design engineer at Sperry Univac in Salt Lake City, Utah. Charles Bytheway extended Mile's function analysis concepts and introduced the methodology called Function Analysis Systems Techniques (FAST) to the Society of American Value Engineers (SAVE) at their International Convention in 1965 (Bytheway 1965). FAST uses intuitive logic to decompose a high level, or objective function into secondary and lower level functions that are displayed in a logic diagram called a FAST model. Other techniques can then be applied to allocate functions to components, individuals, processes, or other entities that accomplish the functions. FAST is best applied in a team setting and proves to be an effective methodology for functional decomposition, allocation, and alternative development.

  17. Function Analysis and Decomposistion using Function Analysis Systems Technique

    SciTech Connect

    Wixson, James Robert

    1999-06-01

    The "Father of Value Analysis", Lawrence D. Miles, was a design engineer for General Electric in Schenectady, New York. Miles developed the concept of function analysis to address difficulties in satisfying the requirements to fill shortages of high demand manufactured parts and electrical components during World War II. His concept of function analysis was further developed in the 1960s by Charles W. Bytheway, a design engineer at Sperry Univac in Salt Lake City, Utah. Charles Bytheway extended Mile's function analysis concepts and introduced the methodology called Function Analysis Systems Technique (FAST) to the Society of American Value Engineers (SAVE) at their International Convention in 1965 (Bytheway 1965). FAST uses intuitive logic to decompose a high level, or objective function into secondary and lower level functions that are displayed in a logic diagram called a FAST model. Other techniques can then be applied to allocate functions to components, individuals, processes, or other entities that accomplish the functions. FAST is best applied in a team setting and proves to be an effective methodology for functional decomposition, allocation, and alternative development.

  18. Evaluation of Mercury in Environmental Samples by a Supramolecular SolventBased Dispersive LiquidLiquid Microextraction Method Before Analysis by a Cold Vapor Generation Technique.

    PubMed

    Ali, Jamshed; Tuzen, Mustafa; Kazi, Tasneem G

    2017-02-01

    Supramolecular solvent–based dispersive liquid–liquid microextraction was used as a preconcentration method for the determination of trace levels of Hg. This simple method accurately measured oxidized HgII content in claystone and sandstone samples obtained from the Thar Coalfield in Pakistan. Cold vapor atomic absorption spectrometry was used as the detection technique because it is reliable and accurate. The HgII in acidic media forms a complex with dithizone (DTz) in the presence of supramolecular solvent (tetrahydrofuran and 1-undecanol), forming reverse micelles. Formation of the Hg-DTz complex was achieved to increase the interactions with the supramolecular solvent phase at pH 2.5 under the optimized experimental conditions. After addition of the supramolecular solvent to the aqueous solution, the micelles were uniformly mixed using a vortex mixer. The cloudy solution was centrifuged, and the Hg-DTz complex was extracted into the supramolecular solvent phase. Under optimized experimental conditions, the LOD and enrichment factor were found to be 5.61 ng/L and 77.8, respectively. Accuracy of the developed method was checked with Certified Reference Materials. The developed method was successfully applied for the determination of HgII in claystone and sandstone samples from the Block VII and Block VIII areas of the Thar Coalfield on the basis of depth.

  19. Nuclear based techniques for detection of contraband

    SciTech Connect

    Gozani, T.

    1993-12-31

    The detection of contraband such as explosives and drugs concealed in luggage or other container can be quite difficult. Nuclear techniques offer capabilities which are essential to having effective detection devices. This report describes the features of various nuclear techniques and instrumentation.

  20. Risk-based maintenance--techniques and applications.

    PubMed

    Arunraj, N S; Maiti, J

    2007-04-11

    Plant and equipment, however well designed, will not remain safe or reliable if it is not maintained. The general objective of the maintenance process is to make use of the knowledge of failures and accidents to achieve the possible safety with the lowest possible cost. The concept of risk-based maintenance was developed to inspect the high-risk components usually with greater frequency and thoroughness and to maintain in a greater manner, to achieve tolerable risk criteria. Risk-based maintenance methodology provides a tool for maintenance planning and decision making to reduce the probability of failure of equipment and the consequences of failure. In this paper, the risk analysis and risk-based maintenance methodologies were identified and classified into suitable classes. The factors affecting the quality of risk analysis were identified and analyzed. The applications, input data and output data were studied to understand their functioning and efficiency. The review showed that there is no unique way to perform risk analysis and risk-based maintenance. The use of suitable techniques and methodologies, careful investigation during the risk analysis phase, and its detailed and structured results are necessary to make proper risk-based maintenance decisions.

  1. 48 CFR 15.404-1 - Proposal analysis techniques.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... are: I Price Analysis, II Quantitative Techniques for Contract Pricing, III Cost Analysis, IV Advanced... obtained through market research for the same or similar items. (vii) Analysis of data other than...

  2. 48 CFR 15.404-1 - Proposal analysis techniques.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... are: I Price Analysis, II Quantitative Techniques for Contract Pricing, III Cost Analysis, IV Advanced... obtained through market research for the same or similar items. (vii) Analysis of data other than...

  3. Investigation of electroforming techniques, literature analysis report

    NASA Technical Reports Server (NTRS)

    Malone, G. A.

    1975-01-01

    A literature analysis is presented of reports, specifications, and documented experiences with the use of electroforming to produce copper and nickel structures for aerospace and other engineering applications. The literature period covered is from 1948 to 1974. Specific effort was made to correlate mechanical property data for the electrodeposited material with known electroforming solution compositions and operating conditions. From this survey, electrolytes are suggested for selection to electroform copper and nickel outer shells on regeneratively cooled thrust chamber liners, and other devices subject to thermal and pressure exposure, based on mechanical properties obtainable, performance under various thermal environments, and ease of process control for product reproducibility. Processes of potential value in obtaining sound bonds between electrodeposited copper and nickel and copper alloy substrates are also discussed.

  4. Near Real Time Quantitative Gas Analysis Techniques

    NASA Astrophysics Data System (ADS)

    Herget, William F.; Tromp, Marianne L.; Anderson, Charles R.

    1985-12-01

    A Fourier transform infrared (FT-IR) - based system has been developed and is undergoing evaluation for near real time multicomponent quantitative analysis of undiluted gaseous automotive exhaust emissions. The total system includes: (1) a gas conditioning system (GCS) for tracer gas injection, gas mixing, and temperature stabilization; and (2) an exhaust gas analyzer (EGA) consisting of a sample cell, an FT-IR system, and a computerized data processing system. Tests have shown that the system can monitor about 20 individual species (concentrations down to the 1-20 ppm range) with a time resolution of one second. Tests have been conducted on a chassis dynamometer system utilizing different autos, different fuels, and different driving cycles. Results were compared with those obtained using a standard constant volume sampling (CVS) system.

  5. A technique for human error analysis (ATHEANA)

    SciTech Connect

    Cooper, S.E.; Ramey-Smith, A.M.; Wreathall, J.; Parry, G.W.

    1996-05-01

    Probabilistic risk assessment (PRA) has become an important tool in the nuclear power industry, both for the Nuclear Regulatory Commission (NRC) and the operating utilities. Human reliability analysis (HRA) is a critical element of PRA; however, limitations in the analysis of human actions in PRAs have long been recognized as a constraint when using PRA. A multidisciplinary HRA framework has been developed with the objective of providing a structured approach for analyzing operating experience and understanding nuclear plant safety, human error, and the underlying factors that affect them. The concepts of the framework have matured into a rudimentary working HRA method. A trial application of the method has demonstrated that it is possible to identify potentially significant human failure events from actual operating experience which are not generally included in current PRAs, as well as to identify associated performance shaping factors and plant conditions that have an observable impact on the frequency of core damage. A general process was developed, albeit in preliminary form, that addresses the iterative steps of defining human failure events and estimating their probabilities using search schemes. Additionally, a knowledge- base was developed which describes the links between performance shaping factors and resulting unsafe actions.

  6. DCT-based cyber defense techniques

    NASA Astrophysics Data System (ADS)

    Amsalem, Yaron; Puzanov, Anton; Bedinerman, Anton; Kutcher, Maxim; Hadar, Ofer

    2015-09-01

    With the increasing popularity of video streaming services and multimedia sharing via social networks, there is a need to protect the multimedia from malicious use. An attacker may use steganography and watermarking techniques to embed malicious content, in order to attack the end user. Most of the attack algorithms are robust to basic image processing techniques such as filtering, compression, noise addition, etc. Hence, in this article two novel, real-time, defense techniques are proposed: Smart threshold and anomaly correction. Both techniques operate at the DCT domain, and are applicable for JPEG images and H.264 I-Frames. The defense performance was evaluated against a highly robust attack, and the perceptual quality degradation was measured by the well-known PSNR and SSIM quality assessment metrics. A set of defense techniques is suggested for improving the defense efficiency. For the most aggressive attack configuration, the combination of all the defense techniques results in 80% protection against cyber-attacks with PSNR of 25.74 db.

  7. Modular techniques for dynamic fault-tree analysis

    NASA Astrophysics Data System (ADS)

    Patterson-Hine, F. A.; Dugan, Joanne B.

    It is noted that current approaches used to assess the dependability of complex systems such as Space Station Freedom and the Air Traffic Control System are incapable of handling the size and complexity of these highly integrated designs. A novel technique for modeling such systems which is built upon current techniques in Markov theory and combinatorial analysis is described. It enables the development of a hierarchical representation of system behavior which is more flexible than either technique alone. A solution strategy which is based on an object-oriented approach to model representation and evaluation is discussed. The technique is virtually transparent to the user since the fault tree models can be built graphically and the objects defined automatically. The tree modularization procedure allows the two model types, Markov and combinatoric, to coexist and does not require that the entire fault tree be translated to a Markov chain for evaluation. This effectively reduces the size of the Markov chain required and enables solutions with less truncation, making analysis of longer mission times possible. Using the fault-tolerant parallel processor as an example, a model is built and solved for a specific mission scenario and the solution approach is illustrated in detail.

  8. Node Augmentation Technique in Bayesian Network Evidence Analysis and Marshaling

    SciTech Connect

    Keselman, Dmitry; Tompkins, George H; Leishman, Deborah A

    2010-01-01

    Given a Bayesian network, sensitivity analysis is an important activity. This paper begins by describing a network augmentation technique which can simplifY the analysis. Next, we present two techniques which allow the user to determination the probability distribution of a hypothesis node under conditions of uncertain evidence; i.e. the state of an evidence node or nodes is described by a user specified probability distribution. Finally, we conclude with a discussion of three criteria for ranking evidence nodes based on their influence on a hypothesis node. All of these techniques have been used in conjunction with a commercial software package. A Bayesian network based on a directed acyclic graph (DAG) G is a graphical representation of a system of random variables that satisfies the following Markov property: any node (random variable) is independent of its non-descendants given the state of all its parents (Neapolitan, 2004). For simplicities sake, we consider only discrete variables with a finite number of states, though most of the conclusions may be generalized.

  9. Applying knowledge compilation techniques to model-based reasoning

    NASA Technical Reports Server (NTRS)

    Keller, Richard M.

    1991-01-01

    Researchers in the area of knowledge compilation are developing general purpose techniques for improving the efficiency of knowledge-based systems. In this article, an attempt is made to define knowledge compilation, to characterize several classes of knowledge compilation techniques, and to illustrate how some of these techniques can be applied to improve the performance of model-based reasoning systems.

  10. Cochlear implant simulator for surgical technique analysis

    NASA Astrophysics Data System (ADS)

    Turok, Rebecca L.; Labadie, Robert F.; Wanna, George B.; Dawant, Benoit M.; Noble, Jack H.

    2014-03-01

    Cochlear Implant (CI) surgery is a procedure in which an electrode array is inserted into the cochlea. The electrode array is used to stimulate auditory nerve fibers and restore hearing for people with severe to profound hearing loss. The primary goals when placing the electrode array are to fully insert the array into the cochlea while minimizing trauma to the cochlea. Studying the relationship between surgical outcome and various surgical techniques has been difficult since trauma and electrode placement are generally unknown without histology. Our group has created a CI placement simulator that combines an interactive 3D visualization environment with a haptic-feedback-enabled controller. Surgical techniques and patient anatomy can be varied between simulations so that outcomes can be studied under varied conditions. With this system, we envision that through numerous trials we will be able to statistically analyze how outcomes relate to surgical techniques. As a first test of this system, in this work, we have designed an experiment in which we compare the spatial distribution of forces imparted to the cochlea in the array insertion procedure when using two different but commonly used surgical techniques for cochlear access, called round window and cochleostomy access. Our results suggest that CIs implanted using round window access may cause less trauma to deeper intracochlear structures than cochleostomy techniques. This result is of interest because it challenges traditional thinking in the otological community but might offer an explanation for recent anecdotal evidence that suggests that round window access techniques lead to better outcomes.

  11. Retention of denture bases fabricated by three different processing techniques – An in vivo study

    PubMed Central

    Chalapathi Kumar, V. H.; Surapaneni, Hemchand; Ravikiran, V.; Chandra, B. Sarat; Balusu, Srilatha; Reddy, V. Naveen

    2016-01-01

    Aim: Distortion due to Polymerization shrinkage compromises the retention. To evaluate the amount of retention of denture bases fabricated by conventional, anchorized, and injection molding polymerization techniques. Materials and Methods: Ten completely edentulous patients were selected, impressions were made, and master cast obtained was duplicated to fabricate denture bases by three polymerization techniques. Loop was attached to the finished denture bases to estimate the force required to dislodge them by retention apparatus. Readings were subjected to nonparametric Friedman two-way analysis of variance followed by Bonferroni correction methods and Wilcoxon matched-pairs signed-ranks test. Results: Denture bases fabricated by injection molding (3740 g), anchorized techniques (2913 g) recorded greater retention values than conventional technique (2468 g). Significant difference was seen between these techniques. Conclusions: Denture bases obtained by injection molding polymerization technique exhibited maximum retention, followed by anchorized technique, and least retention was seen in conventional molding technique. PMID:27382542

  12. Applications of external cavity diode laser-based technique to noninvasive clinical diagnosis using expired breath ammonia analysis: chronic kidney disease, epilepsy

    NASA Astrophysics Data System (ADS)

    Bayrakli, Ismail; Turkmen, Aysenur; Akman, Hatice; Sezer, M. Tugrul; Kutluhan, Suleyman

    2016-08-01

    An external cavity laser (ECL)-based off-axis cavity-enhanced absorption spectroscopy was applied to noninvasive clinical diagnosis using expired breath ammonia analysis: (1) the correlation between breath ammonia levels and blood parameters related to chronic kidney disease (CKD) was investigated and (2) the relationship between breath ammonia levels and blood concentrations of valproic acid (VAP) was studied. The concentrations of breath ammonia in 15 healthy volunteers, 10 epilepsy patients (before and after taking VAP), and 27 patients with different stages of CKD were examined. The range of breath ammonia levels was 120 to 530 ppb for healthy subjects and 710 to 10,400 ppb for patients with CKD. There was a statistically significant positive correlation between breath ammonia concentrations and urea, blood urea nitrogen, creatinine, or estimated glomerular filtration rate in 27 patients. It was demonstrated that taking VAP gave rise to increasing breath ammonia levels. A statistically significant difference was found between the levels of exhaled ammonia (NH3) in healthy subjects and in patients with epilepsy before and after taking VAP. The results suggest that our breath ammonia measurement system has great potential as an easy, noninvasive, real-time, and continuous monitor of the clinical parameters related to epilepsy and CKD.

  13. Calculation of the elastic properties of prosthetic knee components with an iterative finite element-based modal analysis: quantitative comparison of different measuring techniques.

    PubMed

    Woiczinski, Matthias; Tollrian, Christopher; Schröder, Christian; Steinbrück, Arnd; Müller, Peter E; Jansson, Volkmar

    2013-08-01

    With the aging but still active population, research on total joint replacements relies increasingly on numerical methods, such as finite element analysis, to improve wear resistance of components. However, the validity of finite element models largely depends on the accuracy of their material behavior and geometrical representation. In particular, material properties are often based on manufacturer data or literature reports, but can alternatively be estimated by matching experimental measurements and structural predictions through modal analyses and identification of eigenfrequencies. The aim of the present study was to compare the accuracy of common setups used for estimating the eigenfrequencies of typical components often used in prosthetized joints. Eigenfrequencies of cobalt-chrome and ultra-high-molecular weight polyethylene components were therefore measured with four different setups, and used in modal analyses of corresponding finite element models for an iterative adjustment of their material properties. Results show that for the low-damped cobalt chromium endoprosthesis components, all common measuring setups provided accurate measurements. In the case of high-damped structures, measurements were only possible with setups including a continuously excitation system such as electrodynamic shakers. This study demonstrates that the iterative back-calculation of eigenfrequencies can be a reliable method to estimate the elastic properties for finite element models.

  14. A Quantitative Study of Gully Erosion Based on Object-Oriented Analysis Techniques: A Case Study in Beiyanzikou Catchment of Qixia, Shandong, China

    PubMed Central

    Wang, Tao; He, Fuhong; Zhang, Anding; Gu, Lijuan; Wen, Yangmao; Jiang, Weiguo; Shao, Hongbo

    2014-01-01

    This paper took a subregion in a small watershed gully system at Beiyanzikou catchment of Qixia, China, as a study and, using object-orientated image analysis (OBIA), extracted shoulder line of gullies from high spatial resolution digital orthophoto map (DOM) aerial photographs. Next, it proposed an accuracy assessment method based on the adjacent distance between the boundary classified by remote sensing and points measured by RTK-GPS along the shoulder line of gullies. Finally, the original surface was fitted using linear regression in accordance with the elevation of two extracted edges of experimental gullies, named Gully 1 and Gully 2, and the erosion volume was calculated. The results indicate that OBIA can effectively extract information of gullies; average range difference between points field measured along the edge of gullies and classified boundary is 0.3166 m, with variance of 0.2116 m. The erosion area and volume of two gullies are 2141.6250 m2, 5074.1790 m3 and 1316.1250 m2, 1591.5784 m3, respectively. The results of the study provide a new method for the quantitative study of small gully erosion. PMID:24616626

  15. A quantitative study of gully erosion based on object-oriented analysis techniques: a case study in Beiyanzikou catchment of Qixia, Shandong, China.

    PubMed

    Wang, Tao; He, Fuhong; Zhang, Anding; Gu, Lijuan; Wen, Yangmao; Jiang, Weiguo; Shao, Hongbo

    2014-01-01

    This paper took a subregion in a small watershed gully system at Beiyanzikou catchment of Qixia, China, as a study and, using object-orientated image analysis (OBIA), extracted shoulder line of gullies from high spatial resolution digital orthophoto map (DOM) aerial photographs. Next, it proposed an accuracy assessment method based on the adjacent distance between the boundary classified by remote sensing and points measured by RTK-GPS along the shoulder line of gullies. Finally, the original surface was fitted using linear regression in accordance with the elevation of two extracted edges of experimental gullies, named Gully 1 and Gully 2, and the erosion volume was calculated. The results indicate that OBIA can effectively extract information of gullies; average range difference between points field measured along the edge of gullies and classified boundary is 0.3166 m, with variance of 0.2116 m. The erosion area and volume of two gullies are 2141.6250 m(2), 5074.1790 m(3) and 1316.1250 m(2), 1591.5784 m(3), respectively. The results of the study provide a new method for the quantitative study of small gully erosion.

  16. Using remote sensing techniques and field-based structural analysis to explore new gold and associated mineral sites around Al-Hajar mine, Asir terrane, Arabian Shield

    NASA Astrophysics Data System (ADS)

    Sonbul, Abdullah R.; El-Shafei, Mohamed K.; Bishta, Adel Z.

    2016-05-01

    Modern earth resource satellites provide huge amounts of digital imagery at different resolutions. These satellite imageries are considered one of the most significant sources of data for mineral exploration. Image processing techniques were applied to the exposed rocks around the Al-Aqiq area of the Asir terrane in the southern part of the Arabian Shield. The area under study has two sub-parallel N-S trending metamorphic belts of green-schist facies. The first belt is located southeast of Al-Aqiq, where the Al-Hajar Gold Mine is situated. It is essentially composed of metavolcanics and metasedimentary rocks, and it is intruded by different plutonic rocks of primarily diorite, syenite and porphyritic granite. The second belt is located northwest of Al-Aqiq, and it is composed of metavolcanics and metasedimentary rocks and is intruded by granite bodies. The current study aimed to distinguish the lithological units, detect and map the alteration zones, and extract the major fault lineaments around the Al-Hajar gold prospect. Digital satellite imageries, including Landsat 7 ETM + multispectral and panchromatic and SPOT-5 were used in addition to field verification. Areas with similar spectral signatures to the prospect were identified in the nearby metamorphic belt; it was considered as a target area and was inspected in the field. The relationships between the alteration zones, the mineral deposits and the structural elements were used to locate the ore-bearing zones in the subsurface. The metasedimentary units of the target area showed a dextral-ductile shearing top-to-the-north and the presence of dominant mineralized quartz vein-system. The area to the north of the Al-Hajar prospect showed also sub-parallel shear zones along which different types of alterations were detected. Field-based criteria such as hydrothermal breccia, jasper, iron gossans and porphyritic granite strongly indicate the presence of porphyry-type ore deposits in Al-Hajar metamorphic belt that

  17. Analysis and calibration techniques for superconducting resonators.

    PubMed

    Cataldo, Giuseppe; Wollack, Edward J; Barrentine, Emily M; Brown, Ari D; Moseley, S Harvey; U-Yen, Kongpop

    2015-01-01

    A method is proposed and experimentally explored for in-situ calibration of complex transmission data for superconducting microwave resonators. This cryogenic calibration method accounts for the instrumental transmission response between the vector network analyzer reference plane and the device calibration plane. Once calibrated, the observed resonator response is analyzed in detail by two approaches. The first, a phenomenological model based on physically realizable rational functions, enables the extraction of multiple resonance frequencies and widths for coupled resonators without explicit specification of the circuit network. In the second, an ABCD-matrix representation for the distributed transmission line circuit is used to model the observed response from the characteristic impedance and propagation constant. When used in conjunction with electromagnetic simulations, the kinetic inductance fraction can be determined with this method with an accuracy of 2%. Datasets for superconducting microstrip and coplanar-waveguide resonator devices were investigated and a recovery within 1% of the observed complex transmission amplitude was achieved with both analysis approaches. The experimental configuration used in microwave characterization of the devices and self-consistent constraints for the electromagnetic constitutive relations for parameter extraction are also presented.

  18. A procedural analysis of correspondence training techniques

    PubMed Central

    Paniagua, Freddy A.

    1990-01-01

    A variety of names have been given to procedures used in correspondence training, some more descriptive than others. In this article I argue that a terminology more accurately describing actual procedures, rather than the conceptual function that those procedures are assumed to serve, would benefit the area of correspondence training. I identify two documented procedures during the reinforcement of verbalization phase and five procedures during the reinforcement of correspondence phase and suggest that those procedures can be classified, or grouped into nonoverlapping categories, by specifying the critical dimensions of those procedures belonging to a single category. I suggest that the names of such nonoverlapping categories should clearly specify the dimensions on which the classification is based in order to facilitate experimental comparison of procedures, and to be able to recognize when a new procedure (as opposed to a variant of one already in existence) is developed. Future research involving comparative analysis across and within procedures is discussed within the framework of the proposed classification. PMID:22478059

  19. EEG source analysis using space mapping techniques

    NASA Astrophysics Data System (ADS)

    Crevecoeur, G.; Hallez, H.; van Hese, P.; D'Asseler, Y.; Dupre, L.; van de Walle, R.

    2008-06-01

    The electroencephalogram (EEG) measures potential differences, generated by electrical activity in brain tissue, between scalp electrodes. The EEG potentials can be calculated by the quasi-static Poisson equation in a certain head model. It is well known that the electrical dipole (source) which best fits the measured EEG potentials is obtained by an inverse problem. The dipole parameters are obtained by finding the global minimum of the relative residual energy (RRE). For the first time, the space mapping technique (SM technique) is used for minimizing the RRE. The SM technique aims at aligning two different simulation models: a fine model, accurate but CPU-time expensive, and a coarse model, computationally fast but less accurate than the fine one. The coarse model is a semi-analytical model, the so-called three-shell concentric sphere model. The fine model numerically solves the Poisson equation in a realistic head model. If we use the aggressive space mapping (ASM) algorithm, the errors on the dipole location are too large. The hybrid aggressive space mapping (HASM) on the other hand has better convergence properties, yielding a reduction in dipole location errors. The computational effort of HASM is greater than ASM but smaller than using direct optimization techniques.

  20. Recent trends in particle size analysis techniques

    NASA Technical Reports Server (NTRS)

    Kang, S. H.

    1984-01-01

    Recent advances and developments in the particle-sizing technologies are briefly reviewed in accordance with three operating principles including particle size and shape descriptions. Significant trends of the particle size analysing equipment recently developed show that compact electronic circuitry and rapid data processing systems were mainly adopted in the instrument design. Some newly developed techniques characterizing the particulate system were also introduced.

  1. On Bitstream Based Edge Detection Techniques

    DTIC Science & Technology

    2009-01-01

    IEEE Transactions on, vol. 38, no. 1, pp. xviii– iv, Feb 1992. [5] Rafael C. Gonzalez and Richard E. Woods, Digital Image Processing, Addison-Wesley...Carmona-Poyato, R. Medina- Carnicer, and F. J. Madrid- Cuevas , “Automatic genera- tion of consensus ground truth for the comparison of edge detection techniques,” Image Vision Comput., vol. 26, no. 4, pp. 496–511, 2008.

  2. Noninvasive in vivo glucose sensing using an iris based technique

    NASA Astrophysics Data System (ADS)

    Webb, Anthony J.; Cameron, Brent D.

    2011-03-01

    Physiological glucose monitoring is important aspect in the treatment of individuals afflicted with diabetes mellitus. Although invasive techniques for glucose monitoring are widely available, it would be very beneficial to make such measurements in a noninvasive manner. In this study, a New Zealand White (NZW) rabbit animal model was utilized to evaluate a developed iris-based imaging technique for the in vivo measurement of physiological glucose concentration. The animals were anesthetized with isoflurane and an insulin/dextrose protocol was used to control blood glucose concentration. To further help restrict eye movement, a developed ocular fixation device was used. During the experimental time frame, near infrared illuminated iris images were acquired along with corresponding discrete blood glucose measurements taken with a handheld glucometer. Calibration was performed using an image based Partial Least Squares (PLS) technique. Independent validation was also performed to assess model performance along with Clarke Error Grid Analysis (CEGA). Initial validation results were promising and show that a high percentage of the predicted glucose concentrations are within 20% of the reference values.

  3. Classification Techniques for Multivariate Data Analysis.

    DTIC Science & Technology

    1980-03-28

    analysis among biologists, botanists, and ecologists, while some social scientists may refer "typology". Other frequently encountered terms are pattern...the determinantal equation: lB -XW 0 (42) 49 The solutions X. are the eigenvalues of the matrix W-1 B 1 as in discriminant analysis. There are t non...Statistical Package for Social Sciences (SPSS) (14) subprogram FACTOR was used for the principal components analysis. It is designed both for the factor

  4. An Empirical Analysis of Rough Set Categorical Clustering Techniques

    PubMed Central

    2017-01-01

    Clustering a set of objects into homogeneous groups is a fundamental operation in data mining. Recently, many attentions have been put on categorical data clustering, where data objects are made up of non-numerical attributes. For categorical data clustering the rough set based approaches such as Maximum Dependency Attribute (MDA) and Maximum Significance Attribute (MSA) has outperformed their predecessor approaches like Bi-Clustering (BC), Total Roughness (TR) and Min-Min Roughness(MMR). This paper presents the limitations and issues of MDA and MSA techniques on special type of data sets where both techniques fails to select or faces difficulty in selecting their best clustering attribute. Therefore, this analysis motivates the need to come up with better and more generalize rough set theory approach that can cope the issues with MDA and MSA. Hence, an alternative technique named Maximum Indiscernible Attribute (MIA) for clustering categorical data using rough set indiscernible relations is proposed. The novelty of the proposed approach is that, unlike other rough set theory techniques, it uses the domain knowledge of the data set. It is based on the concept of indiscernibility relation combined with a number of clusters. To show the significance of proposed approach, the effect of number of clusters on rough accuracy, purity and entropy are described in the form of propositions. Moreover, ten different data sets from previously utilized research cases and UCI repository are used for experiments. The results produced in tabular and graphical forms shows that the proposed MIA technique provides better performance in selecting the clustering attribute in terms of purity, entropy, iterations, time, accuracy and rough accuracy. PMID:28068344

  5. Image analysis techniques for automated IVUS contour detection.

    PubMed

    Papadogiorgaki, Maria; Mezaris, Vasileios; Chatzizisis, Yiannis S; Giannoglou, George D; Kompatsiaris, Ioannis

    2008-09-01

    Intravascular ultrasound (IVUS) constitutes a valuable technique for the diagnosis of coronary atherosclerosis. The detection of lumen and media-adventitia borders in IVUS images represents a necessary step towards the reliable quantitative assessment of atherosclerosis. In this work, a fully automated technique for the detection of lumen and media-adventitia borders in IVUS images is presented. This comprises two different steps for contour initialization: one for each corresponding contour of interest and a procedure for the refinement of the detected contours. Intensity information, as well as the result of texture analysis, generated by means of a multilevel discrete wavelet frames decomposition, are used in two different techniques for contour initialization. For subsequently producing smooth contours, three techniques based on low-pass filtering and radial basis functions are introduced. The different combinations of the proposed methods are experimentally evaluated in large datasets of IVUS images derived from human coronary arteries. It is demonstrated that our proposed segmentation approaches can quickly and reliably perform automated segmentation of IVUS images.

  6. 48 CFR 15.404-1 - Proposal analysis techniques.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... are: I Price Analysis, II Quantitative Techniques for Contract Pricing, III Cost Analysis, IV Advanced... estimates. (vi) Comparison of proposed prices with prices obtained through market research for the same...

  7. 48 CFR 15.404-1 - Proposal analysis techniques.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... are: I Price Analysis, II Quantitative Techniques for Contract Pricing, III Cost Analysis, IV Advanced... estimates. (vi) Comparison of proposed prices with prices obtained through market research for the same...

  8. Three Techniques for Task Analysis: Examples from the Nuclear Utilities.

    ERIC Educational Resources Information Center

    Carlisle, Kenneth E.

    1984-01-01

    Discusses three task analysis techniques utilized at the Palo Verde Nuclear Generating Station to review training programs: analysis of (1) job positions, (2) procedures, and (3) instructional presentations. All of these include task breakdown, relationship determination, and task restructuring. (MBR)

  9. Analysis of Cultural Heritage by Accelerator Techniques and Analytical Imaging

    NASA Astrophysics Data System (ADS)

    Ide-Ektessabi, Ari; Toque, Jay Arre; Murayama, Yusuke

    2011-12-01

    In this paper we present the result of experimental investigation using two very important accelerator techniques: (1) synchrotron radiation XRF and XAFS; and (2) accelerator mass spectrometry and multispectral analytical imaging for the investigation of cultural heritage. We also want to introduce a complementary approach to the investigation of artworks which is noninvasive and nondestructive that can be applied in situ. Four major projects will be discussed to illustrate the potential applications of these accelerator and analytical imaging techniques: (1) investigation of Mongolian Textile (Genghis Khan and Kublai Khan Period) using XRF, AMS and electron microscopy; (2) XRF studies of pigments collected from Korean Buddhist paintings; (3) creating a database of elemental composition and spectral reflectance of more than 1000 Japanese pigments which have been used for traditional Japanese paintings; and (4) visible light-near infrared spectroscopy and multispectral imaging of degraded malachite and azurite. The XRF measurements of the Japanese and Korean pigments could be used to complement the results of pigment identification by analytical imaging through spectral reflectance reconstruction. On the other hand, analysis of the Mongolian textiles revealed that they were produced between 12th and 13th century. Elemental analysis of the samples showed that they contained traces of gold, copper, iron and titanium. Based on the age and trace elements in the samples, it was concluded that the textiles were produced during the height of power of the Mongol empire, which makes them a valuable cultural heritage. Finally, the analysis of the degraded and discolored malachite and azurite demonstrates how multispectral analytical imaging could be used to complement the results of high energy-based techniques.

  10. Advanced NMR-based techniques for pore structure analysis of coal. Quarterly report No. 9, October 1, 1993--December 30, 1993

    SciTech Connect

    Smith, D.M.

    1993-12-31

    One of the main problems in coal utilization is the inability to properly characterize its complex pore structure. Coals typically have micro/ultra-micro pores but they also exhibit meso and macroporosity. Conventional pore size techniques (adsorption/condensation, mercury porosimetry) are limited because of this broad pore size range, microporosity, reactive nature of coal, samples must be completely dried, and network/percolation effects. Small angle scattering is limited because it probes both open and dosed pores. Although one would not expect any single technique to provide a satisfactory description of a coal`s structure, it is apparent that better techniques are necessary. Small angle scattering could be improved by combining scattering and adsorption measurements. Also, the measurement of NMR parameters of various gas phase and adsorbed phase NMR active probes can provide pore structure information. We will investigate the dependence of the common NMR parameters such as chemical shifts and relaxation times of several different nuclei and compounds on the pore structure of model microporous solids, carbons, and coals. In particular, we will study the interaction between several small molecules ({sup 129}Xe, {sup 3}He, {sup 14}N{sub 2}, {sup 14}NH{sub 3}, {sup 15}N{sub 2}, {sup 13}CH{sub 4}, {sup 13}CO{sub 2}) and pore surface. Our current work may be divided into three areas: small-angle X-ray scattering (SAXS), adsorption, and NMR.

  11. Model-Based Safety Analysis

    NASA Technical Reports Server (NTRS)

    Joshi, Anjali; Heimdahl, Mats P. E.; Miller, Steven P.; Whalen, Mike W.

    2006-01-01

    System safety analysis techniques are well established and are used extensively during the design of safety-critical systems. Despite this, most of the techniques are highly subjective and dependent on the skill of the practitioner. Since these analyses are usually based on an informal system model, it is unlikely that they will be complete, consistent, and error free. In fact, the lack of precise models of the system architecture and its failure modes often forces the safety analysts to devote much of their effort to gathering architectural details about the system behavior from several sources and embedding this information in the safety artifacts such as the fault trees. This report describes Model-Based Safety Analysis, an approach in which the system and safety engineers share a common system model created using a model-based development process. By extending the system model with a fault model as well as relevant portions of the physical system to be controlled, automated support can be provided for much of the safety analysis. We believe that by using a common model for both system and safety engineering and automating parts of the safety analysis, we can both reduce the cost and improve the quality of the safety analysis. Here we present our vision of model-based safety analysis and discuss the advantages and challenges in making this approach practical.

  12. Objective Analysis and Prediction Techniques - 1985

    DTIC Science & Technology

    1985-11-30

    Automatic Kesocyclone Detection 162 3. Kesocyclone Discriminators 163 7 :. . . . .. . . -,, . . . . . ..... - . . . . . ... ,, . ’ TABLE OF CONTENTS...1975: High order accurate difference solutions of fluid- mechanics problems by a compact differencing technique. J. Comut. Physics, 1j, 383-390. 22...160 ft . f ft f ft f ft t f ft ft ft** fftf ftt ~~ :,.--:.*’ 8 2t 12-. . .. .f -- t. S ~ ~ ~ ~ - 7’ - -10 - - - . .. - - 12 16 00 (;M a’ 4 /v r ai i

  13. Bayesian Analysis of the Pattern Informatics Technique

    NASA Astrophysics Data System (ADS)

    Cho, N.; Tiampo, K.; Klein, W.; Rundle, J.

    2007-12-01

    The pattern informatics (PI) [Rundle et al., 2000; Tiampo et al., 2002; Holliday et al., 2005] is a technique that uses phase dynamics in order to quantify temporal variations in seismicity patterns. This technique has shown interesting results for forecasting earthquakes with magnitude greater than or equal to 5 in southern California from 2000 to 2010 [Rundle et al., 2002]. In this work, a Bayesian approach is used to obtain a modified updated version of the PI called Bayesian pattern informatics (BPI). This alternative method uses the PI result as a prior probability and models such as ETAS [Ogata, 1988, 2004; Helmstetter and Sornette, 2002] or BASS [Turcotte et al., 2007] in order to obtain the likelihood. Its result is similar to the one obtained by the PI: the determination of regions, known as hotspots, that are most susceptible to the occurrence of events with M=5 and larger during the forecast period. As an initial test, retrospective forecasts for the southern California region from 1990 to 2000 were made with both the BPI and the PI techniques, and the results are discussed in this work.

  14. PIE: A Dynamic Failure-Based Technique

    NASA Technical Reports Server (NTRS)

    Voas, Jeffrey M.

    1990-01-01

    This paper presents a dynamic technique for statistically estimating three program characteristics that affect a program's computational behavior: (1) the probability that a particular section of a program is executed, (2) the probability that the particular section affects the data state, and (3) the probability that a data state produced by that section has an effect on program output. These three characteristics can be used to predict whether faults are likely to be uncovered by software testing. Index Terms: Software testing, data state, fault, failure, testability. 1 Introduction

  15. Liquid Tunable Microlenses based on MEMS techniques

    PubMed Central

    Zeng, Xuefeng; Jiang, Hongrui

    2013-01-01

    The recent rapid development in microlens technology has provided many opportunities for miniaturized optical systems, and has found a wide range of applications. Of these microlenses, tunable-focus microlenses are of special interest as their focal lengths can be tuned using micro-scale actuators integrated with the lens structure. Realization of such tunable microlens generally relies on the microelectromechanical system (MEMS) technologies. Here, we review the recent progress in tunable liquid microlenses. The underlying physics relevant to these microlenses are first discussed, followed by description of three main categories of tunable microlenses involving MEMS techniques, mechanically driven, electrically driven, and those integrated within microfluidic systems. PMID:24163480

  16. Soil Analysis using the semi-parametric NAA technique

    SciTech Connect

    Zamboni, C. B.; Silveira, M. A. G.; Medina, N. H.

    2007-10-26

    The semi-parametric Neutron Activation Analysis technique, using Au as a flux monitor, was applied to measure element concentrations of Br, Ca, Cl, K, Mn and Na for soil characterization. The results were compared with those using the Instrumental Neutron Activation Analysis technique and they found to be compatible. The viability, advantages, and limitations of using these two analytic methodologies are discussed.

  17. Assembly, checkout, and operation optimization analysis technique for complex systems

    NASA Technical Reports Server (NTRS)

    1968-01-01

    Computerized simulation model of a launch vehicle/ground support equipment system optimizes assembly, checkout, and operation of the system. The model is used to determine performance parameters in three phases or modes - /1/ systems optimization techniques, /2/ operation analysis methodology, and /3/ systems effectiveness analysis technique.

  18. 77 FR 40552 - Federal Acquisition Regulation; Price Analysis Techniques

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-07-10

    ...)(1), which sets forth the requirements of adequate price competition. However, only FAR 15.403-1(c)(1... Federal Acquisition Regulation; Price Analysis Techniques AGENCY: Department of Defense (DoD), General... clarify the use of a price analysis technique in order to establish a fair and reasonable price....

  19. GAS CURTAIN EXPERIMENTAL TECHNIQUE AND ANALYSIS METHODOLOGIES

    SciTech Connect

    J. R. KAMM; ET AL

    2001-01-01

    The qualitative and quantitative relationship of numerical simulation to the physical phenomena being modeled is of paramount importance in computational physics. If the phenomena are dominated by irregular (i. e., nonsmooth or disordered) behavior, then pointwise comparisons cannot be made and statistical measures are required. The problem we consider is the gas curtain Richtmyer-Meshkov (RM) instability experiments of Rightley et al. (13), which exhibit complicated, disordered motion. We examine four spectral analysis methods for quantifying the experimental data and computed results: Fourier analysis, structure functions, fractal analysis, and continuous wavelet transforms. We investigate the applicability of these methods for quantifying the details of fluid mixing.

  20. GAS CURTAIN EXPERIMENTAL TECHNIQUE AND ANALYSIS METHODOLOGIES.

    SciTech Connect

    Kamm, J. R.; Rider, William; Rightley, P. M.; Prestridge, K. P.; Benjamin, R. F.; Vorobieff, P. V.

    2001-01-01

    The qualitative and quantitative relationship of numerical simulation to the physical phenomena being modeled is of paramount importance in computational physics. If the phenomena are dominated by irregular (i.e., nonsmooth or disordered) behavior, then pointwise comparisons cannot be made and statistical measures are required. The problem we consider is the gas curtain Richtmyer-Meshkov (RM) instability experiments of Rightley et al. [13], which exhibit complicated, disordered motion. We examine four spectral analysis methods for quantifying the experimental data and computed results: Fourier analysis, structure functions, fractal analysis, and continuous wavelet transforms. We investigate the applicability of these methods for quantifying the details of fluid mixing.

  1. Statistical Analysis Techniques for Small Sample Sizes

    NASA Technical Reports Server (NTRS)

    Navard, S. E.

    1984-01-01

    The small sample sizes problem which is encountered when dealing with analysis of space-flight data is examined. Because of such a amount of data available, careful analyses are essential to extract the maximum amount of information with acceptable accuracy. Statistical analysis of small samples is described. The background material necessary for understanding statistical hypothesis testing is outlined and the various tests which can be done on small samples are explained. Emphasis is on the underlying assumptions of each test and on considerations needed to choose the most appropriate test for a given type of analysis.

  2. Overview on techniques in cluster analysis.

    PubMed

    Frades, Itziar; Matthiesen, Rune

    2010-01-01

    Clustering is the unsupervised, semisupervised, and supervised classification of patterns into groups. The clustering problem has been addressed in many contexts and disciplines. Cluster analysis encompasses different methods and algorithms for grouping objects of similar kinds into respective categories. In this chapter, we describe a number of methods and algorithms for cluster analysis in a stepwise framework. The steps of a typical clustering analysis process include sequentially pattern representation, the choice of the similarity measure, the choice of the clustering algorithm, the assessment of the output, and the representation of the clusters.

  3. Morphometric techniques for orientation analysis of karst in northern Florida

    SciTech Connect

    Jenkins, D.T.; Beck, B.F.

    1985-01-01

    Morphometric techniques for the analysis of karst landscape orientation data based on swallet catchment areas can be highly inadequate. The long axes of catchment areas may not coincide with structural control, especially in regions having very low relief. Better structural correlation was observed using multiply linear trend measurements of closed depressions rather than drainage basins. Trend analysis was performed on four areas, approximately 25 km/sup 2/ each, forming a sequence from the Suwannee River to the Cody Escarpment in northern Florida. This area is a karst plain, mantled by 12 to 25 meters of unconsolidated sands and clays. Structural control was examined by tabulating the azimuths of distinct linear trends as determined from depression shape based on 1:24,000 topographic maps. The topography was characterized by 1872 individual swallet catchment areas or 1457 closed depressions. The common geomorphic technique of analyzing orientation data in 10/sup 0/ increments beginning with O/sup 0/ may yield incorrect peak width and placement. To correctly detect all significant orientation peaks all possible combinations of peak width and placement must be tested. Fifty-five different plots were reviewed and tested for each area.

  4. Impedance Flow Cytometry: A Novel Technique in Pollen Analysis

    PubMed Central

    Lambalk, Joep; Ottiger, Marcel

    2016-01-01

    Introduction An efficient and reliable method to estimate plant cell viability, especially of pollen, is important for plant breeding research and plant production processes. Pollen quality is determined by classical methods, like staining techniques or in vitro pollen germination, each having disadvantages with respect to reliability, analysis speed, and species dependency. Analysing single cells based on their dielectric properties by impedance flow cytometry (IFC) has developed into a common method for cellular characterisation in microbiology and medicine during the last decade. The aim of this study is to demonstrate the potential of IFC in plant cell analysis with the focus on pollen. Method Developing and mature pollen grains were analysed during their passage through a microfluidic chip to which radio frequencies of 0.5 to 12 MHz were applied. The acquired data provided information about the developmental stage, viability, and germination capacity. The biological relevance of the acquired IFC data was confirmed by classical staining methods, inactivation controls, as well as pollen germination assays. Results Different stages of developing pollen, dead, viable and germinating pollen populations could be detected and quantified by IFC. Pollen viability analysis by classical FDA staining showed a high correlation with IFC data. In parallel, pollen with active germination potential could be discriminated from the dead and the viable but non-germinating population. Conclusion The presented data demonstrate that IFC is an efficient, label-free, reliable and non-destructive technique to analyse pollen quality in a species-independent manner. PMID:27832091

  5. Typology of Delivery Quality: Latent Profile Analysis of Teacher Engagement and Delivery Techniques in a School-Based Prevention Intervention, "Keepin' It REAL" Curriculum

    ERIC Educational Resources Information Center

    Shin, YoungJu; Miller-Day, Michelle; Pettigrew, Jonathan; Hecht, Michael L.; Krieger, Janice L.

    2014-01-01

    Enhancing the delivery quality of school-based, evidence-based prevention programs is one key to ensuring uniform program effects on student outcomes. Program evaluations often focus on content dosage when implementing prevention curricula, however, less is known about implementation quality of prevention content, especially among teachers who may…

  6. Apprenticeship Learning Techniques for Knowledge Based Systems

    DTIC Science & Technology

    1988-12-01

    domain, such as medicine. The Odysseus explanation-based learning program constructs explanations of problem-solving actions in the domain of medical...theories and empirical methods so as to allow construction of an explanation. The Odysseus learning program provides the first demonstration of using the... Odysseus explanation-based learning program is presfuted, which constructs explanations of human problem-solving actions in the domain of medical di

  7. BaTMAn: Bayesian Technique for Multi-image Analysis

    NASA Astrophysics Data System (ADS)

    Casado, J.; Ascasibar, Y.; García-Benito, R.; Guidi, G.; Choudhury, O. S.; Bellocchi, E.; Sánchez, S. F.; Díaz, A. I.

    2016-12-01

    Bayesian Technique for Multi-image Analysis (BaTMAn) characterizes any astronomical dataset containing spatial information and performs a tessellation based on the measurements and errors provided as input. The algorithm iteratively merges spatial elements as long as they are statistically consistent with carrying the same information (i.e. identical signal within the errors). The output segmentations successfully adapt to the underlying spatial structure, regardless of its morphology and/or the statistical properties of the noise. BaTMAn identifies (and keeps) all the statistically-significant information contained in the input multi-image (e.g. an IFS datacube). The main aim of the algorithm is to characterize spatially-resolved data prior to their analysis.

  8. Trends and Techniques for Space Base Electronics

    NASA Technical Reports Server (NTRS)

    Trotter, J. D.; Wade, T. E.; Gassaway, J. D.

    1979-01-01

    Simulations of various phosphorus and boron diffusions in SOS were completed and a sputtering system, furnaces, and photolithography related equipment were set up. Double layer metal experiments initially utilized wet chemistry techniques. By incorporating ultrasonic etching of the vias, premetal cleaning a modified buffered HF, phosphorus doped vapox, and extended sintering, yields of 98% were obtained using the standard test pattern. A two dimensional modeling program was written for simulating short channel MOSFETs with nonuniform substrate doping. A key simplifying assumption used is that the majority carriers can be represented by a sheet charge at the silicon dioxide silicon interface. Although the program is incomplete, the two dimensional Poisson equation for the potential distribution was achieved. The status of other Z-D MOSFET simulation programs is summarized.

  9. 48 CFR 215.404-1 - Proposal analysis techniques.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 48 Federal Acquisition Regulations System 3 2014-10-01 2014-10-01 false Proposal analysis... Contract Pricing 215.404-1 Proposal analysis techniques. (1) Follow the procedures at PGI 215.404-1 for proposal analysis. (2) For spare parts or support equipment, perform an analysis of— (i) Those line...

  10. Uncertainty analysis technique for OMEGA Dante measurementsa)

    NASA Astrophysics Data System (ADS)

    May, M. J.; Widmann, K.; Sorce, C.; Park, H.-S.; Schneider, M.

    2010-10-01

    The Dante is an 18 channel x-ray filtered diode array which records the spectrally and temporally resolved radiation flux from various targets (e.g., hohlraums, etc.) at x-ray energies between 50 eV and 10 keV. It is a main diagnostic installed on the OMEGA laser facility at the Laboratory for Laser Energetics, University of Rochester. The absolute flux is determined from the photometric calibration of the x-ray diodes, filters and mirrors, and an unfold algorithm. Understanding the errors on this absolute measurement is critical for understanding hohlraum energetic physics. We present a new method for quantifying the uncertainties on the determined flux using a Monte Carlo parameter variation technique. This technique combines the uncertainties in both the unfold algorithm and the error from the absolute calibration of each channel into a one sigma Gaussian error function. One thousand test voltage sets are created using these error functions and processed by the unfold algorithm to produce individual spectra and fluxes. Statistical methods are applied to the resultant set of fluxes to estimate error bars on the measurements.

  11. Uncertainty analysis technique for OMEGA Dante measurements.

    PubMed

    May, M J; Widmann, K; Sorce, C; Park, H-S; Schneider, M

    2010-10-01

    The Dante is an 18 channel x-ray filtered diode array which records the spectrally and temporally resolved radiation flux from various targets (e.g., hohlraums, etc.) at x-ray energies between 50 eV and 10 keV. It is a main diagnostic installed on the OMEGA laser facility at the Laboratory for Laser Energetics, University of Rochester. The absolute flux is determined from the photometric calibration of the x-ray diodes, filters and mirrors, and an unfold algorithm. Understanding the errors on this absolute measurement is critical for understanding hohlraum energetic physics. We present a new method for quantifying the uncertainties on the determined flux using a Monte Carlo parameter variation technique. This technique combines the uncertainties in both the unfold algorithm and the error from the absolute calibration of each channel into a one sigma Gaussian error function. One thousand test voltage sets are created using these error functions and processed by the unfold algorithm to produce individual spectra and fluxes. Statistical methods are applied to the resultant set of fluxes to estimate error bars on the measurements.

  12. Injection Locking Techniques for Spectrum Analysis

    NASA Astrophysics Data System (ADS)

    Gathma, Timothy D.; Buckwalter, James F.

    2011-04-01

    Wideband spectrum analysis supports future communication systems that reconfigure and adapt to the capacity of the spectral environment. While test equipment manufacturers offer wideband spectrum analyzers with excellent sensitivity and resolution, these spectrum analyzers typically cannot offer acceptable size, weight, and power (SWAP). CMOS integrated circuits offer the potential to fully integrate spectrum analysis capability with analog front-end circuitry and digital signal processing on a single chip. Unfortunately, CMOS lacks high-Q passives and wideband resonator tunability that is necessary for heterodyne implementations of spectrum analyzers. As an alternative to the heterodyne receiver architectures, two nonlinear methods for performing wideband, low-power spectrum analysis are presented. The first method involves injecting the spectrum of interest into an array of injection-locked oscillators. The second method employs the closed loop dynamics of both injection locking and phase locking to independently estimate the injected frequency and power.

  13. Injection Locking Techniques for Spectrum Analysis

    SciTech Connect

    Gathma, Timothy D.; Buckwalter, James F.

    2011-04-19

    Wideband spectrum analysis supports future communication systems that reconfigure and adapt to the capacity of the spectral environment. While test equipment manufacturers offer wideband spectrum analyzers with excellent sensitivity and resolution, these spectrum analyzers typically cannot offer acceptable size, weight, and power (SWAP). CMOS integrated circuits offer the potential to fully integrate spectrum analysis capability with analog front-end circuitry and digital signal processing on a single chip. Unfortunately, CMOS lacks high-Q passives and wideband resonator tunability that is necessary for heterodyne implementations of spectrum analyzers. As an alternative to the heterodyne receiver architectures, two nonlinear methods for performing wideband, low-power spectrum analysis are presented. The first method involves injecting the spectrum of interest into an array of injection-locked oscillators. The second method employs the closed loop dynamics of both injection locking and phase locking to independently estimate the injected frequency and power.

  14. Automated fluid analysis apparatus and techniques

    DOEpatents

    Szecsody, James E.

    2004-03-16

    An automated device that couples a pair of differently sized sample loops with a syringe pump and a source of degassed water. A fluid sample is mounted at an inlet port and delivered to the sample loops. A selected sample from the sample loops is diluted in the syringe pump with the degassed water and fed to a flow through detector for analysis. The sample inlet is also directly connected to the syringe pump to selectively perform analysis without dilution. The device is airtight and used to detect oxygen-sensitive species, such as dithionite in groundwater following a remedial injection to treat soil contamination.

  15. Analysis techniques for background rejection at the MAJORANA DEMONSTRATOR

    SciTech Connect

    Cuesta, C; Abgrall, N.; Arnquist, I. J.; Avignone, III, F. T.; Barabash, A.S.; Bertrand, F. E.; Bradley, A. W.; Brudanin, V.; Busch, M.; Byram, D.; Caldwell, A. S.; Chan, Y-D; Christofferson, C. D.; Detwiler, J. A.; Efremenko, Yu.; Ejiri, H.; Elliott, S. R.; Galindo-Uribarri, A.; Gilliss, T.; Green, M. P.; Gruszko, J; Guinn, I S; Guiseppe, V E; Henning, R.; Howard, S.; Howe, M. A.; Keeter, K.J.; Kidd, M. F.; Konovalov, S.I.; Kouzes, R. T.; LaFerriere, B. D.; Leon, J.; MacMullin, J.; Meijer, S. J.; Orrell, J. L.; O'Shaughnessy, C.; Radford, D. C.; Rager, J.; Robertson, R.G.H.; Romero-Romero, E.; Snyder, N; Suriano, A. M.; Tedeschi, D; Trimble, J. E.; Vasilyev, S.; Vetter, K. [University of California et al.

    2015-01-01

    The MAJORANA Collaboration is constructing the MAJORANA DEMONSTRATOR, an ultra-low background, 40-kg modular HPGe detector array to search for neutrinoless double beta decay in Ge-76. In view of the next generation of tonne-scale Ge-based 0 nu beta beta-decay searches that will probe the neutrino mass scale in the inverted-hierarchy region, a major goal of the MAJORANA DEMONSTRATOR is to demonstrate a path forward to achieving a background rate at or below 1 count/tonne/year in the 4 keV region of interest around the Q-value at 2039 keV. The background rejection techniques to be applied to the data include cuts based on data reduction, pulse shape analysis, event coincidences, and time correlations. The Point Contact design of the DEMONSTRATOR's germanium detectors allows for significant reduction of gamma background.

  16. Visualization techniques for tongue analysis in traditional Chinese medicine

    NASA Astrophysics Data System (ADS)

    Pham, Binh L.; Cai, Yang

    2004-05-01

    Visual inspection of the tongue has been an important diagnostic method of Traditional Chinese Medicine (TCM). Clinic data have shown significant connections between various viscera cancers and abnormalities in the tongue and the tongue coating. Visual inspection of the tongue is simple and inexpensive, but the current practice in TCM is mainly experience-based and the quality of the visual inspection varies between individuals. The computerized inspection method provides quantitative models to evaluate color, texture and surface features on the tongue. In this paper, we investigate visualization techniques and processes to allow interactive data analysis with the aim to merge computerized measurements with human expert's diagnostic variables based on five-scale diagnostic conditions: Healthy (H), History Cancers (HC), History of Polyps (HP), Polyps (P) and Colon Cancer (C).

  17. Analysis techniques for background rejection at the Majorana Demonstrator

    SciTech Connect

    Cuestra, Clara; Rielage, Keith Robert; Elliott, Steven Ray; Xu, Wenqin; Goett, John Jerome III

    2015-06-11

    The MAJORANA Collaboration is constructing the MAJORANA DEMONSTRATOR, an ultra-low background, 40-kg modular HPGe detector array to search for neutrinoless double beta decay in 76Ge. In view of the next generation of tonne-scale Ge-based 0νββ-decay searches that will probe the neutrino mass scale in the inverted-hierarchy region, a major goal of the MAJORANA DEMONSTRATOR is to demonstrate a path forward to achieving a background rate at or below 1 count/tonne/year in the 4 keV region of interest around the Q-value at 2039 keV. The background rejection techniques to be applied to the data include cuts based on data reduction, pulse shape analysis, event coincidences, and time correlations. The Point Contact design of the DEMONSTRATOR's germanium detectors allows for significant reduction of gamma background.

  18. Analysis techniques for background rejection at the MAJORANA DEMONSTRATOR

    SciTech Connect

    Cuesta, C.; Buuck, M.; Detwiler, J. A.; Gruszko, J.; Guinn, I. S.; Leon, J.; Robertson, R. G. H.; Abgrall, N.; Bradley, A. W.; Chan, Y-D.; Mertens, S.; Poon, A. W. P.; Arnquist, I. J.; Hoppe, E. W.; Kouzes, R. T.; LaFerriere, B. D.; Orrell, J. L.; Avignone, F. T.; Baldenegro-Barrera, C. X.; Bertrand, F. E.; and others

    2015-08-17

    The MAJORANA Collaboration is constructing the MAJORANA DEMONSTRATOR, an ultra-low background, 40- kg modular HPGe detector array to search for neutrinoless double beta decay in {sup 76}Ge. In view of the next generation of tonne-scale Ge-based 0νβ β-decay searches that will probe the neutrino mass scale in the inverted-hierarchy region, a major goal of the MAJORANA DEMONSTRATOR is to demonstrate a path forward to achieving a background rate at or below 1 count/tonne/year in the 4 keV region of interest around the Q-value at 2039 keV. The background rejection techniques to be applied to the data include cuts based on data reduction, pulse shape analysis, event coincidences, and time correlations. The Point Contact design of the DEMONSTRATOR’s germanium detectors allows for significant reduction of gamma background.

  19. Microelectronics Failure Analysis Techniques. A Procedural Guide

    DTIC Science & Technology

    1980-01-01

    magnitude. Since then a number of round robin tests have been conducted under carefully controlled conditions, i.e., MIL-STD-883, Method 1018.2...components in general take second priority . This, in turn, re- sults in limited sizes and kinds of passive components available in monolithic form. Capacitors...the conductivity modulation within the high re- sistivity base region. For a given base bias VB , negligible anode current I. flows until the anode

  20. Gearbox diagnostics using wavelet-based windowing technique

    NASA Astrophysics Data System (ADS)

    Omar, F. K.; Gaouda, A. M.

    2009-08-01

    In extracting gear box acoustic signals embedded in excessive noise, the need for an online and automated tool becomes a crucial necessity. One of the recent approaches that have gained some acceptance within the research arena is the Wavelet multi-resolution analysis (WMRA). However selecting an accurate mother wavelet, defining dynamic threshold values and identifying the resolution levels to be considered in gearboxes fault detection and diagnosis are still challenging tasks. This paper proposes a novel wavelet-based technique for detecting, locating and estimating the severity of defects in gear tooth fracture. The proposed technique enhances the WMRA by decomposing the noisy data into different resolution levels while data sliding it into Kaiser's window. Only the maximum expansion coefficients at each resolution level are used in de-noising, detecting and measuring the severity of the defects. A small set of coefficients is used in the monitoring process without assigning threshold values or performing signal reconstruction. The proposed monitoring technique has been applied to a laboratory data corrupted with high noise level.

  1. Maintenance Audit through Value Analysis Technique: A Case Study

    NASA Astrophysics Data System (ADS)

    Carnero, M. C.; Delgado, S.

    2008-11-01

    The increase in competitiveness, technological changes and the increase in the requirements of quality and service have forced a change in the design and application of maintenance, as well as the way in which it is considered within the managerial strategy. There are numerous maintenance activities that must be developed in a service company. As a result the maintenance functions as a whole have to be outsourced. Nevertheless, delegating this subject to specialized personnel does not exempt the company from responsibilities, but rather leads to the need for control of each maintenance activity. In order to achieve this control and to evaluate the efficiency and effectiveness of the company it is essential to carry out an audit that diagnoses the problems that could develop. In this paper a maintenance audit applied to a service company is developed. The methodology applied is based on the expert systems. The expert system by means of rules uses the weighting technique SMART and value analysis to obtain the weighting between the decision functions and between the alternatives. The expert system applies numerous rules and relations between different variables associated with the specific maintenance functions, to obtain the maintenance state by sections and the general maintenance state of the enterprise. The contributions of this paper are related to the development of a maintenance audit in a service enterprise, in which maintenance is not generally considered a strategic subject and to the integration of decision-making tools such as the weighting technique SMART with value analysis techniques, typical in the design of new products, in the area of the rule-based expert systems.

  2. Comparison of Hydrogen Sulfide Analysis Techniques

    ERIC Educational Resources Information Center

    Bethea, Robert M.

    1973-01-01

    A summary and critique of common methods of hydrogen sulfide analysis is presented. Procedures described are: reflectance from silver plates and lead acetate-coated tiles, lead acetate and mercuric chloride paper tapes, sodium nitroprusside and methylene blue wet chemical methods, infrared spectrophotometry, and gas chromatography. (BL)

  3. Permethylation Linkage Analysis Techniques for Residual Carbohydrates

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Permethylation analysis is the classic approach to establishing the position of glycosidic linkages between sugar residues. Typically, the carbohydrate is derivatized to form acid-stable methyl ethers, hydrolyzed, peracetylated, and analyzed by gas chromatography-mass spectrometry (GC-MS). The pos...

  4. Accelerator based techniques for contraband detection

    NASA Astrophysics Data System (ADS)

    Vourvopoulos, George

    1994-05-01

    It has been shown that narcotics, explosives, and other contraband materials, contain various chemical elements such as H, C, N, O, P, S, and Cl in quantities and ratios that differentiate them from each other and from other innocuous substances. Neutrons and γ-rays have the ability to penetrate through various materials at large depths. They are thus able, in a non-intrusive way, to interrogate volumes ranging from suitcases to Sea-Land containers, and have the ability to image the object with an appreciable degree of reliability. Neutron induced reactions such as (n, γ), (n, n') (n, p) or proton induced γ-resonance absorption are some of the reactions currently investigated for the identification of the chemical elements mentioned above. Various DC and pulsed techniques are discussed and their advantages, characteristics, and current progress are shown. Areas where use of these methods is currently under evaluation are detection of hidden explosives, illicit drug interdiction, chemical war agents identification, nuclear waste assay, nuclear weapons destruction and others.

  5. A comparison of wavelet analysis techniques in digital holograms

    NASA Astrophysics Data System (ADS)

    Molony, Karen M.; Maycock, Jonathan; McDonald, John B.; Hennelly, Bryan M.; Naughton, Thomas J.

    2008-04-01

    This study explores the effectiveness of wavelet analysis techniques on digital holograms of real-world 3D objects. Stationary and discrete wavelet transform techniques have been applied for noise reduction and compared. Noise is a common problem in image analysis and successful reduction of noise without degradation of content is difficult to achieve. These wavelet transform denoising techniques are contrasted with traditional noise reduction techniques; mean filtering, median filtering, Fourier filtering. The different approaches are compared in terms of speckle reduction, edge preservation and resolution preservation.

  6. Autoregressive and bispectral analysis techniques: EEG applications.

    PubMed

    Ning, T; Bronzino, J D

    1990-01-01

    Some basic properties of autoregressive (AR) modeling and bispectral analysis are reviewed, and examples of their application in electroencephalography (EEG) research are provided. A second-order AR model was used to score cortical EEGs in order. In tests performed on five adult rats to distinguish between different vigilance states such a quiet-waking (QW), rapid-eye-movement (REM), and slow-wave sleep (SWS), SWS activity was correctly identified over 96% of the time, and a 95% agreement rate was achieved in recognizing the REM sleep stage. In a bispectral analysis of the rat EEG, third-order cumulant (TOC) sequences of 32 epochs belonging to the same vigilance state were estimated and then averaged. Preliminary results have shown that bispectra of hippocampal EEGs during REM Sleep exhibit significant quadratic phase couplings between frequencies in the 6-8-Hz range, associated with the theta rhythm.

  7. Envelopment technique and topographic overlays in bite mark analysis

    PubMed Central

    Djeapragassam, Parimala; Daniel, Mariappan Jonathan; Srinivasan, Subramanian Vasudevan; Ramadoss, Koliyan; Jimsha, Vannathan Kumaran

    2015-01-01

    Aims and Objectives: The aims and objectives of our study were to compare four sequential overlays generated using the envelopment technique and to evaluate inter- and intraoperator reliability of the overlays obtained by the envelopment technique. Materials and Methods: Dental stone models were prepared from impressions made from healthy individuals; photographs were taken and computer-assisted overlays were generated. The models were then enveloped in a different-color dental stone. After this, four sequential cuts were made at a thickness of 1mm each. Each sectional cut was photographed and overlays were generated. Thus, 125 overlays were generated and compared. Results: The scoring was done based on matching accuracy and the data were analyzed. The Kruskal-Wallis one-way analysis of variance (ANOVA) test was used to compare four sequential overlays and Spearman's rank correlation tests were used to evaluate the inter- and intraoperator reliability of the overlays obtained by the envelopment technique. Conclusion: Through our study, we conclude that the third and fourth cuts were the best among the four cuts and inter- and intraoperator reliability were found to be statistically significant at 5% level that is 95% confidence interval (P < 0.05). PMID:26816458

  8. [Applicability of laser-based geological techniques in bone research: analysis of calcium oxide distribution in thin-cut animal bones].

    PubMed

    Andrássy, László; Maros, Gyula; Kovács, István János; Horváth, Ágnes; Gulyás, Katalin; Bertalan, Éva; Besnyi, Anikó; Füri, Judit; Fancsik, Tamás; Szekanecz, Zoltán; Bhattoa, Harjit Pal

    2014-11-09

    The structural similarities between the inorganic component of bone tissue and geological formations make it possible that mathematic models may be used to determine weight percentage composition of different mineral element oxides constituting the inorganic component of bone tissue. The determined weight percentage composition can be verified with the determination of element oxide concentration values by laser induced plasma spectroscopy and inductively coupled plasma optical emission spectrometry. It can be concluded from calculated weight percentage composition of the inorganic component of bone tissue and laboratory analyses that the properties of bone tissue are determined primarily by hydroxylapatite. The inorganic bone structure can be studied well by determining the calcium oxide concentration distribution using the laser induced plasma spectroscopy technique. In the present study, thin polished bone slides prepared from male bovine tibia were examined with laser induced plasma spectroscopy in a regular network and combined sampling system to derive the calculated calcium oxide concentration distribution. The superficial calcium oxide concentration distribution, as supported by "frequency distribution" curves, can be categorized into a number of groups. This, as such, helps in clearly demarcating the cortical and trabecular bone structures. Following analyses of bovine tibial bone, the authors found a positive association between the attenuation value, as determined by quantitative computer tomography and the "ρ" density, as used in geology. Furthermore, the calculated "ρ" density and the measured average calcium oxide concentration values showed inverse correlation.

  9. Multiwavelet-transform-based image compression techniques

    NASA Astrophysics Data System (ADS)

    Rao, Sathyanarayana S.; Yoon, Sung H.; Shenoy, Deepak

    1996-10-01

    Multiwavelet transforms are a new class of wavelet transforms that use more than one prototype scaling function and wavelet in the multiresolution analysis/synthesis. The popular Geronimo-Hardin-Massopust multiwavelet basis functions have properties of compact support, orthogonality, and symmetry which cannot be obtained simultaneously in scalar wavelets. The performance of multiwavelets in still image compression is studied using vector quantization of multiwavelet subbands with a multiresolution codebook. The coding gain of multiwavelets is compared with that of other well-known wavelet families using performance measures such as unified coding gain. Implementation aspects of multiwavelet transforms such as pre-filtering/post-filtering and symmetric extension are also considered in the context of image compression.

  10. Vortex metrology using Fourier analysis techniques: vortex networks correlation fringes.

    PubMed

    Angel-Toro, Luciano; Sierra-Sosa, Daniel; Tebaldi, Myrian; Bolognini, Néstor

    2012-10-20

    In this work, we introduce an alternative method of analysis in vortex metrology based on the application of the Fourier optics techniques. The first part of the procedure is conducted as is usual in vortex metrology for uniform in-plane displacement determination. On the basis of two recorded intensity speckled distributions, corresponding to two states of a diffuser coherently illuminated, we numerically generate an analytical signal from each recorded intensity pattern by using a version of the Riesz integral transform. Then, from each analytical signal, a two-dimensional pseudophase map is generated in which the vortices are located and characterized in terms of their topological charges and their core's structural properties. The second part of the procedure allows obtaining Young's interference fringes when Fourier transforming the light passing through a diffracting mask with multiple apertures at the locations of the homologous vortices. In fact, we use the Fourier transform as a mathematical operation to compute the far-field diffraction intensity pattern corresponding to the multiaperture set. Each aperture from the set is associated with a rectangular hole that coincides both in shape and size with a pixel from recorded images. We show that the fringe analysis can be conducted as in speckle photography in an extended range of displacement measurements. Effects related with speckled decorrelation are also considered. Our experimental results agree with those of speckle photography in the range in which both techniques are applicable.

  11. Techniques of DNA methylation analysis with nutritional applications.

    PubMed

    Mansego, Maria L; Milagro, Fermín I; Campión, Javier; Martínez, J Alfredo

    2013-01-01

    Epigenetic mechanisms are likely to play an important role in the regulation of metabolism and body weight through gene-nutrient interactions. This review focuses on methods for analyzing one of the most important epigenetic mechanisms, DNA methylation, from single nucleotide to global measurement depending on the study goal and scope. In addition, this study highlights the major principles and methods for DNA methylation analysis with emphasis on nutritional applications. Recent developments concerning epigenetic technologies are showing promising results of DNA methylation levels at a single-base resolution and provide the ability to differentiate between 5-methylcytosine and other nucleotide modifications such as 5-hydroxymethylcytosine. A large number of methods can be used for the analysis of DNA methylation such as pyrosequencing™, primer extension or real-time PCR methods, and genome-wide DNA methylation profile from microarray or sequencing-based methods. Researchers should conduct a preliminary analysis focused on the type of validation and information provided by each technique in order to select the best method fitting for their nutritional research interests.

  12. Techniques for geothermal liquid sampling and analysis

    SciTech Connect

    Kindle, C.H.; Woodruff, E.M.

    1981-07-01

    A methodology has been developed that is particularly suited to liquid-dominated resources and adaptable to a variety of situations. It is intended to be a base methodology upon which variations can be made to meet specific needs or situations. The approach consists of recording flow conditions at the time of sampling, a specific insertable probe sampling system, a sample stabilization procedure, commercially available laboratory instruments, and data quality check procedures.

  13. Diesel Combustion Analysis Using Rapid Sampling Techniques.

    DTIC Science & Technology

    1982-08-01

    fuel injection is stopped and the engine valves are deactivated. The quench tank, after a dump, contains 80 to 90 percent of the cylinder gas. This...and was based on a I.H.C. four valve prototype head. This head was operated with only two valves using specially designed valve gear. The other two... valve locations were modified to become instrumentation ports. A number of factors caused this first modified head to fall short of the research needs

  14. Speckle-adaptive VISAR fringe analysis technique

    NASA Astrophysics Data System (ADS)

    Erskine, David

    2017-01-01

    A line-VISAR (velocity interferometer) is an important diagnostic in shock physics, simultaneously measuring many fringe histories of adjacent portions of a target splayed along a line on a target, with fringes recorded vs time and space by a streak camera. Due to laser speckle the reflected intensity may be uneven spatially, and due to irregularities in the streak camera electron optics the phase along the slit may be slightly nonlinear. Conventional fringe analysis algorithms which do not properly model these variations can suffer from inferred velocity errors. A speckle-adaptive algorithm has been developed which senses the interferometer and illumination parameters for each individual row (spatial position Y) of the 2d interferogram, so that the interferogram can be compensated for Y-dependent nonfringing intensity, fringe visibility, and nonlinear phase distribution. In numerical simulations and on actual data we have found this individual row-by-row modeling improves the accuracy of the result, compared to a conventional column-by-column analysis approach.

  15. FDI and Accommodation Using NN Based Techniques

    NASA Astrophysics Data System (ADS)

    Garcia, Ramon Ferreiro; de Miguel Catoira, Alberto; Sanz, Beatriz Ferreiro

    Massive application of dynamic backpropagation neural networks is used on closed loop control FDI (fault detection and isolation) tasks. The process dynamics is mapped by means of a trained backpropagation NN to be applied on residual generation. Process supervision is then applied to discriminate faults on process sensors, and process plant parameters. A rule based expert system is used to implement the decision making task and the corresponding solution in terms of faults accommodation and/or reconfiguration. Results show an efficient and robust FDI system which could be used as the core of an SCADA or alternatively as a complement supervision tool operating in parallel with the SCADA when applied on a heat exchanger.

  16. [The test system to identify mucin MUC1 in human blood serum using the technique of immune-enzyme analysis based on monoclonal antibody ICO25].

    PubMed

    Karmakova, T A; Vorontsova, M S; Skripnik, V V; Bezborodova, O A; Iakubovskaia, R I

    2012-02-01

    On the basis of genuine mouse monoclonal antibody ICO25 the test system IEA ICO25 was developed and standardized to quantitative detect tumor-associated antigen, mucin1 in human blood serum in format of inhibitory immune-enzyme analysis. The analytic characteristics of test-system correspond to the standards applied to immune-enzyme diagnostic kits. The results of identification of MUC1 in blood serum of healthy donors and female patients with breast pathology using IEA ICO25 fully correlate with the data concerning the detection of antigen CA15-3 using certified commercial kits. The test system IEA ICO25 can be used to detect MUC1 in human blood serum for research purpose.

  17. Comparison Of Four FFT-Based Frequency-Acquisition Techniques

    NASA Technical Reports Server (NTRS)

    Shah, Biren N.; Hinedi, Sami M.; Holmes, Jack K.

    1993-01-01

    Report presents comparative theoretical analysis of four conceptual techniques for initial estimation of carrier frequency of suppressed-carrier, binary-phase-shift-keyed radio signal. Each technique effected by open-loop analog/digital signal-processing subsystem part of Costas-loop phase-error detector functioning in closed-loop manner overall.

  18. Validation of Design and Analysis Techniques of Tailored Composite Structures

    NASA Technical Reports Server (NTRS)

    Jegley, Dawn C. (Technical Monitor); Wijayratne, Dulnath D.

    2004-01-01

    Aeroelasticity is the relationship between the elasticity of an aircraft structure and its aerodynamics. This relationship can cause instabilities such as flutter in a wing. Engineers have long studied aeroelasticity to ensure such instabilities do not become a problem within normal operating conditions. In recent decades structural tailoring has been used to take advantage of aeroelasticity. It is possible to tailor an aircraft structure to respond favorably to multiple different flight regimes such as takeoff, landing, cruise, 2-g pull up, etc. Structures can be designed so that these responses provide an aerodynamic advantage. This research investigates the ability to design and analyze tailored structures made from filamentary composites. Specifically the accuracy of tailored composite analysis must be verified if this design technique is to become feasible. To pursue this idea, a validation experiment has been performed on a small-scale filamentary composite wing box. The box is tailored such that its cover panels induce a global bend-twist coupling under an applied load. Two types of analysis were chosen for the experiment. The first is a closed form analysis based on a theoretical model of a single cell tailored box beam and the second is a finite element analysis. The predicted results are compared with the measured data to validate the analyses. The comparison of results show that the finite element analysis is capable of predicting displacements and strains to within 10% on the small-scale structure. The closed form code is consistently able to predict the wing box bending to 25% of the measured value. This error is expected due to simplifying assumptions in the closed form analysis. Differences between the closed form code representation and the wing box specimen caused large errors in the twist prediction. The closed form analysis prediction of twist has not been validated from this test.

  19. Improvement of Rocket Engine Plume Analysis Techniques

    NASA Technical Reports Server (NTRS)

    Smith, S. D.

    1982-01-01

    A nozzle plume flow field code was developed. The RAMP code which was chosen as the basic code is of modular construction and has the following capabilities: two phase with two phase transonic solution; a two phase, reacting gas (chemical equilibrium reaction kinetics), supersonic inviscid nozzle/plume solution; and is operational for inviscid solutions at both high and low altitudes. The following capabilities were added to the code: a direct interface with JANNAF SPF code; shock capturing finite difference numerical operator; two phase, equilibrium/frozen, boundary layer analysis; a variable oxidizer to fuel ratio transonic solution; an improved two phase transonic solution; and a two phase real gas semiempirical nozzle boundary layer expansion.

  20. Evaluation of energy system analysis techniques for identifying underground facilities

    SciTech Connect

    VanKuiken, J.C.; Kavicky, J.A.; Portante, E.C.

    1996-03-01

    This report describes the results of a study to determine the feasibility and potential usefulness of applying energy system analysis techniques to help detect and characterize underground facilities that could be used for clandestine activities. Four off-the-shelf energy system modeling tools were considered: (1) ENPEP (Energy and Power Evaluation Program) - a total energy system supply/demand model, (2) ICARUS (Investigation of Costs and Reliability in Utility Systems) - an electric utility system dispatching (or production cost and reliability) model, (3) SMN (Spot Market Network) - an aggregate electric power transmission network model, and (4) PECO/LF (Philadelphia Electric Company/Load Flow) - a detailed electricity load flow model. For the purposes of most of this work, underground facilities were assumed to consume about 500 kW to 3 MW of electricity. For some of the work, facilities as large as 10-20 MW were considered. The analysis of each model was conducted in three stages: data evaluation, base-case analysis, and comparative case analysis. For ENPEP and ICARUS, open source data from Pakistan were used for the evaluations. For SMN and PECO/LF, the country data were not readily available, so data for the state of Arizona were used to test the general concept.

  1. Energy minimization versus pseudo force technique for nonlinear structural analysis

    NASA Technical Reports Server (NTRS)

    Kamat, M. P.; Hayduk, R. J.

    1980-01-01

    The effectiveness of using minimization techniques for the solution of nonlinear structural analysis problems is discussed and demonstrated by comparison with the conventional pseudo force technique. The comparison involves nonlinear problems with a relatively few degrees of freedom. A survey of the state-of-the-art of algorithms for unconstrained minimization reveals that extension of the technique to large scale nonlinear systems is possible.

  2. PVUSA instrumentation and data analysis techniques for photovoltaic systems

    SciTech Connect

    Newmiller, J.; Hutchinson, P.; Townsend, T.; Whitaker, C.

    1995-10-01

    The Photovoltaics for Utility Scale Applications (PVUSA) project tests two types of PV systems at the main test site in Davis, California: new module technologies fielded as 20-kW Emerging Module Technology (EMT) arrays and more mature technologies fielded as 70- to 500-kW turnkey Utility-Scale (US) systems. PVUSA members have also installed systems in their service areas. Designed appropriately, data acquisition systems (DASs) can be a convenient and reliable means of assessing system performance, value, and health. Improperly designed, they can be complicated, difficult to use and maintain, and provide data of questionable validity. This report documents PVUSA PV system instrumentation and data analysis techniques and lessons learned. The report is intended to assist utility engineers, PV system designers, and project managers in establishing an objective, then, through a logical series of topics, facilitate selection and design of a DAS to meet the objective. Report sections include Performance Reporting Objectives (including operational versus research DAS), Recommended Measurements, Measurement Techniques, Calibration Issues, and Data Processing and Analysis Techniques. Conclusions and recommendations based on the several years of operation and performance monitoring are offered. This report is one in a series of 1994--1995 PVUSA reports documenting PVUSA lessons learned at the demonstration sites in Davis and Kerman, California. Other topical reports address: five-year assessment of EMTs; validation of the Kerman 500-kW grid support PV plant benefits; construction and safety experience in installing and operating PV systems; balance-of-system design and costs; procurement, acceptance, and rating practices for PV power plants; experience with power conditioning units and power quality.

  3. 48 CFR 215.404-1 - Proposal analysis techniques.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... techniques. 215.404-1 Section 215.404-1 Federal Acquisition Regulations System DEFENSE ACQUISITION REGULATIONS SYSTEM, DEPARTMENT OF DEFENSE CONTRACTING METHODS AND CONTRACT TYPES CONTRACTING BY NEGOTIATION Contract Pricing 215.404-1 Proposal analysis techniques. (1) Follow the procedures at PGI 215.404-1...

  4. Advanced NMR-based techniques for pore structure analysis of coal. Quarterly report No. 10, January 1, 1994--March 31, 1994

    SciTech Connect

    Smith, D.M.

    1994-06-01

    In the present quarter, results from {sup 129}Xe NMR experiments were made available that allowed the determination of the mean free path of a Xenon molecule within the pores of the material. The chemical shift at various loadings of Xenon was determined and the shift at zero loading was obtained by extrapolating the data to zero pressure. At zero loading, the collisions suffered by a Xenon molecule can be regarded as being entirely with the pore walls, since the concentration of Xenon molecules in the system is very low. Thus, the mean free path {lambda} is a measure of the distance travelled by a Xenon molecule before colliding with a wall, and hence is also a measure of the pore dimension. SAXS data reported in previous quarters gave the average radius of gyration R{sub g} which is also a measure of the average dimension of the pores of the material. In addition, application of the potential theory to the CO{sub 2} (274 K) adsorption data allowed the determination of a characteristic adsorption potential E, which is inversely proportional to the width of the pore. Thus, E should correlate inversely with the mean free path {lambda} as determined using {sup 129}Xe NMR. Also, E should correlate inversely with the radius of gyration R{sub g} from SAXS experiments. Another parameter obtained by analysis of the CO{sub 2} (274 K) adsorption data is the exponent n in the Dubinin-Astakhov equation. We had shown in previous quarters that this is a measure of the heterogeneity of the material.

  5. A dried blood spots technique based LC-MS/MS method for the analysis of posaconazole in human whole blood samples.

    PubMed

    Reddy, Todime M; Tama, Cristina I; Hayes, Roger N

    2011-11-15

    A rugged and robust liquid chromatographic tandem mass spectrometric (LC-MS/MS) method utilizing dried blood spots (DBS) was developed and validated for the analysis of posaconazole in human whole blood. Posaconazole fortified blood samples were spotted (15 μL) onto Ahlstrom Alh-226 DBS cards and dried for at least 2h. Punched spots were then extracted by using a mixture of acetonitrile and water containing stable labeled internal standard (IS). Posaconazole and its IS were separated from endogenous matrix components on a Kinetex™ C18 column under gradient conditions with a mobile phase A consisting of 0.1% formic acid and a mobile phase B consisting of 0.1% formic acid in acetonitrile/methanol (70/30, v/v). The analyte and IS were detected using a Sciex API 4000 triple quadrupole LC-MS/MS system equipped with a TurboIonSpray™ source operated in the positive ion mode. The assay was linear over the concentration range of 5-5000 ng/mL. The inter-run accuracy and precision of the assay were -1.8% to 0.8% and 4.0% to 10.4%, respectively. Additional assessments unique to DBS were investigated including sample spot homogeneity, spot volume, and hematocrit. Blood spot homogeneity was maintained and accurate and precise quantitation results were obtained when using a blood spot volume of between 15 and 35 μL. Human blood samples with hematocrit values ranging between 25% and 41% gave acceptable quantitation results. The validation results indicate that the method is accurate, precise, sensitive, selective and reproducible.

  6. Chaos based Analytical techniques for daily extreme hydrological observations

    NASA Astrophysics Data System (ADS)

    Ng, W. W.; Panu, U. S.; Lennox, W. C.

    2007-08-01

    SummaryThe existence of outliers in data sets affects the decision-making process related to design, operation, and management of water resources. Insufficient information on outliers limits our understanding and predictive ability of such extreme hydrologic phenomena. Hydrologic systems are complex and dynamic in nature where current state and future evolutions depend on numerous physical variables and their interactions. Such systems can be represented in a simplified form through chaotic approach. Chaotic approach can determine the level of complexity of a system that provides the required information and parameters for subsequent predictive analyses. This research focuses on the application of chaotic analytical techniques to daily hydrologic series comprising of outliers. Different techniques and concepts of chaotic theory are adopted to enhance our understanding of the phenomena of outliers. Employing the streamflow data of the Saugeen River in Ontario, Canada, this paper illustrates the use of the autocorrelation functions, mutual information, power spectrum analysis, phase space reconstruction, correlation dimension, surrogate tests, and Hurst coefficients for the analysis of chaotic systems. Based on the results of analyses, one can arrive at the following conclusions: (1) The analyzed series exhibited random-like fluctuations. However, by rejecting the hypothesis of a random process, the analyzed series were found to be non-random. (2) The existence of outliers was found to increase the complexity of the analyzed series. High embedding dimensionalities obtained from the correlation analysis of the analyzed series support our conclusion. (3) The differentiation of a highly complex system from a random process, and the impact of outliers on the complexity of a system were quantitatively as well as visually presented from a chaotic perspective.

  7. On the factors governing water vapor turbulence mixing in the convective boundary layer over land: Concept and data analysis technique using ground-based lidar measurements.

    PubMed

    Pal, Sandip

    2016-06-01

    The convective boundary layer (CBL) turbulence is the key process for exchanging heat, momentum, moisture and trace gases between the earth's surface and the lower part of the troposphere. The turbulence parameterization of the CBL is a challenging but important component in numerical models. In particular, correct estimation of CBL turbulence features, parameterization, and the determination of the contribution of eddy diffusivity are important for simulating convection initiation, and the dispersion of health hazardous air pollutants and Greenhouse gases. In general, measurements of higher-order moments of water vapor mixing ratio (q) variability yield unique estimates of turbulence in the CBL. Using the high-resolution lidar-derived profiles of q variance, third-order moment, and skewness and analyzing concurrent profiles of vertical velocity, potential temperature, horizontal wind and time series of near-surface measurements of surface flux and meteorological parameters, a conceptual framework based on bottom up approach is proposed here for the first time for a robust characterization of the turbulent structure of CBL over land so that our understanding on the processes governing CBL q turbulence could be improved. Finally, principal component analyses will be applied on the lidar-derived long-term data sets of q turbulence statistics to identify the meteorological factors and the dominant physical mechanisms governing the CBL turbulence features.

  8. Comparative analysis of NDE techniques with image processing

    NASA Astrophysics Data System (ADS)

    Rathod, Vijay R.; Anand, R. S.; Ashok, Alaknanda

    2012-12-01

    The paper reports comparative results of nondestructive testing (NDT) based experimentation done on created flaws in the casting at the Central Foundry Forge Plant (CFFP) of Bharat Heavy Electrical Ltd. India (BHEL). The present experimental study is aimed at comparing the evaluation of image processing methods applied on the radiographic images of welding defects such as slag inclusion, porosity, lack-of-root penetration and cracks with other NDT methods. Different image segmentation techniques have been proposed here for identifying the above created welding defects. Currently, there is a large amount of research work going on in the field of automated system for inspection, analysis and detection of flaws in the weldments. Comparison of other NDT methods and application of image processing on the radiographic images of weld defects are aimed to detect defects reliably and to make accept/reject decisions as per the international standard.

  9. Hyperspectral-imaging-based techniques applied to wheat kernels characterization

    NASA Astrophysics Data System (ADS)

    Serranti, Silvia; Cesare, Daniela; Bonifazi, Giuseppe

    2012-05-01

    Single kernels of durum wheat have been analyzed by hyperspectral imaging (HSI). Such an approach is based on the utilization of an integrated hardware and software architecture able to digitally capture and handle spectra as an image sequence, as they results along a pre-defined alignment on a surface sample properly energized. The study was addressed to investigate the possibility to apply HSI techniques for classification of different types of wheat kernels: vitreous, yellow berry and fusarium-damaged. Reflectance spectra of selected wheat kernels of the three typologies have been acquired by a laboratory device equipped with an HSI system working in near infrared field (1000-1700 nm). The hypercubes were analyzed applying principal component analysis (PCA) to reduce the high dimensionality of data and for selecting some effective wavelengths. Partial least squares discriminant analysis (PLS-DA) was applied for classification of the three wheat typologies. The study demonstrated that good classification results were obtained not only considering the entire investigated wavelength range, but also selecting only four optimal wavelengths (1104, 1384, 1454 and 1650 nm) out of 121. The developed procedures based on HSI can be utilized for quality control purposes or for the definition of innovative sorting logics of wheat.

  10. Introducing Risk Management Techniques Within Project Based Software Engineering Courses

    NASA Astrophysics Data System (ADS)

    Port, Daniel; Boehm, Barry

    2002-03-01

    In 1996, USC switched its core two-semester software engineering course from a hypothetical-project, homework-and-exam course based on the Bloom taxonomy of educational objectives (knowledge, comprehension, application, analysis, synthesis, and evaluation). The revised course is a real-client team-project course based on the CRESST model of learning objectives (content understanding, problem solving, collaboration, communication, and self-regulation). We used the CRESST cognitive demands analysis to determine the necessary student skills required for software risk management and the other major project activities, and have been refining the approach over the last 5 years of experience, including revised versions for one-semester undergraduate and graduate project course at Columbia. This paper summarizes our experiences in evolving the risk management aspects of the project course. These have helped us mature more general techniques such as risk-driven specifications, domain-specific simplifier and complicator lists, and the schedule as an independent variable (SAIV) process model. The largely positive results in terms of review of pass / fail rates, client evaluations, product adoption rates, and hiring manager feedback are summarized as well.

  11. Hand-Based Biometric Analysis

    NASA Technical Reports Server (NTRS)

    Bebis, George (Inventor); Amayeh, Gholamreza (Inventor)

    2015-01-01

    Hand-based biometric analysis systems and techniques are described which provide robust hand-based identification and verification. An image of a hand is obtained, which is then segmented into a palm region and separate finger regions. Acquisition of the image is performed without requiring particular orientation or placement restrictions. Segmentation is performed without the use of reference points on the images. Each segment is analyzed by calculating a set of Zernike moment descriptors for the segment. The feature parameters thus obtained are then fused and compared to stored sets of descriptors in enrollment templates to arrive at an identity decision. By using Zernike moments, and through additional manipulation, the biometric analysis is invariant to rotation, scale, or translation or an in put image. Additionally, the analysis utilizes re-use of commonly-seen terms in Zernike calculations to achieve additional efficiencies over traditional Zernike moment calculation.

  12. Hand-Based Biometric Analysis

    NASA Technical Reports Server (NTRS)

    Bebis, George

    2013-01-01

    Hand-based biometric analysis systems and techniques provide robust hand-based identification and verification. An image of a hand is obtained, which is then segmented into a palm region and separate finger regions. Acquisition of the image is performed without requiring particular orientation or placement restrictions. Segmentation is performed without the use of reference points on the images. Each segment is analyzed by calculating a set of Zernike moment descriptors for the segment. The feature parameters thus obtained are then fused and compared to stored sets of descriptors in enrollment templates to arrive at an identity decision. By using Zernike moments, and through additional manipulation, the biometric analysis is invariant to rotation, scale, or translation or an input image. Additionally, the analysis uses re-use of commonly seen terms in Zernike calculations to achieve additional efficiencies over traditional Zernike moment calculation.

  13. A Word-Based Compression Technique for Text Files.

    ERIC Educational Resources Information Center

    Vernor, Russel L., III; Weiss, Stephen F.

    1978-01-01

    Presents a word-based technique for storing natural language text in compact form. The compressed text consists of a dictionary and a text that is a combination of actual running text and pointers to the dictionary. This technique has shown itself to be effective for both text storage and retrieval. (VT)

  14. Comparison of laser transit anemometry data analysis techniques

    NASA Technical Reports Server (NTRS)

    Humphreys, William M., Jr.; Gartrell, Luther R.

    1991-01-01

    Two techniques for the extraction of two-dimensional flow information from laser transit anemometry (LTA) data sets are presented and compared via a simulation study and experimental investigation. The methods are a probability density function (PDF) estimation technique and a marginal distribution analysis technique. The simulation study builds on the results of previous work and provides a quantification of the accuracy of both techniques for various LTA data acquisition scenarios. The experimental comparison consists of using an LTA system to survey the flow downstream of a turbulence generator in a small low-speed wind tunnel. The collected data sets are analyzed and compared.

  15. Model-checking techniques based on cumulative residuals.

    PubMed

    Lin, D Y; Wei, L J; Ying, Z

    2002-03-01

    Residuals have long been used for graphical and numerical examinations of the adequacy of regression models. Conventional residual analysis based on the plots of raw residuals or their smoothed curves is highly subjective, whereas most numerical goodness-of-fit tests provide little information about the nature of model misspecification. In this paper, we develop objective and informative model-checking techniques by taking the cumulative sums of residuals over certain coordinates (e.g., covariates or fitted values) or by considering some related aggregates of residuals, such as moving sums and moving averages. For a variety of statistical models and data structures, including generalized linear models with independent or dependent observations, the distributions of these stochastic processes tinder the assumed model can be approximated by the distributions of certain zero-mean Gaussian processes whose realizations can be easily generated by computer simulation. Each observed process can then be compared, both graphically and numerically, with a number of realizations from the Gaussian process. Such comparisons enable one to assess objectively whether a trend seen in a residual plot reflects model misspecification or natural variation. The proposed techniques are particularly useful in checking the functional form of a covariate and the link function. Illustrations with several medical studies are provided.

  16. Investigations on landmine detection by neutron-based techniques.

    PubMed

    Csikai, J; Dóczi, R; Király, B

    2004-07-01

    Principles and techniques of some neutron-based methods used to identify the antipersonnel landmines (APMs) are discussed. New results have been achieved in the field of neutron reflection, transmission, scattering and reaction techniques. Some conclusions are as follows: The neutron hand-held detector is suitable for the observation of anomaly caused by a DLM2-like sample in different soils with a scanning speed of 1m(2)/1.5 min; the reflection cross section of thermal neutrons rendered the determination of equivalent thickness of different soil components possible; a simple method was developed for the determination of the thermal neutron flux perturbation factor needed for multi-elemental analysis of bulky samples; unfolded spectra of elastically backscattered neutrons using broad-spectrum sources render the identification of APMs possible; the knowledge of leakage spectra of different source neutrons is indispensable for the determination of the differential and integrated reaction rates and through it the dimension of the interrogated volume; the precise determination of the C/O atom fraction requires the investigations on the angular distribution of the 6.13MeV gamma-ray emitted in the (16)O(n,n'gamma) reaction. These results, in addition to the identification of landmines, render the improvement of the non-intrusive neutron methods possible.

  17. Techniques for the Analysis of Extracellular Vesicles Using Flow Cytometry

    PubMed Central

    Inglis, Heather; Norris, Philip; Danesh, Ali

    2015-01-01

    Extracellular Vesicles (EVs) are small, membrane-derived vesicles found in bodily fluids that are highly involved in cell-cell communication and help regulate a diverse range of biological processes. Analysis of EVs using flow cytometry (FCM) has been notoriously difficult due to their small size and lack of discrete populations positive for markers of interest. Methods for EV analysis, while considerably improved over the last decade, are still a work in progress. Unfortunately, there is no one-size-fits-all protocol, and several aspects must be considered when determining the most appropriate method to use. Presented here are several different techniques for processing EVs and two protocols for analyzing EVs using either individual detection or a bead-based approach. The methods described here will assist with eliminating the antibody aggregates commonly found in commercial preparations, increasing signal–to-noise ratio, and setting gates in a rational fashion that minimizes detection of background fluorescence. The first protocol uses an individual detection method that is especially well suited for analyzing a high volume of clinical samples, while the second protocol uses a bead-based approach to capture and detect smaller EVs and exosomes. PMID:25867010

  18. Specimen preparation and image processing and analysis techniques for automated quantification of concrete microcracks and voids

    SciTech Connect

    Soroushian, Parviz; Elzafraney, Mohamed; Nossoni, Ali

    2003-12-01

    Specimen preparation and image processing/analysis techniques were developed for use in automated quantitative microstructural investigation of concrete, focusing on concrete microcracks and voids. Different specimen preparation techniques were developed for use in fluorescent and scanning electron microscopy (SEM) of concrete; then techniques produce a sharp contrast between microcracks/voids and the body of concrete. The image processing/analysis techniques developed specifically for use with concrete address the following usages: automatic threshold; development of intersecting microcracks/voids and connected voids; distinction of microcracks form voids based on geometric attributes; and noise filtration.

  19. A New MRI-Based Pediatric Subcortical Segmentation Technique (PSST).

    PubMed

    Loh, Wai Yen; Connelly, Alan; Cheong, Jeanie L Y; Spittle, Alicia J; Chen, Jian; Adamson, Christopher; Ahmadzai, Zohra M; Fam, Lillian Gabra; Rees, Sandra; Lee, Katherine J; Doyle, Lex W; Anderson, Peter J; Thompson, Deanne K

    2016-01-01

    Volumetric and morphometric neuroimaging studies of the basal ganglia and thalamus in pediatric populations have utilized existing automated segmentation tools including FIRST (Functional Magnetic Resonance Imaging of the Brain's Integrated Registration and Segmentation Tool) and FreeSurfer. These segmentation packages, however, are mostly based on adult training data. Given that there are marked differences between the pediatric and adult brain, it is likely an age-specific segmentation technique will produce more accurate segmentation results. In this study, we describe a new automated segmentation technique for analysis of 7-year-old basal ganglia and thalamus, called Pediatric Subcortical Segmentation Technique (PSST). PSST consists of a probabilistic 7-year-old subcortical gray matter atlas (accumbens, caudate, pallidum, putamen and thalamus) combined with a customized segmentation pipeline using existing tools: ANTs (Advanced Normalization Tools) and SPM (Statistical Parametric Mapping). The segmentation accuracy of PSST in 7-year-old data was compared against FIRST and FreeSurfer, relative to manual segmentation as the ground truth, utilizing spatial overlap (Dice's coefficient), volume correlation (intraclass correlation coefficient, ICC) and limits of agreement (Bland-Altman plots). PSST achieved spatial overlap scores ≥90% and ICC scores ≥0.77 when compared with manual segmentation, for all structures except the accumbens. Compared with FIRST and FreeSurfer, PSST showed higher spatial overlap (p FDR  < 0.05) and ICC scores, with less volumetric bias according to Bland-Altman plots. PSST is a customized segmentation pipeline with an age-specific atlas that accurately segments typical and atypical basal ganglia and thalami at age 7 years, and has the potential to be applied to other pediatric datasets.

  20. Evolution of Electroencephalogram Signal Analysis Techniques during Anesthesia

    PubMed Central

    Al-Kadi, Mahmoud I.; Reaz, Mamun Bin Ibne; Ali, Mohd Alauddin Mohd

    2013-01-01

    Biosignal analysis is one of the most important topics that researchers have tried to develop during the last century to understand numerous human diseases. Electroencephalograms (EEGs) are one of the techniques which provides an electrical representation of biosignals that reflect changes in the activity of the human brain. Monitoring the levels of anesthesia is a very important subject, which has been proposed to avoid both patient awareness caused by inadequate dosage of anesthetic drugs and excessive use of anesthesia during surgery. This article reviews the bases of these techniques and their development within the last decades and provides a synopsis of the relevant methodologies and algorithms that are used to analyze EEG signals. In addition, it aims to present some of the physiological background of the EEG signal, developments in EEG signal processing, and the effective methods used to remove various types of noise. This review will hopefully increase efforts to develop methods that use EEG signals for determining and classifying the depth of anesthesia with a high data rate to produce a flexible and reliable detection device. PMID:23686141

  1. Separation/Preconcentration Techniques for Rare Earth Elements Analysis

    NASA Astrophysics Data System (ADS)

    Hu, Bin; He, Man; Chen, Beibei; Jiang, Zucheng

    2016-10-01

    The main aim of this chapter exactly characterizes the contribution. The analytical chemistry of the rare earth elements (REEs) very often is highly complicated and the determination of a specific element is impossible without a sample pre-concentration. Sample preparation can be carried out either by separation of the REEs from the matrix or by concentrating the REEs. The separation of REEs from each other is mainly made by chromatography. At the beginning of REE analysis, the method of precipitation/coprecipitation was applied for the treatment of REE mixtures. The method is not applicable for the separation of trace amounts of REEs. The majority of the methods used are based on the distribution of REEs in a two-phase system, a liquid-liquid or a liquid-solid system. Various techniques have been developed for the liquid-liquid extraction (LLE), in particular the liquid phase micro-extraction. The extraction is always combined with a pre-concentration of the REEs in a single drop of extractant or in a hollow fiber filled with the extractant. Further modified techniques for special applications and for difficult REE separation have been developed. Compared to the LLE, the solid phase micro-extraction is preferred. The method is robust and easy to handle, in which the solid phase loaded with the REEs can be used directly for subsequent determination methods. At present, very new solid materials, like nanotubes, are developed and tested for solid phase extraction.

  2. Evolution of electroencephalogram signal analysis techniques during anesthesia.

    PubMed

    Al-Kadi, Mahmoud I; Reaz, Mamun Bin Ibne; Ali, Mohd Alauddin Mohd

    2013-05-17

    Biosignal analysis is one of the most important topics that researchers have tried to develop during the last century to understand numerous human diseases. Electroencephalograms (EEGs) are one of the techniques which provides an electrical representation of biosignals that reflect changes in the activity of the human brain. Monitoring the levels of anesthesia is a very important subject, which has been proposed to avoid both patient awareness caused by inadequate dosage of anesthetic drugs and excessive use of anesthesia during surgery. This article reviews the bases of these techniques and their development within the last decades and provides a synopsis of the relevant methodologies and algorithms that are used to analyze EEG signals. In addition, it aims to present some of the physiological background of the EEG signal, developments in EEG signal processing, and the effective methods used to remove various types of noise. This review will hopefully increase efforts to develop methods that use EEG signals for determining and classifying the depth of anesthesia with a high data rate to produce a flexible and reliable detection device.

  3. The detection of bulk explosives using nuclear-based techniques

    SciTech Connect

    Morgado, R.E.; Gozani, T.; Seher, C.C.

    1988-01-01

    In 1986 we presented a rationale for the detection of bulk explosives based on nuclear techniques that addressed the requirements of civil aviation security in the airport environment. Since then, efforts have intensified to implement a system based on thermal neutron activation (TNA), with new work developing in fast neutron and energetic photon reactions. In this paper we will describe these techniques and present new results from laboratory and airport testing. Based on preliminary results, we contended in our earlier paper that nuclear-based techniques did provide sufficiently penetrating probes and distinguishable detectable reaction products to achieve the FAA operational goals; new data have supported this contention. The status of nuclear-based techniques for the detection of bulk explosives presently under investigation by the US Federal Aviation Administration (FAA) is reviewed. These include thermal neutron activation (TNA), fast neutron activation (FNA), the associated particle technique, nuclear resonance absorption, and photoneutron activation. The results of comprehensive airport testing of the TNA system performed during 1987-88 are summarized. From a technical point of view, nuclear-based techniques now represent the most comprehensive and feasible approach for meeting the operational criteria of detection, false alarms, and throughput. 9 refs., 5 figs., 2 tabs.

  4. Metrology optical power budgeting in SIM using statistical analysis techniques

    NASA Astrophysics Data System (ADS)

    Kuan, Gary M.

    2008-07-01

    The Space Interferometry Mission (SIM) is a space-based stellar interferometry instrument, consisting of up to three interferometers, which will be capable of micro-arc second resolution. Alignment knowledge of the three interferometer baselines requires a three-dimensional, 14-leg truss with each leg being monitored by an external metrology gauge. In addition, each of the three interferometers requires an internal metrology gauge to monitor the optical path length differences between the two sides. Both external and internal metrology gauges are interferometry based, operating at a wavelength of 1319 nanometers. Each gauge has fiber inputs delivering measurement and local oscillator (LO) power, split into probe-LO and reference-LO beam pairs. These beams experience power loss due to a variety of mechanisms including, but not restricted to, design efficiency, material attenuation, element misalignment, diffraction, and coupling efficiency. Since the attenuation due to these sources may degrade over time, an accounting of the range of expected attenuation is needed so an optical power margin can be book kept. A method of statistical optical power analysis and budgeting, based on a technique developed for deep space RF telecommunications, is described in this paper and provides a numerical confidence level for having sufficient optical power relative to mission metrology performance requirements.

  5. Metrology Optical Power Budgeting in SIM Using Statistical Analysis Techniques

    NASA Technical Reports Server (NTRS)

    Kuan, Gary M

    2008-01-01

    The Space Interferometry Mission (SIM) is a space-based stellar interferometry instrument, consisting of up to three interferometers, which will be capable of micro-arc second resolution. Alignment knowledge of the three interferometer baselines requires a three-dimensional, 14-leg truss with each leg being monitored by an external metrology gauge. In addition, each of the three interferometers requires an internal metrology gauge to monitor the optical path length differences between the two sides. Both external and internal metrology gauges are interferometry based, operating at a wavelength of 1319 nanometers. Each gauge has fiber inputs delivering measurement and local oscillator (LO) power, split into probe-LO and reference-LO beam pairs. These beams experience power loss due to a variety of mechanisms including, but not restricted to, design efficiency, material attenuation, element misalignment, diffraction, and coupling efficiency. Since the attenuation due to these sources may degrade over time, an accounting of the range of expected attenuation is needed so an optical power margin can be book kept. A method of statistical optical power analysis and budgeting, based on a technique developed for deep space RF telecommunications, is described in this paper and provides a numerical confidence level for having sufficient optical power relative to mission metrology performance requirements.

  6. Basic Sequence Analysis Techniques for Use with Audit Trail Data

    ERIC Educational Resources Information Center

    Judd, Terry; Kennedy, Gregor

    2008-01-01

    Audit trail analysis can provide valuable insights to researchers and evaluators interested in comparing and contrasting designers' expectations of use and students' actual patterns of use of educational technology environments (ETEs). Sequence analysis techniques are particularly effective but have been neglected to some extent because of real…

  7. Regional environmental analysis and management: New techniques for current problems

    NASA Technical Reports Server (NTRS)

    Honea, R. B.; Paludan, C. T. N.

    1974-01-01

    Advances in data acquisition and processing procedures for regional environmental analysis are discussed. Automated and semi-automated techniques employing Earth Resources Technology Satellite data and conventional data sources are presented. Experiences are summarized. The ERTS computer compatible tapes provide a very complete and flexible record of earth resources data and represent a viable medium to enhance regional environmental analysis research.

  8. Considerations and Techniques for the Analysis of NAEP Data.

    ERIC Educational Resources Information Center

    Johnson, Eugene G.

    1989-01-01

    The effects of certain characteristics (e.g., sample design) of National Assessment of Educational Progress (NAEP) data on statistical analysis techniques are considered. Ignoring special features of NAEP data and proceeding with a standard analysis can produce inferences that underestimate the true variability and overestimate the true degrees of…

  9. Considerations and Techniques for the Analysis of NAEP Data.

    ERIC Educational Resources Information Center

    Johnson, Eugene

    The special characteristics of the data from the National Assessment of Educational Progress (NAEP) that affect the validity of conventional techniques of statistical analysis are considered. In contrast to the assumptions underlying standard methods of statistical analysis, the NAEP samples are obtained via a stratified multi-stage probability…

  10. TU-EF-BRD-02: Indicators and Technique Analysis

    SciTech Connect

    Carlone, M.

    2015-06-15

    Research related to quality and safety has been a staple of medical physics academic activities for a long time. From very early on, medical physicists have developed new radiation measurement equipment and analysis techniques, created ever increasingly accurate dose calculation models, and have vastly improved imaging, planning, and delivery techniques. These and other areas of interest have improved the quality and safety of radiotherapy for our patients. With the advent of TG-100, quality and safety is an area that will garner even more research interest in the future. As medical physicists pursue quality and safety research in greater numbers, it is worthwhile to consider what actually constitutes research on quality and safety. For example, should the development of algorithms for real-time EPID-based in-vivo dosimetry be defined as “quality and safety” research? How about the clinical implementation of such as system? Surely the application of failure modes and effects analysis to a clinical process would be considered quality and safety research, but is this type of research that should be included in the medical physics peer-reviewed literature? The answers to such questions are of critical importance to set researchers in a direction that will provide the greatest benefit to our field and the patients we serve. The purpose of this symposium is to consider what constitutes research in the arena of quality and safety and differentiate it from other research directions. The key distinction here is developing the tool itself (e.g. algorithms for EPID dosimetry) vs. studying the impact of the tool with some quantitative metric. Only the latter would I call quality and safety research. Issues of ‘basic’ versus ‘applied’ quality and safety research will be covered as well as how the research results should be structured to provide increasing levels of support that a quality and safety intervention is effective and sustainable. Examples from existing

  11. Determination of Volatile Organic Compounds in the Atmosphere Using Two Complementary Analysis Techniques.

    PubMed

    Alonso, L; Durana, N; Navazo, M; García, J A; Ilardia, J L

    1999-08-01

    During a preliminary field campaign of volatile organic compound (VOC) measurements carried out in an urban area, two complementary analysis techniques were applied to establish the technical and scientific bases for a strategy to monitor and control VOCs and photochemical oxidants in the Autonomous Community of the Basque Country. Integrated sampling was conducted using Tenax sorbent tubes and laboratory analysis by gas chromatography, and grab sampling and in situ analysis also were conducted using a portable gas chromatograph. With the first technique, monocyclic aromatic hydrocarbons appeared as the compounds with the higher mean concentrations. The second technique allowed the systematic analysis of eight chlorinated and aromatic hydrocarbons. Results of comparing both techniques, as well as the additional information obtained with the second technique, are included.

  12. A Flexible Hierarchical Bayesian Modeling Technique for Risk Analysis of Major Accidents.

    PubMed

    Yu, Hongyang; Khan, Faisal; Veitch, Brian

    2017-02-28

    Safety analysis of rare events with potentially catastrophic consequences is challenged by data scarcity and uncertainty. Traditional causation-based approaches, such as fault tree and event tree (used to model rare event), suffer from a number of weaknesses. These include the static structure of the event causation, lack of event occurrence data, and need for reliable prior information. In this study, a new hierarchical Bayesian modeling based technique is proposed to overcome these drawbacks. The proposed technique can be used as a flexible technique for risk analysis of major accidents. It enables both forward and backward analysis in quantitative reasoning and the treatment of interdependence among the model parameters. Source-to-source variability in data sources is also taken into account through a robust probabilistic safety analysis. The applicability of the proposed technique has been demonstrated through a case study in marine and offshore industry.

  13. Initial planetary base construction techniques and machine implementation

    NASA Technical Reports Server (NTRS)

    Crockford, William W.

    1987-01-01

    Conceptual designs of (1) initial planetary base structures, and (2) an unmanned machine to perform the construction of these structures using materials local to the planet are presented. Rock melting is suggested as a possible technique to be used by the machine in fabricating roads, platforms, and interlocking bricks. Identification of problem areas in machine design and materials processing is accomplished. The feasibility of the designs is contingent upon favorable results of an analysis of the engineering behavior of the product materials. The analysis requires knowledge of several parameters for solution of the constitutive equations of the theory of elasticity. An initial collection of these parameters is presented which helps to define research needed to perform a realistic feasibility study. A qualitative approach to estimating power and mass lift requirements for the proposed machine is used which employs specifications of currently available equipment. An initial, unmanned mission scenario is discussed with emphasis on identifying uncompleted tasks and suggesting design considerations for vehicles and primitive structures which use the products of the machine processing.

  14. Geotechnical Analysis of Paleoseismic Shaking Using Liquefaction Features: Part I. Major Updating of Analysis Techniques

    USGS Publications Warehouse

    Olson, Scott M.; Green, Russell A.; Obermeier, Stephen F.

    2003-01-01

    A new methodology is proposed for the geotechnical analysis of strength of paleoseismic shaking using liquefaction effects. The proposed method provides recommendations for selection of both individual and regionally located test sites, techniques for validation of field data for use in back-analysis, and use of a recently developed energy-based solution to back-calculate paleoearthquake magnitude and strength of shaking. The proposed method allows investigators to assess the influence of post-earthquake density change and aging. The proposed method also describes how the back-calculations from individual sites should be integrated into a regional assessment of paleoseismic parameters.

  15. Damage detection technique by measuring laser-based mechanical impedance

    SciTech Connect

    Lee, Hyeonseok; Sohn, Hoon

    2014-02-18

    This study proposes a method for measurement of mechanical impedance using noncontact laser ultrasound. The measurement of mechanical impedance has been of great interest in nondestructive testing (NDT) or structural health monitoring (SHM) since mechanical impedance is sensitive even to small-sized structural defects. Conventional impedance measurements, however, have been based on electromechanical impedance (EMI) using contact-type piezoelectric transducers, which show deteriorated performances induced by the effects of a) Curie temperature limitations, b) electromagnetic interference (EMI), c) bonding layers and etc. This study aims to tackle the limitations of conventional EMI measurement by utilizing laser-based mechanical impedance (LMI) measurement. The LMI response, which is equivalent to a steady-state ultrasound response, is generated by shooting the pulse laser beam to the target structure, and is acquired by measuring the out-of-plane velocity using a laser vibrometer. The formation of the LMI response is observed through the thermo-mechanical finite element analysis. The feasibility of applying the LMI technique for damage detection is experimentally verified using a pipe specimen under high temperature environment.

  16. Crystallographic texture analysis of archaeological metals: interpretation of manufacturing techniques

    NASA Astrophysics Data System (ADS)

    Artioli, G.

    2007-12-01

    Neutron probes and high energy X-rays are sources of primary importance for the non-invasive characterization of materials related to cultural heritage. Their employment in the characterization of archaeological metal objects, combined with the recent instrumental and computational developments in the field of crystallographic texture analysis (CTA) from diffraction data proves to be a powerful tool for the interpretation of ancient metal working techniques. Diffraction based CTA, when performed using penetrating probes and adequate detector coverage of reciprocal space, for example using large detector arrays and/or ToF mode, allows simultaneous identification and quantification of crystalline phases, besides the microstructural and textural characterization of the object, and it can be effectively used as a totally non-invasive tool for metallographic analysis. Furthermore, the chemical composition of the object may also be obtained by the simultaneous detection of prompt gamma rays induced by neutron activation, or by the fluorescence signal from high energy X-rays, in order to obtain a large amount of complementary information in a single experiment. The specific application of neutron CTA to the characterization of the manufacturing processes of prehistoric copper axes is discussed in detail.

  17. VLBI Analysis with the Multi-Technique Software GEOSAT

    NASA Technical Reports Server (NTRS)

    Kierulf, Halfdan Pascal; Andersen, Per-Helge; Boeckmann, Sarah; Kristiansen, Oddgeir

    2010-01-01

    GEOSAT is a multi-technique geodetic analysis software developed at Forsvarets Forsknings Institutt (Norwegian defense research establishment). The Norwegian Mapping Authority has now installed the software and has, together with Forsvarets Forsknings Institutt, adapted the software to deliver datum-free normal equation systems in SINEX format. The goal is to be accepted as an IVS Associate Analysis Center and to provide contributions to the IVS EOP combination on a routine basis. GEOSAT is based on an upper diagonal factorized Kalman filter which allows estimation of time variable parameters like the troposphere and clocks as stochastic parameters. The tropospheric delays in various directions are mapped to tropospheric zenith delay using ray-tracing. Meteorological data from ECMWF with a resolution of six hours is used to perform the ray-tracing which depends both on elevation and azimuth. Other models are following the IERS and IVS conventions. The Norwegian Mapping Authority has submitted test SINEX files produced with GEOSAT to IVS. The results have been compared with the existing IVS combined products. In this paper the outcome of these comparisons is presented.

  18. S-192 analysis: Conventional and special data processing techniques. [Michigan

    NASA Technical Reports Server (NTRS)

    Nalepka, R. F. (Principal Investigator); Morganstern, J.; Cicone, R.; Sarno, J.; Lambeck, P.; Malila, W.

    1975-01-01

    The author has identified the following significant results. Multispectral scanner data gathered over test sites in southeast Michigan were analyzed. This analysis showed the data to be somewhat deficient especially in terms of the limited signal range in most SDOs and also in regard to SDO-SDO misregistration. Further analysis showed that the scan line straightening algorithm increased the misregistration of the data. Data were processed using the conic format. The effects of such misregistration on classification accuracy was analyzed via simulation and found to be significant. Results of employing conventional as well as special, unresolved object, processing techniques were disappointing due, at least in part, to the limited signal range and noise content of the data. Application of a second class of special processing techniques, signature extension techniques, yielded better results. Two of the more basic signature extension techniques seemed to be useful in spite of the difficulties.

  19. CANDU in-reactor quantitative visual-based inspection techniques

    NASA Astrophysics Data System (ADS)

    Rochefort, P. A.

    2009-02-01

    This paper describes two separate visual-based inspection procedures used at CANDU nuclear power generating stations. The techniques are quantitative in nature and are delivered and operated in highly radioactive environments with access that is restrictive, and in one case is submerged. Visual-based inspections at stations are typically qualitative in nature. For example a video system will be used to search for a missing component, inspect for a broken fixture, or locate areas of excessive corrosion in a pipe. In contrast, the methods described here are used to measure characteristic component dimensions that in one case ensure ongoing safe operation of the reactor and in the other support reactor refurbishment. CANDU reactors are Pressurized Heavy Water Reactors (PHWR). The reactor vessel is a horizontal cylindrical low-pressure calandria tank approximately 6 m in diameter and length, containing heavy water as a neutron moderator. Inside the calandria, 380 horizontal fuel channels (FC) are supported at each end by integral end-shields. Each FC holds 12 fuel bundles. The heavy water primary heat transport water flows through the FC pressure tube, removing the heat from the fuel bundles and delivering it to the steam generator. The general design of the reactor governs both the type of measurements that are required and the methods to perform the measurements. The first inspection procedure is a method to remotely measure the gap between FC and other in-core horizontal components. The technique involves delivering vertically a module with a high-radiation-resistant camera and lighting into the core of a shutdown but fuelled reactor. The measurement is done using a line-of-sight technique between the components. Compensation for image perspective and viewing elevation to the measurement is required. The second inspection procedure measures flaws within the reactor's end shield FC calandria tube rolled joint area. The FC calandria tube (the outer shell of the FC) is

  20. Analysis techniques used on field degraded photovoltaic modules

    SciTech Connect

    Hund, T.D.; King, D.L.

    1995-09-01

    Sandia National Laboratory`s PV System Components Department performs comprehensive failure analysis of photovoltaic modules after extended field exposure at various sites around the world. A full spectrum of analytical techniques are used to help identify the causes of degradation. The techniques are used to make solder fatigue life predictions for PV concentrator modules, identify cell damage or current mismatch, and measure the adhesive strength of the module encapsulant.

  1. Efficient Plant Supervision Strategy Using NN Based Techniques

    NASA Astrophysics Data System (ADS)

    Garcia, Ramon Ferreiro; Rolle, Jose Luis Calvo; Castelo, Francisco Javier Perez

    Most of non-linear type one and type two control systems suffers from lack of detectability when model based techniques are applied on FDI (fault detection and isolation) tasks. In general, all types of processes suffer from lack of detectability also due to the ambiguity to discriminate the process, sensors and actuators in order to isolate any given fault. This work deals with a strategy to detect and isolate faults which include massive neural networks based functional approximation procedures associated to recursive rule based techniques applied to a parity space approach.

  2. Applications of Electromigration Techniques: Applications of Electromigration Techniques in Food Analysis

    NASA Astrophysics Data System (ADS)

    Wieczorek, Piotr; Ligor, Magdalena; Buszewski, Bogusław

    Electromigration techniques, including capillary electrophoresis (CE), are widely used for separation and identification of compounds present in food products. These techniques may also be considered as alternate and complementary with respect to commonly used analytical techniques, such as high-performance liquid chromatography (HPLC), or gas chromatography (GC). Applications of CE concern the determination of high-molecular compounds, like polyphenols, including flavonoids, pigments, vitamins, food additives (preservatives, antioxidants, sweeteners, artificial pigments) are presented. Also, the method developed for the determination of proteins and peptides composed of amino acids, which are basic components of food products, are studied. Other substances such as carbohydrates, nucleic acids, biogenic amines, natural toxins, and other contaminations including pesticides and antibiotics are discussed. The possibility of CE application in food control laboratories, where analysis of the composition of food and food products are conducted, is of great importance. CE technique may be used during the control of technological processes in the food industry and for the identification of numerous compounds present in food. Due to the numerous advantages of the CE technique it is successfully used in routine food analysis.

  3. Diode laser based water vapor DIAL using modulated pulse technique

    NASA Astrophysics Data System (ADS)

    Pham, Phong Le Hoai; Abo, Makoto

    2014-11-01

    In this paper, we propose a diode laser based differential absorption lidar (DIAL) for measuring lower-tropospheric water vapor profile using the modulated pulse technique. The transmitter is based on single-mode diode laser and tapered semiconductor optical amplifier with a peak power of 10W around 800nm absorption band, and the receiver telescope diameter is 35cm. The selected wavelengths are compared to referenced wavelengths in terms of random error and systematic errors. The key component of modulated pulse technique, a macropulse, is generated with a repetition rate of 10 kHz, and the modulation within the macropulse is coded according to a pseudorandom sequence with 100ns chip width. As a result, we evaluate both single pulse modulation and pseudorandom coded pulse modulation technique. The water vapor profiles conducted from these modulation techniques are compared to the real observation data in summer in Japan.

  4. Review and classification of variability analysis techniques with clinical applications.

    PubMed

    Bravi, Andrea; Longtin, André; Seely, Andrew J E

    2011-10-10

    Analysis of patterns of variation of time-series, termed variability analysis, represents a rapidly evolving discipline with increasing applications in different fields of science. In medicine and in particular critical care, efforts have focussed on evaluating the clinical utility of variability. However, the growth and complexity of techniques applicable to this field have made interpretation and understanding of variability more challenging. Our objective is to provide an updated review of variability analysis techniques suitable for clinical applications. We review more than 70 variability techniques, providing for each technique a brief description of the underlying theory and assumptions, together with a summary of clinical applications. We propose a revised classification for the domains of variability techniques, which include statistical, geometric, energetic, informational, and invariant. We discuss the process of calculation, often necessitating a mathematical transform of the time-series. Our aims are to summarize a broad literature, promote a shared vocabulary that would improve the exchange of ideas, and the analyses of the results between different studies. We conclude with challenges for the evolving science of variability analysis.

  5. Review and classification of variability analysis techniques with clinical applications

    PubMed Central

    2011-01-01

    Analysis of patterns of variation of time-series, termed variability analysis, represents a rapidly evolving discipline with increasing applications in different fields of science. In medicine and in particular critical care, efforts have focussed on evaluating the clinical utility of variability. However, the growth and complexity of techniques applicable to this field have made interpretation and understanding of variability more challenging. Our objective is to provide an updated review of variability analysis techniques suitable for clinical applications. We review more than 70 variability techniques, providing for each technique a brief description of the underlying theory and assumptions, together with a summary of clinical applications. We propose a revised classification for the domains of variability techniques, which include statistical, geometric, energetic, informational, and invariant. We discuss the process of calculation, often necessitating a mathematical transform of the time-series. Our aims are to summarize a broad literature, promote a shared vocabulary that would improve the exchange of ideas, and the analyses of the results between different studies. We conclude with challenges for the evolving science of variability analysis. PMID:21985357

  6. The Analysis of Dimensionality Reduction Techniques in Cryptographic Object Code Classification

    SciTech Connect

    Jason L. Wright; Milos Manic

    2010-05-01

    This paper compares the application of three different dimension reduction techniques to the problem of locating cryptography in compiled object code. A simple classi?er is used to compare dimension reduction via sorted covariance, principal component analysis, and correlation-based feature subset selection. The analysis concentrates on the classi?cation accuracy as the number of dimensions is increased.

  7. Machine Learning Techniques for Arterial Pressure Waveform Analysis

    PubMed Central

    Almeida, Vânia G.; Vieira, João; Santos, Pedro; Pereira, Tânia; Pereira, H. Catarina; Correia, Carlos; Pego, Mariano; Cardoso, João

    2013-01-01

    The Arterial Pressure Waveform (APW) can provide essential information about arterial wall integrity and arterial stiffness. Most of APW analysis frameworks individually process each hemodynamic parameter and do not evaluate inter-dependencies in the overall pulse morphology. The key contribution of this work is the use of machine learning algorithms to deal with vectorized features extracted from APW. With this purpose, we follow a five-step evaluation methodology: (1) a custom-designed, non-invasive, electromechanical device was used in the data collection from 50 subjects; (2) the acquired position and amplitude of onset, Systolic Peak (SP), Point of Inflection (Pi) and Dicrotic Wave (DW) were used for the computation of some morphological attributes; (3) pre-processing work on the datasets was performed in order to reduce the number of input features and increase the model accuracy by selecting the most relevant ones; (4) classification of the dataset was carried out using four different machine learning algorithms: Random Forest, BayesNet (probabilistic), J48 (decision tree) and RIPPER (rule-based induction); and (5) we evaluate the trained models, using the majority-voting system, comparatively to the respective calculated Augmentation Index (AIx). Classification algorithms have been proved to be efficient, in particular Random Forest has shown good accuracy (96.95%) and high area under the curve (AUC) of a Receiver Operating Characteristic (ROC) curve (0.961). Finally, during validation tests, a correlation between high risk labels, retrieved from the multi-parametric approach, and positive AIx values was verified. This approach gives allowance for designing new hemodynamic morphology vectors and techniques for multiple APW analysis, thus improving the arterial pulse understanding, especially when compared to traditional single-parameter analysis, where the failure in one parameter measurement component, such as Pi, can jeopardize the whole evaluation. PMID

  8. Emerging techniques for soil analysis via mid-infrared spectroscopy

    NASA Astrophysics Data System (ADS)

    Linker, R.; Shaviv, A.

    2009-04-01

    Transmittance and diffuse reflectance (DRIFT) spectroscopy in the mid-IR range are well-established methods for soil analysis. Over the last five years, additional mid-IR techniques have been investigated, and in particular: 1. Attenuated total reflectance (ATR) Attenuated total reflectance is commonly used for analysis of liquids and powders for which simple transmittance measurements are not possible. The method relies on a crystal with a high refractive index, which is in contact with the sample and serves as a waveguide for the IR radiation. The radiation beam is directed in such a way that it hits the crystal/sample interface several times, each time penetrating a few microns into the sample. Since the penetration depth is limited to a few microns, very good contact between the sample and the crystal must be ensured, which can be achieved by working with samples close to water saturation. However, the strong absorbance of water in the mid-infrared range as well as the absorbance of some soil constituents (e.g., calcium carbonate) interfere with some of the absorbance bands of interest. This has led to the development of several post-processing methods for analysis of the spectra. The FTIR-ATR technique has been successfully applied to soil classification as well as to determination of nitrate concentration [1, 6-8, 10]. Furthermore, Shaviv et al. [12] demonstrated the possibility of using fiber optics as an ATR devise for direct determination of nitrate concentration in soil extracts. Recently, Du et al. [5] showed that it is possible to differentiate between 14N and 15N in such spectra, which opens very promising opportunities for developing FTIR-ATR based methods for investigating nitrogen transformation in soils by tracing changes in N-isotopic species. 2. Photo-acoustic spectroscopy Photoacoustic spectroscopy (PAS) is based on absorption-induced heating of the sample, which produces pressure fluctuations in a surrounding gas. These fluctuations are

  9. Toner and paper-based fabrication techniques for microfluidic applications.

    PubMed

    Coltro, Wendell Karlos Tomazelli; de Jesus, Dosil Pereira; da Silva, José Alberto Fracassi; do Lago, Claudimir Lucio; Carrilho, Emanuel

    2010-08-01

    The interest in low-cost microfluidic platforms as well as emerging microfabrication techniques has increased considerably over the last years. Toner- and paper-based techniques have appeared as two of the most promising platforms for the production of disposable devices for on-chip applications. This review focuses on recent advances in the fabrication techniques and in the analytical/bioanalytical applications of toner and paper-based devices. The discussion is divided in two parts dealing with (i) toner and (ii) paper devices. Examples of miniaturized devices fabricated by using direct-printing or toner transfer masking in polyester-toner, glass, PDMS as well as conductive platforms as recordable compact disks and printed circuit board are presented. The construction and the use of paper-based devices for off-site diagnosis and bioassays are also described to cover this emerging platform for low-cost diagnostics.

  10. Construction of dynamic stochastic simulation models using knowledge-based techniques

    NASA Technical Reports Server (NTRS)

    Williams, M. Douglas; Shiva, Sajjan G.

    1990-01-01

    Over the past three decades, computer-based simulation models have proven themselves to be cost-effective alternatives to the more structured deterministic methods of systems analysis. During this time, many techniques, tools and languages for constructing computer-based simulation models have been developed. More recently, advances in knowledge-based system technology have led many researchers to note the similarities between knowledge-based programming and simulation technologies and to investigate the potential application of knowledge-based programming techniques to simulation modeling. The integration of conventional simulation techniques with knowledge-based programming techniques is discussed to provide a development environment for constructing knowledge-based simulation models. A comparison of the techniques used in the construction of dynamic stochastic simulation models and those used in the construction of knowledge-based systems provides the requirements for the environment. This leads to the design and implementation of a knowledge-based simulation development environment. These techniques were used in the construction of several knowledge-based simulation models including the Advanced Launch System Model (ALSYM).

  11. Surface plasmon resonance based biosensor technique: a review.

    PubMed

    Guo, Xiaowei

    2012-07-01

    Optical Surface plasmon resonance (SPR) biosensors represent the most advanced and developed optical label-free biosensor technology. Optical SPR biosensors are a powerful detection and analysis tool that has vast applications in environmental protection, biotechnology, medical diagnostics, drug screening, food safety and security. This article reviews the recent development of SPR biosensor techniques, including bulk SPR and localized SPR (LSPR) biosensors, for detecting interactions between an analyte of interest in solution and a biomolecular recognition. The concepts of bulk and localized SPs and the working principles of both sensing techniques are introduced. Major sensing advances on biorecognition elements, measurement formats, and sensing platforms are presented. Finally, the discussions on both biosensor techniques as well as comparison of both SPR sensing techniques are made.

  12. Laser-based direct-write techniques for cell printing.

    PubMed

    Schiele, Nathan R; Corr, David T; Huang, Yong; Raof, Nurazhani Abdul; Xie, Yubing; Chrisey, Douglas B

    2010-09-01

    Fabrication of cellular constructs with spatial control of cell location (+/-5 microm) is essential to the advancement of a wide range of applications including tissue engineering, stem cell and cancer research. Precise cell placement, especially of multiple cell types in co- or multi-cultures and in three dimensions, can enable research possibilities otherwise impossible, such as the cell-by-cell assembly of complex cellular constructs. Laser-based direct writing, a printing technique first utilized in electronics applications, has been adapted to transfer living cells and other biological materials (e.g., enzymes, proteins and bioceramics). Many different cell types have been printed using laser-based direct writing, and this technique offers significant improvements when compared to conventional cell patterning techniques. The predominance of work to date has not been in application of the technique, but rather focused on demonstrating the ability of direct writing to pattern living cells, in a spatially precise manner, while maintaining cellular viability. This paper reviews laser-based additive direct-write techniques for cell printing, and the various cell types successfully laser direct-written that have applications in tissue engineering, stem cell and cancer research are highlighted. A particular focus is paid to process dynamics modeling and process-induced cell injury during laser-based cell direct writing.

  13. Laser-based direct-write techniques for cell printing

    PubMed Central

    Schiele, Nathan R; Corr, David T; Huang, Yong; Raof, Nurazhani Abdul; Xie, Yubing; Chrisey, Douglas B

    2016-01-01

    Fabrication of cellular constructs with spatial control of cell location (±5 μm) is essential to the advancement of a wide range of applications including tissue engineering, stem cell and cancer research. Precise cell placement, especially of multiple cell types in co- or multi-cultures and in three dimensions, can enable research possibilities otherwise impossible, such as the cell-by-cell assembly of complex cellular constructs. Laser-based direct writing, a printing technique first utilized in electronics applications, has been adapted to transfer living cells and other biological materials (e.g., enzymes, proteins and bioceramics). Many different cell types have been printed using laser-based direct writing, and this technique offers significant improvements when compared to conventional cell patterning techniques. The predominance of work to date has not been in application of the technique, but rather focused on demonstrating the ability of direct writing to pattern living cells, in a spatially precise manner, while maintaining cellular viability. This paper reviews laser-based additive direct-write techniques for cell printing, and the various cell types successfully laser direct-written that have applications in tissue engineering, stem cell and cancer research are highlighted. A particular focus is paid to process dynamics modeling and process-induced cell injury during laser-based cell direct writing. PMID:20814088

  14. [THE COMPARATIVE ANALYSIS OF TECHNIQUES OF IDENTIFICATION OF CORYNEBACTERIUM NON DIPHTHERIAE].

    PubMed

    Kharseeva, G G; Voronina, N A; Mironov, A Yu; Alutina, E L

    2015-12-01

    The comparative analysis was carried out concerning effectiveness of three techniques of identification of Corynebacterium non diphtheriae: bacteriological, molecular genetic (sequenation on 16SpRNA) andmass-spectrometric (MALDI-ToFMS). The analysis covered 49 strains of Corynebacterium non diphtheriae (C.pseudodiphheriticum, C.amycolatum, C.propinquum, C.falsenii) and 2 strains of Corynebacterium diphtheriae isolated under various pathology form urogenital tract and upper respiratory ways. The corinbacteria were identified using bacteriologic technique, sequenation on 16SpRNA and mass-spectrometric technique (MALDIToF MS). The full concordance of results of species' identification was marked in 26 (51%) of strains of Corynebacterium non diphtheriae at using three analysis techniques; in 43 (84.3%) strains--at comparison of bacteriologic technique with sequenation on 16S pRNA and in 29 (57%)--at mass-spectrometric analysis and sequenation on 16S pRNA. The bacteriologic technique is effective for identification of Corynebacterium diphtheriae. The precise establishment of species belonging of corynebacteria with variable biochemical characteristics the molecular genetic technique of analysis is to be applied. The mass-spectrometric technique (MALDI-ToF MS) requires further renewal of data bases for identifying larger spectrum of representatives of genus Corynebacterium.

  15. Ivory species identification using electrophoresis-based techniques.

    PubMed

    Kitpipit, Thitika; Thanakiatkrai, Phuvadol; Penchart, Kitichaya; Ouithavon, Kanita; Satasook, Chutamas; Linacre, Adrian

    2016-12-01

    Despite continuous conservation efforts by national and international organizations, the populations of the three extant elephant species are still dramatically declining due to the illegal trade in ivory leading to the killing of elephants. A requirement to aid investigations and prosecutions is the accurate identification of the elephant species from which the ivory was removed. We report on the development of the first fully validated multiplex PCR-electrophoresis assay for ivory DNA analysis that can be used as a screening or confirmatory test. SNPs from the NADH dehydrogenase 5 and cytochrome b gene loci were identified and used in the development of the assay. The three extant elephant species could be identified based on three peaks/bands. Elephas maximus exhibited two distinct PCR fragments at approximate 129 and 381 bp; Loxodonta cyclotis showed two PCR fragments at 89 and 129 bp; and Loxodonta africana showed a single fragment of 129 bp. The assay correctly identified the elephant species using all 113 ivory and blood samples used in this report. We also report on the high sensitivity and specificity of the assay. All single-blinded samples were correctly classified, which demonstrated the assay's ability to be used for real casework. In addition, the assay could be used in conjunction with the technique of direct amplification. We propose that the test will benefit wildlife forensic laboratories and aid in the transition to the criminal justice system.

  16. Magnetic separation techniques in sample preparation for biological analysis: a review.

    PubMed

    He, Jincan; Huang, Meiying; Wang, Dongmei; Zhang, Zhuomin; Li, Gongke

    2014-12-01

    Sample preparation is a fundamental and essential step in almost all the analytical procedures, especially for the analysis of complex samples like biological and environmental samples. In past decades, with advantages of superparamagnetic property, good biocompatibility and high binding capacity, functionalized magnetic materials have been widely applied in various processes of sample preparation for biological analysis. In this paper, the recent advancements of magnetic separation techniques based on magnetic materials in the field of sample preparation for biological analysis were reviewed. The strategy of magnetic separation techniques was summarized. The synthesis, stabilization and bio-functionalization of magnetic nanoparticles were reviewed in detail. Characterization of magnetic materials was also summarized. Moreover, the applications of magnetic separation techniques for the enrichment of protein, nucleic acid, cell, bioactive compound and immobilization of enzyme were described. Finally, the existed problems and possible trends of magnetic separation techniques for biological analysis in the future were proposed.

  17. Performance analysis of multiple PRF technique for ambiguity resolution

    NASA Technical Reports Server (NTRS)

    Chang, C. Y.; Curlander, J. C.

    1992-01-01

    For short wavelength spaceborne synthetic aperture radar (SAR), ambiguity in Doppler centroid estimation occurs when the azimuth squint angle uncertainty is larger than the azimuth antenna beamwidth. Multiple pulse recurrence frequency (PRF) hopping is a technique developed to resolve the ambiguity by operating the radar in different PRF's in the pre-imaging sequence. Performance analysis results of the multiple PRF technique are presented, given the constraints of the attitude bound, the drift rate uncertainty, and the arbitrary numerical values of PRF's. The algorithm performance is derived in terms of the probability of correct ambiguity resolution. Examples, using the Shuttle Imaging Radar-C (SIR-C) and X-SAR parameters, demonstrate that the probability of correct ambiguity resolution obtained by the multiple PRF technique is greater than 95 percent and 80 percent for the SIR-C and X-SAR applications, respectively. The success rate is significantly higher than that achieved by the range cross correlation technique.

  18. A comparative analysis of soft computing techniques for gene prediction.

    PubMed

    Goel, Neelam; Singh, Shailendra; Aseri, Trilok Chand

    2013-07-01

    The rapid growth of genomic sequence data for both human and nonhuman species has made analyzing these sequences, especially predicting genes in them, very important and is currently the focus of many research efforts. Beside its scientific interest in the molecular biology and genomics community, gene prediction is of considerable importance in human health and medicine. A variety of gene prediction techniques have been developed for eukaryotes over the past few years. This article reviews and analyzes the application of certain soft computing techniques in gene prediction. First, the problem of gene prediction and its challenges are described. These are followed by different soft computing techniques along with their application to gene prediction. In addition, a comparative analysis of different soft computing techniques for gene prediction is given. Finally some limitations of the current research activities and future research directions are provided.

  19. LOFT Debriefings: An Analysis of Instructor Techniques and Crew Participation

    NASA Technical Reports Server (NTRS)

    Dismukes, R. Key; Jobe, Kimberly K.; McDonnell, Lori K.

    1997-01-01

    This study analyzes techniques instructors use to facilitate crew analysis and evaluation of their Line-Oriented Flight Training (LOFT) performance. A rating instrument called the Debriefing Assessment Battery (DAB) was developed which enables raters to reliably assess instructor facilitation techniques and characterize crew participation. Thirty-six debriefing sessions conducted at five U.S. airlines were analyzed to determine the nature of instructor facilitation and crew participation. Ratings obtained using the DAB corresponded closely with descriptive measures of instructor and crew performance. The data provide empirical evidence that facilitation can be an effective tool for increasing the depth of crew participation and self-analysis of CRM performance. Instructor facilitation skill varied dramatically, suggesting a need for more concrete hands-on training in facilitation techniques. Crews were responsive but fell short of actively leading their own debriefings. Ways to improve debriefing effectiveness are suggested.

  20. Evaluation of Analysis Techniques for Fluted-Core Sandwich Cylinders

    NASA Technical Reports Server (NTRS)

    Lovejoy, Andrew E.; Schultz, Marc R.

    2012-01-01

    Buckling-critical launch-vehicle structures require structural concepts that have high bending stiffness and low mass. Fluted-core, also known as truss-core, sandwich construction is one such concept. In an effort to identify an analysis method appropriate for the preliminary design of fluted-core cylinders, the current paper presents and compares results from several analysis techniques applied to a specific composite fluted-core test article. The analysis techniques are evaluated in terms of their ease of use and for their appropriateness at certain stages throughout a design analysis cycle (DAC). Current analysis techniques that provide accurate determination of the global buckling load are not readily applicable early in the DAC, such as during preliminary design, because they are too costly to run. An analytical approach that neglects transverse-shear deformation is easily applied during preliminary design, but the lack of transverse-shear deformation results in global buckling load predictions that are significantly higher than those from more detailed analysis methods. The current state of the art is either too complex to be applied for preliminary design, or is incapable of the accuracy required to determine global buckling loads for fluted-core cylinders. Therefore, it is necessary to develop an analytical method for calculating global buckling loads of fluted-core cylinders that includes transverse-shear deformations, and that can be easily incorporated in preliminary design.

  1. Regression Commonality Analysis: A Technique for Quantitative Theory Building

    ERIC Educational Resources Information Center

    Nimon, Kim; Reio, Thomas G., Jr.

    2011-01-01

    When it comes to multiple linear regression analysis (MLR), it is common for social and behavioral science researchers to rely predominately on beta weights when evaluating how predictors contribute to a regression model. Presenting an underutilized statistical technique, this article describes how organizational researchers can use commonality…

  2. Dynamic mechanical analysis: A practical introduction to techniques and applications

    SciTech Connect

    Menard, K.

    1999-01-01

    This book introduces DMA, its history, and its current position as part of thermal analysis on polymers. It discusses major types of instrumentation, including oscillatory rotational, oscillatory axial, and torsional pendulum. It also describes analytical techniques in terms of utility, quality of data, methods of calibration, and suitability for different types of materials and assesses applications for thermoplastics, thermosetting systems, and thermosets.

  3. The Recoverability of P-Technique Factor Analysis

    ERIC Educational Resources Information Center

    Molenaar, Peter C. M.; Nesselroade, John R.

    2009-01-01

    It seems that just when we are about to lay P-technique factor analysis finally to rest as obsolete because of newer, more sophisticated multivariate time-series models using latent variables--dynamic factor models--it rears its head to inform us that an obituary may be premature. We present the results of some simulations demonstrating that even…

  4. Evaluation of Meterorite Amono Acid Analysis Data Using Multivariate Techniques

    NASA Technical Reports Server (NTRS)

    McDonald, G.; Storrie-Lombardi, M.; Nealson, K.

    1999-01-01

    The amino acid distributions in the Murchison carbonaceous chondrite, Mars meteorite ALH84001, and ice from the Allan Hills region of Antarctica are shown, using a multivariate technique known as Principal Component Analysis (PCA), to be statistically distinct from the average amino acid compostion of 101 terrestrial protein superfamilies.

  5. Recent Electrochemical and Optical Sensors in Flow-Based Analysis

    PubMed Central

    Chailapakul, Orawon; Ngamukot, Passapol; Yoosamran, Alongkorn; Siangproh, Weena; Wangfuengkanagul, Nattakarn

    2006-01-01

    Some recent analytical sensors based on electrochemical and optical detection coupled with different flow techniques have been chosen in this overview. A brief description of fundamental concepts and applications of each flow technique, such as flow injection analysis (FIA), sequential injection analysis (SIA), all injection analysis (AIA), batch injection analysis (BIA), multicommutated FIA (MCFIA), multisyringe FIA (MSFIA), and multipumped FIA (MPFIA) were reviewed.

  6. Canonical Analysis as a Generalized Regression Technique for Multivariate Analysis.

    ERIC Educational Resources Information Center

    Williams, John D.

    The use of characteristic coding (dummy coding) is made in showing solutions to four multivariate problems using canonical analysis. The canonical variates can be themselves analyzed by the use of multiple linear regression. When the canonical variates are used as criteria in a multiple linear regression, the R2 values are equal to 0, where 0 is…

  7. Projectile Base Flow Analysis

    DTIC Science & Technology

    2007-11-02

    S) AND ADDRESS(ES) DCW Industries, Inc. 5354 Palm Drive La Canada, CA 91011 8. PERFORMING ORGANIZATION...REPORT NUMBER DCW -38-R-05 9. SPONSORING / MONITORING AGENCY NAME(S) AND ADDRESS(ES) U. S. Army Research Office...Turbulence Modeling for CFD, Second Edition, DCW Industries, Inc., La Cañada, CA. Wilcox, D. C. (2001), “Projectile Base Flow Analysis,” DCW

  8. A Novel Nanofabrication Technique of Silicon-Based Nanostructures.

    PubMed

    Meng, Lingkuan; He, Xiaobin; Gao, Jianfeng; Li, Junjie; Wei, Yayi; Yan, Jiang

    2016-12-01

    A novel nanofabrication technique which can produce highly controlled silicon-based nanostructures in wafer scale has been proposed using a simple amorphous silicon (α-Si) material as an etch mask. SiO2 nanostructures directly fabricated can serve as nanotemplates to transfer into the underlying substrates such as silicon, germanium, transistor gate, or other dielectric materials to form electrically functional nanostructures and devices. In this paper, two typical silicon-based nanostructures such as nanoline and nanofin have been successfully fabricated by this technique, demonstrating excellent etch performance. In addition, silicon nanostructures fabricated above can be further trimmed to less than 10 nm by combing with assisted post-treatment methods. The novel nanofabrication technique will be expected a new emerging technology with low process complexity and good compatibility with existing silicon integrated circuit and is an important step towards the easy fabrication of a wide variety of nanoelectronics, biosensors, and optoelectronic devices.

  9. Membrane-based microextraction techniques in analytical chemistry: A review.

    PubMed

    Carasek, Eduardo; Merib, Josias

    2015-06-23

    The use of membrane-based sample preparation techniques in analytical chemistry has gained growing attention from the scientific community since the development of miniaturized sample preparation procedures in the 1990s. The use of membranes makes the microextraction procedures more stable, allowing the determination of analytes in complex and "dirty" samples. This review describes some characteristics of classical membrane-based microextraction techniques (membrane-protected solid-phase microextraction, hollow-fiber liquid-phase microextraction and hollow-fiber renewal liquid membrane) as well as some alternative configurations (thin film and electromembrane extraction) used successfully for the determination of different analytes in a large variety of matrices, some critical points regarding each technique are highlighted.

  10. A Novel Nanofabrication Technique of Silicon-Based Nanostructures

    NASA Astrophysics Data System (ADS)

    Meng, Lingkuan; He, Xiaobin; Gao, Jianfeng; Li, Junjie; Wei, Yayi; Yan, Jiang

    2016-11-01

    A novel nanofabrication technique which can produce highly controlled silicon-based nanostructures in wafer scale has been proposed using a simple amorphous silicon (α-Si) material as an etch mask. SiO2 nanostructures directly fabricated can serve as nanotemplates to transfer into the underlying substrates such as silicon, germanium, transistor gate, or other dielectric materials to form electrically functional nanostructures and devices. In this paper, two typical silicon-based nanostructures such as nanoline and nanofin have been successfully fabricated by this technique, demonstrating excellent etch performance. In addition, silicon nanostructures fabricated above can be further trimmed to less than 10 nm by combing with assisted post-treatment methods. The novel nanofabrication technique will be expected a new emerging technology with low process complexity and good compatibility with existing silicon integrated circuit and is an important step towards the easy fabrication of a wide variety of nanoelectronics, biosensors, and optoelectronic devices.

  11. What Child Analysis Can Teach Us about Psychoanalytic Technique.

    PubMed

    Ablon, Steven Luria

    2014-01-01

    Child analysis has much to teach us about analytic technique. Children have an innate, developmentally driven sense of analytic process. Children in analysis underscore the importance of an understanding and belief in the therapeutic action of play, the provisional aspects of play, and that not all play will be understood. Each analysis requires learning a new play signature that is constantly reorganized. Child analysis emphasizes the emergence and integration of dissociated states, the negotiation of self-other relationships, the importance of co-creation, and the child's awareness of the analyst's sensibility. Child analysis highlights the robust nature of transference and how working through and repairing is related to the initiation of coordinated patterns of high predictability in the context of deep attachments. I will illustrate these and other ideas in the description of the analysis of a nine-year-old boy.

  12. Knowledge-based GIS techniques applied to geological engineering

    USGS Publications Warehouse

    Usery, E. Lynn; Altheide, Phyllis; Deister, Robin R.P.; Barr, David J.

    1988-01-01

    A knowledge-based geographic information system (KBGIS) approach which requires development of a rule base for both GIS processing and for the geological engineering application has been implemented. The rule bases are implemented in the Goldworks expert system development shell interfaced to the Earth Resources Data Analysis System (ERDAS) raster-based GIS for input and output. GIS analysis procedures including recoding, intersection, and union are controlled by the rule base, and the geological engineering map product is generted by the expert system. The KBGIS has been used to generate a geological engineering map of Creve Coeur, Missouri.

  13. Design, data analysis and sampling techniques for clinical research.

    PubMed

    Suresh, Karthik; Thomas, Sanjeev V; Suresh, Geetha

    2011-10-01

    Statistical analysis is an essential technique that enables a medical research practitioner to draw meaningful inference from their data analysis. Improper application of study design and data analysis may render insufficient and improper results and conclusion. Converting a medical problem into a statistical hypothesis with appropriate methodological and logical design and then back-translating the statistical results into relevant medical knowledge is a real challenge. This article explains various sampling methods that can be appropriately used in medical research with different scenarios and challenges.

  14. Environmental Immunoassays: Alternative Techniques for Soil and Water Analysis

    USGS Publications Warehouse

    Aga, D.S.; Thurman, E.M.

    1996-01-01

    Analysis of soil and water samples for environmental studies and compliance testing can be formidable, time consuming, and costly. As a consequence, immunochemical techniques have become popular for environmental analysis because they are reliable, rapid, and cost effective. During the past 5 years, the use of immunoassays for environmental monitoring has increased substantially, and their use as an integral analytical tool in many environmental laboratories is now commonplace. This chapter will present the basic concept of immunoassays, recent advances in the development of immunochemical methods, and examples of successful applications of immunoassays in environmental analysis.

  15. Microfluidic IEF technique for sequential phosphorylation analysis of protein kinases

    NASA Astrophysics Data System (ADS)

    Choi, Nakchul; Song, Simon; Choi, Hoseok; Lim, Bu-Taek; Kim, Young-Pil

    2015-11-01

    Sequential phosphorylation of protein kinases play the important role in signal transduction, protein regulation, and metabolism in living cells. The analysis of these phosphorylation cascades will provide new insights into their physiological functions in many biological functions. Unfortunately, the existing methods are limited to analyze the cascade activity. Therefore, we suggest a microfluidic isoelectric focusing technique (μIEF) for the analysis of the cascade activity. Using the technique, we show that the sequential phosphorylation of a peptide by two different kinases can be successfully detected on a microfluidic chip. In addition, the inhibition assay for kinase activity and the analysis on a real sample have also been conducted. The results indicate that μIEF is an excellent means for studies on phosphorylation cascade activity.

  16. Optical supervised filtering technique based on Hopfield neural network

    NASA Astrophysics Data System (ADS)

    Bal, Abdullah

    2004-11-01

    Hopfield neural network is commonly preferred for optimization problems. In image segmentation, conventional Hopfield neural networks (HNN) are formulated as a cost-function-minimization problem to perform gray level thresholding on the image histogram or the pixels' gray levels arranged in a one-dimensional array [R. Sammouda, N. Niki, H. Nishitani, Pattern Rec. 30 (1997) 921-927; K.S. Cheng, J.S. Lin, C.W. Mao, IEEE Trans. Med. Imag. 15 (1996) 560-567; C. Chang, P. Chung, Image and Vision comp. 19 (2001) 669-678]. In this paper, a new high speed supervised filtering technique is proposed for image feature extraction and enhancement problems by modifying the conventional HNN. The essential improvement in this technique is to use 2D convolution operation instead of weight-matrix multiplication. Thereby, neural network based a new filtering technique has been obtained that is required just 3 × 3 sized filter mask matrix instead of large size weight coefficient matrix. Optical implementation of the proposed filtering technique is executed easily using the joint transform correlator. The requirement of non-negative data for optical implementation is provided by bias technique to convert the bipolar data to non-negative data. Simulation results of the proposed optical supervised filtering technique are reported for various feature extraction problems such as edge detection, corner detection, horizontal and vertical line extraction, and fingerprint enhancement.

  17. Video Multiple Watermarking Technique Based on Image Interlacing Using DWT

    PubMed Central

    Ibrahim, Mohamed M.; Abdel Kader, Neamat S.; Zorkany, M.

    2014-01-01

    Digital watermarking is one of the important techniques to secure digital media files in the domains of data authentication and copyright protection. In the nonblind watermarking systems, the need of the original host file in the watermark recovery operation makes an overhead over the system resources, doubles memory capacity, and doubles communications bandwidth. In this paper, a robust video multiple watermarking technique is proposed to solve this problem. This technique is based on image interlacing. In this technique, three-level discrete wavelet transform (DWT) is used as a watermark embedding/extracting domain, Arnold transform is used as a watermark encryption/decryption method, and different types of media (gray image, color image, and video) are used as watermarks. The robustness of this technique is tested by applying different types of attacks such as: geometric, noising, format-compression, and image-processing attacks. The simulation results show the effectiveness and good performance of the proposed technique in saving system resources, memory capacity, and communications bandwidth. PMID:25587570

  18. Video multiple watermarking technique based on image interlacing using DWT.

    PubMed

    Ibrahim, Mohamed M; Abdel Kader, Neamat S; Zorkany, M

    2014-01-01

    Digital watermarking is one of the important techniques to secure digital media files in the domains of data authentication and copyright protection. In the nonblind watermarking systems, the need of the original host file in the watermark recovery operation makes an overhead over the system resources, doubles memory capacity, and doubles communications bandwidth. In this paper, a robust video multiple watermarking technique is proposed to solve this problem. This technique is based on image interlacing. In this technique, three-level discrete wavelet transform (DWT) is used as a watermark embedding/extracting domain, Arnold transform is used as a watermark encryption/decryption method, and different types of media (gray image, color image, and video) are used as watermarks. The robustness of this technique is tested by applying different types of attacks such as: geometric, noising, format-compression, and image-processing attacks. The simulation results show the effectiveness and good performance of the proposed technique in saving system resources, memory capacity, and communications bandwidth.

  19. Practical applications of activation analysis and other nuclear techniques

    SciTech Connect

    Lyon, W S

    1982-01-01

    Neeutron activation analysis (NAA) is a versatile, sensitive multielement, usually nondestructive analytical technique used to determine elemental concentrations in a variety of materials. Samples are irradiated with neutrons in a nuclear reactor, removed, and for the nondestructive technique, the induced radioactivity measured. This measurement of ..gamma.. rays emitted from specific radionuclides makes possible the quantitative determination of elements present. The method is described, advantages and disadvantages listed and a number of examples of its use given. Two other nuclear methods, particle induced x-ray emission and synchrotron produced x-ray fluorescence are also briefly discussed.

  20. Machinery Diagnostics Via Mechanical Vibration Analysis using Spectral Analysis Techniques

    DTIC Science & Technology

    1988-09-01

    based on the economics of the situation, it is more advantageous to opt for a continuous monitoring system and or there are a very large number of...etc. 3 formats, to systems where permanently installed sensors feed into a computer equipped with diagnostic software. 3. Applictkon to Machinery...the intervals will only be optimal for those units which degrade exactly as does the average unit of the class. Those which perform below average may

  1. Matrix Factorization Techniques for Analysis of Imaging Mass Spectrometry Data

    PubMed Central

    Siy, Peter W.; Moffitt, Richard A.; Parry, R. Mitchell; Chen, Yanfeng; Liu, Ying; Sullards, M. Cameron; Merrill, Alfred H.; Wang, May D.

    2016-01-01

    Imaging mass spectrometry is a method for understanding the molecular distribution in a two-dimensional sample. This method is effective for a wide range of molecules, but generates a large amount of data. It is difficult to extract important information from these large datasets manually and automated methods for discovering important spatial and spectral features are needed. Independent component analysis and non-negative matrix factorization are explained and explored as tools for identifying underlying factors in the data. These techniques are compared and contrasted with principle component analysis, the more standard analysis tool. Independent component analysis and non-negative matrix factorization are found to be more effective analysis methods. A mouse cerebellum dataset is used for testing.

  2. Golden glazes analysis by PIGE and PIXE techniques

    NASA Astrophysics Data System (ADS)

    Fonseca, M.; Luís, H.; Franco, N.; Reis, M. A.; Chaves, P. C.; Taborda, A.; Cruz, J.; Galaviz, D.; Fernandes, N.; Vieira, P.; Ribeiro, J. P.; Jesus, A. P.

    2011-12-01

    We present the analysis performed on the chemical composition of two golden glazes available in the market using the PIGE and PIXE techniques at the ITN ion beam laboratory. The analysis of the light elements was performed using the Emitted Radiation Yield Analysis (ERYA) code, a standard-free method for PIGE analysis on thick samples. The results were compared to those obtained on an old glaze. Consistently high concentrations of lead and sodium were found in all analyzed golden glazes. The analysis of the samples pointed to Mo and Co as the specific elements responsible of the gold colour at the desired temperature, and allowed Portuguese ceramists to produce a golden glaze at 997 °C. Optical reflection spectra of the glazes are given, showing that the produced glaze has a spectrum similar to the old glaze. Also, in order to help the ceramists, the unknown compositions of four different types of frits (one of the components of glazes) were analysed.

  3. Comparison of laser-based rapid prototyping techniques

    NASA Astrophysics Data System (ADS)

    Humphreys, Hugh; Wimpenny, David

    2002-04-01

    A diverse range of Rapid Prototyping, or layer manufacturing techniques have evolved since the introduction of the first process in the late 1980s. Many, although not all, rapid prototyping processes rely on lasers to provide a localised and controllable source of light for curing a liquid photopolymer or heat to fuse thermoplastic powders to form objects. This paper will provide an overview of laser based rapid prototyping methods and discuss the future direction of this technology in light of the threats posed by low cost 3D printing techniques and the opportunity for the direct manufacture of metal components.

  4. Probabilistic Analysis Techniques Applied to Complex Spacecraft Power System Modeling

    NASA Technical Reports Server (NTRS)

    Hojnicki, Jeffrey S.; Rusick, Jeffrey J.

    2005-01-01

    Electric power system performance predictions are critical to spacecraft, such as the International Space Station (ISS), to ensure that sufficient power is available to support all the spacecraft s power needs. In the case of the ISS power system, analyses to date have been deterministic, meaning that each analysis produces a single-valued result for power capability because of the complexity and large size of the model. As a result, the deterministic ISS analyses did not account for the sensitivity of the power capability to uncertainties in model input variables. Over the last 10 years, the NASA Glenn Research Center has developed advanced, computationally fast, probabilistic analysis techniques and successfully applied them to large (thousands of nodes) complex structural analysis models. These same techniques were recently applied to large, complex ISS power system models. This new application enables probabilistic power analyses that account for input uncertainties and produce results that include variations caused by these uncertainties. Specifically, N&R Engineering, under contract to NASA, integrated these advanced probabilistic techniques with Glenn s internationally recognized ISS power system model, System Power Analysis for Capability Evaluation (SPACE).

  5. Development of the variational SEASAT data analysis technique

    NASA Technical Reports Server (NTRS)

    Sasaki, Y. K.; Chang, L. P.; Goerss, J. S.

    1985-01-01

    Surface winds are closely associated with the surface pressure gradient. The variational SEASAT data analysis technique was designed to improve the sea level pressure analysis in the data sparse areas. The SEASAT-derived surface wind data were compared with observations from the Joint Air Sea Interaction Experiment (JASIN) and it was found that the satellite-derived sea surface wind has an accuracy of up to + or - 2 m/s in speed and + or - 20 deg in direction. These numbers are considered characteristic of the retrieved SEASAT wind field. By combining the densely spaced SEASAT-derived wind data with the sparsely distributed sea-level pressure observation via a variational adjustment technique subject to some appropriate physical constraint(s), an improvement in the sea-level pressure analysis is expected. It is demonstrated that a simple marine boundary layer scheme in conjunction with a variational adjustment technique can be developed to help improve the sea-level pressure analysis by the SEASAT-derived wind of a limited-area domain in the ocean.

  6. Development of a variational SEASAT data analysis technique

    NASA Technical Reports Server (NTRS)

    Sasaki, Y. K.; Chang, L. P.; Goerss, J. S.

    1986-01-01

    Oceans are data-sparse areas in terms of conventional weather observations. The surface pressure field obtained solely by analyzing the conventional weather data is not expected to possess high accuracy. On the other hand, in entering asynoptic data such as satellite-derived temperature soundings into an atmospheric prediction system, an improved surface analysis is crucial for obtaining more accurate weather predictions because the mass distribution of the entire atmosphere will be better represented in the system as a result of the more accurate surface pressure field. In order to obtain improved surface pressure analyses over the oceans, a variational adjustment technique was developed to help blend the densely distributed surface wind data derived from the SEASAT-A radar observations into the sparsely distributed conventional pressure data. A simple marine boundary layer scheme employed in the adjustment technique was discussed. In addition, a few aspects of the current technique were determined by numerical experiments.

  7. Large areas elemental mapping by ion beam analysis techniques

    NASA Astrophysics Data System (ADS)

    Silva, T. F.; Rodrigues, C. L.; Curado, J. F.; Allegro, P.; Moro, M. V.; Campos, P. H. O. V.; Santos, S. B.; Kajiya, E. A. M.; Rizzutto, M. A.; Added, N.; Tabacniks, M. H.

    2015-07-01

    The external beam line of the Laboratory for Material Analysis with Ion Beams (LAMFI) is a versatile setup for multi-technique analysis. X-ray detectors for Particle Induced X-rays Emission (PIXE) measurements, a Gamma-ray detector for Particle Induced Gamma- ray Emission (PIGE), and a particle detector for scattering analysis, such as Rutherford Backscattering Spectrometry (RBS), were already installed. In this work, we present some results, using a large (60-cm range) XYZ computer controlled sample positioning system, completely developed and build in our laboratory. The XYZ stage was installed at the external beam line and its high spacial resolution (better than 5 μm over the full range) enables positioning the sample with high accuracy and high reproducibility. The combination of a sub-millimeter beam with the large range XYZ robotic stage is being used to produce elemental maps of large areas in samples like paintings, ceramics, stones, fossils, and all sort of samples. Due to its particular characteristics, this is a unique device in the sense of multi-technique analysis of large areas. With the continuous development of the external beam line at LAMFI, coupled to the robotic XYZ stage, it is becoming a robust and reliable option for regular analysis of trace elements (Z > 5) competing with the traditional in-vacuum ion-beam-analysis with the advantage of automatic rastering.

  8. Graphene-based terahertz photodetector by noise thermometry technique

    SciTech Connect

    Wang, Ming-Jye; Wang, Ji-Wun; Wang, Chun-Lun; Chiang, Yen-Yu; Chang, Hsian-Hong

    2014-01-20

    We report the characteristics of graphene-based terahertz (THz) photodetector based on noise thermometry technique by measuring its noise power at frequency from 4 to 6 GHz. Hot electron system in graphene microbridge is generated after THz photon pumping and creates extra noise power. The equivalent noise temperature and electron temperature increase rapidly in low THz pumping regime and saturate gradually in high THz power regime which is attributed to a faster energy relaxation process involved by stronger electron-phonon interaction. Based on this detector, a conversion efficiency around 0.15 from THz power to noise power in 4–6 GHz span has been achieved.

  9. Flow analysis techniques as effective tools for the improved environmental analysis of organic compounds expressed as total indices.

    PubMed

    Maya, Fernando; Estela, José Manuel; Cerdà, Víctor

    2010-04-15

    The scope of this work is the accomplishment of an overview about the current state-of-the-art flow analysis techniques applied to the environmental determination of organic compounds expressed as total indices. Flow analysis techniques are proposed as effective tools for the quick obtention of preliminary chemical information about the occurrence of organic compounds on the environment prior to the use of more complex, time-consuming and expensive instrumental techniques. Recently improved flow-based methodologies for the determination of chemical oxygen demand, halogenated organic compounds and phenols are presented and discussed in detail. The aim of the present work is to demonstrate the highlight of flow-based techniques as vanguard tools on the determination of organic compounds in environmental water samples.

  10. Optical center alignment technique based on inner profile measurement method

    NASA Astrophysics Data System (ADS)

    Wakayama, Toshitaka; Yoshizawa, Toru

    2014-05-01

    Center alignment is important technique to tune up the spindle of various precision machines in manufacturing industry. Conventionally such a tool as a dial indicator has been used to adjust and to position the axis by manual operations of a technical worker. However, it is not easy to precisely control its axis. In this paper, we developed the optical center alignment technique based on inner profile measurement using a ring beam device. In this case, the center position of the cylinder hole can be determined from circular profile detected by optical sectioning method using a ring beam device. In our trials, the resolution of the center position is proved less than 10 micrometers in extreme cases. This technique is available for practical applications in machine tool industry.

  11. Copyright protection scheme based on chaos and secret sharing techniques

    NASA Astrophysics Data System (ADS)

    Lou, Der-Chyuan; Shieh, Jieh-Ming; Tso, Hao-Kuan

    2005-11-01

    A copyright protection scheme based on chaos and secret sharing techniques is proposed. Instead of modifying the original image to embed a watermark in it, the proposed scheme extracts a feature from the image first. Then, the extracted feature and the watermark are scrambled by a chaos technique. Finally, the secret sharing technique is used to construct a shadow image. The watermark can be retrieved by performing an XOR operation between the shadow images. The proposed scheme has the following advantages. Firstly, the watermark retrieval does not need the original image. Secondly, the scheme does not need to modify the original image for embedding the watermark. Thirdly, compared with several schemes, the scheme is secure and robust in resisting various attacks.

  12. Choosing a DIVA: a comparison of emerging digital imagery vegetation analysis techniques

    USGS Publications Warehouse

    Jorgensen, Christopher F.; Stutzman, Ryan J.; Anderson, Lars C.; Decker, Suzanne E.; Powell, Larkin A.; Schacht, Walter H.; Fontaine, Joseph J.

    2013-01-01

    Question: What is the precision of five methods of measuring vegetation structure using ground-based digital imagery and processing techniques? Location: Lincoln, Nebraska, USA Methods: Vertical herbaceous cover was recorded using digital imagery techniques at two distinct locations in a mixed-grass prairie. The precision of five ground-based digital imagery vegetation analysis (DIVA) methods for measuring vegetation structure was tested using a split-split plot analysis of covariance. Variability within each DIVA technique was estimated using coefficient of variation of mean percentage cover. Results: Vertical herbaceous cover estimates differed among DIVA techniques. Additionally, environmental conditions affected the vertical vegetation obstruction estimates for certain digital imagery methods, while other techniques were more adept at handling various conditions. Overall, percentage vegetation cover values differed among techniques, but the precision of four of the five techniques was consistently high. Conclusions: DIVA procedures are sufficient for measuring various heights and densities of standing herbaceous cover. Moreover, digital imagery techniques can reduce measurement error associated with multiple observers' standing herbaceous cover estimates, allowing greater opportunity to detect patterns associated with vegetation structure.

  13. MEMS-based power generation techniques for implantable biosensing applications.

    PubMed

    Lueke, Jonathan; Moussa, Walied A

    2011-01-01

    Implantable biosensing is attractive for both medical monitoring and diagnostic applications. It is possible to monitor phenomena such as physical loads on joints or implants, vital signs, or osseointegration in vivo and in real time. Microelectromechanical (MEMS)-based generation techniques can allow for the autonomous operation of implantable biosensors by generating electrical power to replace or supplement existing battery-based power systems. By supplementing existing battery-based power systems for implantable biosensors, the operational lifetime of the sensor is increased. In addition, the potential for a greater amount of available power allows additional components to be added to the biosensing module, such as computational and wireless and components, improving functionality and performance of the biosensor. Photovoltaic, thermovoltaic, micro fuel cell, electrostatic, electromagnetic, and piezoelectric based generation schemes are evaluated in this paper for applicability for implantable biosensing. MEMS-based generation techniques that harvest ambient energy, such as vibration, are much better suited for implantable biosensing applications than fuel-based approaches, producing up to milliwatts of electrical power. High power density MEMS-based approaches, such as piezoelectric and electromagnetic schemes, allow for supplemental and replacement power schemes for biosensing applications to improve device capabilities and performance. In addition, this may allow for the biosensor to be further miniaturized, reducing the need for relatively large batteries with respect to device size. This would cause the implanted biosensor to be less invasive, increasing the quality of care received by the patient.

  14. MEMS-Based Power Generation Techniques for Implantable Biosensing Applications

    PubMed Central

    Lueke, Jonathan; Moussa, Walied A.

    2011-01-01

    Implantable biosensing is attractive for both medical monitoring and diagnostic applications. It is possible to monitor phenomena such as physical loads on joints or implants, vital signs, or osseointegration in vivo and in real time. Microelectromechanical (MEMS)-based generation techniques can allow for the autonomous operation of implantable biosensors by generating electrical power to replace or supplement existing battery-based power systems. By supplementing existing battery-based power systems for implantable biosensors, the operational lifetime of the sensor is increased. In addition, the potential for a greater amount of available power allows additional components to be added to the biosensing module, such as computational and wireless and components, improving functionality and performance of the biosensor. Photovoltaic, thermovoltaic, micro fuel cell, electrostatic, electromagnetic, and piezoelectric based generation schemes are evaluated in this paper for applicability for implantable biosensing. MEMS-based generation techniques that harvest ambient energy, such as vibration, are much better suited for implantable biosensing applications than fuel-based approaches, producing up to milliwatts of electrical power. High power density MEMS-based approaches, such as piezoelectric and electromagnetic schemes, allow for supplemental and replacement power schemes for biosensing applications to improve device capabilities and performance. In addition, this may allow for the biosensor to be further miniaturized, reducing the need for relatively large batteries with respect to device size. This would cause the implanted biosensor to be less invasive, increasing the quality of care received by the patient. PMID:22319362

  15. Characterization of PTFE Using Advanced Thermal Analysis Techniques

    NASA Astrophysics Data System (ADS)

    Blumm, J.; Lindemann, A.; Meyer, M.; Strasser, C.

    2010-10-01

    Polytetrafluoroethylene (PTFE) is a synthetic fluoropolymer used in numerous industrial applications. It is often referred to by its trademark name, Teflon. Thermal characterization of a PTFE material was carried out using various thermal analysis and thermophysical properties test techniques. The transformation energetics and specific heat were measured employing differential scanning calorimetry. The thermal expansion and the density changes were determined employing pushrod dilatometry. The viscoelastic properties (storage and loss modulus) were analyzed using dynamic mechanical analysis. The thermal diffusivity was measured using the laser flash technique. Combining thermal diffusivity data with specific heat and density allows calculation of the thermal conductivity of the polymer. Measurements were carried out from - 125 °C up to 150 °C. Additionally, measurements of the mechanical properties were carried out down to - 170 °C. The specific heat tests were conducted into the fully molten regions up to 370 °C.

  16. Techniques of preparing plant material for chromatographic separation and analysis.

    PubMed

    Romanik, G; Gilgenast, E; Przyjazny, A; Kamiński, M

    2007-03-10

    This paper discusses preparation techniques of samples of plant material for chromatographic analysis. Individual steps of the procedures used in sample preparation, including sample collection from the environment or from tissue cultures, drying, comminution, homogenization, leaching, extraction, distillation and condensation, analyte enrichment, and obtaining the final extracts for chromatographic analysis are discussed. The techniques most often used for isolation of analytes from homogenized plant material, i.e., Soxhlet extraction, ultrasonic solvent extraction (sonication), accelerated solvent extraction, microwave-assisted extraction, supercritical-fluid extraction, steam distillation, as well as membrane processes are emphasized. Sorptive methods of sample enrichment and removal of interferences, i.e., solid-phase extraction, and solid-phase micro-extraction are also discussed.

  17. Analysis of Hospital Processes with Process Mining Techniques.

    PubMed

    Orellana García, Arturo; Pérez Alfonso, Damián; Larrea Armenteros, Osvaldo Ulises

    2015-01-01

    Process mining allows for discovery, monitoring, and improving processes identified in information systems from their event logs. In hospital environments, process analysis has been a crucial factor for cost reduction, control and proper use of resources, better patient care, and achieving service excellence. This paper presents a new component for event logs generation in the Hospital Information System or HIS, developed at University of Informatics Sciences. The event logs obtained are used for analysis of hospital processes with process mining techniques. The proposed solution intends to achieve the generation of event logs in the system with high quality. The performed analyses allowed for redefining functions in the system and proposed proper flow of information. The study exposed the need to incorporate process mining techniques in hospital systems to analyze the processes execution. Moreover, we illustrate its application for making clinical and administrative decisions for the management of hospital activities.

  18. Scalable Analysis Techniques for Microprocessor Performance Counter Metrics

    SciTech Connect

    Ahn, D H; Vetter, J S

    2002-07-24

    Contemporary microprocessors provide a rich set of integrated performance counters that allow application developers and system architects alike the opportunity to gather important information about workload behaviors. These counters can capture instruction, memory, and operating system behaviors. Current techniques for analyzing data produced from these counters use raw counts, ratios, and visualization techniques to help users make decisions about their application source code. While these techniques are appropriate for analyzing data from one process, they do not scale easily to new levels demanded by contemporary computing systems. Indeed, the amount of data generated by these experiments is on the order of tens of thousands of data points. Furthermore, if users execute multiple experiments, then we add yet another dimension to this already knotty picture. This flood of multidimensional data can swamp efforts to harvest important ideas from these valuable counters. Very simply, this paper addresses these concerns by evaluating several multivariate statistical techniques on these datasets. We find that several techniques, such as statistical clustering, can automatically extract important features from this data. These derived results can, in turn, be feed directly back to an application developer, or used as input to a more comprehensive performance analysis environment, such as a visualization or an expert system.

  19. Refinement of Techniques Metallographic Analysis of Highly Dispersed Structures

    NASA Astrophysics Data System (ADS)

    Khammatov, A.; Belkin, D.; Barbina, N.

    2016-01-01

    Flaws are regularly made while developing standards and technical specifications. They can come out as minor misprints, as an insufficient description of a technique. In spite the fact that the flaws are well known, it does not come to the stage of introducing changes to standards. In this paper shows that in the normative documents is necessary to clarify the requirements for metallurgical microscopes, which are used for analysis of finely-dispersed.

  20. Fusing modeling techniques to support domain analysis for reuse opportunities identification

    NASA Technical Reports Server (NTRS)

    Hall, Susan Main; Mcguire, Eileen

    1993-01-01

    Functional modeling techniques or object-oriented graphical representations, which are more useful to someone trying to understand the general design or high level requirements of a system? For a recent domain analysis effort, the answer was a fusion of popular modeling techniques of both types. By using both functional and object-oriented techniques, the analysts involved were able to lean on their experience in function oriented software development, while taking advantage of the descriptive power available in object oriented models. In addition, a base of familiar modeling methods permitted the group of mostly new domain analysts to learn the details of the domain analysis process while producing a quality product. This paper describes the background of this project and then provides a high level definition of domain analysis. The majority of this paper focuses on the modeling method developed and utilized during this analysis effort.

  1. Artificial intelligence techniques used in respiratory sound analysis--a systematic review.

    PubMed

    Palaniappan, Rajkumar; Sundaraj, Kenneth; Sundaraj, Sebastian

    2014-02-01

    Artificial intelligence (AI) has recently been established as an alternative method to many conventional methods. The implementation of AI techniques for respiratory sound analysis can assist medical professionals in the diagnosis of lung pathologies. This article highlights the importance of AI techniques in the implementation of computer-based respiratory sound analysis. Articles on computer-based respiratory sound analysis using AI techniques were identified by searches conducted on various electronic resources, such as the IEEE, Springer, Elsevier, PubMed, and ACM digital library databases. Brief descriptions of the types of respiratory sounds and their respective characteristics are provided. We then analyzed each of the previous studies to determine the specific respiratory sounds/pathology analyzed, the number of subjects, the signal processing method used, the AI techniques used, and the performance of the AI technique used in the analysis of respiratory sounds. A detailed description of each of these studies is provided. In conclusion, this article provides recommendations for further advancements in respiratory sound analysis.

  2. Gabor-based fusion technique for Optical Coherence Microscopy.

    PubMed

    Rolland, Jannick P; Meemon, Panomsak; Murali, Supraja; Thompson, Kevin P; Lee, Kye-sung

    2010-02-15

    We recently reported on an Optical Coherence Microscopy technique, whose innovation intrinsically builds on a recently reported - 2 microm invariant lateral resolution by design throughout a 2 mm cubic full-field of view - liquid-lens-based dynamic focusing optical probe [Murali et al., Optics Letters 34, 145-147, 2009]. We shall report in this paper on the image acquisition enabled by this optical probe when combined with an automatic data fusion method developed and described here to produce an in-focus high resolution image throughout the imaging depth of the sample. An African frog tadpole (Xenopus laevis) was imaged with the novel probe and the Gabor-based fusion technique, demonstrating subcellular resolution in a 0.5 mm (lateral) x 0.5 mm (axial) without the need, for the first time, for x-y translation stages, depth scanning, high-cost adaptive optics, or manual intervention. In vivo images of human skin are also presented.

  3. Metabolic engineering: techniques for analysis of targets for genetic manipulations.

    PubMed

    Nielsen, J

    Metabolic engineering has been defined as the purposeful modification of intermediary metabolism using recombinant DNA techniques. With this definition metabolic engineering includes: (1) inserting new pathways in microorganisms with the aim of producing novel metabolites, e.g., production of polyketides by Streptomyces; (2) production of heterologous peptides, e.g., production of human insulin, erythropoitin, and tPA; and (3) improvement of both new and existing processes, e.g., production of antibiotics and industrial enzymes. Metabolic engineering is a multidisciplinary approach, which involves input from chemical engineers, molecular biologists, biochemists, physiologists, and analytical chemists. Obviously, molecular biology is central in the production of novel products, as well as in the improvement of existing processes. However, in the latter case, input from other disciplines is pivotal in order to target the genetic modifications; with the rapid developments in molecular biology, progress in the field is likely to be limited by procedures to identify the optimal genetic changes. Identification of the optimal genetic changes often requires a meticulous mapping of the cellular metabolism at different operating conditions, and the application of metabolic engineering to process optimization is, therefore, expected mainly to have an impact on the improvement of processes where yield, productivity, and titer are important design factors, i.e., in the production of metabolites and industrial enzymes. Despite the prospect of obtaining major improvement through metabolic engineering, this approach is, however, not expected to completely replace the classical approach to strain improvement-random mutagenesis followed by screening. Identification of the optimal genetic changes for improvement of a given process requires analysis of the underlying mechanisms, at best, at the molecular level. To reveal these mechanisms a number of different techniques may be applied

  4. Analysis of Self-Excited Combustion Instabilities Using Decomposition Techniques

    DTIC Science & Technology

    2013-01-01

    the spatial mode kv would contain all normalized numbers and vice versa. Or, we can write Eq.(6) in this way, T k k k kA u v (7) In Eq.(7), ku is the...with traditional band -pass filtering based analysis for the study of self-excited combustion instabilities in a longitudinal mode rocket combustor. The...traditional band -pass filtering based analysis for the study of self-excited combustion instabilities in a longitudinal mode rocket combustor. The POD analysis

  5. AGARD flight test techniques series. Volume 9: Aircraft exterior noise measurement and analysis techniques

    NASA Astrophysics Data System (ADS)

    Heller, H.

    1991-04-01

    Testing and analysis techniques to measure aircraft noise primarily for purposes of noise certification as specified by the 'International Civil Aviation Organization', ICAO are described. The relevant aircraft noise certification standards and recommended practices are presented in detail for subsonic jet aircraft, for heavy and light propeller-driven aircraft, and for helicopters. The practical execution of conducting noise certification tests is treated in depth. The characteristics and requirements of the acoustic and non-acoustic instrumentation for data acquisition and data processing are discussed, as are the procedures to determine the special noise measures - effective perceived noise level (EPNL) and maximum overall A-weighted noise level (L sub pA,max) - that are required for the noise certification of different types of aircraft. The AGARDograph also contains an extensive, although selective, discussion of test and analysis techniques for more detailed aircraft noise studies by means of either flight experiments or full-scale and model-scale wind tunnel experiments. Appendices provide supplementary information.

  6. Metal trace analysis by PIXE and PDMS techniques

    NASA Astrophysics Data System (ADS)

    Dias da Cunha, K.; Barros Leite, C. V.

    2002-03-01

    The risk for the human health due to exposure to aerosols depends on the intake pattern, the mass concentration and the speciation of the elements present in airborne particles. In this work plasma desorption mass spectrometry (PDMS) was used as complementary technique to the particle-induced X-ray emission (PIXE) technique to characterize aerosol samples collected in the environment. The PIXE technique allows the identification of the elements present in the sample and to determine their mass concentrations. The mass spectrometry (PDMS) was used to identify the speciation of these elements present in the samples. The aerosol samples were collected using a 6-stage cascade impactor (CI) in two sites of Rio de Janeiro City. One is an island (Fundão Island) in the Guanabara Bay close to an industrial zone and the other, in Gávea, is a residential zone close to a lagoon and to the seashore. The mass median aerodynamic diameter (MMAD) measured indicated that the airborne particulates were in the fine fraction of the aerosols collected in both locations. In order to identify the contribution of the seawater particles from the Guanabara Bay in the aerosols, seawater samples were also collected at Fundão Island. The samples were analyzed by PIXE and PDMS techniques. The analysis of the results suggests that the aerosols are different in both sampling sites and also exist a contribution from the Guanabara Bay seawater particles to the aerosols collected in the Fundão Island. PIXE allows identification and quantification of the elements heavier than Na ( Z=11) while PDMS allows identification of organic and inorganic compounds present in the samples, as these techniques are used as complementary techniques they provide important information about the aerosols characterization.

  7. Kinematic Analysis of Five Different Anterior Cruciate Ligament Reconstruction Techniques.

    PubMed

    Gadikota, Hemanth R; Hosseini, Ali; Asnis, Peter; Li, Guoan

    2015-06-01

    Several anatomical anterior cruciate ligament (ACL) reconstruction techniques have been proposed to restore normal joint kinematics. However, the relative superiorities of these techniques with one another and traditional single-bundle reconstructions are unclear. Kinematic responses of five previously reported reconstruction techniques (single-bundle reconstruction using a bone-patellar tendon-bone graft [SBR-BPTB], single-bundle reconstruction using a hamstring tendon graft [SBR-HST], single-tunnel double-bundle reconstruction using a hamstring tendon graft [STDBR-HST], anatomical single-tunnel reconstruction using a hamstring tendon graft [ASTR-HST], and a double-tunnel double-bundle reconstruction using a hamstring tendon graft [DBR-HST]) were systematically analyzed. The knee kinematics were determined under anterior tibial load (134 N) and simulated quadriceps load (400 N) at 0°, 15°, 30°, 60°, and 90° of flexion using a robotic testing system. Anterior joint stability under anterior tibial load was qualified as normal for ASTR-HST and DBR-HST and nearly normal for SBR-BPTB, SBR-HST, and STDBR-HST as per the International Knee Documentation Committee knee examination form categorization. The analysis of this study also demonstrated that SBR-BPTB, STDBR-HST, ASTR-HST, and DBR-HST restored the anterior joint stability to normal condition while the SBR-HST resulted in a nearly normal anterior joint stability under the action of simulated quadriceps load. The medial-lateral translations were restored to normal level by all the reconstructions. The internal tibial rotations under the simulated muscle load were over-constrained by all the reconstruction techniques, and more so by the DBR-HST. All five ACL reconstruction techniques could provide either normal or nearly normal anterior joint stability; however, the techniques over-constrained internal tibial rotation under the simulated quadriceps load.

  8. Application of mass spectrometry-based proteomics techniques for the detection of protein doping in sports.

    PubMed

    Kay, Richard G; Creaser, Colin S

    2010-04-01

    Mass spectrometry-based proteomic approaches have been used to develop methodologies capable of detecting the abuse of protein therapeutics such as recombinant human erythropoietin and recombinant human growth hormone. Existing detection methods use antibody-based approaches that, although effective, suffer from long assay development times and specificity issues. The application of liquid chromatography with tandem mass spectrometry and selected reaction-monitoring-based analysis has demonstrated the ability to detect and quantify existing protein therapeutics in plasma. Furthermore, the multiplexing capability of selected reaction-monitoring analysis has also aided in the detection of multiple downstream biomarkers in a single analysis, requiring less sample than existing immunological techniques. The flexibility of mass spectrometric instrumentation has shown that the technique is capable of detecting the abuse of novel and existing protein therapeutics, and has a vital role in the fight to keep sports drug-free.

  9. A Different Web-Based Geocoding Service Using Fuzzy Techniques

    NASA Astrophysics Data System (ADS)

    Pahlavani, P.; Abbaspour, R. A.; Zare Zadiny, A.

    2015-12-01

    Geocoding - the process of finding position based on descriptive data such as address or postal code - is considered as one of the most commonly used spatial analyses. Many online map providers such as Google Maps, Bing Maps and Yahoo Maps present geocoding as one of their basic capabilities. Despite the diversity of geocoding services, users usually face some limitations when they use available online geocoding services. In existing geocoding services, proximity and nearness concept is not modelled appropriately as well as these services search address only by address matching based on descriptive data. In addition there are also some limitations in display searching results. Resolving these limitations can enhance efficiency of the existing geocoding services. This paper proposes the idea of integrating fuzzy technique with geocoding process to resolve these limitations. In order to implement the proposed method, a web-based system is designed. In proposed method, nearness to places is defined by fuzzy membership functions and multiple fuzzy distance maps are created. Then these fuzzy distance maps are integrated using fuzzy overlay technique for obtain the results. Proposed methods provides different capabilities for users such as ability to search multi-part addresses, searching places based on their location, non-point representation of results as well as displaying search results based on their priority.

  10. Surgical technique for repair of complex anterior skull base defects

    PubMed Central

    Reinard, Kevin; Basheer, Azam; Jones, Lamont; Standring, Robert; Lee, Ian; Rock, Jack

    2015-01-01

    Background: Modern microsurgical techniques enable en bloc resection of complex skull base tumors. Anterior cranial base surgery, particularly, has been associated with a high rate of postoperative cerebrospinal fluid (CSF) leak, meningitis, intracranial abscess, and pneumocephalus. We introduce simple modifications to already existing surgical strategies designed to minimize the incidence of postoperative CSF leak and associated morbidity and mortality. Methods: Medical records from 1995 to 2013 were reviewed in accordance with the Institutional Review Board. We identified 21 patients who underwent operations for repair of large anterior skull base defects following removal of sinonasal or intracranial pathology using standard craniofacial procedures. Patient charts were screened for CSF leak, meningitis, or intracranial abscess formation. Results: A total of 15 male and 6 female patients with an age range of 26–89 years were included. All patients were managed with the same operative technique for reconstruction of the frontal dura and skull base defect. Spinal drainage was used intraoperatively in all cases but the lumbar drain was removed at the end of each case in all patients. Only one patient required re-operation for repair of persistent CSF leak. None of the patients developed meningitis or intracranial abscess. There were no perioperative mortalities. Median follow-up was 10 months. Conclusion: The layered reconstruction of large anterior cranial fossa defects resulted in postoperative CSF leak in only 5% of the patients and represents a simple and effective closure option for skull base surgeons. PMID:25722926

  11. Wavelet-based techniques for the gamma-ray sky

    DOE PAGES

    McDermott, Samuel D.; Fox, Patrick J.; Cholis, Ilias; ...

    2016-07-01

    Here, we demonstrate how the image analysis technique of wavelet decomposition can be applied to the gamma-ray sky to separate emission on different angular scales. New structures on scales that differ from the scales of the conventional astrophysical foreground and background uncertainties can be robustly extracted, allowing a model-independent characterization with no presumption of exact signal morphology. As a test case, we generate mock gamma-ray data to demonstrate our ability to extract extended signals without assuming a fixed spatial template. For some point source luminosity functions, our technique also allows us to differentiate a diffuse signal in gamma-rays from darkmore » matter annihilation and extended gamma-ray point source populations in a data-driven way.« less

  12. Wavelet-based techniques for the gamma-ray sky

    SciTech Connect

    McDermott, Samuel D.; Fox, Patrick J.; Cholis, Ilias; Lee, Samuel K.

    2016-07-01

    Here, we demonstrate how the image analysis technique of wavelet decomposition can be applied to the gamma-ray sky to separate emission on different angular scales. New structures on scales that differ from the scales of the conventional astrophysical foreground and background uncertainties can be robustly extracted, allowing a model-independent characterization with no presumption of exact signal morphology. As a test case, we generate mock gamma-ray data to demonstrate our ability to extract extended signals without assuming a fixed spatial template. For some point source luminosity functions, our technique also allows us to differentiate a diffuse signal in gamma-rays from dark matter annihilation and extended gamma-ray point source populations in a data-driven way.

  13. An in Situ Technique for Elemental Analysis of Lunar Surfaces

    NASA Technical Reports Server (NTRS)

    Kane, K. Y.; Cremers, D. A.

    1992-01-01

    An in situ analytical technique that can remotely determine the elemental constituents of solids has been demonstrated. Laser-Induced Breakdown Spectroscopy (LIBS) is a form of atomic emission spectroscopy in which a powerful laser pulse is focused on a solid to generate a laser spark, or microplasma. Material in the plasma is vaporized, and the resulting atoms are excited to emit light. The light is spectrally resolved to identify the emitting species. LIBS is a simple technique that can be automated for inclusion aboard a remotely operated vehicle. Since only optical access to a sample is required, areas inaccessible to a rover can be analyzed remotely. A single laser spark both vaporizes and excites the sample so that near real-time analysis (a few minutes) is possible. This technique provides simultaneous multielement detection and has good sensitivity for many elements. LIBS also eliminates the need for sample retrieval and preparation preventing possible sample contamination. These qualities make the LIBS technique uniquely suited for use in the lunar environment.

  14. X-Ray microanalytical techniques based on synchrotron radiation.

    PubMed

    Snigireva, Irina; Snigirev, Anatoly

    2006-01-01

    The development of 3rd generation synchrotron radiation sources like European Synchrotron Radiation Facility (ESRF) in parallel with recent advances in the technology of X-ray microfocusing elements like Kirkpatrick-Baez (KB) mirrors, diffractive (Fresnel zone plates, FZP) and refractive (compound refractive lenses, CRL) optics, makes it possible to use X-ray microscopy techniques with high energy X-rays (energy superior to 4 keV). Spectroscopy, imaging, tomography and diffraction studies of samples with hard X-rays at micrometre and sub-micrometre spatial resolutions are now possible. The concept of combining these techniques as a high-energy microscopy has been proposed and successfully realized at the ESRF beamlines. Therefore a short summary of X-ray microscopy techniques is presented first. The main emphasis will be put on those methods which aim to produce sub-micron and nanometre resolution. These methods fall into three broad categories: reflective, refractive and diffractive optics. The basic principles and recent achievements will be discussed for all optical devices. Recent applications of synchrotron based microanalytical techniques to characterise radioactive fuel particles (UO(2)) released from the Chernobyl reactor are reported.

  15. Certification-Based Process Analysis

    NASA Technical Reports Server (NTRS)

    Knight, Russell L.

    2013-01-01

    Space mission architects are often challenged with knowing which investment in technology infusion will have the highest return. Certification-based analysis (CBA) gives architects and technologists a means to communicate the risks and advantages of infusing technologies at various points in a process. Various alternatives can be compared, and requirements based on supporting streamlining or automation can be derived and levied on candidate technologies. CBA is a technique for analyzing a process and identifying potential areas of improvement. The process and analysis products are used to communicate between technologists and architects. Process means any of the standard representations of a production flow; in this case, any individual steps leading to products, which feed into other steps, until the final product is produced at the end. This sort of process is common for space mission operations, where a set of goals is reduced eventually to a fully vetted command sequence to be sent to the spacecraft. Fully vetting a product is synonymous with certification. For some types of products, this is referred to as verification and validation, and for others it is referred to as checking. Fundamentally, certification is the step in the process where one insures that a product works as intended, and contains no flaws.

  16. Apprenticeship learning techniques for knowledge-based systems

    SciTech Connect

    Wilkins, D.C.

    1987-01-01

    This thesis describes apprenticeship learning techniques for automation of the transfer of expertise. Apprenticeship learning is a form of learning by watching, in which learning occurs as a byproduct of building explanations of human problem-solving actions. As apprenticeship is the most-powerful method that human experts use to refine and debug their expertise in knowledge-intensive domains such as medicine; this motivates giving such capabilities to an expert system. The major accomplishment in this thesis is showing how an explicit representation of the strategy knowledge to solve a general problem class, such as diagnosis, can provide a basis for learning the knowledge that is specific to a particular domain, such as medicine. The Odysseus learning program provides the first demonstration of using the same technique to transfer of expertise to and from an expert system knowledge base. Another major focus of this thesis is limitations of apprenticeship learning. It is shown that extant techniques for reasoning under uncertainty for expert systems lead to a sociopathic knowledge base.

  17. Recording and analysis techniques for high-frequency oscillations.

    PubMed

    Worrell, G A; Jerbi, K; Kobayashi, K; Lina, J M; Zelmann, R; Le Van Quyen, M

    2012-09-01

    In recent years, new recording technologies have advanced such that, at high temporal and spatial resolutions, high-frequency oscillations (HFO) can be recorded in human partial epilepsy. However, because of the deluge of multichannel data generated by these experiments, achieving the full potential of parallel neuronal recordings depends on the development of new data mining techniques to extract meaningful information relating to time, frequency and space. Here, we aim to bridge this gap by focusing on up-to-date recording techniques for measurement of HFO and new analysis tools for their quantitative assessment. In particular, we emphasize how these methods can be applied, what property might be inferred from neuronal signals, and potentially productive future directions.

  18. Recording and analysis techniques for high-frequency oscillations

    PubMed Central

    Worrell, G.A.; Jerbi, K.; Kobayashi, K.; Lina, J.M.; Zelmann, R.; Le Van Quyen, M.

    2013-01-01

    In recent years, new recording technologies have advanced such that, at high temporal and spatial resolutions, high-frequency oscillations (HFO) can be recorded in human partial epilepsy. However, because of the deluge of multichannel data generated by these experiments, achieving the full potential of parallel neuronal recordings depends on the development of new data mining techniques to extract meaningful information relating to time, frequency and space. Here, we aim to bridge this gap by focusing on up-to-date recording techniques for measurement of HFO and new analysis tools for their quantitative assessment. In particular, we emphasize how these methods can be applied, what property might be inferred from neuronal signals, and potentially productive future directions. PMID:22420981

  19. Application of thermal analysis techniques in activated carbon production

    USGS Publications Warehouse

    Donnals, G.L.; DeBarr, J.A.; Rostam-Abadi, M.; Lizzio, A.A.; Brady, T.A.

    1996-01-01

    Thermal analysis techniques have been used at the ISGS as an aid in the development and characterization of carbon adsorbents. Promising adsorbents from fly ash, tires, and Illinois coals have been produced for various applications. Process conditions determined in the preparation of gram quantities of carbons were used as guides in the preparation of larger samples. TG techniques developed to characterize the carbon adsorbents included the measurement of the kinetics of SO2 adsorption, the performance of rapid proximate analyses, and the determination of equilibrium methane adsorption capacities. Thermal regeneration of carbons was assessed by TG to predict the life cycle of carbon adsorbents in different applications. TPD was used to determine the nature of surface functional groups and their effect on a carbon's adsorption properties.

  20. Infrared Contrast Analysis Technique for Flash Thermography Nondestructive Evaluation

    NASA Technical Reports Server (NTRS)

    Koshti, Ajay

    2014-01-01

    The paper deals with the infrared flash thermography inspection to detect and analyze delamination-like anomalies in nonmetallic materials. It provides information on an IR Contrast technique that involves extracting normalized contrast verses time evolutions from the flash thermography infrared video data. The paper provides the analytical model used in the simulation of infrared image contrast. The contrast evolution simulation is achieved through calibration on measured contrast evolutions from many flat bottom holes in the subject material. The paper also provides formulas to calculate values of the thermal measurement features from the measured contrast evolution curve. Many thermal measurement features of the contrast evolution that relate to the anomaly characteristics are calculated. The measurement features and the contrast simulation are used to evaluate flash thermography inspection data in order to characterize the delamination-like anomalies. In addition, the contrast evolution prediction is matched to the measured anomaly contrast evolution to provide an assessment of the anomaly depth and width in terms of depth and diameter of the corresponding equivalent flat-bottom hole (EFBH) or equivalent uniform gap (EUG). The paper provides anomaly edge detection technique called the half-max technique which is also used to estimate width of an indication. The EFBH/EUG and half-max width estimations are used to assess anomaly size. The paper also provides some information on the "IR Contrast" software application, half-max technique and IR Contrast feature imaging application, which are based on models provided in this paper.

  1. Workflow-based approaches to neuroimaging analysis.

    PubMed

    Fissell, Kate

    2007-01-01

    Analysis of functional and structural magnetic resonance imaging (MRI) brain images requires a complex sequence of data processing steps to proceed from raw image data to the final statistical tests. Neuroimaging researchers have begun to apply workflow-based computing techniques to automate data analysis tasks. This chapter discusses eight major components of workflow management systems (WFMSs): the workflow description language, editor, task modules, data access, verification, client, engine, and provenance, and their implementation in the Fiswidgets neuroimaging workflow system. Neuroinformatics challenges involved in applying workflow techniques in the domain of neuroimaging are discussed.

  2. New modulation-based watermarking technique for video

    NASA Astrophysics Data System (ADS)

    Lemma, Aweke; van der Veen, Michiel; Celik, Mehmet

    2006-02-01

    Successful watermarking algorithms have already been developed for various applications ranging from meta-data tagging to forensic tracking. Nevertheless, it is commendable to develop alternative watermarking techniques that provide a broader basis for meeting emerging services, usage models and security threats. To this end, we propose a new multiplicative watermarking technique for video, which is based on the principles of our successful MASK audio watermark. Audio-MASK has embedded the watermark by modulating the short-time envelope of the audio signal and performed detection using a simple envelope detector followed by a SPOMF (symmetrical phase-only matched filter). Video-MASK takes a similar approach and modulates the image luminance envelope. In addition, it incorporates a simple model to account for the luminance sensitivity of the HVS (human visual system). Preliminary tests show algorithms transparency and robustness to lossy compression.

  3. An osmolyte-based micro-volume ultrafiltration technique.

    PubMed

    Ghosh, Raja

    2014-12-07

    This paper discusses a novel, simple, and inexpensive micro-volume ultrafiltration technique for protein concentration, desalting, buffer exchange, and size-based protein purification. The technique is suitable for processing protein samples in a high-throughput mode. It utilizes a combination of capillary action, and osmosis for drawing water and other permeable species from a micro-volume sample droplet applied on the surface of an ultrafiltration membrane. A macromolecule coated on the permeate side of the membrane functions as the osmolyte. The action of the osmolyte could, if required, be augmented by adding a supersorbent polymer layer over the osmolyte. The mildly hydrophobic surface of the polymeric ultrafiltration membrane used in this study minimized sample droplet spreading, thus making it easy to recover the retained material after separation, without sample interference and cross-contamination. High protein recoveries were observed in the micro-volume ultrafiltration experiments described in the paper.

  4. Nuclear and radiochemical techniques in chemical analysis. Final report

    SciTech Connect

    Finston, H.L.; Williams, E.T.

    1981-06-01

    The areas studied during the period of the contract included determinations of cross sections for nuclear reactions, determination of neutron capture cross sections of radionuclides, application of special activation techniques, and x-ray counting, elucidation of synergic solvent extraction mechanisms and development of new solvent extraction techniques, and the development of a PIXE analytical facility. The thermal neutron capture cross section of /sup 22/Na was determined, and cross sections and energy levels were determined for /sup 20/Ne(n,..cap alpha..)/sup 17/O, /sup 20/Ne(n,P)/sup 20/F, and /sup 40/Ar(n,..cap alpha..)/sup 37/S. Inelastic scattering with 2 to 3 MeV neutrons followed by counting of the metastable states permits analysis of the following elements: In, Sr, Cd, Hg, and Pb. Bromine can be detected in the presence of a 500-fold excess of Na and/or K by thermal neutron activation and x-ray counting, and as little as 0.3 x 10/sup -9/ g of Hg can be detected by this technique. Mediun energy neutrons (10 to 160 MeV) have been used to determine Tl, Pb, and Bi by (n,Xn) and (n,PXn) reactions. The reaction /sup 19/F(P,..cap alpha..)/sup 76/O has been used to determine as little as 50 ..mu..mol of Freon -14. Mechanisms for synergic solvent extractions have been elucidated and a new technique of homogeneous liquid-liquid solvent extraction has been developed in which the neutral complex is rapidly extracted propylene carbonate by raising and lowering the temperature of the system. An external-beam PIXE system has been developed for trace element analyses of a variety of sample types. Various sample preparation techniques have been applied to a diverse range of samples including marine sediment, coral, coal, and blood.

  5. Investigation of spectral analysis techniques for randomly sampled velocimetry data

    NASA Technical Reports Server (NTRS)

    Sree, Dave

    1993-01-01

    It is well known that velocimetry (LV) generates individual realization velocity data that are randomly or unevenly sampled in time. Spectral analysis of such data to obtain the turbulence spectra, and hence turbulence scales information, requires special techniques. The 'slotting' technique of Mayo et al, also described by Roberts and Ajmani, and the 'Direct Transform' method of Gaster and Roberts are well known in the LV community. The slotting technique is faster than the direct transform method in computation. There are practical limitations, however, as to how a high frequency and accurate estimate can be made for a given mean sampling rate. These high frequency estimates are important in obtaining the microscale information of turbulence structure. It was found from previous studies that reliable spectral estimates can be made up to about the mean sampling frequency (mean data rate) or less. If the data were evenly samples, the frequency range would be half the sampling frequency (i.e. up to Nyquist frequency); otherwise, aliasing problem would occur. The mean data rate and the sample size (total number of points) basically limit the frequency range. Also, there are large variabilities or errors associated with the high frequency estimates from randomly sampled signals. Roberts and Ajmani proposed certain pre-filtering techniques to reduce these variabilities, but at the cost of low frequency estimates. The prefiltering acts as a high-pass filter. Further, Shapiro and Silverman showed theoretically that, for Poisson sampled signals, it is possible to obtain alias-free spectral estimates far beyond the mean sampling frequency. But the question is, how far? During his tenure under 1993 NASA-ASEE Summer Faculty Fellowship Program, the author investigated from his studies on the spectral analysis techniques for randomly sampled signals that the spectral estimates can be enhanced or improved up to about 4-5 times the mean sampling frequency by using a suitable

  6. Investigating Coincidence Techniques in Biomedical Applications of Neutron Activation Analysis

    NASA Astrophysics Data System (ADS)

    Chowdhury, P.; Gramer, R.; Tandel, S. K.; Reinhardt, C. J.

    2004-05-01

    While neutron activation analysis has been widely used in biomedical applications for some time, the use of non-radioactive tracer techniques, to monitor, for example, organ blood flow, is more recent. In these studies, pre-clinical animal models are injected with micro-spheres labeled with stable isotopes of elements that have a high neutron absorption cross-section. Subsequently, samples of blood and/or tissue from different locations in the body are subjected to neutron activation analysis to measure the propagation of the labeled micro-spheres through the body. Following irradiation, the counting (with high-resolution Ge detectors) is typically delayed by a few days to dissipate short-lived activity in the samples and improve signal-to-noise for the peaks of interest in the activation spectrum. The aim of the present study was to investigate whether coincidence techniques (for isotopes which decay via two-photon cascades) could improve signal-to-noise and turn-around times. The samples were irradiated at the 1 MW research reactor at the UMass Lowell Radiation Laboratory. The analysis of the multi-parameter coincidence data recorded in event-mode will be presented and compared with the standard method of recording singles spectra.

  7. A new technique for dynamic analysis of bladder compliance.

    PubMed

    Gilmour, R F; Churchill, B M; Steckler, R E; Houle, A M; Khoury, A E; McLorie, G A

    1993-10-01

    We propose an alternative method of measuring compliance that takes into account the multiple phases of bladder filling. We describe our new technique, dynamic compliance analysis, and evaluate its clinical applicability. To perform the analysis we digitized a cystometrogram curve at a sampling rate of 2 samples per second using an MS-DOS computer system. A program designed to retrieve the stored data was used to analyze the subtracted bladder pressure. The result yielded a value of compliance every half second that was then plotted on an x-y graph, with instantaneous compliance as the dependent variable and per cent of total volume infused as the independent variable. To determine the clinical applicability of this technique we chose 63 curves from clinically normal patients. The results of the dynamic compliance analyses were predictable. The dynamic compliance values for the normal group had a minimum that was always greater than 10 ml./cm. water throughout the tonus limb (phase 2) of the cystometrogram. We conclude that dynamic compliance analysis yields more information about bladder response during filling, similar to the stress-strain curve used in the study of solid mechanics.

  8. Overview of independent component analysis technique with an application to synthetic aperture radar (SAR) imagery processing.

    PubMed

    Fiori, Simone

    2003-01-01

    We present an overview of independent component analysis, an emerging signal processing technique based on neural networks, with the aim to provide an up-to-date survey of the theoretical streams in this discipline and of the current applications in the engineering area. We also focus on a particular application, dealing with a remote sensing technique based on synthetic aperture radar imagery processing: we briefly review the features and main applications of synthetic aperture radar and show how blind signal processing by neural networks may be advantageously employed to enhance the quality of remote sensing data.

  9. Vision based techniques for rotorcraft low altitude flight

    NASA Technical Reports Server (NTRS)

    Sridhar, Banavar; Suorsa, Ray; Smith, Philip

    1991-01-01

    An overview of research in obstacle detection at NASA Ames Research Center is presented. The research applies techniques from computer vision to automation of rotorcraft navigation. The development of a methodology for detecting the range to obstacles based on the maximum utilization of passive sensors is emphasized. The development of a flight and image data base for verification of vision-based algorithms, and a passive ranging methodology tailored to the needs of helicopter flight are discussed. Preliminary results indicate that it is possible to obtain adequate range estimates except at regions close to the FOE. Closer to the FOE, the error in range increases since the magnitude of the disparity gets smaller, resulting in a low SNR.

  10. Methodologies and techniques for analysis of network flow data

    SciTech Connect

    Bobyshev, A.; Grigoriev, M.; /Fermilab

    2004-12-01

    Network flow data gathered at the border routers and core switches is used at Fermilab for statistical analysis of traffic patterns, passive network monitoring, and estimation of network performance characteristics. Flow data is also a critical tool in the investigation of computer security incidents. Development and enhancement of flow based tools is an on-going effort. This paper describes the most recent developments in flow analysis at Fermilab.

  11. Review of Preliminary Analysis Techniques for Tension Structures.

    DTIC Science & Technology

    1984-02-01

    COMPLETING FORM I REPORT NuMBER 12 GOVT ACCESSION MO. I PECoPENT*5 CATALO G NUMBER CR 84.017 Sb- " TYPE OFR PO TI PERIOD COVERED6 Review of...Preliminary AnalysisFia 1 Oct 82-30 Sep 83Techniques for Tension Structures 6PROMN R EOTNME 7 A.JTnOR(. 0 CONTRACT OR GRANT NUMBER (.) John W. Leonard N62583/82...11 NUMBER OF PA’,VS ,Port H-ueneme, CA 93043 6- 14 MONiTORINC. AGENCY NAME A ADORESS(,l dilfforn fromW Co-r,Ind Otro 15 SECURITY CL.ASS fol thso *Poof

  12. Technologies and microstructures for separation techniques in chemical analysis

    NASA Astrophysics Data System (ADS)

    Spiering, Vincent L.; Lammerink, Theo S. J.; Jansen, Henri V.; Fluitman, Jan H.; van den Berg, Albert

    1996-09-01

    The possibilities for microtechnology in chemical analysis and separation techniques are discussed. The combination of the materials and the dimensions of structures can limit the sample and waste volumes on the one hand, but also increases the performance of the chemical systems. Especially in high performance chromatography separation systems, where the separation quality is directly depending on the length to width ratio of the fluid channels, there is a large potential for applications. Novel technologies as well as demonstrator devices for different applications will be presented in this paper. Finally, a modular concept for microfluidic systems, in which these micromachined structures can be incorporated, is described and illustrated with a demonstrator.

  13. Reduction and analysis techniques for infrared imaging data

    NASA Technical Reports Server (NTRS)

    Mccaughrean, Mark

    1989-01-01

    Infrared detector arrays are becoming increasingly available to the astronomy community, with a number of array cameras already in use at national observatories, and others under development at many institutions. As the detector technology and imaging instruments grow more sophisticated, more attention is focussed on the business of turning raw data into scientifically significant information. Turning pictures into papers, or equivalently, astronomy into astrophysics, both accurately and efficiently, is discussed. Also discussed are some of the factors that can be considered at each of three major stages; acquisition, reduction, and analysis, concentrating in particular on several of the questions most relevant to the techniques currently applied to near infrared imaging.

  14. Development of solution techniques for nonlinear structural analysis

    NASA Technical Reports Server (NTRS)

    Vos, R. G.; Andrews, J. S.

    1974-01-01

    Nonlinear structural solution methods in the current research literature are classified according to order of the solution scheme, and it is shown that the analytical tools for these methods are uniformly derivable by perturbation techniques. A new perturbation formulation is developed for treating an arbitrary nonlinear material, in terms of a finite-difference generated stress-strain expansion. Nonlinear geometric effects are included in an explicit manner by appropriate definition of an applicable strain tensor. A new finite-element pilot computer program PANES (Program for Analysis of Nonlinear Equilibrium and Stability) is presented for treatment of problems involving material and geometric nonlinearities, as well as certain forms on nonconservative loading.

  15. Data Analysis Techniques for a Lunar Surface Navigation System Testbed

    NASA Technical Reports Server (NTRS)

    Chelmins, David; Sands, O. Scott; Swank, Aaron

    2011-01-01

    NASA is interested in finding new methods of surface navigation to allow astronauts to navigate on the lunar surface. In support of the Vision for Space Exploration, the NASA Glenn Research Center developed the Lunar Extra-Vehicular Activity Crewmember Location Determination System and performed testing at the Desert Research and Technology Studies event in 2009. A significant amount of sensor data was recorded during nine tests performed with six test subjects. This paper provides the procedure, formulas, and techniques for data analysis, as well as commentary on applications.

  16. Prompt Gamma Activation Analysis (PGAA): Technique of choice for nondestructive bulk analysis of returned comet samples

    NASA Technical Reports Server (NTRS)

    Lindstrom, David J.; Lindstrom, Richard M.

    1989-01-01

    Prompt gamma activation analysis (PGAA) is a well-developed analytical technique. The technique involves irradiation of samples in an external neutron beam from a nuclear reactor, with simultaneous counting of gamma rays produced in the sample by neutron capture. Capture of neutrons leads to excited nuclei which decay immediately with the emission of energetic gamma rays to the ground state. PGAA has several advantages over other techniques for the analysis of cometary materials: (1) It is nondestructive; (2) It can be used to determine abundances of a wide variety of elements, including most major and minor elements (Na, Mg, Al, Si, P, K, Ca, Ti, Cr, Mn, Fe, Co, Ni), volatiles (H, C, N, F, Cl, S), and some trace elements (those with high neutron capture cross sections, including B, Cd, Nd, Sm, and Gd); and (3) It is a true bulk analysis technique. Recent developments should improve the technique's sensitivity and accuracy considerably.

  17. Antimisting kerosene: Base fuel effects, blending and quality control techniques

    NASA Technical Reports Server (NTRS)

    Yavrouian, A. H.; Ernest, J.; Sarohia, V.

    1984-01-01

    The problems associated with blending of the AMK additive with Jet A, and the base fuel effects on AMK properties are addressed. The results from the evaluation of some of the quality control techniques for AMK are presented. The principal conclusions of this investigation are: significant compositional differences for base fuel (Jet A) within the ASTM specification DI655; higher aromatic content of the base fuel was found to be beneficial for the polymer dissolution at ambient (20 C) temperature; using static mixer technology, the antimisting additive (FM-9) is in-line blended with Jet A, producing AMK which has adequate fire-protection properties 15 to 20 minutes after blending; degradability of freshly blended and equilibrated AMK indicated that maximum degradability is reached after adequate fire protection is obtained; the results of AMK degradability as measured by filter ratio, confirmed previous RAE data that power requirements to decade freshly blended AMK are significantly higher than equilibrated AMK; blending of the additive by using FM-9 concentrate in Jet A produces equilibrated AMK almost instantly; nephelometry offers a simple continuous monitoring capability and is used as a real time quality control device for AMK; and trajectory (jet thurst) and pressure drop tests are useful laboratory techniques for evaluating AMK quality.

  18. Color demosaicking using deinterlacing and median-based filtering techniques

    NASA Astrophysics Data System (ADS)

    Huang, Wen-Tsung; Chen, Wen-Jan; Tai, Shen-Chuan

    2010-10-01

    Color demosaicking is critical to the image quality of single-sensor-based imaging devices. Caused by the sampling pattern of color filter array (CFA), the demosaicked images typically suffer from visual color artifacts in regions of high frequency and sharp edge structures, degrading the quality of camera output. We present a new high-quality demosaicking algorithm by taking advantage of deinterlacing and median-based filtering techniques. We treat the sampled green data of Bayer CFA as a form of diagonal interlaced green planes and make use of some key concepts about spatial deinterlacing to help the edge estimation in terms of both various directions and accuracy. In addition, a specific edge feature, sharp line edge of width 1 pixel, can also be handed well by the proposed method. The median-based filtering techniques are developed for suppressing most visual demosaicking artifacts, such as zipper effect, false color artifact, and interpolation artifact. Experimental results show that our algorithm is effective in suppressing visual artifacts, preserving the edges of image with sharpness and satisfying visual inspection, while keeping computational efficiency.

  19. Study of systems and techniques for data base management

    NASA Technical Reports Server (NTRS)

    1976-01-01

    Data management areas were studied to identify pertinent problems and issues that will affect future NASA data users in terms of performance and cost. Specific topics discussed include the identifications of potential NASA data users other than those normally discussed, consideration affecting the clustering of minicomputers, low cost computer system for information retrieval and analysis, the testing of minicomputer based data base management systems, ongoing work related to the use of dedicated systems for data base management, and the problems of data interchange among a community of NASA data users.

  20. Lidar analysis techniques for use in the atmospheric boundary layer

    NASA Technical Reports Server (NTRS)

    Eichinger, William E.; Cooper, Daniel I.; Hof, Doug; Holtkamp, David; Quick, Robert, Jr.; Tiee, Joe; Karl, Robert

    1992-01-01

    There is a growing body of observational and theoretical evidence which suggests that local climate characteristics are associated with variations in the earth's surface. The link between surface variability and local-scale processes must be made if we are to improve our understanding of the feedback mechanisms involved in surface-atmosphere dynamics. However, to understand these interactions, the surface-atmosphere interface must be studied as a large-scale spatial system. Lidars are ideal tools to study the spatial properties of the atmosphere. The described techniques were developed for use with the Los Alamos Water Raman-Lidar, but are applicable to many other types of lidar. The methodology of the analysis of lidar data is summarized in order to determine meteorological parameters in the atmospheric boundary layer. The techniques are not exhaustive but are intended to show the depth and breadth of the information which can be obtained from lidars. Two methods for the computation of water-vapor fluxes were developed. The first uses the fact that the water vapor concentration in the vertical direction follows a logarithmic profile when corrected for atmospheric stability. The second method involves using inertial dissipation techniques in which lidar-derived spatial and temporal power spectra are used to determine the flux.

  1. Sensitivity analysis techniques for models of human behavior.

    SciTech Connect

    Bier, Asmeret Brooke

    2010-09-01

    Human and social modeling has emerged as an important research area at Sandia National Laboratories due to its potential to improve national defense-related decision-making in the presence of uncertainty. To learn about which sensitivity analysis techniques are most suitable for models of human behavior, different promising methods were applied to an example model, tested, and compared. The example model simulates cognitive, behavioral, and social processes and interactions, and involves substantial nonlinearity, uncertainty, and variability. Results showed that some sensitivity analysis methods create similar results, and can thus be considered redundant. However, other methods, such as global methods that consider interactions between inputs, can generate insight not gained from traditional methods.

  2. Stalked protozoa identification by image analysis and multivariable statistical techniques.

    PubMed

    Amaral, A L; Ginoris, Y P; Nicolau, A; Coelho, M A Z; Ferreira, E C

    2008-06-01

    Protozoa are considered good indicators of the treatment quality in activated sludge systems as they are sensitive to physical, chemical and operational processes. Therefore, it is possible to correlate the predominance of certain species or groups and several operational parameters of the plant. This work presents a semiautomatic image analysis procedure for the recognition of the stalked protozoa species most frequently found in wastewater treatment plants by determining the geometrical, morphological and signature data and subsequent processing by discriminant analysis and neural network techniques. Geometrical descriptors were found to be responsible for the best identification ability and the identification of the crucial Opercularia and Vorticella microstoma microorganisms provided some degree of confidence to establish their presence in wastewater treatment plants.

  3. Radiation-Based Medical Imaging Techniques: An Overview

    NASA Astrophysics Data System (ADS)

    Prior, John O.; Lecoq, Paul

    This chapter will present an overview of two radiation-based medical imaging techniques using radiopharmaceuticals used in nuclear medicine/molecular imaging, namely, single-photon emission computed tomography (SPECT) and positron emission tomography (PET). The relative merits in terms of radiation sensitivity and image resolution of SPECT and PET will be compared to the main conventional radiologic modalities that are computed tomography (CT) and magnetic resonance (MR) imaging. Differences in terms of temporal resolution will also be outlined, as well as the other similarities and dissimilarities of these two techniques, including their latest and upcoming multimodality combination. The main clinical applications are briefly described and examples of specific SPECT and PET radiopharmaceuticals are listed. SPECT and PET imaging will be then further detailed in the two subsequent chapters describing in greater depth the basics and future trends of each technique (see Chaps. 37, "SPECT Imaging: Basics and New Trends" 10.1007/978-3-642-13271-1_37 and 38, "PET Imaging: Basics and New Trends" 10.1007/978-3-642-13271-1_38.

  4. [Applications of spectral analysis technique to monitoring grasshoppers].

    PubMed

    Lu, Hui; Han, Jian-guo; Zhang, Lu-da

    2008-12-01

    Grasshopper monitoring is of great significance in protecting environment and reducing economic loss. However, how to predict grasshoppers accurately and effectively is a difficult problem for a long time. In the present paper, the importance of forecasting grasshoppers and its habitat is expounded, and the development in monitoring grasshopper populations and the common arithmetic of spectral analysis technique are illustrated. Meanwhile, the traditional methods are compared with the spectral technology. Remote sensing has been applied in monitoring the living, growing and breeding habitats of grasshopper population, and can be used to develop a forecast model combined with GIS. The NDVI values can be analyzed throughout the remote sensing data and be used in grasshopper forecasting. Hyper-spectra remote sensing technique which can be used to monitor grasshoppers more exactly has advantages in measuring the damage degree and classifying damage areas of grasshoppers, so it can be adopted to monitor the spatial distribution dynamic of rangeland grasshopper population. Differentialsmoothing can be used to reflect the relations between the characteristic parameters of hyper-spectra and leaf area index (LAI), and indicate the intensity of grasshopper damage. The technology of near infrared reflectance spectroscopy has been employed in judging grasshopper species, examining species occurrences and monitoring hatching places by measuring humidity and nutrient of soil, and can be used to investigate and observe grasshoppers in sample research. According to this paper, it is concluded that the spectral analysis technique could be used as a quick and exact tool in monitoring and forecasting the infestation of grasshoppers, and will become an important means in such kind of research for their advantages in determining spatial orientation, information extracting and processing. With the rapid development of spectral analysis methodology, the goal of sustainable monitoring

  5. Analysis of filter tuning techniques for sequential orbit determination

    NASA Technical Reports Server (NTRS)

    Lee, T.; Yee, C.; Oza, D.

    1995-01-01

    This paper examines filter tuning techniques for a sequential orbit determination (OD) covariance analysis. Recently, there has been a renewed interest in sequential OD, primarily due to the successful flight qualification of the Tracking and Data Relay Satellite System (TDRSS) Onboard Navigation System (TONS) using Doppler data extracted onboard the Extreme Ultraviolet Explorer (EUVE) spacecraft. TONS computes highly accurate orbit solutions onboard the spacecraft in realtime using a sequential filter. As the result of the successful TONS-EUVE flight qualification experiment, the Earth Observing System (EOS) AM-1 Project has selected TONS as the prime navigation system. In addition, sequential OD methods can be used successfully for ground OD. Whether data are processed onboard or on the ground, a sequential OD procedure is generally favored over a batch technique when a realtime automated OD system is desired. Recently, OD covariance analyses were performed for the TONS-EUVE and TONS-EOS missions using the sequential processing options of the Orbit Determination Error Analysis System (ODEAS). ODEAS is the primary covariance analysis system used by the Goddard Space Flight Center (GSFC) Flight Dynamics Division (FDD). The results of these analyses revealed a high sensitivity of the OD solutions to the state process noise filter tuning parameters. The covariance analysis results show that the state estimate error contributions from measurement-related error sources, especially those due to the random noise and satellite-to-satellite ionospheric refraction correction errors, increase rapidly as the state process noise increases. These results prompted an in-depth investigation of the role of the filter tuning parameters in sequential OD covariance analysis. This paper analyzes how the spacecraft state estimate errors due to dynamic and measurement-related error sources are affected by the process noise level used. This information is then used to establish

  6. Application of transport phenomena analysis technique to cerebrospinal fluid.

    PubMed

    Lam, C H; Hansen, E A; Hall, W A; Hubel, A

    2013-12-01

    The study of hydrocephalus and the modeling of cerebrospinal fluid flow have proceeded in the past using mathematical analysis that was very capable of prediction phenomenonologically but not well in physiologic parameters. In this paper, the basis of fluid dynamics at the physiologic state is explained using first established equations of transport phenomenon. Then, microscopic and molecular level techniques of modeling are described using porous media theory and chemical kinetic theory and then applied to cerebrospinal fluid (CSF) dynamics. Using techniques of transport analysis allows the field of cerebrospinal fluid dynamics to approach the level of sophistication of urine and blood transport. Concepts such as intracellular and intercellular pathways, compartmentalization, and tortuosity are associated with quantifiable parameters that are relevant to the anatomy and physiology of cerebrospinal fluid transport. The engineering field of transport phenomenon is rich and steeped in architectural, aeronautical, nautical, and more recently biological history. This paper summarizes and reviews the approaches that have been taken in the field of engineering and applies it to CSF flow.

  7. Hospitals Productivity Measurement Using Data Envelopment Analysis Technique

    PubMed Central

    TORABIPOUR, Amin; NAJARZADEH, Maryam; ARAB, Mohammad; FARZIANPOUR, Freshteh; GHASEMZADEH, Roya

    2014-01-01

    Abstract Background This study aimed to measure the hospital productivity using data envelopment analysis (DEA) technique and Malmquist indices. Methods This is a cross sectional study in which the panel data were used in a 4 year period from 2007 to 2010. The research was implemented in 12 teaching and non-teaching hospitals of Ahvaz County. Data envelopment analysis technique and the Malmquist indices with an input-orientation approach, was used to analyze the data and estimation of productivity. Data were analyzed using the SPSS.18 and DEAP.2 software. Results Six hospitals (50%) had a value lower than 1, which represents an increase in total productivity and other hospitals were non-productive. the average of total productivity factor (TPF) was 1.024 for all hospitals, which represents a decrease in efficiency by 2.4% from 2007 to 2010. The average technical, technologic, scale and managerial efficiency change was 0.989, 1.008, 1.028, and 0.996 respectively. There was not a significant difference in mean productivity changes among teaching and non-teaching hospitals (P>0.05) (except in 2009 years). Conclusion Productivity rate of hospitals had an increasing trend generally. However, the total average of productivity was decreased in hospitals. Besides, between the several components of total productivity, variation of technological efficiency had the highest impact on reduce of total average of productivity. PMID:26060727

  8. Measured extent of agricultural expansion depends on analysis technique

    DOE PAGES

    Dunn, Jennifer B.; Merz, Dylan; Copenhaver, Ken L.; ...

    2017-01-31

    Concern is rising that ecologically important, carbon-rich natural lands in the United States are losing ground to agriculture. We investigate how quantitative assessments of historical land use change to address this concern differ in their conclusions depending on the data set used. We examined land use change between 2006 and 2014 in 20 counties in the Prairie Pothole Region using the Cropland Data Layer, a modified Cropland Data Layer, data from the National Agricultural Imagery Program, and in-person ground-truthing. The Cropland Data Layer analyses overwhelmingly returned the largest amount of land use change with associated error that limits drawing conclusionsmore » from it. Analysis with visual imagery estimated a fraction of this land use change. Clearly, analysis technique drives understanding of the measured extent of land use change; different techniques produce vastly different results that would inform land management policy in strikingly different ways. As a result, best practice guidelines are needed.« less

  9. The Fourier analysis technique and epsilon-pseudo-eigenvalues

    SciTech Connect

    Donato, J.M.

    1993-07-01

    The spectral radii of iteration matrices and the spectra and condition numbers of preconditioned systems are important in forecasting the convergence rates for iterative methods. Unfortunately, the spectra of iteration matrices or preconditioned systems is rarely easily available. The Fourier analysis technique has been shown to be a useful tool in studying the effectiveness of iterative methods by determining approximate expressions for the eigenvalues or condition numbers of matrix systems. For non-symmetric matrices the eigenvalues may be highly sensitive to perturbations. The spectral radii of nonsymmetric iteration matrices may not give a numerically realistic indication of the convergence of the iterative method. Trefethen and others have presented a theory on the use of {epsilon}-pseudo-eigenvalues in the study of matrix equations. For Toeplitz matrices, we show that the theory of c-pseudo-eigenvalues includes the Fourier analysis technique as a limiting case. For non-Toeplitz matrices, the relationship is not clear. We shall examine this relationship for non-Toeplitz matrices that arise when studying preconditioned systems for methods applied to a two-dimensional discretized elliptic differential equation.

  10. Application of image processing techniques to fluid flow data analysis

    NASA Technical Reports Server (NTRS)

    Giamati, C. C.

    1981-01-01

    The application of color coding techniques used in processing remote sensing imagery to analyze and display fluid flow data is discussed. A minicomputer based color film recording and color CRT display system is described. High quality, high resolution images of two-dimensional data are produced on the film recorder. Three dimensional data, in large volume, are used to generate color motion pictures in which time is used to represent the third dimension. Several applications and examples are presented. System hardware and software is described.

  11. New Flutter Analysis Technique for Time-Domain Computational Aeroelasticity

    NASA Technical Reports Server (NTRS)

    Pak, Chan-Gi; Lung, Shun-Fat

    2017-01-01

    A new time-domain approach for computing flutter speed is presented. Based on the time-history result of aeroelastic simulation, the unknown unsteady aerodynamics model is estimated using a system identification technique. The full aeroelastic model is generated via coupling the estimated unsteady aerodynamic model with the known linear structure model. The critical dynamic pressure is computed and used in the subsequent simulation until the convergence of the critical dynamic pressure is achieved. The proposed method is applied to a benchmark cantilevered rectangular wing.

  12. Thiophene-based monolayer OFETs prepared by Langmuir techniques

    NASA Astrophysics Data System (ADS)

    Agina, Elena V.; Sizov, Alexey S.; Anisimov, Daniil S.; Trul, Askold A.; Borshchev, Oleg V.; Paraschuk, Dmitry Y.; Shcherbina, Maxim A.; Chvalun, Sergey N.; Ponomarenko, Sergey A.

    2015-08-01

    A novel fast, easily processible and highly reproducible approach to thiophene-based monolayer OFETs fabrication by Langmuir-Blodgett or Langmuir-Schaefer techniques was developed and successfully applied. It is based on selfassembly of organosilicon derivatives of oligothiophenes or benzothienobenzothiophene on the water-air interface. Influence of the conjugation length and the anchor group chemistry of the self-assembling molecules on the monolayer structure and electric performance of monolayer OFETs was systematically investigated. The efficient monolayer OFETs with the charge carrier mobilities up to 0.01 cm2/Vs and on/off ratio up to 106 were fabricated, and their functionality in integrated circuits under normal air conditions was demonstrated.

  13. Evolutionary Based Techniques for Fault Tolerant Field Programmable Gate Arrays

    NASA Technical Reports Server (NTRS)

    Larchev, Gregory V.; Lohn, Jason D.

    2006-01-01

    The use of SRAM-based Field Programmable Gate Arrays (FPGAs) is becoming more and more prevalent in space applications. Commercial-grade FPGAs are potentially susceptible to permanently debilitating Single-Event Latchups (SELs). Repair methods based on Evolutionary Algorithms may be applied to FPGA circuits to enable successful fault recovery. This paper presents the experimental results of applying such methods to repair four commonly used circuits (quadrature decoder, 3-by-3-bit multiplier, 3-by-3-bit adder, 440-7 decoder) into which a number of simulated faults have been introduced. The results suggest that evolutionary repair techniques can improve the process of fault recovery when used instead of or as a supplement to Triple Modular Redundancy (TMR), which is currently the predominant method for mitigating FPGA faults.

  14. Vibration Analysis using 3D Image Correlation Technique

    NASA Astrophysics Data System (ADS)

    Siebert, T.; Splitthof, K.

    2010-06-01

    Digital speckle correlation techniques have already been successfully proven to be an accurate displacement analysis tool for a wide range of applications. With the use of two cameras, three dimensional measurements of contours and displacements can be carried out. With a simple setup it opens a wide range of applications. Rapid new developments in the field of digital imaging and computer technology opens further applications for these measurement methods to high speed deformation and strain analysis, e.g. in the fields of material testing, fracture mechanics, advanced materials and component testing. The high resolution of the deformation measurements in space and time opens a wide range of applications for vibration analysis of objects. Since the system determines the absolute position and displacements of the object in space, it is capable of measuring high amplitudes and even objects with rigid body movements. The absolute resolution depends on the field of view and is scalable. Calibration of the optical setup is a crucial point which will be discussed in detail. Examples of the analysis of harmonic vibration and transient events from material research and industrial applications are presented. The results show typical features of the system.

  15. Direct analysis of transonic rotor noise with CFD technique

    NASA Astrophysics Data System (ADS)

    Aoyama, Takashi; Saito, Shigeru

    1994-06-01

    Three-dimensional Euler equations are directly solved to analyze the High-Speed Impulsive (HSI) noise of a helicopter motor by using CFD technique. The MSI noise is one of the most important sources of helicopter noise. It is generated on the advancing side of a helicopter caused by the shock wave on a blade surface. Although the method which solves the Ffowcs-Williams and Hawkings equation has been often used to analyze the subsonic rotor noise, it doesn't success to predict the transonic rotor noise such as the HSI noise With the advance of CFD technique, the calculation of the HSI noise is recently performed by the combined method of CFD with the Kirchhoff's equation or by the direct simulation using CFD technique. The latter has not been studied enough because huge number of grids are needed to capture the propagation of sound from a blade to an observer located in a far field. So, the powerful supercomputer of NAL, Numerical Wind Tunnel (NWT) is employed to calculate the RSI noise of a non-lifting hovering rotor directly by using the method. The numerical method to solve the governing equation is an implicit finite-difference scheme which utilizes a higher-order upwind scheme based on TVD. As a result, it is observed that the calculated wave form is in very good agreement with an experimental data at sonic cylinder. The agreement is not very good at about three rotor radii but is reasonable at about two rotor radii.

  16. RBF-based technique for statistical demodulation of pathological tremor.

    PubMed

    Gianfelici, Francesco

    2013-10-01

    This paper presents an innovative technique based on the joint approximation capabilities of radial basis function (RBF) networks and the estimation capability of the multivariate iterated Hilbert transform (IHT) for the statistical demodulation of pathological tremor from electromyography (EMG) signals in patients with Parkinson's disease. We define a stochastic model of the multichannel high-density surface EMG by means of the RBF networks applied to the reconstruction of the stochastic process (characterizing the disease) modeled by the multivariate relationships generated by the Karhunen-Loéve transform in Hilbert spaces. Next, we perform a demodulation of the entire random field by means of the estimation capability of the multivariate IHT in a statistical setting. The proposed method is applied to both simulated signals and data recorded from three Parkinsonian patients and the results show that the amplitude modulation components of the tremor oscillation can be estimated with signal-to-noise ratio close to 30 dB with root-mean-square error for the estimates of the tremor instantaneous frequency. Additionally, the comparisons with a large number of techniques based on all the combinations of the RBF, extreme learning machine, backpropagation, support vector machine used in the first step of the algorithm; and IHT, empirical mode decomposition, multiband energy separation algorithm, periodic algebraic separation and energy demodulation used in the second step of the algorithm, clearly show the effectiveness of our technique. These results show that the proposed approach is a potential useful tool for advanced neurorehabilitation technologies that aim at tremor characterization and suppression.

  17. Modern Micro and Nanoparticle-Based Imaging Techniques

    PubMed Central

    Ryvolova, Marketa; Chomoucka, Jana; Drbohlavova, Jana; Kopel, Pavel; Babula, Petr; Hynek, David; Adam, Vojtech; Eckschlager, Tomas; Hubalek, Jaromir; Stiborova, Marie; Kaiser, Jozef; Kizek, Rene

    2012-01-01

    The requirements for early diagnostics as well as effective treatment of insidious diseases such as cancer constantly increase the pressure on development of efficient and reliable methods for targeted drug/gene delivery as well as imaging of the treatment success/failure. One of the most recent approaches covering both the drug delivery as well as the imaging aspects is benefitting from the unique properties of nanomaterials. Therefore a new field called nanomedicine is attracting continuously growing attention. Nanoparticles, including fluorescent semiconductor nanocrystals (quantum dots) and magnetic nanoparticles, have proven their excellent properties for in vivo imaging techniques in a number of modalities such as magnetic resonance and fluorescence imaging, respectively. In this article, we review the main properties and applications of nanoparticles in various in vitro imaging techniques, including microscopy and/or laser breakdown spectroscopy and in vivo methods such as magnetic resonance imaging and/or fluorescence-based imaging. Moreover the advantages of the drug delivery performed by nanocarriers such as iron oxides, gold, biodegradable polymers, dendrimers, lipid based carriers such as liposomes or micelles are also highlighted. PMID:23202187

  18. A polarization-based Thomson scattering technique for burning plasmas

    NASA Astrophysics Data System (ADS)

    Parke, E.; Mirnov, V. V.; Den Hartog, D. J.

    2014-02-01

    The traditional Thomson scattering diagnostic is based on measurement of the wavelength spectrum of scattered light, where electron temperature measurements are inferred from thermal broadening of the spectrum. At sufficiently high temperatures, especially those predicted for ITER and other burning plasmas, relativistic effects cause a change in the degree of polarization (P) of the scattered light; for fully polarized incident laser light, the scattered light becomes partially polarized. The resulting reduction of polarization is temperature dependent and has been proposed by other authors as a potential alternative to the traditional spectral decomposition technique. Following the previously developed Stokes vector approach, we analytically calculate the degree of polarization for incoherent Thomson scattering. For the first time, we obtain exact results valid for the full range of incident laser polarization states, scattering angles, and electron temperatures. While previous work focused only on linear polarization, we show that circularly polarized incident light optimizes the degree of depolarization for a wide range of temperatures relevant to burning plasmas. We discuss the feasibility of a polarization based Thomson scattering diagnostic for ITER-like plasmas with both linearly and circularly polarized light and compare to the traditional technique.

  19. Enhancing the effectiveness of IST through risk-based techniques

    SciTech Connect

    Floyd, S.D.

    1996-12-01

    Current IST requirements were developed mainly through deterministic-based methods. While this approach has resulted in an adequate level of safety and reliability for pumps and valves, insights from probabilistic safety assessments suggest a better safety focus can be achieved at lower costs. That is, some high safety impact pumps and valves are currently not tested under the IST program and should be added, while low safety impact valves could be tested at significantly greater intervals than allowed by the current IST program. The nuclear utility industry, through the Nuclear Energy Institute (NEI), has developed a draft guideline for applying risk-based techniques to focus testing on those pumps and valves with a high safety impact while reducing test frequencies on low safety impact pumps and valves. The guideline is being validated through an industry pilot application program that is being reviewed by the U.S. Nuclear Regulatory Commission. NEI and the ASME maintain a dialogue on the two groups` activities related to risk-based IST. The presenter will provide an overview of the NEI guideline, discuss the methodological approach for applying risk-based technology to IST and provide the status of the industry pilot plant effort.

  20. Capillary electrophoresis-based proteomic techniques for biomarker discovery.

    PubMed

    Fang, Xueping; Wang, Chenchen; Lee, Cheng S

    2013-01-01

    Besides proteome complexity, the greatest bioanalytical challenge facing comprehensive proteomic analysis, particularly in the identification of low abundance proteins, is related to the large variation of protein relative abundances. In contrast to universally enriching all analytes by a similar degree, the result of the capillary isotachophoresis (CITP) stacking process is that major components may be diluted, but trace compounds are concentrated. Such selective enhancement toward low abundance proteins drastically reduces the range of relative protein abundances within complex proteomes and greatly enhances the resulting proteome coverage. Furthermore, CITP offers seamless combination with nano-reversed phase liquid chromatography (nano-RPLC) as two highly resolving and completely orthogonal separation techniques critically needed for analyzing complex proteomes.

  1. Method of pectus excavatum measurement based on structured light technique

    NASA Astrophysics Data System (ADS)

    Glinkowski, Wojciech; Sitnik, Robert; Witkowski, Marcin; Kocoń, Hanna; Bolewicki, Pawel; Górecki, Andrzej

    2009-07-01

    We present an automatic method for assessment of pectus excavatum severity based on an optical 3-D markerless shape measurement. A four-directional measurement system based on a structured light projection method is built to capture the shape of the body surface of the patients. The system setup is described and typical measurement parameters are given. The automated data analysis path is explained. Their main steps are: normalization of trunk model orientation, cutting the model into slices, analysis of each slice shape, selecting the proper slice for the assessment of pectus excavatum of the patient, and calculating its shape parameter. We develop a new shape parameter (I3ds) that shows high correlation with the computed tomography (CT) Haller index widely used for assessment of pectus excavatum. Clinical results and the evaluation of developed indexes are presented.

  2. Geospatial Products and Techniques at the Center for Transportation Analysis

    SciTech Connect

    Chin, Shih-Miao; Hwang, Ho-Ling; Peterson, Bruce E

    2008-01-01

    This paper highlights geospatial science-related innovations and developments conducted by the Center for Transportation Analysis (CTA) at the Oak Ridge National Laboratory. CTA researchers have been developing integrated inter-modal transportation solutions through innovative and cost-effective research and development for many years. Specifically, this paper profiles CTA-developed Geographic Information System (GIS) products that are publicly available. Examples of these GIS-related products include: the CTA Transportation Networks; GeoFreight system; and the web-based Multi-Modal Routing Analysis System. In addition, an application on assessment of railroad Hazmat routing alternatives is also discussed.

  3. Polarization-based material classification technique using passive millimeter-wave polarimetric imagery.

    PubMed

    Hu, Fei; Cheng, Yayun; Gui, Liangqi; Wu, Liang; Zhang, Xinyi; Peng, Xiaohui; Su, Jinlong

    2016-11-01

    The polarization properties of thermal millimeter-wave emission capture inherent information of objects, e.g., material composition, shape, and surface features. In this paper, a polarization-based material-classification technique using passive millimeter-wave polarimetric imagery is presented. Linear polarization ratio (LPR) is created to be a new feature discriminator that is sensitive to material type and to remove the reflected ambient radiation effect. The LPR characteristics of several common natural and artificial materials are investigated by theoretical and experimental analysis. Based on a priori information about LPR characteristics, the optimal range of incident angle and the classification criterion are discussed. Simulation and measurement results indicate that the presented classification technique is effective for distinguishing between metals and dielectrics. This technique suggests possible applications for outdoor metal target detection in open scenes.

  4. Breath Analysis Using Laser Spectroscopic Techniques: Breath Biomarkers, Spectral Fingerprints, and Detection Limits

    PubMed Central

    Wang, Chuji; Sahay, Peeyush

    2009-01-01

    Breath analysis, a promising new field of medicine and medical instrumentation, potentially offers noninvasive, real-time, and point-of-care (POC) disease diagnostics and metabolic status monitoring. Numerous breath biomarkers have been detected and quantified so far by using the GC-MS technique. Recent advances in laser spectroscopic techniques and laser sources have driven breath analysis to new heights, moving from laboratory research to commercial reality. Laser spectroscopic detection techniques not only have high-sensitivity and high-selectivity, as equivalently offered by the MS-based techniques, but also have the advantageous features of near real-time response, low instrument costs, and POC function. Of the approximately 35 established breath biomarkers, such as acetone, ammonia, carbon dioxide, ethane, methane, and nitric oxide, 14 species in exhaled human breath have been analyzed by high-sensitivity laser spectroscopic techniques, namely, tunable diode laser absorption spectroscopy (TDLAS), cavity ringdown spectroscopy (CRDS), integrated cavity output spectroscopy (ICOS), cavity enhanced absorption spectroscopy (CEAS), cavity leak-out spectroscopy (CALOS), photoacoustic spectroscopy (PAS), quartz-enhanced photoacoustic spectroscopy (QEPAS), and optical frequency comb cavity-enhanced absorption spectroscopy (OFC-CEAS). Spectral fingerprints of the measured biomarkers span from the UV to the mid-IR spectral regions and the detection limits achieved by the laser techniques range from parts per million to parts per billion levels. Sensors using the laser spectroscopic techniques for a few breath biomarkers, e.g., carbon dioxide, nitric oxide, etc. are commercially available. This review presents an update on the latest developments in laser-based breath analysis. PMID:22408503

  5. Post-test navigation data analysis techniques for the shuttle ALT

    NASA Technical Reports Server (NTRS)

    1975-01-01

    Postflight test analysis data processing techniques for shuttle approach and landing tests (ALT) navigation data are defined. Postfight test processor requirements are described along with operational and design requirements, data input requirements, and software test requirements. The postflight test data processing is described based on the natural test sequence: quick-look analysis, postflight navigation processing, and error isolation processing. Emphasis is placed on the tradeoffs that must remain open and subject to analysis until final definition is achieved in the shuttle data processing system and the overall ALT plan. A development plan for the implementation of the ALT postflight test navigation data processing system is presented. Conclusions are presented.

  6. Electron Microprobe Analysis Techniques for Accurate Measurements of Apatite

    NASA Astrophysics Data System (ADS)

    Goldoff, B. A.; Webster, J. D.; Harlov, D. E.

    2010-12-01

    Apatite [Ca5(PO4)3(F, Cl, OH)] is a ubiquitous accessory mineral in igneous, metamorphic, and sedimentary rocks. The mineral contains halogens and hydroxyl ions, which can provide important constraints on fugacities of volatile components in fluids and other phases in igneous and metamorphic environments in which apatite has equilibrated. Accurate measurements of these components in apatite are therefore necessary. Analyzing apatite by electron microprobe (EMPA), which is a commonly used geochemical analytical technique, has often been found to be problematic and previous studies have identified sources of error. For example, Stormer et al. (1993) demonstrated that the orientation of an apatite grain relative to the incident electron beam could significantly affect the concentration results. In this study, a variety of alternative EMPA operating conditions for apatite analysis were investigated: a range of electron beam settings, count times, crystal grain orientations, and calibration standards were tested. Twenty synthetic anhydrous apatite samples that span the fluorapatite-chlorapatite solid solution series, and whose halogen concentrations were determined by wet chemistry, were analyzed. Accurate measurements of these samples were obtained with many EMPA techniques. One effective method includes setting a static electron beam to 10-15nA, 15kV, and 10 microns in diameter. Additionally, the apatite sample is oriented with the crystal’s c-axis parallel to the slide surface and the count times are moderate. Importantly, the F and Cl EMPA concentrations are in extremely good agreement with the wet-chemical data. We also present EMPA operating conditions and techniques that are problematic and should be avoided. J.C. Stormer, Jr. et al., Am. Mineral. 78 (1993) 641-648.

  7. Plasma and trap-based techniques for science with positrons

    NASA Astrophysics Data System (ADS)

    Danielson, J. R.; Dubin, D. H. E.; Greaves, R. G.; Surko, C. M.

    2015-01-01

    In recent years, there has been a wealth of new science involving low-energy antimatter (i.e., positrons and antiprotons) at energies ranging from 102 to less than 10-3 eV . Much of this progress has been driven by the development of new plasma-based techniques to accumulate, manipulate, and deliver antiparticles for specific applications. This article focuses on the advances made in this area using positrons. However, many of the resulting techniques are relevant to antiprotons as well. An overview is presented of relevant theory of single-component plasmas in electromagnetic traps. Methods are described to produce intense sources of positrons and to efficiently slow the typically energetic particles thus produced. Techniques are described to trap positrons efficiently and to cool and compress the resulting positron gases and plasmas. Finally, the procedures developed to deliver tailored pulses and beams (e.g., in intense, short bursts, or as quasimonoenergetic continuous beams) for specific applications are reviewed. The status of development in specific application areas is also reviewed. One example is the formation of antihydrogen atoms for fundamental physics [e.g., tests of invariance under charge conjugation, parity inversion, and time reversal (the CPT theorem), and studies of the interaction of gravity with antimatter]. Other applications discussed include atomic and materials physics studies and the study of the electron-positron many-body system, including both classical electron-positron plasmas and the complementary quantum system in the form of Bose-condensed gases of positronium atoms. Areas of future promise are also discussed. The review concludes with a brief summary and a list of outstanding challenges.

  8. Modeling and simulation of atmosphere interference signal based on FTIR spectroscopy technique

    NASA Astrophysics Data System (ADS)

    Zhang, Yugui; Li, Qiang; Yu, Zhengyang; Liu, Zhengmin

    2016-09-01

    Fourier Transform Infrared spectroscopy technique, featured with large frequency range and high spectral resolution, is becoming the research focus in spectrum analysis area, and is spreading in atmosphere detection applications in the aerospace field. In this paper, based on FTIR spectroscopy technique, the principle of atmosphere interference signal generation is deduced in theory, and also its mathematical model and simulation are carried out. Finally, the intrinsic characteristics of the interference signal in time domain and frequency domain, which give a theoretical foundation to the performance parameter design of electrical signal processing, are analyzed.

  9. Human reliability analysis (HRA) techniques and observational clinical HRA.

    PubMed

    Cuschieri, Alfred; Tang, B

    2010-01-01

    This review explains the nature of human reliability analysis (HRA) methods developed and used for predicting safety in high-risk human activities. HRA techniques have evolved over the years and have become less subjective as a result of inclusion of (i) cognitive factors in the man-machine interface and (ii) high and low dependency levels between human failure events (HFEs). All however remain probabilistic in the assessment of safety. In the translation of these techniques, developed for assessment of safety of high-risk industries (nuclear, aerospace etc.) where catastrophic failures from the man-machine complex interface are fortunately rare, to the clinical operative surgery (with its high incidence of human errors), the system loses subjectivity since the documentation of HFEs can be assessed and studied prospectively on the basis of an objective data capture of errors enacted during a defined clinical activity. The observational clinical-HRA (OC-HRA) was developed specifically for this purpose, initially for laparoscopic general surgery. It has however been used by other surgical specialties. OC-HRA has the additional merit of objective determination of the proficiency of a surgeon in executing specific interventions and is adaptable to the evaluation of safety and proficiency in clinical activities within the preoperative and postoperative periods.

  10. An objective isobaric/isentropic technique for upper air analysis

    NASA Technical Reports Server (NTRS)

    Mancuso, R. L.; Endlich, R. M.; Ehernberger, L. J.

    1981-01-01

    An objective meteorological analysis technique is presented whereby both horizontal and vertical upper air analyses are performed. The process used to interpolate grid-point values from the upper-air station data is the same as for grid points on both an isobaric surface and a vertical cross-sectional plane. The nearby data surrounding each grid point are used in the interpolation by means of an anisotropic weighting scheme, which is described. The interpolation for a grid-point potential temperature is performed isobarically; whereas wind, mixing-ratio, and pressure height values are interpolated from data that lie on the isentropic surface that passes through the grid point. Two versions (A and B) of the technique are evaluated by qualitatively comparing computer analyses with subjective handdrawn analyses. The objective products of version A generally have fair correspondence with the subjective analyses and with the station data, and depicted the structure of the upper fronts, tropopauses, and jet streams fairly well. The version B objective products correspond more closely to the subjective analyses, and show the same strong gradients across the upper front with only minor smoothing.

  11. Transit Spectroscopy: new data analysis techniques and interpretation

    NASA Astrophysics Data System (ADS)

    Tinetti, Giovanna; Waldmann, Ingo P.; Morello, Giuseppe; Tessenyi, Marcell; Varley, Ryan; Barton, Emma; Yurchenko, Sergey; Tennyson, Jonathan; Hollis, Morgan

    2014-11-01

    Planetary science beyond the boundaries of our Solar System is today in its infancy. Until a couple of decades ago, the detailed investigation of the planetary properties was restricted to objects orbiting inside the Kuiper Belt. Today, we cannot ignore that the number of known planets has increased by two orders of magnitude nor that these planets resemble anything but the objects present in our own Solar System. A key observable for planets is the chemical composition and state of their atmosphere. To date, two methods can be used to sound exoplanetary atmospheres: transit and eclipse spectroscopy, and direct imaging spectroscopy. Although the field of exoplanet spectroscopy has been very successful in past years, there are a few serious hurdles that need to be overcome to progress in this area: in particular instrument systematics are often difficult to disentangle from the signal, data are sparse and often not recorded simultaneously causing degeneracy of interpretation. We will present here new data analysis techniques and interpretation developed by the “ExoLights” team at UCL to address the above-mentioned issues. Said techniques include statistical tools, non-parametric, machine-learning algorithms, optimized radiative transfer models and spectroscopic line-lists. These new tools have been successfully applied to existing data recorded with space and ground instruments, shedding new light on our knowledge and understanding of these alien worlds.

  12. Radial velocity data analysis with compressed sensing techniques

    NASA Astrophysics Data System (ADS)

    Hara, Nathan C.; Boué, G.; Laskar, J.; Correia, A. C. M.

    2017-01-01

    We present a novel approach for analysing radial velocity data that combines two features: all the planets are searched at once and the algorithm is fast. This is achieved by utilizing compressed sensing techniques, which are modified to be compatible with the Gaussian process framework. The resulting tool can be used like a Lomb-Scargle periodogram and has the same aspect but with much fewer peaks due to aliasing. The method is applied to five systems with published radial velocity data sets: HD 69830, HD 10180, 55 Cnc, GJ 876 and a simulated very active star. The results are fully compatible with previous analysis, though obtained more straightforwardly. We further show that 55 Cnc e and f could have been respectively detected and suspected in early measurements from the Lick Observatory and Hobby-Eberly Telescope available in 2004, and that frequencies due to dynamical interactions in GJ 876 can be seen.

  13. Applying machine learning techniques to DNA sequence analysis

    SciTech Connect

    Shavlik, J.W.

    1992-01-01

    We are developing a machine learning system that modifies existing knowledge about specific types of biological sequences. It does this by considering sample members and nonmembers of the sequence motif being learned. Using this information (which we call a domain theory''), our learning algorithm produces a more accurate representation of the knowledge needed to categorize future sequences. Specifically, the KBANN algorithm maps inference rules, such as consensus sequences, into a neural (connectionist) network. Neural network training techniques then use the training examples of refine these inference rules. We have been applying this approach to several problems in DNA sequence analysis and have also been extending the capabilities of our learning system along several dimensions.

  14. Applying machine learning techniques to DNA sequence analysis

    SciTech Connect

    Shavlik, J.W. . Dept. of Computer Sciences); Noordewier, M.O. . Dept. of Computer Science)

    1992-01-01

    We are primarily developing a machine teaming (ML) system that modifies existing knowledge about specific types of biological sequences. It does this by considering sample members and nonmembers of the sequence motif being teamed. Using this information, our teaming algorithm produces a more accurate representation of the knowledge needed to categorize future sequences. Specifically, our KBANN algorithm maps inference rules about a given recognition task into a neural network. Neural network training techniques then use the training examples to refine these inference rules. We call these rules a domain theory, following the convention in the machine teaming community. We have been applying this approach to several problems in DNA sequence analysis. In addition, we have been extending the capabilities of our teaming system along several dimensions. We have also been investigating parallel algorithms that perform sequence alignments in the presence of frameshift errors.

  15. Development of a sensitivity analysis technique for multiloop flight control systems

    NASA Technical Reports Server (NTRS)

    Vaillard, A. H.; Paduano, J.; Downing, D. R.

    1985-01-01

    This report presents the development and application of a sensitivity analysis technique for multiloop flight control systems. This analysis yields very useful information on the sensitivity of the relative-stability criteria of the control system, with variations or uncertainties in the system and controller elements. The sensitivity analysis technique developed is based on the computation of the singular values and singular-value gradients of a feedback-control system. The method is applicable to single-input/single-output as well as multiloop continuous-control systems. Application to sampled-data systems is also explored. The sensitivity analysis technique was applied to a continuous yaw/roll damper stability augmentation system of a typical business jet, and the results show that the analysis is very useful in determining the system elements which have the largest effect on the relative stability of the closed-loop system. As a secondary product of the research reported here, the relative stability criteria based on the concept of singular values were explored.

  16. Detecting Molecular Properties by Various Laser-Based Techniques

    SciTech Connect

    Hsin, Tse-Ming

    2007-01-01

    Four different laser-based techniques were applied to study physical and chemical characteristics of biomolecules and dye molecules. These techniques are liole burning spectroscopy, single molecule spectroscopy, time-resolved coherent anti-Stokes Raman spectroscopy and laser-induced fluorescence microscopy. Results from hole burning and single molecule spectroscopy suggested that two antenna states (C708 & C714) of photosystem I from cyanobacterium Synechocystis PCC 6803 are connected by effective energy transfer and the corresponding energy transfer time is ~6 ps. In addition, results from hole burning spectroscopy indicated that the chlorophyll dimer of the C714 state has a large distribution of the dimer geometry. Direct observation of vibrational peaks and evolution of coumarin 153 in the electronic excited state was demonstrated by using the fs/ps CARS, a variation of time-resolved coherent anti-Stokes Raman spectroscopy. In three different solvents, methanol, acetonitrile, and butanol, a vibration peak related to the stretch of the carbonyl group exhibits different relaxation dynamics. Laser-induced fluorescence microscopy, along with the biomimetic containers-liposomes, allows the measurement of the enzymatic activity of individual alkaline phosphatase from bovine intestinal mucosa without potential interferences from glass surfaces. The result showed a wide distribution of the enzyme reactivity. Protein structural variation is one of the major reasons that are responsible for this highly heterogeneous behavior.

  17. An improved visualization-based force-measurement technique for short-duration hypersonic facilities

    NASA Astrophysics Data System (ADS)

    Laurence, Stuart J.; Karl, Sebastian

    2010-06-01

    This article is concerned with describing and exploring the limitations of an improved version of a recently proposed visualization-based technique for the measurement of forces and moments in short-duration hypersonic wind tunnels. The technique is based on tracking the motion of a free-flying body over a sequence of high-speed visualizations; while this idea is not new in itself, the use of high-speed digital cinematography combined with a highly accurate least-squares tracking algorithm allows improved results over what have been previously possible with such techniques. The technique precision is estimated through the analysis of artificially constructed and experimental test images, and the resulting error in acceleration measurements is characterized. For wind-tunnel scale models, position measurements to within a few microns are shown to be readily attainable. Image data from two previous experimental studies in the T5 hypervelocity shock tunnel are then reanalyzed with the improved technique: the uncertainty in the mean drag acceleration is shown to be reduced to the order of the flow unsteadiness, 2-3%, and time-resolved acceleration measurements are also shown to be possible. The response time of the technique for the configurations studied is estimated to be ˜0.5 ms. Comparisons with computations using the DLR TAU code also yield agreement to within the overall experimental uncertainty. Measurement of the pitching moment for blunt geometries still appears challenging, however.

  18. Adolescent baseball pitching technique: lower extremity biomechanical analysis.

    PubMed

    Milewski, Matthew D; Õunpuu, Sylvia; Solomito, Matthew; Westwell, Melany; Nissen, Carl W

    2012-11-01

    Documentation of the lower extremity motion patterns of adolescent pitchers is an important part of understanding the pitching motion and the implication of lower extremity technique on upper extremity loads, injury and performance. The purpose of this study was to take the initial step in this process by documenting the biomechanics of the lower extremities during the pitching cycle in adolescent pitchers and to compare these findings with the published data for older pitchers. Three-dimensional motion analysis using a comprehensive lower extremity model was used to evaluate the fast ball pitch technique in adolescent pitchers. Thirty-two pitchers with a mean age of 12.4 years (range 10.5-14.7 years) and at least 2 years of experience were included in this study. The pitchers showed a mean of 49 ± 12° of knee flexion of the lead leg at foot contact. They tended to maintain this position through ball release, and then extended their knee during the follow through phase (ball release to maximal internal glenohumeral rotation). The lead leg hip rapidly progressed into adduction and flexion during the arm cocking phase with a range of motion of 40 ± 10° adduction and 30 ± 13° flexion. The lead hip mean peak adduction velocity was 434 ± 83°/s and flexion velocity was 456 ± 156°/s. Simultaneously, the trailing leg hip rapidly extended approaching to a mean peak extension of -8 ± 5° at 39% of the pitch cycle, which is close to passive range of motion constraints. Peak hip abduction of the trailing leg at foot contact was -31 ± 12°, which also approached passive range of motion constraints. Differences and similarities were also noted between the adolescent lower extremity kinematics and adult pitchers; however, a more comprehensive analysis using similar methods is needed for a complete comparison.

  19. A new technique for the separation and analysis of organomercury compounds: HPLC-PCO-CVAAS

    SciTech Connect

    Engelhart, W.G.

    1994-12-31

    While methodologies and instrumentation for mercury are well established, a simple, reliable technique for quantifying organomercury compounds has not emerged. The environmental impact of organomercurials cannot be accurately assessed without data from reliable, standardized analytical procedures. AOAC methods do exist for the analysis of methylmercury in fish tissue and are used for compliance monitoring of the FDA`s 1 ppm action level. However, these gas chromatographic based methods exhibit poor selectivity for organomercury compounds and limited sensitivity due to the small injection volumes used. Virtually all other publications in the field are feasibility studies reporting results obtained using modified, experimental instrumentation. Difficulties in interfacing the instruments required for separation with the instruments performing the quantitation function have hindered adoption of these experimental approaches as routine analytical methods. A new technique for the separation and analysis of organomercury compounds that overcomes the limitations of other techniques has recently been demonstrated. This technique termed HPLC-PCO-CVAAS combines high performance liquid chromatography with a post column oxidation step by followed by cold vapor atomic absorption spectroscopy. The underlying principles of the HPLC-PCO-CVAAS technique will be discussed and contrasted with other techniques. Analytical results obtained with methyl, phenyl and ethyl mercury species, and inorganic mercury (II) will be reported.

  20. Acoustical Characteristics of Mastication Sounds: Application of Speech Analysis Techniques

    NASA Astrophysics Data System (ADS)

    Brochetti, Denise

    Food scientists have used acoustical methods to study characteristics of mastication sounds in relation to food texture. However, a model for analysis of the sounds has not been identified, and reliability of the methods has not been reported. Therefore, speech analysis techniques were applied to mastication sounds, and variation in measures of the sounds was examined. To meet these objectives, two experiments were conducted. In the first experiment, a digital sound spectrograph generated waveforms and wideband spectrograms of sounds by 3 adult subjects (1 male, 2 females) for initial chews of food samples differing in hardness and fracturability. Acoustical characteristics were described and compared. For all sounds, formants appeared in the spectrograms, and energy occurred across a 0 to 8000-Hz range of frequencies. Bursts characterized waveforms for peanut, almond, raw carrot, ginger snap, and hard candy. Duration and amplitude of the sounds varied with the subjects. In the second experiment, the spectrograph was used to measure the duration, amplitude, and formants of sounds for the initial 2 chews of cylindrical food samples (raw carrot, teething toast) differing in diameter (1.27, 1.90, 2.54 cm). Six adult subjects (3 males, 3 females) having normal occlusions and temporomandibular joints chewed the samples between the molar teeth and with the mouth open. Ten repetitions per subject were examined for each food sample. Analysis of estimates of variation indicated an inconsistent intrasubject variation in the acoustical measures. Food type and sample diameter also affected the estimates, indicating the variable nature of mastication. Generally, intrasubject variation was greater than intersubject variation. Analysis of ranks of the data indicated that the effect of sample diameter on the acoustical measures was inconsistent and depended on the subject and type of food. If inferences are to be made concerning food texture from acoustical measures of mastication

  1. Retinoblastoma-comparative analysis of external radiotherapy techniques, including an IMRT technique

    SciTech Connect

    Reisner, Marcio Lemberg . E-mail: mreisner@uol.com.br; Viegas, Celia Maria Pais; Grazziotin, Rachele Zanchet; Santos Batista, Delano Valdivino; Carneiro, Tulio Meneses; Mendonca de Araujo, Carlos Manoel; Marchiori, Edson

    2007-03-01

    Purpose: To compare the numerous external radiotherapy (RT) techniques for the treatment of retinoblastoma, as well as an intensity-modulated RT (IMRT) technique. The latter was elaborated to evaluate the potential dose reduction in the surrounding tissue, as well as the potential avoidance of subdosage in the ora serrata retinae. Methods and Materials: A 2-year-old patient with unilateral retinoblastoma underwent CT. With the aid of an ophthalmologist, the ocular structures were delimited, and 13 techniques described in published reports were reproduced on three-dimensional planning software and identified according to their authors. A technique with four noncoplanar fields using IMRT was also elaborated. These techniques were compared according to the dose to the ora serrata retinae, lens, orbit (volume that received a dose of {>=}20 Gy), vitreous, optic nerve, lacrimal gland (volume that received a dose of {>=}34 Gy), and cornea and according to their ease of reproducibility. Results: The techniques that attained the therapeutic dose to the ora serrata retinae were the IMRT technique and the techniques of Haye, Cassady, Cormack, and al-Beteri. The Cormack technique had the lowest volume that received a dose of {>=}20 Gy in the orbit, followed by the IMRT technique. The IMRT technique also achieved the lowest volume that received a dose of {>=}34 Gy (14%) in the lacrimal gland. The Abramson/McCormick/Blach, Cassady, Reese, and Schipper techniques were the easiest to reproduce and the Chin the most complex. Conclusion: Retinoblastoma treatment with IMRT has an advantage over the other techniques, because it allows for the greatest reduction of dose to the orbit and lacrimal gland, while maintaining the therapeutic dose to the ora serrata retinae and vitreous.

  2. Validation techniques for fault emulation of SRAM-based FPGAs

    DOE PAGES

    Quinn, Heather; Wirthlin, Michael

    2015-08-07

    A variety of fault emulation systems have been created to study the effect of single-event effects (SEEs) in static random access memory (SRAM) based field-programmable gate arrays (FPGAs). These systems are useful for augmenting radiation-hardness assurance (RHA) methodologies for verifying the effectiveness for mitigation techniques; understanding error signatures and failure modes in FPGAs; and failure rate estimation. For radiation effects researchers, it is important that these systems properly emulate how SEEs manifest in FPGAs. If the fault emulation systems does not mimic the radiation environment, the system will generate erroneous data and incorrect predictions of behavior of the FPGA inmore » a radiation environment. Validation determines whether the emulated faults are reasonable analogs to the radiation-induced faults. In this study we present methods for validating fault emulation systems and provide several examples of validated FPGA fault emulation systems.« less

  3. Validation techniques for fault emulation of SRAM-based FPGAs

    SciTech Connect

    Quinn, Heather; Wirthlin, Michael

    2015-08-07

    A variety of fault emulation systems have been created to study the effect of single-event effects (SEEs) in static random access memory (SRAM) based field-programmable gate arrays (FPGAs). These systems are useful for augmenting radiation-hardness assurance (RHA) methodologies for verifying the effectiveness for mitigation techniques; understanding error signatures and failure modes in FPGAs; and failure rate estimation. For radiation effects researchers, it is important that these systems properly emulate how SEEs manifest in FPGAs. If the fault emulation systems does not mimic the radiation environment, the system will generate erroneous data and incorrect predictions of behavior of the FPGA in a radiation environment. Validation determines whether the emulated faults are reasonable analogs to the radiation-induced faults. In this study we present methods for validating fault emulation systems and provide several examples of validated FPGA fault emulation systems.

  4. Diagnosis of Dengue Infection Using Conventional and Biosensor Based Techniques

    PubMed Central

    Parkash, Om; Hanim Shueb, Rafidah

    2015-01-01

    Dengue is an arthropod-borne viral disease caused by four antigenically different serotypes of dengue virus. This disease is considered as a major public health concern around the world. Currently, there is no licensed vaccine or antiviral drug available for the prevention and treatment of dengue disease. Moreover, clinical features of dengue are indistinguishable from other infectious diseases such as malaria, chikungunya, rickettsia and leptospira. Therefore, prompt and accurate laboratory diagnostic test is urgently required for disease confirmation and patient triage. The traditional diagnostic techniques for the dengue virus are viral detection in cell culture, serological testing, and RNA amplification using reverse transcriptase PCR. This paper discusses the conventional laboratory methods used for the diagnosis of dengue during the acute and convalescent phase and highlights the advantages and limitations of these routine laboratory tests. Subsequently, the biosensor based assays developed using various transducers for the detection of dengue are also reviewed. PMID:26492265

  5. Mars laser altimeter based on a single photon ranging technique

    NASA Technical Reports Server (NTRS)

    Prochazka, Ivan; Hamal, Karel; Sopko, B.; Pershin, S.

    1993-01-01

    The Mars 94/96 Mission will carry, among others things, the balloon probe experiment. The balloon with the scientific cargo in the gondola underneath will drift in the Mars atmosphere, its altitude will range from zero, in the night, up to 5 km at noon. The accurate gondola altitude will be determined by an altimeter. As the Balloon gondola mass is strictly limited, the altimeter total mass and power consumption are critical; maximum allowed is a few hundred grams a few tens of mWatts of average power consumption. We did propose, design, and construct the laser altimeter based on the single photon ranging technique. Topics covered include the following: principle of operation, altimeter construction, and ground tests.

  6. Protein elasticity probed with two synchrotron-based techniques.

    PubMed

    Leu, Bogdan M; Alatas, Ahmet; Sinn, Harald; Alp, E Ercan; Said, Ayman H; Yavaş, Hasan; Zhao, Jiyong; Sage, J Timothy; Sturhahn, Wolfgang

    2010-02-28

    Compressibility characterizes three interconnecting properties of a protein: dynamics, structure, and function. The compressibility values for the electron-carrying protein cytochrome c and for other proteins, as well, available in the literature vary considerably. Here, we apply two synchrotron-based techniques--nuclear resonance vibrational spectroscopy and inelastic x-ray scattering--to measure the adiabatic compressibility of this protein. This is the first report of the compressibility of any material measured with this method. Unlike the methods previously used, this novel approach probes the protein globally, at ambient pressure, does not require the separation of protein and solvent contributions to the total compressibility, and uses samples that contain the heme iron, as in the native state. We show, by comparing our results with molecular dynamics predictions, that the compressibility is almost independent of temperature. We discuss potential applications of this method to other materials beyond proteins.

  7. Diagnosis of Dengue Infection Using Conventional and Biosensor Based Techniques.

    PubMed

    Parkash, Om; Shueb, Rafidah Hanim

    2015-10-19

    Dengue is an arthropod-borne viral disease caused by four antigenically different serotypes of dengue virus. This disease is considered as a major public health concern around the world. Currently, there is no licensed vaccine or antiviral drug available for the prevention and treatment of dengue disease. Moreover, clinical features of dengue are indistinguishable from other infectious diseases such as malaria, chikungunya, rickettsia and leptospira. Therefore, prompt and accurate laboratory diagnostic test is urgently required for disease confirmation and patient triage. The traditional diagnostic techniques for the dengue virus are viral detection in cell culture, serological testing, and RNA amplification using reverse transcriptase PCR. This paper discusses the conventional laboratory methods used for the diagnosis of dengue during the acute and convalescent phase and highlights the advantages and limitations of these routine laboratory tests. Subsequently, the biosensor based assays developed using various transducers for the detection of dengue are also reviewed.

  8. Protein elasticity probed with two synchrotron-based techniques.

    SciTech Connect

    Leu, B. M.; Alatas, A.; Sinn, H.; Alp, E. E.; Said, A.; Yavas, H.; Zhao, J.; Sage, J. T.; Sturhahn, W.; X-Ray Science Division; Hasylab; Northeastern Univ.

    2010-02-25

    Compressibility characterizes three interconnecting properties of a protein: dynamics, structure, and function. The compressibility values for the electron-carrying protein cytochrome c and for other proteins, as well, available in the literature vary considerably. Here, we apply two synchrotron-based techniques - nuclear resonance vibrational spectroscopy and inelastic x-ray scattering - to measure the adiabatic compressibility of this protein. This is the first report of the compressibility of any material measured with this method. Unlike the methods previously used, this novel approach probes the protein globally, at ambient pressure, does not require the separation of protein and solvent contributions to the total compressibility, and uses samples that contain the heme iron, as in the native state. We show, by comparing our results with molecular dynamics predictions, that the compressibility is almost independent of temperature. We discuss potential applications of this method to other materials beyond proteins.

  9. Phase-based cell imaging techniques for microbeam irradiations

    NASA Astrophysics Data System (ADS)

    Ross, G. J.; Bigelow, A. W.; Randers–Pehrson, G.; Peng, C. C.; Brenner, D. J.

    2005-12-01

    The microbeam facility at Columbia University is expanding current protocols for single-particle, single-cell irradiations, so experimenters can locate and irradiate nuclei and cytoplasm of unstained cells. The ion beamline is located directly under the dish, therefore, any new techniques must use reflection microscopy. Two approaches are being integrated and neither require the removal of the cell growth medium prior to irradiation. A novel immersion-based Mirau interferometry lens which uses low-coherence light sources to inhibit unwanted fringing is under design. The process requires tens of nanometers or better precision of vertical stage motion, which will be accomplished with our custom high-precision z-stage. Quantitative Phase microscopy is under testing, also using the z-stage. Future plans include optimization of software routines to decrease time between irradiations. Both methods will be compared further with the automated location routines which use nuclear and cytoplasm stains.

  10. A borax fusion technique for quantitative X-ray fluorescence analysis.

    PubMed

    Van Willigen, J H; Kruidhof, H; Dahmen, E A

    1971-04-01

    A borax fusion technique to cast glass discs for quantitative X-ray analysis is described in detail. The method is based on the "nonwetting" properties of a Pt/Au alloy towards molten borax, on the favourable composition of the flux and finally on the favourable form of the casting mould. The critical points of the technique are stressed, resulting in a method which could be carried out successfully by inexperienced workers. In general the method compares favourably in speed and accuracy with wet-chemical methods.

  11. Structural break detection method based on the Adaptive Regression Splines technique

    NASA Astrophysics Data System (ADS)

    Kucharczyk, Daniel; Wyłomańska, Agnieszka; Zimroz, Radosław

    2017-04-01

    For many real data, long term observation consists of different processes that coexist or occur one after the other. Those processes very often exhibit different statistical properties and thus before the further analysis the observed data should be segmented. This problem one can find in different applications and therefore new segmentation techniques have been appeared in the literature during last years. In this paper we propose a new method of time series segmentation, i.e. extraction from the analysed vector of observations homogeneous parts with similar behaviour. This method is based on the absolute deviation about the median of the signal and is an extension of the previously proposed techniques also based on the simple statistics. In this paper we introduce the method of structural break point detection which is based on the Adaptive Regression Splines technique, one of the form of regression analysis. Moreover we propose also the statistical test which allows testing hypothesis of behaviour related to different regimes. First, the methodology we apply to the simulated signals with different distributions in order to show the effectiveness of the new technique. Next, in the application part we analyse the real data set that represents the vibration signal from a heavy duty crusher used in a mineral processing plant.

  12. Optical transmission testing based on asynchronous sampling techniques

    NASA Astrophysics Data System (ADS)

    Mrozek, T.; Perlicki, K.; Wilczewski, G.

    2016-09-01

    This paper presents a method of analysis of images obtained with the Asynchronous Delay Tap Sampling technique, which is used for simultaneous monitoring of a number of phenomena in the physical layer of an optical network. This method allows visualization of results in a form of an optical signal's waveform (characteristics depicting phase portraits). Depending on a specific phenomenon being observed (i.e.: chromatic dispersion, polarization mode dispersion and ASE noise), the shape of the waveform changes. Herein presented original waveforms were acquired utilizing the OptSim 4.0 simulation package. After specific simulation testing, the obtained numerical data was transformed into an image form, that was further subjected to the analysis using authors' custom algorithms. These algorithms utilize various pixel operations and creation of reports each image might be characterized with. Each individual report shows the number of black pixels being present in the specific image segment. Afterwards, generated reports are compared with each other, across the original-impaired relationship. The differential report is created which consists of a "binary key" that shows the increase in the number of pixels in each particular segment. The ultimate aim of this work is to find the correlation between the generated binary keys and the analyzed common phenomenon being observed, allowing identification of the type of interference occurring. In the further course of the work it is evitable to determine their respective values. The presented work delivers the first objective - the ability to recognize interference.

  13. Comparison of gas chromatographic hyphenated techniques for mercury speciation analysis.

    PubMed

    Nevado, J J Berzas; Martín-Doimeadios, R C Rodríguez; Krupp, E M; Bernardo, F J Guzmán; Fariñas, N Rodríguez; Moreno, M Jiménez; Wallace, D; Ropero, M J Patiño

    2011-07-15

    In this study, we evaluate advantages and disadvantages of three hyphenated techniques for mercury speciation analysis in different sample matrices using gas chromatography (GC) with mass spectrometry (GC-MS), inductively coupled plasma mass spectrometry (GC-ICP-MS) and pyrolysis atomic fluorescence (GC-pyro-AFS) detection. Aqueous ethylation with NaBEt(4) was required in all cases. All systems were validated with respect to precision, with repeatability and reproducibility <5% RSD, confirmed by the Snedecor F-test. All methods proved to be robust according to a Plackett-Burnham design for 7 factors and 15 experiments, and calculations were carried out using the procedures described by Youden and Steiner. In order to evaluate accuracy, certified reference materials (DORM-2 and DOLT-3) were analyzed after closed-vessel microwave extraction with tetramethylammonium hydroxide (TMAH). No statistically significant differences were found to the certified values (p=0.05). The suitability for water samples analysis with different organic matter and chloride contents was evaluated by recovery experiments in synthetic spiked waters. Absolute detection and quantification limits were in the range of 2-6 pg for GC-pyro-AFS, 1-4 pg for GC-MS, with 0.05-0.21 pg for GC-ICP-MS showing the best limits of detection for the three systems employed. However, all systems are sufficiently sensitive for mercury speciation in environmental samples, with GC-MS and GC-ICP-MS offering isotope analysis capabilities for the use of species-specific isotope dilution analysis, and GC-pyro-AFS being the most cost effective alternative.

  14. Improved Tandem Measurement Techniques for Aerosol Particle Analysis

    NASA Astrophysics Data System (ADS)

    Rawat, Vivek Kumar

    Non-spherical, chemically inhomogeneous (complex) nanoparticles are encountered in a number of natural and engineered environments, including combustion systems (which produces highly non-spherical aggregates), reactors used in gas-phase materials synthesis of doped or multicomponent materials, and in ambient air. These nanoparticles are often highly diverse in size, composition and shape, and hence require determination of property distribution functions for accurate characterization. This thesis focuses on development of tandem mobility-mass measurement techniques coupled with appropriate data inversion routines to facilitate measurement of two dimensional size-mass distribution functions while correcting for the non-idealities of the instruments. Chapter 1 provides the detailed background and motivation for the studies performed in this thesis. In chapter 2, the development of an inversion routine is described which is employed to determine two dimensional size-mass distribution functions from Differential Mobility Analyzer-Aerosol Particle Mass analyzer tandem measurements. Chapter 3 demonstrates the application of the two dimensional distribution function to compute cumulative mass distribution function and also evaluates the validity of this technique by comparing the calculated total mass concentrations to measured values for a variety of aerosols. In Chapter 4, this tandem measurement technique with the inversion routine is employed to analyze colloidal suspensions. Chapter 5 focuses on application of a transverse modulation ion mobility spectrometer coupled with a mass spectrometer to study the effect of vapor dopants on the mobility shifts of sub 2 nm peptide ion clusters. These mobility shifts are then compared to models based on vapor uptake theories. Finally, in Chapter 6, a conclusion of all the studies performed in this thesis is provided and future avenues of research are discussed.

  15. Novel failure analysis techniques using photon probing with a scanning optical microscope

    SciTech Connect

    Cole, E.I. Jr.; Soden, J.M.; Rife, J.L.; Barton, D.L.; Henderson, C.L.

    1993-12-31

    Three new failure analysis techniques for integrated circuits (ICs) have been developed using localized photon probing with a scanning optical microscope (SOM). The first two are light-induced voltage alteration (LIVA) imaging techniques that (1) localize open-circuited and damaged junctions and (2) image transistor logic states. The third technique uses the SOM to control logic states optically from the IC backside. LIVA images are produced by monitoring the voltage fluctuations of a constant current power supply as a laser beam is scanned over the IC. High selectivity for localizing defects has been demonstrated using the LIVA approach. Logic state mapping results, similar to previous work using biased optical beam induced current (OBIC) and laser probing approaches have also been produced using LIVA. Application of the two LIVA based techniques to backside failure analysis has been demonstrated using an infrared laser source. Optical logic state control is based upon earlier work examining transistor response to photon injection. The physics of each method and their applications for failure analysis are described.

  16. Chemometric experimental design based optimization techniques in capillary electrophoresis: a critical review of modern applications.

    PubMed

    Hanrahan, Grady; Montes, Ruthy; Gomez, Frank A

    2008-01-01

    A critical review of recent developments in the use of chemometric experimental design based optimization techniques in capillary electrophoresis applications is presented. Current advances have led to enhanced separation capabilities of a wide range of analytes in such areas as biological, environmental, food technology, pharmaceutical, and medical analysis. Significant developments in design, detection methodology and applications from the last 5 years (2002-2007) are reported. Furthermore, future perspectives in the use of chemometric methodology in capillary electrophoresis are considered.

  17. Hybrid Analytical Technique for Nonlinear Vibration Analysis of Thin-Walled Beams

    NASA Technical Reports Server (NTRS)

    Noor, Ahmed K.; Hadian, Jafar M.; Andersen, Carl M.

    1993-01-01

    A two-step hybrid analytical technique is presented for the nonlinear vibration analysis of thin-walled beams. The first step involves the generation of various-order perturbation functions using the Linstedt-Poincare perturbation technique. The second step consists of using the perturbation functions as coordinate (or approximation) functions and then computing both the amplitudes of these functions and the nonlinear frequency of vibration via a direct variational procedure. The analytical formulation is based on a form of the geometrically nonlinear beam theory with the effects of in-plane inertia, rotatory inertia, and transverse shear deformation included. The effectiveness of the proposed technique is demonstrated by means of a numerical example of thin-walled beam with a doubly symmetric I-section. The solutions obtained using a single-spatial mode were compared with those obtained using multiple-spatial modes. The standard of comparison was taken to be the frequencies obtained by the direct integration/fast Fourier transform (FFT) technique. The nonlinear frequencies obtained by the hybrid technique were shown to converge to the corresponding ones obtained by the direct integration/fast Fourier transform (FFT) technique well beyond the range of applicability of the perturbation technique. The frequencies and total strain energy of the beam were overestimated by using a single-spatial mode.

  18. State of the art in feedstuff analysis: a technique-oriented perspective.

    PubMed

    Cheli, Federica; Battaglia, Debora; Pinotti, Luciano; Baldi, Antonella

    2012-09-26

    The need for global feed supply traceability, the high-throughput testing demands of feed industry, and regulatory enforcement drive the need for feed analysis and make extremely complex the issue of the control and evaluation of feed quality, safety, and functional properties, all of which contribute to the very high number of analyses that must be performed. Feed analysis, with respect to animal nutritional requirements, health, reproduction, and production, should be multianalytically approached. In addition to standard methods of chemical analysis, new methods for evaluation of feed composition and functional properties, authenticity, and safety have been developed. Requirements for new analytical methods emphasize performance, sensitivity, reliability, speed, simplified use, low cost for high volume, and routine assays. This review provides an overview of the most used and promising methods for feed analysis. The review is intentionally focused on the following techniques: classical chemical analysis; in situ and in vitro methods; analytical techniques coupled with chemometric tools (NIR and sensors); and cell-based bioassays. This review describes both the potential and limitations of each technique and discusses the challenges that need to be overcome to obtain validated and standardized methods of analysis for a complete and global feed evaluation and characterization.

  19. Sensitivity-analysis techniques: self-teaching curriculum

    SciTech Connect

    Iman, R.L.; Conover, W.J.

    1982-06-01

    This self teaching curriculum on sensitivity analysis techniques consists of three parts: (1) Use of the Latin Hypercube Sampling Program (Iman, Davenport and Ziegler, Latin Hypercube Sampling (Program User's Guide), SAND79-1473, January 1980); (2) Use of the Stepwise Regression Program (Iman, et al., Stepwise Regression with PRESS and Rank Regression (Program User's Guide) SAND79-1472, January 1980); and (3) Application of the procedures to sensitivity and uncertainty analyses of the groundwater transport model MWFT/DVM (Campbell, Iman and Reeves, Risk Methodology for Geologic Disposal of Radioactive Waste - Transport Model Sensitivity Analysis; SAND80-0644, NUREG/CR-1377, June 1980: Campbell, Longsine, and Reeves, The Distributed Velocity Method of Solving the Convective-Dispersion Equation, SAND80-0717, NUREG/CR-1376, July 1980). This curriculum is one in a series developed by Sandia National Laboratories for transfer of the capability to use the technology developed under the NRC funded High Level Waste Methodology Development Program.

  20. Pattern Recognition Software and Techniques for Biological Image Analysis

    PubMed Central

    Shamir, Lior; Delaney, John D.; Orlov, Nikita; Eckley, D. Mark; Goldberg, Ilya G.

    2010-01-01

    The increasing prevalence of automated image acquisition systems is enabling new types of microscopy experiments that generate large image datasets. However, there is a perceived lack of robust image analysis systems required to process these diverse datasets. Most automated image analysis systems are tailored for specific types of microscopy, contrast methods, probes, and even cell types. This imposes significant constraints on experimental design, limiting their application to the narrow set of imaging methods for which they were designed. One of the approaches to address these limitations is pattern recognition, which was originally developed for remote sensing, and is increasingly being applied to the biology domain. This approach relies on training a computer to recognize patterns in images rather than developing algorithms or tuning parameters for specific image processing tasks. The generality of this approach promises to enable data mining in extensive image repositories, and provide objective and quantitative imaging assays for routine use. Here, we provide a brief overview of the technologies behind pattern recognition and its use in computer vision for biological and biomedical imaging. We list available software tools that can be used by biologists and suggest practical experimental considerations to make the best use of pattern recognition techniques for imaging assays. PMID:21124870

  1. Pattern recognition software and techniques for biological image analysis.

    PubMed

    Shamir, Lior; Delaney, John D; Orlov, Nikita; Eckley, D Mark; Goldberg, Ilya G

    2010-11-24

    The increasing prevalence of automated image acquisition systems is enabling new types of microscopy experiments that generate large image datasets. However, there is a perceived lack of robust image analysis systems required to process these diverse datasets. Most automated image analysis systems are tailored for specific types of microscopy, contrast methods, probes, and even cell types. This imposes significant constraints on experimental design, limiting their application to the narrow set of imaging methods for which they were designed. One of the approaches to address these limitations is pattern recognition, which was originally developed for remote sensing, and is increasingly being applied to the biology domain. This approach relies on training a computer to recognize patterns in images rather than developing algorithms or tuning parameters for specific image processing tasks. The generality of this approach promises to enable data mining in extensive image repositories, and provide objective and quantitative imaging assays for routine use. Here, we provide a brief overview of the technologies behind pattern recognition and its use in computer vision for biological and biomedical imaging. We list available software tools that can be used by biologists and suggest practical experimental considerations to make the best use of pattern recognition techniques for imaging assays.

  2. Ares Launch Vehicle Transonic Buffet Testing and Analysis Techniques

    NASA Technical Reports Server (NTRS)

    Piatak, David J.; Sekula, Martin K.; Rausch, Russ D.

    2010-01-01

    It is necessary to define the launch vehicle buffet loads to ensure that structural components and vehicle subsystems possess adequate strength, stress, and fatigue margins when the vehicle structural dynamic response to buffet forcing functions are considered. In order to obtain these forcing functions, the accepted method is to perform wind-tunnel testing of a rigid model instrumented with hundreds of unsteady pressure transducers designed to measure the buffet environment across the desired frequency range. The buffet wind-tunnel test program for the Ares Crew Launch Vehicle employed 3.5 percent scale rigid models of the Ares I and Ares I-X launch vehicles instrumented with 256 unsteady pressure transducers each. These models were tested at transonic conditions at the Transonic Dynamics Tunnel at NASA Langley Research Center. The ultimate deliverable of the Ares buffet test program are buffet forcing functions (BFFs) derived from integrating the measured fluctuating pressures on the rigid wind-tunnel models. These BFFs are then used as input to a multi-mode structural analysis to determine the vehicle response to buffet and the resulting buffet loads and accelerations. This paper discusses the development of the Ares I and I-X rigid buffet model test programs from the standpoint of model design, instrumentation system design, test implementation, data analysis techniques to yield final products, and presents normalized sectional buffet forcing function root-mean-squared levels.

  3. Spatiotemporal analysis of olive flowering using geostatistical techniques.

    PubMed

    Rojo, Jesús; Pérez-Badia, Rosa

    2015-02-01

    Analysis of flowering patterns in the olive (Olea europaea L.) are of considerable agricultural and ecological interest, and also provide valuable information for allergy-sufferers, enabling identification of the major sources of airborne pollen at any given moment by interpreting the aerobiological data recorded in pollen traps. The present spatiotemporal analysis of olive flowering in central Spain combined geostatistical techniques with the application of a Geographic Information Systems, and compared results for flowering intensity with airborne pollen records. The results were used to obtain continuous phenological maps which determined the pattern of the succession of the olive flowering. The results show also that, although the highest airborne olive-pollen counts were recorded during the greatest flowering intensity of the groves closest to the pollen trap, the counts recorded at the start of the pollen season were not linked to local olive groves, which had not yet begin to flower. To detect the remote sources of olive pollen several episodes of pollen recorded before the local flowering season were analysed using a HYSPLIT trajectory model and the findings showed that western, southern and southwestern winds transported pollen grains into the study area from earlier-flowering groves located outside the territory.

  4. Skull base tumours part I: imaging technique, anatomy and anterior skull base tumours.

    PubMed

    Borges, Alexandra

    2008-06-01

    Advances in cross-sectional imaging, surgical technique and adjuvant treatment have largely contributed to ameliorate the prognosis, lessen the morbidity and mortality of patients with skull base tumours and to the growing medical investment in the management of these patients. Because clinical assessment of the skull base is limited, cross-sectional imaging became indispensable in the diagnosis, treatment planning and follow-up of patients with suspected skull base pathology and the radiologist is increasingly responsible for the fate of these patients. This review will focus on the advances in imaging technique; contribution to patient's management and on the imaging features of the most common tumours affecting the anterior skull base. Emphasis is given to a systematic approach to skull base pathology based upon an anatomic division taking into account the major tissue constituents in each skull base compartment. The most relevant information that should be conveyed to surgeons and radiation oncologists involved in patient's management will be discussed.

  5. Knowledge-Based Image Analysis.

    DTIC Science & Technology

    1981-04-01

    UNCLASSIF1 ED ETL-025s N IIp ETL-0258 AL Ai01319 S"Knowledge-based image analysis u George C. Stockman Barbara A. Lambird I David Lavine Laveen N. Kanal...extraction, verification, region classification, pattern recognition, image analysis . 3 20. A. CT (Continue on rever.. d. It necessary and Identify by...UNCLgSTFTF n In f SECURITY CLASSIFICATION OF THIS PAGE (When Date Entered) .L1 - I Table of Contents Knowledge Based Image Analysis I Preface

  6. Tools and Techniques for Wt1-Based Lineage Tracing.

    PubMed

    Wilm, Bettina; Muñoz-Chapuli, Ramon

    2016-01-01

    The spatiotemporal expression pattern of Wt1 has been extensively studied in a number of animal models to establish its function and the developmental fate of the cells expressing this gene. In this chapter, we review the available animal models for Wt1-expressing cell lineage analysis, including direct Wt1 expression reporters and systems for permanent Wt1 lineage tracing. We describe the presently used constitutive or inducible genetic lineage tracing approaches based on the Cre/loxP system utilizing Cre recombinase expression under control of a Wt1 promoter.To make these systems accessible, we provide laboratory protocols that include dissection and processing of the tissues for immunofluorescence and histopathological analysis of the lineage-labeled Wt1-derived cells within the embryo/tissue context.

  7. The effects of processing techniques on magnesium-based composite

    NASA Astrophysics Data System (ADS)

    Rodzi, Siti Nur Hazwani Mohamad; Zuhailawati, Hussain

    2016-12-01

    The aim of this study is to investigate the effect of processing techniques on the densification, hardness and compressive strength of Mg alloy and Mg-based composite for biomaterial application. The control sample (pure Mg) and Mg-based composite (Mg-Zn/HAp) were fabricated through mechanical alloying process using high energy planetary mill, whilst another Mg-Zn/HAp composite was fabricated through double step processing (the matrix Mg-Zn alloy was fabricated by planetary mill, subsequently HAp was dispersed by roll mill). As-milled powder was then consolidated by cold press into 10 mm diameter pellet under 400 MPa compaction pressure before being sintered at 300 °C for 1 hour under the flow of argon. The densification of the sintered pellets were then determined by Archimedes principle. Mechanical properties of the sintered pellets were characterized by microhardness and compression test. The results show that the density of the pellets was significantly increased by addition of HAp, but the most optimum density was observed when the sample was fabricated through double step processing (1.8046 g/cm3). Slight increment in hardness and ultimate compressive strength were observed for Mg-Zn/HAp composite that was fabricated through double step processing (58.09 HV, 132.19 MPa), as compared to Mg-Zn/HAp produced through single step processing (47.18 HV, 122.49 MPa).

  8. Design of OFDM radar pulses using genetic algorithm based techniques

    NASA Astrophysics Data System (ADS)

    Lellouch, Gabriel; Mishra, Amit Kumar; Inggs, Michael

    2016-08-01

    The merit of evolutionary algorithms (EA) to solve convex optimization problems is widely acknowledged. In this paper, a genetic algorithm (GA) optimization based waveform design framework is used to improve the features of radar pulses relying on the orthogonal frequency division multiplexing (OFDM) structure. Our optimization techniques focus on finding optimal phase code sequences for the OFDM signal. Several optimality criteria are used since we consider two different radar processing solutions which call either for single or multiple-objective optimizations. When minimization of the so-called peak-to-mean envelope power ratio (PMEPR) single-objective is tackled, we compare our findings with existing methods and emphasize on the merit of our approach. In the scope of the two-objective optimization, we first address PMEPR and peak-to-sidelobe level ratio (PSLR) and show that our approach based on the non-dominated sorting genetic algorithm-II (NSGA-II) provides design solutions with noticeable improvements as opposed to random sets of phase codes. We then look at another case of interest where the objective functions are two measures of the sidelobe level, namely PSLR and the integrated-sidelobe level ratio (ISLR) and propose to modify the NSGA-II to include a constrain on the PMEPR instead. In the last part, we illustrate via a case study how our encoding solution makes it possible to minimize the single objective PMEPR while enabling a target detection enhancement strategy, when the SNR metric would be chosen for the detection framework.

  9. Parameter tuning of PVD process based on artificial intelligence technique

    NASA Astrophysics Data System (ADS)

    Norlina, M. S.; Diyana, M. S. Nor; Mazidah, P.; Rusop, M.

    2016-07-01

    In this study, an artificial intelligence technique is proposed to be implemented in the parameter tuning of a PVD process. Due to its previous adaptation in similar optimization problems, genetic algorithm (GA) is selected to optimize the parameter tuning of the RF magnetron sputtering process. The most optimized parameter combination obtained from GA's optimization result is expected to produce the desirable zinc oxide (ZnO) thin film from the sputtering process. The parameters involved in this study were RF power, deposition time and substrate temperature. The algorithm was tested to optimize the 25 datasets of parameter combinations. The results from the computational experiment were then compared with the actual result from the laboratory experiment. Based on the comparison, GA had shown that the algorithm was reliable to optimize the parameter combination before the parameter tuning could be done to the RF magnetron sputtering machine. In order to verify the result of GA, the algorithm was also been compared to other well known optimization algorithms, which were, particle swarm optimization (PSO) and gravitational search algorithm (GSA). The results had shown that GA was reliable in solving this RF magnetron sputtering process parameter tuning problem. GA had shown better accuracy in the optimization based on the fitness evaluation.

  10. Development of carbon fiber-based piezoresistive linear sensing technique

    NASA Astrophysics Data System (ADS)

    Yang, Caiqian; Wu, Zhishen; Huang, Huang

    2009-03-01

    In this paper, the development of carbon fiber-based piezoresistive linear sensing technique and its application in civil engineering structures is studied and summarized. The sensing mechanism is based on the electrical conductivity and piezoresistivity of different types of carbon fibers. Firstly, the influences of values of signal currents and temperature on the sensing properties are studied to decide the suitable sensing current. Then, the linear temperature and strain sensing feasibility of different types of carbon fibers is addressed and discussed. Finally, the application of this kind of sensors is studied in monitoring the health of reinforced concrete (RC) and prestressed concrete (PC) structures. A good linearity of fractional change in electrical resistance (ER) (ΔR/R0)-strain and &DeltaR/R0-temperature is demonstrated. The &DeltaR/R0-strain and &DeltaR/R0-temperature curves of CFRP/HCFRP sensors can be well fitted with a line with a correlation coefficient larger than 0.978. All these reveal that carbon fibers reinforced polymer (CFRP) can be used as both piezoresistive linear strain and temperature sensors.

  11. [Evidence-based surgical techniques for caesarean section].

    PubMed

    Aabakke, Anna J M; Secher, Niels Jørgen; Krebs, Lone

    2014-02-10

    Caesarean section (CS) is a common surgical procedure, and in Denmark 21% of deliveries is by CS. There is an increasing amount of scientific evidence to support the different surgical techniques used at CS. This article reviews the literature regarding CS techniques. There is still a lack of evidence especially about the long-term consequences of the surgical techniques.

  12. Integration of geological remote-sensing techniques in subsurface analysis

    USGS Publications Warehouse

    Taranik, James V.; Trautwein, Charles M.

    1976-01-01

    Geological remote sensing is defined as the study of the Earth utilizing electromagnetic radiation which is either reflected or emitted from its surface in wavelengths ranging from 0.3 micrometre to 3 metres. The natural surface of the Earth is composed of a diversified combination of surface cover types, and geologists must understand the characteristics of surface cover types to successfully evaluate remotely-sensed data. In some areas landscape surface cover changes throughout the year, and analysis of imagery acquired at different times of year can yield additional geological information. Integration of different scales of analysis allows landscape features to be effectively interpreted. Interpretation of the static elements displayed on imagery is referred to as an image interpretation. Image interpretation is dependent upon: (1) the geologist's understanding of the fundamental aspects of image formation, and (2.) his ability to detect, delineate, and classify image radiometric data; recognize radiometric patterns; and identify landscape surface characteristics as expressed on imagery. A geologic interpretation integrates surface characteristics of the landscape with subsurface geologic relationships. Development of a geologic interpretation from imagery is dependent upon: (1) the geologist's ability to interpret geomorphic processes from their static surface expression as landscape characteristics on imagery, (2) his ability to conceptualize the dynamic processes responsible for the evolution 6f interpreted geologic relationships (his ability to develop geologic models). The integration of geologic remote-sensing techniques in subsurface analysis is illustrated by development of an exploration model for ground water in the Tucson area of Arizona, and by the development of an exploration model for mineralization in southwest Idaho.

  13. Plasma-based ambient mass spectrometry techniques: The current status and future prospective.

    PubMed

    Ding, Xuelu; Duan, Yixiang

    2015-01-01

    Plasma-based ambient mass spectrometry is emerging as a frontier technology for direct analysis of sample that employs low-energy plasma as the ionization reagent. The versatile sources of ambient mass spectrometry (MS) can be classified according to the plasma formation approaches; namely, corona discharge, glow discharge, dielectric barrier discharge, and microwave-induced discharge. These techniques allow pretreatment-free detection of samples, ranging from biological materials (e.g., flies, bacteria, plants, tissues, peptides, metabolites, and lipids) to pharmaceuticals, food-stuffs, polymers, chemical warfare reagents, and daily-use chemicals. In most cases, plasma-based ambient MS performs well as a qualitative tool and as an analyzer for semi-quantitation. Herein, we provide an overview of the key concepts, mechanisms, and applications of plasma-based ambient MS techniques, and discuss the challenges and outlook.

  14. A dynamic mechanical analysis technique for porous media

    PubMed Central

    Pattison, Adam J; McGarry, Matthew; Weaver, John B; Paulsen, Keith D

    2015-01-01

    Dynamic mechanical analysis (DMA) is a common way to measure the mechanical properties of materials as functions of frequency. Traditionally, a viscoelastic mechanical model is applied and current DMA techniques fit an analytical approximation to measured dynamic motion data by neglecting inertial forces and adding empirical correction factors to account for transverse boundary displacements. Here, a finite element (FE) approach to processing DMA data was developed to estimate poroelastic material properties. Frequency-dependent inertial forces, which are significant in soft media and often neglected in DMA, were included in the FE model. The technique applies a constitutive relation to the DMA measurements and exploits a non-linear inversion to estimate the material properties in the model that best fit the model response to the DMA data. A viscoelastic version of this approach was developed to validate the approach by comparing complex modulus estimates to the direct DMA results. Both analytical and FE poroelastic models were also developed to explore their behavior in the DMA testing environment. All of the models were applied to tofu as a representative soft poroelastic material that is a common phantom in elastography imaging studies. Five samples of three different stiffnesses were tested from 1 – 14 Hz with rough platens placed on the top and bottom surfaces of the material specimen under test to restrict transverse displacements and promote fluid-solid interaction. The viscoelastic models were identical in the static case, and nearly the same at frequency with inertial forces accounting for some of the discrepancy. The poroelastic analytical method was not sufficient when the relevant physical boundary constraints were applied, whereas the poroelastic FE approach produced high quality estimates of shear modulus and hydraulic conductivity. These results illustrated appropriate shear modulus contrast between tofu samples and yielded a consistent contrast in

  15. A Simulated Comparison of Level-1b GRACE Analysis Techniques

    NASA Astrophysics Data System (ADS)

    Andrews, Stuart; Moore, Philip; King, Matt

    2014-05-01

    GRACE estimates of temporal mass anomalies have been obtained using a number of different approaches including; conventional spherical harmonic using a standard Gaussian smoothing filter and the mascon approach which applies a constraint matrix between mascon parameters that share geophysical similarities. Temporal gravity fields are frequently produced by different groups and obtained using different codes and algorithms making it hard to directly compare any subsequent mass flux analysis. It is therefore important that an assessment of the different methodologies is undertaken to provide users with an understanding of the errors and to assess the ability of each technique to resolve basin-level mass changes at a variety of spatial scales. In this study we undertake a comparison of solutions generated through the estimation of mascon and spherical harmonic coefficients. Simulations provide an accurate assessment and quantify the capability of each technique to resolve basin-level mass changes at a variety of spatial scales while understanding how the methodologies handle the noise inherent at higher degree and order. We will present results of our simulations and show how masses leak into their surrounding region through the GRACE KBRR residuals. Through a simulated recovery of a GLDAS anomaly with added noise in the form of 'stripes' we will show the advantage of the mascon solution over a spherical harmonic recovery. The study is subsequently extended to simulate the recovery of an Antarctic mass signal validating the use of the mascon methodology in Polar Regions. We will show how the addition of a constraint between mascon parameters that share geophysical similarities result in a reduction of the signal lost at all degrees and an improvement in the recovered signal.

  16. Validation techniques of agent based modelling for geospatial simulations

    NASA Astrophysics Data System (ADS)

    Darvishi, M.; Ahmadi, G.

    2014-10-01

    One of the most interesting aspects of modelling and simulation study is to describe the real world phenomena that have specific properties; especially those that are in large scales and have dynamic and complex behaviours. Studying these phenomena in the laboratory is costly and in most cases it is impossible. Therefore, Miniaturization of world phenomena in the framework of a model in order to simulate the real phenomena is a reasonable and scientific approach to understand the world. Agent-based modelling and simulation (ABMS) is a new modelling method comprising of multiple interacting agent. They have been used in the different areas; for instance, geographic information system (GIS), biology, economics, social science and computer science. The emergence of ABM toolkits in GIS software libraries (e.g. ESRI's ArcGIS, OpenMap, GeoTools, etc) for geospatial modelling is an indication of the growing interest of users to use of special capabilities of ABMS. Since ABMS is inherently similar to human cognition, therefore it could be built easily and applicable to wide range applications than a traditional simulation. But a key challenge about ABMS is difficulty in their validation and verification. Because of frequent emergence patterns, strong dynamics in the system and the complex nature of ABMS, it is hard to validate and verify ABMS by conventional validation methods. Therefore, attempt to find appropriate validation techniques for ABM seems to be necessary. In this paper, after reviewing on Principles and Concepts of ABM for and its applications, the validation techniques and challenges of ABM validation are discussed.

  17. Photonic analog-to-digital conversion based on oversampling techniques

    NASA Astrophysics Data System (ADS)

    Shoop, Barry L.; Das, Pankaj K.; Ressler, Eugene K., Jr.; Talty, Timothy J.

    2000-07-01

    A novel photonic approach to analog-to-digital (A/D) conversion based on temporal and spatial oversampling techniques in conjunction with a smart pixel hardware implementation of a neural algorithm is described. In this approach, the input signal is first sampled at a rate higher than that required by the Nyquist criterion and then presented spatially as the input to the 2D error diffusion neural network consisting of M X N pixels. The neural network processes the input oversampled analog image and produces an M X N pixel binary output image which is an optimum representation of the input analog signal. Upon convergence, the neural network minimizes an energy function representing the frequency-weighted squared error between the input analog image and the output halftoned image. Decimation and low-pass filtering techniques, common to oversampling A/D converters, digitally sum and average the M X N pixel output binary image using high-speed digital electronic circuitry. By employing a 2D smart pixel neural approach to oversampling A/D conversion, each pixel constitutes a simple oversampling modulator thereby producing a distributed A/D architecture. Spectral noise shaping across the array diffuses quantization error thereby improving the signal-to-noise ratio performance. Here, each quantizer within the network is embedded in a fully- connected, distributed mesh feedback loop which spectrally shapes the overall quantization noise significantly reducing the effects of component mismatch typically associated with parallel or channelized A/D approaches. The 2D neural array provides higher aggregate bit rates which can extend the useful bandwidth of oversampling converters.

  18. Rapid Automated Dissolution and Analysis Techniques for Radionuclides in Recycle Process Streams

    SciTech Connect

    Sudowe, Ralf; Roman, Audrey; Dailey, Ashlee; Go, Elaine

    2013-07-18

    The analysis of process samples for radionuclide content is an important part of current procedures for material balance and accountancy in the different process streams of a recycling plant. The destructive sample analysis techniques currently available necessitate a significant amount of time. It is therefore desirable to develop new sample analysis procedures that allow for a quick turnaround time and increased sample throughput with a minimum of deviation between samples. In particular, new capabilities for rapid sample dissolution and radiochemical separation are required. Most of the radioanalytical techniques currently employed for sample analysis are based on manual laboratory procedures. Such procedures are time- and labor-intensive, and not well suited for situations in which a rapid sample analysis is required and/or large number of samples need to be analyzed. To address this issue we are currently investigating radiochemical separation methods based on extraction chromatography that have been specifically optimized for the analysis of process stream samples. The influence of potential interferences present in the process samples as well as mass loading, flow rate and resin performance is being studied. In addition, the potential to automate these procedures utilizing a robotic platform is evaluated. Initial studies have been carried out using the commercially available DGA resin. This resin shows an affinity for Am, Pu, U, and Th and is also exhibiting signs of a possible synergistic effects in the presence of iron.

  19. Subcellular chemical and morphological analysis by stimulated Raman scattering microscopy and image analysis techniques

    PubMed Central

    D’Arco, Annalisa; Brancati, Nadia; Ferrara, Maria Antonietta; Indolfi, Maurizio; Frucci, Maria; Sirleto, Luigi

    2016-01-01

    The visualization of heterogeneous morphology, segmentation and quantification of image features is a crucial point for nonlinear optics microscopy applications, spanning from imaging of living cells or tissues to biomedical diagnostic. In this paper, a methodology combining stimulated Raman scattering microscopy and image analysis technique is presented. The basic idea is to join the potential of vibrational contrast of stimulated Raman scattering and the strength of imaging analysis technique in order to delineate subcellular morphology with chemical specificity. Validation tests on label free imaging of polystyrene-beads and of adipocyte cells are reported and discussed. PMID:27231626

  20. Applications of Advanced, Waveform Based AE Techniques for Testing Composite Materials

    NASA Technical Reports Server (NTRS)

    Prosser, William H.

    1996-01-01

    Advanced, waveform based acoustic emission (AE) techniques have been previously used to evaluate damage progression in laboratory tests of composite coupons. In these tests, broad band, high fidelity acoustic sensors were used to detect signals which were then digitized and stored for analysis. Analysis techniques were based on plate mode wave propagation characteristics. This approach, more recently referred to as Modal AE, provides an enhanced capability to discriminate and eliminate noise signals from those generated by damage mechanisms. This technique also allows much more precise source location than conventional, threshold crossing arrival time determination techniques. To apply Modal AE concepts to the interpretation of AE on larger composite structures, the effects of wave propagation over larger distances and through structural complexities must be well characterized and understood. In this research, measurements were made of the attenuation of the extensional and flexural plate mode components of broad band simulated AE signals in large composite panels. As these materials have applications in a cryogenic environment, the effects of cryogenic insulation on the attenuation of plate mode AE signals were also documented.

  1. Handheld underwater 3D sensor based on fringe projection technique

    NASA Astrophysics Data System (ADS)

    Bräuer-Burchardt, Christian; Heinze, Matthias; Schmidt, Ingo; Meng, Lichun; Ramm, Roland; Kühmstedt, Peter; Notni, Gunther

    2015-05-01

    A new, handheld 3D surface scanner was developed especially for underwater use until a diving depth of about 40 meters. Additionally, the sensor is suitable for the outdoor use under bad weather circumstance like splashing water, wind, and bad illumination conditions. The optical components of the sensor are two cameras and one projector. The measurement field is about 250 mm x 200 mm. The depth resolution is about 50 μm and the lateral resolution is approximately 150 μm. The weight of the scanner is about 10 kg. The housing was produced of synthetic powder using a 3D printing technique. The measurement time for one scan is between a third and a half second. The computer for measurement control and data analysis is already integrated into the housing of the scanner. A display on the backside presents the results of each measurement graphically for a real-time evaluation of the user during the recording of the measurement data.

  2. Exploring techniques for vision based human activity recognition: methods, systems, and evaluation.

    PubMed

    Xu, Xin; Tang, Jinshan; Zhang, Xiaolong; Liu, Xiaoming; Zhang, Hong; Qiu, Yimin

    2013-01-25

    With the wide applications of vision based intelligent systems, image and video analysis technologies have attracted the attention of researchers in the computer vision field. In image and video analysis, human activity recognition is an important research direction. By interpreting and understanding human activity, we can recognize and predict the occurrence of crimes and help the police or other agencies react immediately. In the past, a large number of papers have been published on human activity recognition in video and image sequences. In this paper, we provide a comprehensive survey of the recent development of the techniques, including methods, systems, and quantitative evaluation towards the performance of human activity recognition.

  3. [Statistical study of the wavelet-based lossy medical image compression technique].

    PubMed

    Puniene, Jūrate; Navickas, Ramūnas; Punys, Vytenis; Jurkevicius, Renaldas

    2002-01-01

    Medical digital images have informational redundancy. Both the amount of memory for image storage and their transmission time could be reduced if image compression techniques are applied. The techniques are divided into two groups: lossless (compression ratio does not exceed 3 times) and lossy ones. Compression ratio of lossy techniques depends on visibility of distortions. It is a variable parameter and it can exceed 20 times. A compression study was performed to evaluate the compression schemes, which were based on the wavelet transform. The goal was to develop a set of recommendations for an acceptable compression ratio for different medical image modalities: ultrasound cardiac images and X-ray angiographic images. The acceptable image quality after compression was evaluated by physicians. Statistical analysis of the evaluation results was used to form a set of recommendations.

  4. Dimensional Changes of Acrylic Resin Denture Bases: Conventional Versus Injection-Molding Technique

    PubMed Central

    Gharechahi, Jafar; Asadzadeh, Nafiseh; Shahabian, Foad; Gharechahi, Maryam

    2014-01-01

    Objective: Acrylic resin denture bases undergo dimensional changes during polymerization. Injection molding techniques are reported to reduce these changes and thereby improve physical properties of denture bases. The aim of this study was to compare dimensional changes of specimens processed by conventional and injection-molding techniques. Materials and Methods: SR-Ivocap Triplex Hot resin was used for conventional pressure-packed and SR-Ivocap High Impact was used for injection-molding techniques. After processing, all the specimens were stored in distilled water at room temperature until measured. For dimensional accuracy evaluation, measurements were recorded at 24-hour, 48-hour and 12-day intervals using a digital caliper with an accuracy of 0.01 mm. Statistical analysis was carried out by SPSS (SPSS Inc., Chicago, IL, USA) using t-test and repeated-measures ANOVA. Statistical significance was defined at P<0.05. Results: After each water storage period, the acrylic specimens produced by injection exhibited less dimensional changes compared to those produced by the conventional technique. Curing shrinkage was compensated by water sorption with an increase in water storage time decreasing dimensional changes. Conclusion: Within the limitations of this study, dimensional changes of acrylic resin specimens were influenced by the molding technique used and SR-Ivocap injection procedure exhibited higher dimensional accuracy compared to conventional molding. PMID:25584050

  5. Damage Detection and Analysis in CFRPs Using Acoustic Emission Technique

    NASA Astrophysics Data System (ADS)

    Whitlow, Travis Laron

    Real time monitoring of damage is an important aspect of life management of critical structures. Acoustic emission (AE) techniques allow for measurement and assessment of damage in real time. Acoustic emission parameters such as signal amplitude and duration were monitored during the loading sequences. Criteria that can indicate the onset of critical damage to the structure were developed. Tracking the damage as it happens gives a better analysis of the failure evolution that will allow for a more accurate determination of structural life. The main challenge is distinguishing between legitimate damage signals and "false positives" which are unrelated to damage growth. Such false positives can be related to electrical noise, friction, or mechanical vibrations. This research focuses on monitoring signals of damage growth in carbon fiber reinforced polymers (CFRPs) and separating the relevant signals from the false ones. In this Dissertation, acoustic emission signals from CFRP specimens were experimentally recorded and analyzed. The objectives of this work are: (1) perform static and fatigue loading of CFRP composite specimens and measure the associated AE signals, (2) accurately determine the AE parameters (energy, frequency, duration, etc.) of signals generated during failure of such specimens, (3) use fiber optic sensors to monitor the strain distribution of the damage zone and relate these changes in strain measurements to AE data.

  6. An evaluation of wind turbine blade cross section analysis techniques.

    SciTech Connect

    Paquette, Joshua A.; Griffith, Daniel Todd; Laird, Daniel L.; Resor, Brian Ray

    2010-03-01

    The blades of a modern wind turbine are critical components central to capturing and transmitting most of the load experienced by the system. They are complex structural items composed of many layers of fiber and resin composite material and typically, one or more shear webs. Large turbine blades being developed today are beyond the point of effective trial-and-error design of the past and design for reliability is always extremely important. Section analysis tools are used to reduce the three-dimensional continuum blade structure to a simpler beam representation for use in system response calculations to support full system design and certification. One model simplification approach is to analyze the two-dimensional blade cross sections to determine the properties for the beam. Another technique is to determine beam properties using static deflections of a full three-dimensional finite element model of a blade. This paper provides insight into discrepancies observed in outputs from each approach. Simple two-dimensional geometries and three-dimensional blade models are analyzed in this investigation. Finally, a subset of computational and experimental section properties for a full turbine blade are compared.

  7. Probability techniques for reliability analysis of composite materials

    NASA Technical Reports Server (NTRS)

    Wetherhold, Robert C.; Ucci, Anthony M.

    1994-01-01

    Traditional design approaches for composite materials have employed deterministic criteria for failure analysis. New approaches are required to predict the reliability of composite structures since strengths and stresses may be random variables. This report will examine and compare methods used to evaluate the reliability of composite laminae. The two types of methods that will be evaluated are fast probability integration (FPI) methods and Monte Carlo methods. In these methods, reliability is formulated as the probability that an explicit function of random variables is less than a given constant. Using failure criteria developed for composite materials, a function of design variables can be generated which defines a 'failure surface' in probability space. A number of methods are available to evaluate the integration over the probability space bounded by this surface; this integration delivers the required reliability. The methods which will be evaluated are: the first order, second moment FPI methods; second order, second moment FPI methods; the simple Monte Carlo; and an advanced Monte Carlo technique which utilizes importance sampling. The methods are compared for accuracy, efficiency, and for the conservativism of the reliability estimation. The methodology involved in determining the sensitivity of the reliability estimate to the design variables (strength distributions) and importance factors is also presented.

  8. An assessment of finite-element modeling techniques for thick-solid/thin-shell joints analysis

    NASA Technical Reports Server (NTRS)

    Min, J. B.; Androlake, S. G.

    1993-01-01

    The subject of finite-element modeling has long been of critical importance to the practicing designer/analyst who is often faced with obtaining an accurate and cost-effective structural analysis of a particular design. Typically, these two goals are in conflict. The purpose is to discuss the topic of finite-element modeling for solid/shell connections (joints) which are significant for the practicing modeler. Several approaches are currently in use, but frequently various assumptions restrict their use. Such techniques currently used in practical applications were tested, especially to see which technique is the most ideally suited for the computer aided design (CAD) environment. Some basic thoughts regarding each technique are also discussed. As a consequence, some suggestions based on the results are given to lead reliable results in geometrically complex joints where the deformation and stress behavior are complicated.

  9. Multivariate analysis of remote LIBS spectra using partial least squares, principal component analysis, and related techniques

    SciTech Connect

    Clegg, Samuel M; Barefield, James E; Wiens, Roger C; Sklute, Elizabeth; Dyare, Melinda D

    2008-01-01

    Quantitative analysis with LIBS traditionally employs calibration curves that are complicated by the chemical matrix effects. These chemical matrix effects influence the LIBS plasma and the ratio of elemental composition to elemental emission line intensity. Consequently, LIBS calibration typically requires a priori knowledge of the unknown, in order for a series of calibration standards similar to the unknown to be employed. In this paper, three new Multivariate Analysis (MV A) techniques are employed to analyze the LIBS spectra of 18 disparate igneous and highly-metamorphosed rock samples. Partial Least Squares (PLS) analysis is used to generate a calibration model from which unknown samples can be analyzed. Principal Components Analysis (PCA) and Soft Independent Modeling of Class Analogy (SIMCA) are employed to generate a model and predict the rock type of the samples. These MV A techniques appear to exploit the matrix effects associated with the chemistries of these 18 samples.

  10. Analysis of compressive fracture in rock using statistical techniques

    SciTech Connect

    Blair, S.C.

    1994-12-01

    Fracture of rock in compression is analyzed using a field-theory model, and the processes of crack coalescence and fracture formation and the effect of grain-scale heterogeneities on macroscopic behavior of rock are studied. The model is based on observations of fracture in laboratory compression tests, and incorporates assumptions developed using fracture mechanics analysis of rock fracture. The model represents grains as discrete sites, and uses superposition of continuum and crack-interaction stresses to create cracks at these sites. The sites are also used to introduce local heterogeneity. Clusters of cracked sites can be analyzed using percolation theory. Stress-strain curves for simulated uniaxial tests were analyzed by studying the location of cracked sites, and partitioning of strain energy for selected intervals. Results show that the model implicitly predicts both development of shear-type fracture surfaces and a strength-vs-size relation that are similar to those observed for real rocks. Results of a parameter-sensitivity analysis indicate that heterogeneity in the local stresses, attributed to the shape and loading of individual grains, has a first-order effect on strength, and that increasing local stress heterogeneity lowers compressive strength following an inverse power law. Peak strength decreased with increasing lattice size and decreasing mean site strength, and was independent of site-strength distribution. A model for rock fracture based on a nearest-neighbor algorithm for stress redistribution is also presented and used to simulate laboratory compression tests, with promising results.

  11. Biofunctionalization of Si nanowires using a solution based technique

    NASA Astrophysics Data System (ADS)

    Williams, Elissa H.; Davydov, Albert V.; Oleshko, Vladimir P.; Lin, Nancy J.; Steffens, Kristen L.; Manocchi, Amy K.; Krylyuk, Sergiy; Rao, Mulpuri V.; Schreifels, John A.

    2012-10-01

    Here we present a solution based functionalization technique for streptavidin (SA) protein conjugation to silicon nanowires (Si NWs). Si NWs, with a diameter of 110 nm to 130 nm and a length of 5 μm to 10 μm, were functionalized with 3-aminopropyltriethoxysilane (APTES) followed by biotin for the selective attachment of SA. High-resolution transmission electron microscopy (HRTEM) and atomic force microscopy (AFM) showed that the Si NWs were conformally coated with 20 nm to 30 nm thick APTES, biotin, and SA layers upon functionalization. Successful attachment of each bio/organic layer was confirmed by X-ray photoelectron spectroscopy (XPS) and fluorescence microscopy. Fluorescence microscopy also demonstrated that there was an undesirable non-specific binding of the SA protein as well as a control protein, bovine serum albumin (BSA), to the APTES-coated Si NWs. However, inhibition of BSA binding and enhancement of SA binding were achieved following the biotinylation step. The biofunctionalized Si NWs show potential as label-free biosensing platforms for the specific and selective detection of biomolecules.

  12. Research on technique of wavefront retrieval based on Foucault test

    NASA Astrophysics Data System (ADS)

    Yuan, Lvjun; Wu, Zhonghua

    2010-05-01

    During finely grinding the best fit sphere and initial stage of polishing, surface error of large aperture aspheric mirrors is too big to test using common interferometer. Foucault test is widely used in fabricating large aperture mirrors. However, the optical path is disturbed seriously by air turbulence, and changes of light and dark zones can not be identified, which often lowers people's judging ability and results in making mistake to diagnose surface error of the whole mirror. To solve the problem, the research presents wavefront retrieval based on Foucault test through digital image processing and quantitative calculation. Firstly, real Foucault image can be gained through collecting a variety of images by CCD, and then average these image to eliminate air turbulence. Secondly, gray values are converted into surface error values through principle derivation, mathematical modeling, and software programming. Thirdly, linear deviation brought by defocus should be removed by least-square method to get real surface error. At last, according to real surface error, plot wavefront map, gray contour map and corresponding pseudo color contour map. The experimental results indicates that the three-dimensional wavefront map and two-dimensional contour map are able to accurately and intuitively show surface error on the whole mirrors under test, and they are beneficial to grasp surface error as a whole. The technique can be used to guide the fabrication of large aperture and long focal mirrors during grinding and initial stage of polishing the aspheric surface, which improves fabricating efficiency and precision greatly.

  13. Age estimation based on Kvaal's technique using digital panoramic radiographs

    PubMed Central

    Mittal, Samta; Nagendrareddy, Suma Gundareddy; Sharma, Manisha Lakhanpal; Agnihotri, Poornapragna; Chaudhary, Sunil; Dhillon, Manu

    2016-01-01

    Introduction: Age estimation is important for administrative and ethical reasons and also because of legal consequences. Dental pulp undergoes regression in size with increasing age due to secondary dentin deposition and can be used as a parameter of age estimation even beyond 25 years of age. Kvaal et al. developed a method for chronological age estimation based on the pulp size using periapical dental radiographs. There is a need for testing this method of age estimation in the Indian population using simple tools like digital imaging on living individuals not requiring extraction of teeth. Aims and Objectives: Estimation of the chronological age of subjects by Kvaal's method using digital panoramic radiographs and also testing the validity of regression equations as given by Kvaal et al. Materials and Methods: The study sample included a total of 152 subjects in the age group of 14-60 years. Measurements were performed on the standardized digital panoramic radiographs based on Kvaal's method. Different regression formulae were derived and the age was assessed. The assessed age was then correlated to the actual age of the patient using Student's t-test. Results: No significant difference between the mean of the chronological age and the estimated age was observed. However, the values of the mean age estimated by using regression equations as given previously in the study of Kvaal et al. significantly underestimated the chronological age in the present study sample. Conclusion: The results of the study give an inference for the feasibility of this technique by calculation of regression equations on digital panoramic radiographs. However, it negates the applicability of same regression equations as given by Kvaal et al. on the study population. PMID:27555738

  14. Novel technique: a pupillometer-based objective chromatic perimetry

    NASA Astrophysics Data System (ADS)

    Rotenstreich, Ygal; Skaat, Alon; Sher, Ifat; Kolker, Andru; Rosenfeld, Elkana; Melamed, Shlomo; Belkin, Michael

    2014-02-01

    Evaluation of visual field (VF) is important for clinical diagnosis and patient monitoring. The current VF methods are subjective and require patient cooperation. Here we developed a novel objective perimetry technique based on the pupil response (PR) to multifocal chromatic stimuli in normal subjects and in patients with glaucoma and retinitis pigmentosa (RP). A computerized infrared video pupillometer was used to record PR to short- and long-wavelength stimuli (peak 485 nm and 620 nm, respectively) at light intensities of 15-100 cd-s/m2 at thirteen different points of the VF. The RP study included 30 eyes of 16 patients and 20 eyes of 12 healthy participants. The glaucoma study included 22 eyes of 11 patients and 38 eyes of 19 healthy participants. Significantly reduced PR was observed in RP patients in response to short-wavelength stimuli at 40 cd-s/m2 in nearly all perimetric locations (P <0.05). By contrast, RP patients demonstrated nearly normal PR to long-wavelength in majority of perimetric locations. The glaucoma group showed significantly reduced PR to long- and short-wavelength stimuli at high intensity in all perimetric locations (P <0.05). The PR of glaucoma patients was significantly lower than normal in response to short-wavelength stimuli at low intensity mostly in central and 20° locations (p<0.05). This study demonstrates the feasibility of using pupillometer-based chromatic perimetry for objectively assessing VF defects and retinal function and optic nerve damage in patients with retinal dystrophies and glaucoma. Furthermore, this method may be used to distinguish between the damaged cells underlying the VF defect.

  15. Application of activation techniques to biological analysis. [813 references

    SciTech Connect

    Bowen, H.J.M.

    1981-12-01

    Applications of activation analysis in the biological sciences are reviewed for the period of 1970 to 1979. The stages and characteristics of activation analysis are described, and its advantages and disadvantages enumerated. Most applications involve activation by thermal neutrons followed by either radiochemical or instrumental determination. Relatively little use has been made of activation by fast neutrons, photons, or charged particles. In vivo analyses are included, but those based on prompt gamma or x-ray emission are not. Major applications include studies of reference materials, and the elemental analysis of plants, marine biota, animal and human tissues, diets, and excreta. Relatively little use of it has been made in biochemistry, microbiology, and entomology, but it has become important in toxicology and environmental science. The elements most often determined are Ag, As, Au, Br, Ca, Cd, Cl, Co, Cr, Cs, Cu, Fe, Hg, I, K, Mn, Mo, Na, Rb, Sb, Sc, Se, and Zn, while few or no determinations of B, Be, Bi, Ga, Gd, Ge, H, In, Ir, Li, Nd, Os, Pd, Pr, Pt, Re, Rh, Ru, Te, Tl, or Y have been made in biological materials.

  16. Weighted graph based ordering techniques for preconditioned conjugate gradient methods

    NASA Technical Reports Server (NTRS)

    Clift, Simon S.; Tang, Wei-Pai

    1994-01-01

    We describe the basis of a matrix ordering heuristic for improving the incomplete factorization used in preconditioned conjugate gradient techniques applied to anisotropic PDE's. Several new matrix ordering techniques, derived from well-known algorithms in combinatorial graph theory, which attempt to implement this heuristic, are described. These ordering techniques are tested against a number of matrices arising from linear anisotropic PDE's, and compared with other matrix ordering techniques. A variation of RCM is shown to generally improve the quality of incomplete factorization preconditioners.

  17. Region segmentation techniques for object-based image compression: a review

    NASA Astrophysics Data System (ADS)

    Schmalz, Mark S.; Ritter, Gerhard X.

    2004-10-01

    Image compression based on transform coding appears to be approaching an asymptotic bit rate limit for application-specific distortion levels. However, a new compression technology, called object-based compression (OBC) promises improved rate-distortion performance at higher compression ratios. OBC involves segmentation of image regions, followed by efficient encoding of each region"s content and boundary. Advantages of OBC include efficient representation of commonly occurring textures and shapes in terms of pointers into a compact codebook of region contents and boundary primitives. This facilitates fast decompression via substitution, at the cost of codebook search in the compression step. Segmentation cose and error are significant disadvantages in current OBC implementations. Several innovative techniques have been developed for region segmentation, including (a) moment-based analysis, (b) texture representation in terms of a syntactic grammar, and (c) transform coding approaches such as wavelet based compression used in MPEG-7 or JPEG-2000. Region-based characterization with variance templates is better understood, but lacks the locality of wavelet representations. In practice, tradeoffs are made between representational fidelity, computational cost, and storage requirement. This paper overviews current techniques for automatic region segmentation and representation, especially those that employ wavelet classification and region growing techniques. Implementational discussion focuses on complexity measures and performance metrics such as segmentation error and computational cost.

  18. [The applications for Fourier transform infrared spectrum analysis technique in preventive medicine field].

    PubMed

    Yang, Jiao-lan; Luo, Tian

    2002-08-01

    This paper expatriated the applications for Fourier transform infrared spectrum analysis technique in preventive medicine field from four aspects of environmental pollution, life science, and the latest infrared analysis methods and near infrared analysis technique. In the environmental pollution field, it mainly described the advantages, the limitations and the solutions of the combined applications for gas chromatograph and Fourier transform infrared spectrum. In the life science field, it described the application for Fourier transform infrared spectrum analysis technique on protein secondary structure, membrane protein, phospholipid, nucleic acid, cell, tissue. In addition, it also introduced a few latest infrared analysis methods and the applications for near infrared spectrum analysis technique in food, cosmetic, drug.

  19. Image-Based Techniques for Digitizing Environments and Artifacts

    DTIC Science & Technology

    2003-01-01

    model a bumpy wall as a flat surface, and the computer will compute the re- lief. This technique was employed in modeling the West façade of the gothic ...particularly well on objects with close to Lamber- tian reflectance properties such as aged marble sculptures . However, these existing techniques for

  20. On the rotation of teleseismic seismograms based on the receiver function technique

    NASA Astrophysics Data System (ADS)

    Wilde-Piórko, M.; Grycuk, M.; Polkowski, M.; Grad, M.

    2017-01-01

    The receiver function (RF) technique is a well-established method to investigate the crustal and upper mantle structures based on three-component seismograms of teleseismic events. In the present study, we propose a modified automatic procedure to determine the back azimuth and polarization angles of a teleseismic event based on the RF technique. The method is tested for the recording of 3 permanent and 3 temporary broadband seismic stations located in the vicinity of Poland. Additionally, the analysis of Rayleigh wave polarization is conducted to show that the new procedure is not sensitive to incorrect seismometer orientation. The synthetic modelling of RF by a modified ray-tracing method for 2.5D models beneath each seismic station down to a depth of 60 km is performed to show the effectiveness of the proposed method in the calculation of RF for a complex structure with dipping layers.