Science.gov

Sample records for analysis technique based

  1. Rapid Disaster Analysis based on SAR Techniques

    NASA Astrophysics Data System (ADS)

    Yang, C. H.; Soergel, U.

    2015-03-01

    Due to all-day and all-weather capability spaceborne SAR is a valuable means for rapid mapping during and after disaster. In this paper, three change detection techniques based on SAR data are discussed: (1) initial coarse change detection, (2) flooded area detection, and (3) linear-feature change detection. The 2011 Tohoku Earthquake and Tsunami is used as case study, where earthquake and tsunami events provide a complex case for this study. In (1), pre- and post-event TerraSAR-X images are coregistered accurately to produce a false-color image. Such image provides a quick and rough overview of potential changes, which is useful for initial decision making and identifies areas worthwhile to be analysed further in more depth. In (2), the post-event TerraSAR-X image is used to extract the flooded area by morphological approaches. In (3), we are interested in detecting changes of linear shape as indicator for modified man-made objects. Morphological approaches, e.g. thresholding, simply extract pixel-based changes in the difference image. However, in this manner many irrelevant changes are highlighted, too (e.g., farming activity, speckle). In this study, Curvelet filtering is applied in the difference image not only to suppress false alarms but also to enhance the change signals of linear-feature form (e.g. buildings) in settlements. Afterwards, thresholding is conducted to extract linear-shaped changed areas. These three techniques mentioned above are designed to be simple and applicable in timely disaster analysis. They are all validated by comparing with the change map produced by Center for Satellite Based Crisis Information, DLR.

  2. New Flutter Analysis Technique for CFD-based Unsteady Aeroelasticity

    NASA Technical Reports Server (NTRS)

    Pak, Chan-gi; Jutte, Christine V.

    2009-01-01

    This paper presents a flutter analysis technique for the transonic flight regime. The technique uses an iterative approach to determine the critical dynamic pressure for a given mach number. Unlike other CFD-based flutter analysis methods, each iteration solves for the critical dynamic pressure and uses this value in subsequent iterations until the value converges. This process reduces the iterations required to determine the critical dynamic pressure. To improve the accuracy of the analysis, the technique employs a known structural model, leaving only the aerodynamic model as the unknown. The aerodynamic model is estimated using unsteady aeroelastic CFD analysis combined with a parameter estimation routine. The technique executes as follows. The known structural model is represented as a finite element model. Modal analysis determines the frequencies and mode shapes for the structural model. At a given mach number and dynamic pressure, the unsteady CFD analysis is performed. The output time history of the surface pressure is converted to a nodal aerodynamic force vector. The forces are then normalized by the given dynamic pressure. A multi-input multi-output parameter estimation software, ERA, estimates the aerodynamic model through the use of time histories of nodal aerodynamic forces and structural deformations. The critical dynamic pressure is then calculated using the known structural model and the estimated aerodynamic model. This output is used as the dynamic pressure in subsequent iterations until the critical dynamic pressure is determined. This technique is demonstrated on the Aerostructures Test Wing-2 model at NASA's Dryden Flight Research Center.

  3. FDTD technique based crosstalk analysis of bundled SWCNT interconnects

    NASA Astrophysics Data System (ADS)

    Singh Duksh, Yograj; Kaushik, Brajesh Kumar; Agarwal, Rajendra P.

    2015-05-01

    The equivalent electrical circuit model of a bundled single-walled carbon nanotube based distributed RLC interconnects is employed for the crosstalk analysis. The accurate time domain analysis and crosstalk effect in the VLSI interconnect has emerged as an essential design criteria. This paper presents a brief description of the numerical method based finite difference time domain (FDTD) technique that is intended for estimation of voltages and currents on coupled transmission lines. For the FDTD implementation, the stability of the proposed model is strictly restricted by the Courant condition. This method is used for the estimation of crosstalk induced propagation delay and peak voltage in lossy RLC interconnects. Both functional and dynamic crosstalk effects are analyzed in the coupled transmission line. The effect of line resistance on crosstalk induced delay, and peak voltage under dynamic and functional crosstalk is also evaluated. The FDTD analysis and the SPICE simulations are carried out at 32 nm technology node for the global interconnects. It is observed that the analytical results obtained using the FDTD technique are in good agreement with the SPICE simulation results. The crosstalk induced delay, propagation delay, and peak voltage obtained using the FDTD technique shows average errors of 4.9%, 3.4% and 0.46%, respectively, in comparison to SPICE.

  4. Image analysis techniques associated with automatic data base generation.

    NASA Technical Reports Server (NTRS)

    Bond, A. D.; Ramapriyan, H. K.; Atkinson, R. J.; Hodges, B. C.; Thomas, D. T.

    1973-01-01

    This paper considers some basic problems relating to automatic data base generation from imagery, the primary emphasis being on fast and efficient automatic extraction of relevant pictorial information. Among the techniques discussed are recursive implementations of some particular types of filters which are much faster than FFT implementations, a 'sequential similarity detection' technique of implementing matched filters, and sequential linear classification of multispectral imagery. Several applications of the above techniques are presented including enhancement of underwater, aerial and radiographic imagery, detection and reconstruction of particular types of features in images, automatic picture registration and classification of multiband aerial photographs to generate thematic land use maps.

  5. Improved mesh based photon sampling techniques for neutron activation analysis

    SciTech Connect

    Relson, E.; Wilson, P. P. H.; Biondo, E. D.

    2013-07-01

    The design of fusion power systems requires analysis of neutron activation of large, complex volumes, and the resulting particles emitted from these volumes. Structured mesh-based discretization of these problems allows for improved modeling in these activation analysis problems. Finer discretization of these problems results in large computational costs, which drives the investigation of more efficient methods. Within an ad hoc subroutine of the Monte Carlo transport code MCNP, we implement sampling of voxels and photon energies for volumetric sources using the alias method. The alias method enables efficient sampling of a discrete probability distribution, and operates in 0(1) time, whereas the simpler direct discrete method requires 0(log(n)) time. By using the alias method, voxel sampling becomes a viable alternative to sampling space with the 0(1) approach of uniformly sampling the problem volume. Additionally, with voxel sampling it is straightforward to introduce biasing of volumetric sources, and we implement this biasing of voxels as an additional variance reduction technique that can be applied. We verify our implementation and compare the alias method, with and without biasing, to direct discrete sampling of voxels, and to uniform sampling. We study the behavior of source biasing in a second set of tests and find trends between improvements and source shape, material, and material density. Overall, however, the magnitude of improvements from source biasing appears to be limited. Future work will benefit from the implementation of efficient voxel sampling - particularly with conformal unstructured meshes where the uniform sampling approach cannot be applied. (authors)

  6. BCC skin cancer diagnosis based on texture analysis techniques

    NASA Astrophysics Data System (ADS)

    Chuang, Shao-Hui; Sun, Xiaoyan; Chang, Wen-Yu; Chen, Gwo-Shing; Huang, Adam; Li, Jiang; McKenzie, Frederic D.

    2011-03-01

    In this paper, we present a texture analysis based method for diagnosing the Basal Cell Carcinoma (BCC) skin cancer using optical images taken from the suspicious skin regions. We first extracted the Run Length Matrix and Haralick texture features from the images and used a feature selection algorithm to identify the most effective feature set for the diagnosis. We then utilized a Multi-Layer Perceptron (MLP) classifier to classify the images to BCC or normal cases. Experiments showed that detecting BCC cancer based on optical images is feasible. The best sensitivity and specificity we achieved on our data set were 94% and 95%, respectively.

  7. Ionospheric Plasma Drift Analysis Technique Based On Ray Tracing

    NASA Astrophysics Data System (ADS)

    Ari, Gizem; Toker, Cenk

    2016-07-01

    Ionospheric drift measurements provide important information about the variability in the ionosphere, which can be used to quantify ionospheric disturbances caused by natural phenomena such as solar, geomagnetic, gravitational and seismic activities. One of the prominent ways for drift measurement depends on instrumentation based measurements, e.g. using an ionosonde. The drift estimation of an ionosonde depends on measuring the Doppler shift on the received signal, where the main cause of Doppler shift is the change in the length of the propagation path of the signal between the transmitter and the receiver. Unfortunately, ionosondes are expensive devices and their installation and maintenance require special care. Furthermore, the ionosonde network over the world or even Europe is not dense enough to obtain a global or continental drift map. In order to overcome the difficulties related to an ionosonde, we propose a technique to perform ionospheric drift estimation based on ray tracing. First, a two dimensional TEC map is constructed by using the IONOLAB-MAP tool which spatially interpolates the VTEC estimates obtained from the EUREF CORS network. Next, a three dimensional electron density profile is generated by inputting the TEC estimates to the IRI-2015 model. Eventually, a close-to-real situation electron density profile is obtained in which ray tracing can be performed. These profiles can be constructed periodically with a period of as low as 30 seconds. By processing two consequent snapshots together and calculating the propagation paths, we estimate the drift measurements over any coordinate of concern. We test our technique by comparing the results to the drift measurements taken at the DPS ionosonde at Pruhonice, Czech Republic. This study is supported by TUBITAK 115E915 and Joint TUBITAK 114E092 and AS CR14/001 projects.

  8. GC-Based Techniques for Breath Analysis: Current Status, Challenges, and Prospects.

    PubMed

    Xu, Mingjun; Tang, Zhentao; Duan, Yixiang; Liu, Yong

    2016-07-01

    Breath analysis is a noninvasive diagnostic method that profiles a person's physical state by volatile organic compounds in the breath. It has huge potential in the field of disease diagnosis. In order to offer opportunities for practical applications, various GC-based techniques have been investigated for on-line breath analysis since GC is the most preferred technique for mixed gas separation. This article reviews the development of breath analysis and GC-based techniques in basic breath research, involving sampling methods, preconcentration methods, conventional GC-based techniques, and newly developed GC techniques for breath analysis. The combination of GC and newly developed detection techniques takes advantages of the virtues of each. In addition, portable GC or micro GC are poised to become field GC-based techniques in breath analysis. Challenges faced in GC-based techniques for breath analysis are discussed candidly. Effective cooperation of experts from different fields is urgent to promote the development of breath analysis. PMID:26529095

  9. A new variance-based global sensitivity analysis technique

    NASA Astrophysics Data System (ADS)

    Wei, Pengfei; Lu, Zhenzhou; Song, Jingwen

    2013-11-01

    A new set of variance-based sensitivity indices, called W-indices, is proposed. Similar to the Sobol's indices, both main and total effect indices are defined. The W-main effect indices measure the average reduction of model output variance when the ranges of a set of inputs are reduced, and the total effect indices quantify the average residual variance when the ranges of the remaining inputs are reduced. Geometrical interpretations show that the W-indices gather the full information of the variance ratio function, whereas, Sobol's indices only reflect the marginal information. Then the double-loop-repeated-set Monte Carlo (MC) (denoted as DLRS MC) procedure, the double-loop-single-set MC (denoted as DLSS MC) procedure and the model emulation procedure are introduced for estimating the W-indices. It is shown that the DLRS MC procedure is suitable for computing all the W-indices despite its highly computational cost. The DLSS MC procedure is computationally efficient, however, it is only applicable for computing low order indices. The model emulation is able to estimate all the W-indices with low computational cost as long as the model behavior is correctly captured by the emulator. The Ishigami function, a modified Sobol's function and two engineering models are utilized for comparing the W- and Sobol's indices and verifying the efficiency and convergence of the three numerical methods. Results show that, for even an additive model, the W-total effect index of one input may be significantly larger than its W-main effect index. This indicates that there may exist interaction effects among the inputs of an additive model when their distribution ranges are reduced.

  10. Surface analysis of cast aluminum by means of artificial vision and AI-based techniques

    NASA Astrophysics Data System (ADS)

    Platero, Carlos; Fernandez, Carlos; Campoy, Pascual; Aracil, Rafael

    1996-02-01

    An architecture for surface analysis of continuous cast aluminum strip is described. The data volume to be processed has forced up the development of a high-parallel architecture for high- speed image processing. An especially suitable lighting system has been developed for defect enhancing in metallic surfaces. A special effort has been put in the design of the defect detection algorithm to reach two main objectives: robustness and low processing time. These goals have been achieved combining a local analysis together with data interpretation based on syntactical analysis that has allowed us to avoid morphological analysis. Defect classification is accomplished by means of rule-based systems along with data-based classifiers. The use of clustering techniques is discussed to perform partitions in Rn by SOM, divergency methods to reduce the feature vector applied to the data-based classifiers. The combination of techniques inside a hybrid system leads to near 100% classification success.

  11. Simultaneous and integrated neutron-based techniques for material analysis of a metallic ancient flute

    NASA Astrophysics Data System (ADS)

    Festa, G.; Pietropaolo, A.; Grazzi, F.; Sutton, L. F.; Scherillo, A.; Bognetti, L.; Bini, A.; Barzagli, E.; Schooneveld, E.; Andreani, C.

    2013-09-01

    A metallic 19th century flute was studied by means of integrated and simultaneous neutron-based techniques: neutron diffraction, neutron radiative capture analysis and neutron radiography. This experiment follows benchmark measurements devoted to assessing the effectiveness of a multitask beamline concept for neutron-based investigation on materials. The aim of this study is to show the potential application of the approach using multiple and integrated neutron-based techniques for musical instruments. Such samples, in the broad scenario of cultural heritage, represent an exciting research field. They may represent an interesting link between different disciplines such as nuclear physics, metallurgy and acoustics.

  12. Knowledge-base for the new human reliability analysis method, A Technique for Human Error Analysis (ATHEANA)

    SciTech Connect

    Cooper, S.E.; Wreathall, J.; Thompson, C.M., Drouin, M.; Bley, D.C.

    1996-10-01

    This paper describes the knowledge base for the application of the new human reliability analysis (HRA) method, a ``A Technique for Human Error Analysis`` (ATHEANA). Since application of ATHEANA requires the identification of previously unmodeled human failure events, especially errors of commission, and associated error-forcing contexts (i.e., combinations of plant conditions and performance shaping factors), this knowledge base is an essential aid for the HRA analyst.

  13. A damage identification technique based on embedded sensitivity analysis and optimization processes

    NASA Astrophysics Data System (ADS)

    Yang, Chulho; Adams, Douglas E.

    2014-07-01

    A vibration based structural damage identification method, using embedded sensitivity functions and optimization algorithms, is discussed in this work. The embedded sensitivity technique requires only measured or calculated frequency response functions to obtain the sensitivity of system responses to each component parameter. Therefore, this sensitivity analysis technique can be effectively used for the damage identification process. Optimization techniques are used to minimize the difference between the measured frequency response functions of the damaged structure and those calculated from the baseline system using embedded sensitivity functions. The amount of damage can be quantified directly in engineering units as changes in stiffness, damping, or mass. Various factors in the optimization process and structural dynamics are studied to enhance the performance and robustness of the damage identification process. This study shows that the proposed technique can improve the accuracy of damage identification with less than 2 percent error of estimation.

  14. COHN analysis: Body composition measurements based on the associated particle imaging and prompt-gamma neutron activation analysis techniques

    Technology Transfer Automated Retrieval System (TEKTRAN)

    The measurement of the body's carbon (C), oxygen (O), hydrogen (H), and nitrogen (N) content can be used to calculate the relative amounts of fat, protein, and water. A system based on prompt-gamma neutron activation analysis (PGNAA), coupled with the associated particle imaging (API) technique, is...

  15. Sex-based differences in lifting technique under increasing load conditions: A principal component analysis.

    PubMed

    Sheppard, P S; Stevenson, J M; Graham, R B

    2016-05-01

    The objective of the present study was to determine if there is a sex-based difference in lifting technique across increasing-load conditions. Eleven male and 14 female participants (n = 25) with no previous history of low back disorder participated in the study. Participants completed freestyle, symmetric lifts of a box with handles from the floor to a table positioned at 50% of their height for five trials under three load conditions (10%, 20%, and 30% of their individual maximum isometric back strength). Joint kinematic data for the ankle, knee, hip, and lumbar and thoracic spine were collected using a two-camera Optotrak motion capture system. Joint angles were calculated using a three-dimensional Euler rotation sequence. Principal component analysis (PCA) and single component reconstruction were applied to assess differences in lifting technique across the entire waveforms. Thirty-two PCs were retained from the five joints and three axes in accordance with the 90% trace criterion. Repeated-measures ANOVA with a mixed design revealed no significant effect of sex for any of the PCs. This is contrary to previous research that used discrete points on the lifting curve to analyze sex-based differences, but agrees with more recent research using more complex analysis techniques. There was a significant effect of load on lifting technique for five PCs of the lower limb (PC1 of ankle flexion, knee flexion, and knee adduction, as well as PC2 and PC3 of hip flexion) (p < 0.005). However, there was no significant effect of load on the thoracic and lumbar spine. It was concluded that when load is standardized to individual back strength characteristics, males and females adopted a similar lifting technique. In addition, as load increased male and female participants changed their lifting technique in a similar manner. PMID:26851478

  16. Validity of content-based techniques to distinguish true and fabricated statements: A meta-analysis.

    PubMed

    Oberlader, Verena A; Naefgen, Christoph; Koppehele-Gossel, Judith; Quinten, Laura; Banse, Rainer; Schmidt, Alexander F

    2016-08-01

    Within the scope of judicial decisions, approaches to distinguish between true and fabricated statements have been of particular importance since ancient times. Although methods focusing on "prototypical" deceptive behavior (e.g., psychophysiological phenomena, nonverbal cues) have largely been rejected with regard to validity, content-based techniques constitute a promising approach and are well established within the applied forensic context. The basic idea of this approach is that experience-based and nonexperience-based statements differ in their content-related quality. In order to test the validity of the most prominent content-based techniques, criteria-based content analysis (CBCA) and reality monitoring (RM), we conducted a comprehensive meta-analysis on English- and German-language studies. Based on a variety of decision criteria, 56 studies were included revealing an overall effect size of g = 1.03 (95% confidence interval [0.78, 1.27], Q = 420.06, p < .001, I2 = 92.48%, N = 3,429). There was no significant difference in the effectiveness of CBCA and RM. Additionally, we investigated a number of moderator variables, such as characteristics of participants, statements, and judgment procedures, as well as general study characteristics. Results showed that the application of all CBCA criteria outperformed any incomplete CBCA criteria set. Furthermore, statement classification based on discriminant functions revealed higher discrimination rates than decisions based on sum scores. Finally, unpublished studies showed higher effect sizes than studies published in peer-reviewed journals. All results are discussed in terms of their significance for future research (e.g., developing standardized decision rules) and practical application (e.g., user training, applying complete criteria set). (PsycINFO Database Record PMID:27149290

  17. Advanced SuperDARN meteor wind observations based on raw time series analysis technique

    NASA Astrophysics Data System (ADS)

    Tsutsumi, M.; Yukimatu, A. S.; Holdsworth, D. A.; Lester, M.

    2009-04-01

    The meteor observation technique based on SuperDARN raw time series analysis has been upgraded. This technique extracts meteor information as biproducts and does not degrade the quality of normal SuperDARN operations. In the upgrade the radar operating system (RADOPS) has been modified so that it can oversample every 15 km during the normal operations, which have a range resolution of 45 km. As an alternative method for better range determination a frequency domain interferometry (FDI) capability was also coded in RADOPS, where the operating radio frequency can be changed every pulse sequence. Test observations were conducted using the CUTLASS Iceland East and Finland radars, where oversampling and FDI operation (two frequencies separated by 3 kHz) were simultaneously carried out. Meteor ranges obtained in both ranging techniques agreed very well. The ranges were then combined with the interferometer data to estimate meteor echo reflection heights. Although there were still some ambiguities in the arrival angles of echoes because of the rather long antenna spacing of the interferometers, the heights and arrival angles of most of meteor echoes were more accurately determined than previously. Wind velocities were successfully estimated over the height range of 84 to 110 km. The FDI technique developed here can be further applied to the common SuperDARN operation, and study of fine horizontal structures of F region plasma irregularities is expected in the future.

  18. Tissue deformation analysis using a laser based digital image correlation technique.

    PubMed

    Kerl, Johannes; Parittotokkaporn, Tassanai; Frasson, Luca; Oldfield, Matthew; y Baena, Ferdinando Rodriguez; Beyrau, Frank

    2012-02-01

    A laser based technique for planar time-resolved measurements of tissue deformation in transparent biomedical materials with high spatial resolution is developed. The approach is based on monitoring the displacement of micrometer particles previously embedded into a semi-transparent sample as it is deformed by some form of external loading. The particles are illuminated in a plane inside the tissue material by a thin laser light sheet, and the pattern is continuously recorded by a digital camera. Image analysis yields the locally and temporally resolved sample deformation in the measurement plane without the need for any in situ measurement hardware. The applicability of the method for determination of tissue deformation and material strain during the insertion of a needle probe into a soft material sample is demonstrated by means of an in vitro trial on gelatin. PMID:22301185

  19. A novel murmur-based heart sound feature extraction technique using envelope-morphological analysis

    NASA Astrophysics Data System (ADS)

    Yao, Hao-Dong; Ma, Jia-Li; Fu, Bin-Bin; Wang, Hai-Yang; Dong, Ming-Chui

    2015-07-01

    Auscultation of heart sound (HS) signals serves as an important primary approach to diagnose cardiovascular diseases (CVDs) for centuries. Confronting the intrinsic drawbacks of traditional HS auscultation, computer-aided automatic HS auscultation based on feature extraction technique has witnessed explosive development. Yet, most existing HS feature extraction methods adopt acoustic or time-frequency features which exhibit poor relationship with diagnostic information, thus restricting the performance of further interpretation and analysis. Tackling such a bottleneck problem, this paper innovatively proposes a novel murmur-based HS feature extraction method since murmurs contain massive pathological information and are regarded as the first indications of pathological occurrences of heart valves. Adapting discrete wavelet transform (DWT) and Shannon envelope, the envelope-morphological characteristics of murmurs are obtained and three features are extracted accordingly. Validated by discriminating normal HS and 5 various abnormal HS signals with extracted features, the proposed method provides an attractive candidate in automatic HS auscultation.

  20. Applications of synchrotron-based micro-imaging techniques for the analysis of Cultural Heritage materials

    SciTech Connect

    Cotte, Marine; Chilida, Javier; Walter, Philippe; Taniguchi, Yoko; Susini, Jean

    2009-01-29

    The analysis of cultural Heritage objects is often technically challenging. When analyzing micro-fragments, the amount of matter is usually very tiny, hence requiring sensitive techniques. These samples, in particular painting fragments, may present multi-layered structures, with layer thickness of {approx}10 {mu}m. It leads to favor micro-imaging techniques, with a good lateral resolution (about one micrometer), that manage the discriminative study of each layer. Besides, samples are usually very complex in term of chemistry, as they are made of mineral and organic matters, amorphous and crystallized phases, major and minor elements. Accordingly, a multi-modal approach is generally essential to solve the chemical complexity of such hybrid materials. Different examples will be given, to illustrate the various possibilities of synchrotron-based micro-imaging techniques, such as micro X-ray diffraction, micro X-ray fluorescence, micro X-ray absorption spectroscopy and micro FTIR spectroscopy. Focus will be made on paintings, but the whole range of museum objects (going from soft matter like paper or wood to hard matter like metal and glass) will be also considered.

  1. Subdivision based isogeometric analysis technique for electric field integral equations for simply connected structures

    NASA Astrophysics Data System (ADS)

    Li, Jie; Dault, Daniel; Liu, Beibei; Tong, Yiying; Shanker, Balasubramaniam

    2016-08-01

    The analysis of electromagnetic scattering has long been performed on a discrete representation of the geometry. This representation is typically continuous but not differentiable. The need to define physical quantities on this geometric representation has led to development of sets of basis functions that need to satisfy constraints at the boundaries of the elements/tessellations (viz., continuity of normal or tangential components across element boundaries). For electromagnetics, these result in either curl/div-conforming basis sets. The geometric representation used for analysis is in stark contrast with that used for design, wherein the surface representation is higher order differentiable. Using this representation for both geometry and physics on geometry has several advantages, and is elucidated in Hughes et al. (2005) [7]. Until now, a bulk of the literature on isogeometric methods have been limited to solid mechanics, with some effort to create NURBS based basis functions for electromagnetic analysis. In this paper, we present the first complete isogeometry solution methodology for the electric field integral equation as applied to simply connected structures. This paper systematically proceeds through surface representation using subdivision, definition of vector basis functions on this surface, to fidelity in the solution of integral equations. We also present techniques to stabilize the solution at low frequencies, and impose a Calderón preconditioner. Several results presented serve to validate the proposed approach as well as demonstrate some of its capabilities.

  2. A novel mesh processing based technique for 3D plant analysis

    PubMed Central

    2012-01-01

    Background In recent years, imaging based, automated, non-invasive, and non-destructive high-throughput plant phenotyping platforms have become popular tools for plant biology, underpinning the field of plant phenomics. Such platforms acquire and record large amounts of raw data that must be accurately and robustly calibrated, reconstructed, and analysed, requiring the development of sophisticated image understanding and quantification algorithms. The raw data can be processed in different ways, and the past few years have seen the emergence of two main approaches: 2D image processing and 3D mesh processing algorithms. Direct image quantification methods (usually 2D) dominate the current literature due to comparative simplicity. However, 3D mesh analysis provides the tremendous potential to accurately estimate specific morphological features cross-sectionally and monitor them over-time. Result In this paper, we present a novel 3D mesh based technique developed for temporal high-throughput plant phenomics and perform initial tests for the analysis of Gossypium hirsutum vegetative growth. Based on plant meshes previously reconstructed from multi-view images, the methodology involves several stages, including morphological mesh segmentation, phenotypic parameters estimation, and plant organs tracking over time. The initial study focuses on presenting and validating the accuracy of the methodology on dicotyledons such as cotton but we believe the approach will be more broadly applicable. This study involved applying our technique to a set of six Gossypium hirsutum (cotton) plants studied over four time-points. Manual measurements, performed for each plant at every time-point, were used to assess the accuracy of our pipeline and quantify the error on the morphological parameters estimated. Conclusion By directly comparing our automated mesh based quantitative data with manual measurements of individual stem height, leaf width and leaf length, we obtained the mean

  3. Digital Instrumentation and Control Failure Events Derivation and Analysis by Frame-Based Technique

    SciTech Connect

    Hui-Wen Huang; Chunkuan Shih; Swu Yih; Yen-Chang Tzeng; Ming-Huei Chen

    2006-07-01

    A frame-based technique, including physical frame, logical frame, and cognitive frame, was adopted to perform digital I and C failure events derivation and analysis for generic ABWR. The physical frame was structured with a modified PCTran-ABWR plant simulation code, which was extended and enhanced on the feedwater system, recirculation system, and steam line system. The logical model is structured with MATLAB, which was incorporated into PCTran-ABWR to improve the pressure control system, feedwater control system, recirculation control system, and automated power regulation control system. As a result, the software failure of these digital control systems can be properly simulated and analyzed. The cognitive frame was simulated by the operator awareness status in the scenarios. Moreover, via an internal characteristics tuning technique, the modified PCTran-ABWR can precisely reflect the characteristics of the power-core flow. Hence, in addition to the transient plots, the analysis results can then be demonstrated on the power-core flow map. A number of postulated I and C system software failure events were derived to achieve the dynamic analyses. The basis for event derivation includes the published classification for software anomalies, the digital I and C design data for ABWR, chapter 15 accident analysis of generic SAR, and the reported NPP I and C software failure events. The case study of this research includes (1) the software CMF analysis for the major digital control systems; and (2) postulated ABWR digital I and C software failure events derivation from the actual happening of non-ABWR digital I and C software failure events, which were reported to LER of USNRC or IRS of IAEA. These events were analyzed by PCTran-ABWR. Conflicts among plant status, computer status, and human cognitive status are successfully identified. The operator might not easily recognize the abnormal condition, because the computer status seems to progress normally. However, a well

  4. An Evaluation of Microcomputer-Based Strain Analysis Techniques on Meteoritic Chondrules

    NASA Astrophysics Data System (ADS)

    Hill, H. G. M.

    1995-09-01

    Introduction: Chondrule flattening and distinct foliation are preserved in certain chondrites [1] and have been interpreted, by some, as evidence of shock-induced pressure through hypervelocity impacts on parent bodies [2]. Recently, mean aspect ratios of naturally and artificially shocked chondrules, in the Allende (CV3) chondrite, have been correlated with shock intensity [3] using established shock stage criteria [4]. Clearly, quantification of chondrule deformation and appropriate petrographic criteria can be useful tools for constraining parent body shock history and, possibly, post-shock heating [3]. Here, strain analysis techniques (R(sub)(f)/phi and Fry) normally employed in structural geology, have been adapted and evaluated [5], for measuring mean chondrule strain, and orientation. In addition, the possible use of such strain data for partial shock stage classification is considered. R(sub)(f)/phi and Fry Analysis: The relationship between displacement and shape changes in rocks is known as strain [6] and assumes that an initial circle with a unit radius is deformed to form an ellipse, the finite strain ellipse (Rf). The strain ratio (Rs) is an expression of the change of shape. The orientation of the strain ellipse (phi) is the angle subtended between the semi-major axes and the direction of a fixed point of reference. Generally, log mean Rf ~ Rs and, therefore, the approximation Rf = Rs is valid. For chondrules, this is reasonable as they were originally molten, or partially-molten, droplets [7]. Fry's 'center-to-center' geological strain analysis technique [8] is based on the principle that the distribution of particle centers in rocks can sometimes be used to determine the state of finite strain (Rf). Experimental Techniques: The Bovedy (L3) chondrite was chosen for investigation as it contains abundant, oriented, elliptical chondrules [5]. Hardware employed consisted of a Macintosh microcomputer and a flat-bed scanner. Chondrule outlines, obtained

  5. Microfluidic assay-based optical measurement techniques for cell analysis: A review of recent progress.

    PubMed

    Choi, Jong-Ryul; Song, Hyerin; Sung, Jong Hwan; Kim, Donghyun; Kim, Kyujung

    2016-03-15

    Since the early 2000s, microfluidic cell culture systems have attracted significant attention as a promising alternative to conventional cell culture methods and the importance of designing an efficient detection system to analyze cell behavior on a chip in real time is raised. For this reason, various measurement techniques for microfluidic devices have been developed with the development of microfluidic assays for high-throughput screening and mimicking of in vivo conditions. In this review, we discuss optical measurement techniques for microfluidic assays. First of all, the recent development of fluorescence- and absorbance-based optical measurement systems is described. Next, advanced optical detection systems are introduced with respect to three emphases: 1) optimization for long-term, real-time, and in situ measurements; 2) performance improvements; and 3) multimodal analysis conjugations. Moreover, we explore presents future prospects for the establishment of optical detection systems following the development of complex, multi-dimensional microfluidic cell culture assays to mimic in vivo tissue, organ, and human systems. PMID:26409023

  6. Performance analysis of compressive ghost imaging based on different signal reconstruction techniques.

    PubMed

    Kang, Yan; Yao, Yin-Ping; Kang, Zhi-Hua; Ma, Lin; Zhang, Tong-Yi

    2015-06-01

    We present different signal reconstruction techniques for implementation of compressive ghost imaging (CGI). The different techniques are validated on the data collected from ghost imaging with the pseudothermal light experimental system. Experiment results show that the technique based on total variance minimization gives high-quality reconstruction of the imaging object with less time consumption. The different performances among these reconstruction techniques and their parameter settings are also analyzed. The conclusion thus offers valuable information to promote the implementation of CGI in real applications. PMID:26367039

  7. An Overview of Micromechanics-Based Techniques for the Analysis of Microstructural Randomness in Functionally Graded Materials

    SciTech Connect

    Ferrante, Fernando J.; Brady, Lori L. Graham; Acton, Katherine; Arwade, Sanjay R.

    2008-02-15

    A review of current research efforts to develop micromechanics-based techniques for the study of microstructural randomness of functionally graded materials is presented, along with a framework developed by the authors of this paper that includes stochastic simulation of statistically inhomogeneous samples and a windowing technique coupled with a micromechanical homogenization technique. The methodology is illustrated through the analysis of one sample coupled with finite element modeling.

  8. Prioritization of sub-watersheds based on morphometric analysis using geospatial technique in Piperiya watershed, India

    NASA Astrophysics Data System (ADS)

    Chandniha, Surendra Kumar; Kansal, Mitthan Lal

    2014-11-01

    Hydrological investigation and behavior of watershed depend upon geo-morphometric characteristics of catchment. Morphometric analysis is commonly used for development of regional hydrological model of ungauged watershed. A critical valuation and assessment of geo-morphometric constraints has been carried out. Prioritization of watersheds based on water plot capacity of Piperiya watershed has been evaluated by linear, aerial and relief aspects. Morphometric analysis has been attempted for prioritization for nine sub-watersheds of Piperiya watershed in Hasdeo river basin, which is a tributary of the Mahanadi. Sub-watersheds are delineated by ArcMap 9.3 software as per digital elevation model (DEM). Assessment of drainages and their relative parameters such as stream order, stream length, stream frequency, drainage density, texture ratio, form factor, circulatory ratio, elongation ratio, bifurcation ratio and compactness ratio has been calculated separately for each sub-watershed using the Remote Sensing (RS) and Geospatial techniques. Finally, the prioritized score on the basis of morphometric behavior of each sub-watershed is assigned and thereafter consolidated scores have been estimated to identify the most sensitive parameters. The analysis reveals that stream order varies from 1 to 5; however, the first-order stream covers maximum area of about 87.7 %. Total number of stream segment of all order is 1,264 in the watershed. The study emphasizes the prioritization of the sub-watersheds on the basis of morphometric analysis. The final score of entire nine sub-watersheds is assigned as per erosion threat. The sub-watershed with the least compound parameter value was assigned as highest priority. However, the sub-watersheds has been categorized into three classes as high (4.1-4.7), medium (4.8-5.3) and low (>5.4) priority on the basis of their maximum (6.0) and minimum (4.1) prioritized score.

  9. Operational modal analysis via image based technique of very flexible space structures

    NASA Astrophysics Data System (ADS)

    Sabatini, Marco; Gasbarri, Paolo; Palmerini, Giovanni B.; Monti, Riccardo

    2013-08-01

    Vibrations represent one of the most important topics of the engineering design relevant to flexible structures. The importance of this problem increases when a very flexible system is considered, and this is often the case of space structures. In order to identify the modal characteristics, in terms of natural frequencies and relevant modal parameters, ground tests are performed. However, these parameters could vary due to the operative conditions of the system. In order to continuously monitor the modal characteristics during the satellite lifetime, an operational modal analysis is mandatory. This kind of analysis is usually performed by using classical accelerometers or strain gauges and by properly analyzing the acquired output. In this paper a different approach for the vibrations data acquisition will be performed via image-based technique. In order to simulate a flexible satellite, a free flying platform is used; the problem is furthermore complicated by the fact that the overall system, constituted by a highly rigid bus and very flexible panels, must necessarily be modeled as a multibody system. In the experimental campaign, the camera, placed on the bus, will be used to identify the eigenfrequencies of the vibrating structure; in this case aluminum thin plates simulate very flexible solar panels. The structure is excited by a hammer or studied during a fast attitude maneuver. The results of the experimental activity will be investigated and compared with respect to the numerical simulation obtained via a FEM-multibody software and the relevant results will be proposed and discussed.

  10. Instanton-based techniques for analysis and reduction of error floor of LDPC codes

    SciTech Connect

    Chertkov, Michael; Chilappagari, Shashi K; Stepanov, Mikhail G; Vasic, Bane

    2008-01-01

    We describe a family of instanton-based optimization methods developed recently for the analysis of the error floors of low-density parity-check (LDPC) codes. Instantons are the most probable configurations of the channel noise which result in decoding failures. We show that the general idea and the respective optimization technique are applicable broadly to a variety of channels, discrete or continuous, and variety of sub-optimal decoders. Specifically, we consider: iterative belief propagation (BP) decoders, Gallager type decoders, and linear programming (LP) decoders performing over the additive white Gaussian noise channel (AWGNC) and the binary symmetric channel (BSC). The instanton analysis suggests that the underlying topological structures of the most probable instanton of the same code but different channels and decoders are related to each other. Armed with this understanding of the graphical structure of the instanton and its relation to the decoding failures, we suggest a method to construct codes whose Tanner graphs are free of these structures, and thus have less significant error floors.

  11. Polyspectral signal analysis techniques for condition based maintenance of helicopter drive-train system

    NASA Astrophysics Data System (ADS)

    Hassan Mohammed, Mohammed Ahmed

    For an efficient maintenance of a diverse fleet of air- and rotorcraft, effective condition based maintenance (CBM) must be established based on rotating components monitored vibration signals. In this dissertation, we present theory and applications of polyspectral signal processing techniques for condition monitoring of critical components in the AH-64D helicopter tail rotor drive train system. Currently available vibration-monitoring tools are mostly built around auto- and cross-power spectral analysis which have limited performance in detecting frequency correlations higher than second order. Studying higher order correlations and their Fourier transforms, higher order spectra, provides more information about the vibration signals which helps in building more accurate diagnostic models of the mechanical system. Based on higher order spectral analysis, different signal processing techniques are developed to assess health conditions of different critical rotating-components in the AH-64D helicopter drive-train. Based on cross-bispectrum, quadratic nonlinear transfer function is presented to model second order nonlinearity in a drive-shaft running between the two hanger bearings. Then, quadratic-nonlinearity coupling coefficient between frequency harmonics of the rotating shaft is used as condition metric to study different seeded shaft faults compared to baseline case, namely: shaft misalignment, shaft imbalance, and combination of shaft misalignment and imbalance. The proposed quadratic-nonlinearity metric shows better capabilities in distinguishing the four studied shaft settings than the conventional linear coupling based on cross-power spectrum. We also develop a new concept of Quadratic-Nonlinearity Power-Index spectrum, QNLPI(f), that can be used in signal detection and classification, based on bicoherence spectrum. The proposed QNLPI(f) is derived as a projection of the three-dimensional bicoherence spectrum into two-dimensional spectrum that

  12. Error analysis of a 3D imaging system based on fringe projection technique

    NASA Astrophysics Data System (ADS)

    Zhang, Zonghua; Dai, Jie

    2013-12-01

    In the past few years, optical metrology has found numerous applications in scientific and commercial fields owing to its non-contact nature. One of the most popular methods is the measurement of 3D surface based on fringe projection techniques because of the advantages of non-contact operation, full-field and fast acquisition and automatic data processing. In surface profilometry by using digital light processing (DLP) projector, many factors affect the accuracy of 3D measurement. However, there is no research to give the complete error analysis of a 3D imaging system. This paper will analyze some possible error sources of a 3D imaging system, for example, nonlinear response of CCD camera and DLP projector, sampling error of sinusoidal fringe pattern, variation of ambient light and marker extraction during calibration. These error sources are simulated in a software environment to demonstrate their effects on measurement. The possible compensation methods are proposed to give high accurate shape data. Some experiments were conducted to evaluate the effects of these error sources on 3D shape measurement. Experimental results and performance evaluation show that these errors have great effect on measuring 3D shape and it is necessary to compensate for them for accurate measurement.

  13. Performance Analysis of SAC Optical PPM-CDMA System-Based Interference Rejection Technique

    NASA Astrophysics Data System (ADS)

    Alsowaidi, N.; Eltaif, Tawfig; Mokhtar, M. R.

    2016-03-01

    In this paper, we aim to theoretically analyse optical code division multiple access (OCDMA) system that based on successive interference cancellation (SIC) using pulse position modulation (PPM), considering the interference between the users, imperfection cancellation occurred during the cancellation process and receiver noises. Spectral amplitude coding (SAC) scheme is used to suppress the overlapping between the users and reduce the receiver noises effect. The theoretical analysis of the multiple access interference (MAI)-limited performance of this approach indicates the influence of the size of M-ary PPM on OCDMA system. The OCDMA system performance improves with increasing M-ary PPM. Therefore, it was found that the SIC/SAC-OCDMA system using PPM technique along with modified prime (MPR) codes used as signature sequence code offers significant improvement over the one without cancellation and it can support up to 103 users at the benchmarking value of bit error rate (BER) = 10-9 with prime number p = 11 while the system without cancellation scheme can support only up to 52 users.

  14. Spatio-temporal analysis of discharge regimes based on hydrograph classification techniques in an agricultural catchment

    NASA Astrophysics Data System (ADS)

    Chen, Xiaofei; Bloeschl, Guenter; Blaschke, Alfred Paul; Silasari, Rasmiaditya; Exner-Kittridge, Mike

    2016-04-01

    The stream, discharges and groundwater hydro-graphs is an integration in spatial and temporal variations for small-scale hydrological response. Characterizing discharges response regime in a drainage farmland is essential to irrigation strategies and hydrologic modeling. Especially for agricultural basins, diurnal hydro-graphs from drainage discharges have been investigated to achieve drainage process inferences in varying magnitudes. To explore the variability of discharge responses, we developed an impersonal method to characterize and classify discharge hydrograph based on features of magnitude and time-series. A cluster analysis (hierarchical k-means) and principal components analysis techniques are used for discharge time-series and groundwater level hydro-graphs to analyze their event characteristics, using 8 different discharge and 18 groundwater level hydro-graphs to test. As the variability of rainfall activity, system location, discharge regime and soil moisture pre-event condition in the catchment, three main clusters of discharge hydro-graph are identified from the test. The results show that : (1) the hydro-graphs from these drainage discharges had similar shapes but different magnitudes for individual rainstorm; the similarity is also showed in overland flow discharge and spring system; (2) for each cluster, the similarity of shape insisted, but the rising slope are different due to different antecedent wetness condition and the rain accumulation meanwhile the difference of regression slope can be explained by system location and discharge area; and (3) surface water always has a close proportional relation with soil moisture throughout the year, while only after the soil moisture exceeds a certain threshold does the outflow of tile drainage systems have a direct ratio relationship with soil moisture and a inverse relationship with the groundwater levels. Finally, we discussed the potential application of hydrograph classification in a wider range of

  15. Development of evaluation technique of GMAW welding quality based on statistical analysis

    NASA Astrophysics Data System (ADS)

    Feng, Shengqiang; Terasaki, Hidenri; Komizo, Yuichi; Hu, Shengsun; Chen, Donggao; Ma, Zhihua

    2014-11-01

    Nondestructive techniques for appraising gas metal arc welding(GMAW) faults plays a very important role in on-line quality controllability and prediction of the GMAW process. On-line welding quality controllability and prediction have several disadvantages such as high cost, low efficiency, complication and greatly being affected by the environment. An enhanced, efficient evaluation technique for evaluating welding faults based on Mahalanobis distance(MD) and normal distribution is presented. In addition, a new piece of equipment, designated the weld quality tester(WQT), is developed based on the proposed evaluation technique. MD is superior to other multidimensional distances such as Euclidean distance because the covariance matrix used for calculating MD takes into account correlations in the data and scaling. The values of MD obtained from welding current and arc voltage are assumed to follow a normal distribution. The normal distribution has two parameters: the mean µ and standard deviation σ of the data. In the proposed evaluation technique used by the WQT, values of MD located in the range from zero to µ+3 σ are regarded as "good". Two experiments which involve changing the flow of shielding gas and smearing paint on the surface of the substrate are conducted in order to verify the sensitivity of the proposed evaluation technique and the feasibility of using WQT. The experimental results demonstrate the usefulness of the WQT for evaluating welding quality. The proposed technique can be applied to implement the on-line welding quality controllability and prediction, which is of great importance to design some novel equipment for weld quality detection.

  16. Synchrotron-Based Microspectroscopic Analysis of Molecular and Biopolymer Structures Using Multivariate Techniques and Advanced Multi-Components Modeling

    SciTech Connect

    Yu, P.

    2008-01-01

    More recently, advanced synchrotron radiation-based bioanalytical technique (SRFTIRM) has been applied as a novel non-invasive analysis tool to study molecular, functional group and biopolymer chemistry, nutrient make-up and structural conformation in biomaterials. This novel synchrotron technique, taking advantage of bright synchrotron light (which is million times brighter than sunlight), is capable of exploring the biomaterials at molecular and cellular levels. However, with the synchrotron RFTIRM technique, a large number of molecular spectral data are usually collected. The objective of this article was to illustrate how to use two multivariate statistical techniques: (1) agglomerative hierarchical cluster analysis (AHCA) and (2) principal component analysis (PCA) and two advanced multicomponent modeling methods: (1) Gaussian and (2) Lorentzian multi-component peak modeling for molecular spectrum analysis of bio-tissues. The studies indicated that the two multivariate analyses (AHCA, PCA) are able to create molecular spectral corrections by including not just one intensity or frequency point of a molecular spectrum, but by utilizing the entire spectral information. Gaussian and Lorentzian modeling techniques are able to quantify spectral omponent peaks of molecular structure, functional group and biopolymer. By application of these four statistical methods of the multivariate techniques and Gaussian and Lorentzian modeling, inherent molecular structures, functional group and biopolymer onformation between and among biological samples can be quantified, discriminated and classified with great efficiency.

  17. Accuracy of an approximate static structural analysis technique based on stiffness matrix eigenmodes

    NASA Technical Reports Server (NTRS)

    Sobieszczanski-Sobieski, J.; Hajela, P.

    1979-01-01

    Use of the stiffness matrix eigenmodes, instead of the vibration eigenmodes, as generalized coordinates is proposed for condensation of static load deflection equations in finite element stiffness method. The modes are selected by strain energy criteria and the resulting fast, approximate analysis technique is evaluated by applications to idealized built-up wings and a fuselage segment. The best results obtained are a two-order of magnitude reduction of the number of degrees of freedom in a high aspect ratio wing associated with less than one percent error in prediction of the largest displacement.

  18. Wilcoxon signed-rank-based technique for the pulse-shape analysis of HPGe detectors

    NASA Astrophysics Data System (ADS)

    Martín, S.; Quintana, B.; Barrientos, D.

    2016-07-01

    The characterization of the electric response of segmented-contact high-purity germanium detectors requires scanning systems capable of accurately associating each pulse with the position of the interaction that generated it. This process requires an algorithm sensitive to changes above the electronic noise in the pulse shapes produced at different positions, depending on the resolution of the Ge crystal. In this work, a pulse-shape comparison technique based on the Wilcoxon signed-rank test has been developed. It provides a method to distinguish pulses coming from different interaction points in the germanium crystal. Therefore, this technique is a necessary step for building a reliable pulse-shape database that can be used later for the determination of the position of interaction for γ-ray tracking spectrometry devices such as AGATA, GRETA or GERDA. The method was validated by comparison with a χ2 test using simulated and experimental pulses corresponding to a Broad Energy germanium detector (BEGe).

  19. Fluorous affinity-based separation techniques for the analysis of biogenic and related molecules.

    PubMed

    Hayama, Tadashi; Yoshida, Hideyuki; Yamaguchi, Masatoshi; Nohta, Hitoshi

    2014-12-01

    Perfluoroalkyl-containing compounds have a unique 'fluorous' property that refers to the remarkably specific affinity they share. Fluorous compounds can be easily isolated from non-fluorous species on the perfluoroalkyl-functionalized stationary phases used in fluorous solid-phase extraction and fluorous liquid chromatography by means of fluorous-fluorous interactions (fluorophilicity). Recently, this unique specificity has been applied to the highly selective enrichment and analysis of different classes of biogenic and related compounds in complex samples. Because the biogenic compounds are generally not 'fluorous', they must be derivatized with appropriate perfluoroalkyl group-containing reagent in order to utilize fluorous interaction. In this review, we introduce the application of fluorous affinity techniques including derivatization methods to biogenic sample analysis. PMID:24865313

  20. Aerodynamic measurement techniques. [laser based diagnostic techniques

    NASA Technical Reports Server (NTRS)

    Hunter, W. W., Jr.

    1976-01-01

    Laser characteristics of intensity, monochromatic, spatial coherence, and temporal coherence were developed to advance laser based diagnostic techniques for aerodynamic related research. Two broad categories of visualization and optical measurements were considered, and three techniques received significant attention. These are holography, laser velocimetry, and Raman scattering. Examples of the quantitative laser velocimeter and Raman scattering measurements of velocity, temperature, and density indicated the potential of these nonintrusive techniques.

  1. Mass Spectrometry Based Imaging Techniques for Spatially Resolved Analysis of Molecules

    PubMed Central

    Matros, Andrea; Mock, Hans-Peter

    2013-01-01

    Higher plants are composed of a multitude of tissues with specific functions, reflected by distinct profiles for transcripts, proteins, and metabolites. Comprehensive analysis of metabolites and proteins has advanced tremendously within recent years, and this progress has been driven by the rapid development of sophisticated mass spectrometric techniques. In most of the current “omics”-studies, analysis is performed on whole organ or whole plant extracts, rendering to the loss of spatial information. Mass spectrometry imaging (MSI) techniques have opened a new avenue to obtain information on the spatial distribution of metabolites and of proteins. Pioneered in the field of medicine, the approaches are now applied to study the spatial profiles of molecules in plant systems. A range of different plant organs and tissues have been successfully analyzed by MSI, and patterns of various classes of metabolites from primary and secondary metabolism could be obtained. It can be envisaged that MSI approaches will substantially contribute to build spatially resolved biochemical networks. PMID:23626593

  2. Advanced NMR-based techniques for pore structure analysis of coal

    SciTech Connect

    Smith, D.M.

    1992-01-01

    One of the main problems in coal utilization is the inability to properly characterize its complex pore structure. Coals typically have micro/ultra-micro pores but they also exhibit meso and macroporosity. Conventional pore size techniques (adsorption/condensation, mercury porosimetry) are limited because of this broad pore size range, microporosity, reactive nature of coal, samples must be completely dried, and network/percolation effects. Small angle scattering is limited because it probes both open and closed pores. Although one would not expect any single technique to provide a satisfactory description of a coal's structure, it is apparent that better techniques are necessary. We believe that measurement of the NMR parameters of various gas phase and adsorbed phase NMR active probes can provide the resolution to this problem. We will investigate the dependence of the common NMR parameters such as chemical shifts and relaxation times of several different nuclei and compounds on the pore structure of model microporous solids, carbons, and coals. In particular, we will study the interaction between several small molecules and the pore surfaces in coals. These molecules have been selected for their chemical and physical properties. A special NMR probe will be constructed which will allow the concurrent measurement of NMR properties and adsorption uptake at a variety of temperatures. All samples will be subjected to a suite of conventional'' pore structure analyses. These include nitrogen adsorption at 77 K with BET analysis, CO[sub 2] and CH[sub 4] adsorption at 273 K with D-R (Dubinin-Radushkevich) analysis, helium pycnometry, and small angle X-ray scattering as well as gas diffusion measurements.

  3. Development of EMD based signal improvement technique and its application to pulse shape analysis

    NASA Astrophysics Data System (ADS)

    Siwal, Davinder; Suyal, V.; Prasad, A.; Mandal, S.; Singh, R.

    2013-04-01

    A new technique of signal improvement has been developed under the framework of Empirical Mode Decomposition method. It identifies the signal noise from the estimation of correlation coefficient. Such calculations are performed both in the frequency as well as in the time domains of the signal, among the IMFs and the given signal itself. Each of the Fast Fourier Transformed IMFs reflects the complete picture of the frequency involved in the given signal. Therefore, the correlation curve obtained in time domain can be use to identify the noise components. The application of the proposed method has been implemented on the pulse shape data of the liquid scintillator based neutron detector.

  4. Analysis to feature-based video stabilization/registration techniques within application of traffic data collection

    NASA Astrophysics Data System (ADS)

    Sadat, Mojtaba T.; Viti, Francesco

    2015-02-01

    Machine vision is rapidly gaining popularity in the field of Intelligent Transportation Systems. In particular, advantages are foreseen by the exploitation of Aerial Vehicles (AV) in delivering a superior view on traffic phenomena. However, vibration on AVs makes it difficult to extract moving objects on the ground. To partly overcome this issue, image stabilization/registration procedures are adopted to correct and stitch multiple frames taken of the same scene but from different positions, angles, or sensors. In this study, we examine the impact of multiple feature-based techniques for stabilization, and we show that SURF detector outperforms the others in terms of time efficiency and output similarity.

  5. Analysis of meteorological variables in the Australasian region using ground- and space-based GPS techniques

    NASA Astrophysics Data System (ADS)

    Kuleshov, Yuriy; Choy, Suelynn; Fu, Erjiang Frank; Chane-Ming, Fabrice; Liou, Yuei-An; Pavelyev, Alexander G.

    2016-07-01

    Results of analysis of meteorological variables (temperature and moisture) in the Australasian region using the global positioning system (GPS) radio occultation (RO) and GPS ground-based observations verified with in situ radiosonde (RS) data are presented. The potential of using ground-based GPS observations for retrieving column integrated precipitable water vapour (PWV) over the Australian continent has been demonstrated using the Australian ground-based GPS reference stations network. Using data from the 15 ground-based GPS stations, the state of the atmosphere over Victoria during a significant weather event, the March 2010 Melbourne storm, has been investigated, and it has been shown that the GPS observations has potential for monitoring the movement of a weather front that has sharp moisture contrast. Temperature and moisture variability in the atmosphere over various climatic regions (the Indian and the Pacific Oceans, the Antarctic and Australia) has been examined using satellite-based GPS RO and in situ RS observations. Investigating recent atmospheric temperature trends over Antarctica, the time series of the collocated GPS RO and RS data were examined, and strong cooling in the lower stratosphere and warming through the troposphere over Antarctica has been identified, in agreement with outputs of climate models. With further expansion of the Global Navigation Satellite Systems (GNSS) system, it is expected that GNSS satellite- and ground-based measurements would be able to provide an order of magnitude larger amount of data which in turn could significantly advance weather forecasting services, climate monitoring and analysis in the Australasian region.

  6. Novel Recognition Method of Blast Furnace Dust Composition by Multifeature Analysis Based on Comprehensive Image-Processing Techniques

    NASA Astrophysics Data System (ADS)

    Guo, Hongwei; Su, Buxin; Bai, Zhenlong; Zhang, Jianliang; Li, Xinyu

    2014-11-01

    The traditional artificial recognition methods for the blast furnace dust composition have several disadvantages, including a great deal of information to dispose, complex operation, and low working efficiency. In this article, a multifeature analysis method based on comprehensive image-processing techniques was proposed to automatically recognize the blast furnace dust composition. First, the artificial recognition and feature analysis, which included image preprocessing, Harris corner feature, Canny edge feature, and Ruffle feature analysis, was designed to build the template image, so that any unknown dust digital image could be tested. Second, the composition of coke, microvariation pulverized coal, vitric, ash, and iron from dust would be distinguished according to their different range of values based on the multifeature analysis. The method is valid for recognizing the blast furnace dust composition automatically, and it is fast and has a high recognition accuracy.

  7. Novel Recognition Method of Blast Furnace Dust Composition by Multifeature Analysis Based on Comprehensive Image-Processing Techniques

    NASA Astrophysics Data System (ADS)

    Guo, Hongwei; Su, Buxin; Bai, Zhenlong; Zhang, Jianliang; Li, Xinyu

    2014-09-01

    The traditional artificial recognition methods for the blast furnace dust composition have several disadvantages, including a great deal of information to dispose, complex operation, and low working efficiency. In this article, a multifeature analysis method based on comprehensive image-processing techniques was proposed to automatically recognize the blast furnace dust composition. First, the artificial recognition and feature analysis, which included image preprocessing, Harris corner feature, Canny edge feature, and Ruffle feature analysis, was designed to build the template image, so that any unknown dust digital image could be tested. Second, the composition of coke, microvariation pulverized coal, vitric, ash, and iron from dust would be distinguished according to their different range of values based on the multifeature analysis. The method is valid for recognizing the blast furnace dust composition automatically, and it is fast and has a high recognition accuracy.

  8. A New Signal Processing Technique for Neutron Capture Cross Section Measurement Based on Pulse Width Analysis

    NASA Astrophysics Data System (ADS)

    Katabuchi, T.; Matsuhashi, T.; Terada, K.; Mizumoto, M.; Hirose, K.; Kimura, A.; Furutaka, K.; Hara, K. Y.; Harada, H.; Hori, J.; Igashira, M.; Kamiyama, T.; Kitatani, F.; Kino, K.; Kiyanagi, Y.; Koizumi, M.; Nakamura, S.; Oshima, M.; Toh, Y.

    2014-05-01

    A fast data acquisition method based on pulse width analysis was developed for γ-ray spectroscopy with an NaI(Tl) detector. The new method was tested in experiments with standard γ-ray sources and pulsed neutron beam from a spallation neutron source. Pulse height spectra were successfully reconstructed from pulse width distribution by use of an energy calibration curve. The 197Au(n, γ)198Au cross section was measured by this method to test the viability. The obtained experimental cross section showed a good agreement with a calculation using the resonance parameters of JENDL-4.0.

  9. Automated cloud classification using a ground based infra-red camera and texture analysis techniques

    NASA Astrophysics Data System (ADS)

    Rumi, Emal; Kerr, David; Coupland, Jeremy M.; Sandford, Andrew P.; Brettle, Mike J.

    2013-10-01

    Clouds play an important role in influencing the dynamics of local and global weather and climate conditions. Continuous monitoring of clouds is vital for weather forecasting and for air-traffic control. Convective clouds such as Towering Cumulus (TCU) and Cumulonimbus clouds (CB) are associated with thunderstorms, turbulence and atmospheric instability. Human observers periodically report the presence of CB and TCU clouds during operational hours at airports and observatories; however such observations are expensive and time limited. Robust, automatic classification of cloud type using infrared ground-based instrumentation offers the advantage of continuous, real-time (24/7) data capture and the representation of cloud structure in the form of a thermal map, which can greatly help to characterise certain cloud formations. The work presented here utilised a ground based infrared (8-14 μm) imaging device mounted on a pan/tilt unit for capturing high spatial resolution sky images. These images were processed to extract 45 separate textural features using statistical and spatial frequency based analytical techniques. These features were used to train a weighted k-nearest neighbour (KNN) classifier in order to determine cloud type. Ground truth data were obtained by inspection of images captured simultaneously from a visible wavelength colour camera at the same installation, with approximately the same field of view as the infrared device. These images were classified by a trained cloud observer. Results from the KNN classifier gave an encouraging success rate. A Probability of Detection (POD) of up to 90% with a Probability of False Alarm (POFA) as low as 16% was achieved.

  10. AVR Microcontroller-based automated technique for analysis of DC motors

    NASA Astrophysics Data System (ADS)

    Kaur, P.; Chatterji, S.

    2014-01-01

    This paper provides essential information on the development of a 'dc motor test and analysis control card' using AVR series ATMega32 microcontroller. This card can be interfaced to PC and calculates parameters like motor losses, efficiency and plot characteristics for dc motors. Presently, there are different tests and methods available to evaluate motor parameters, but a single and universal user-friendly automated set-up has been discussed in this paper. It has been accomplished by designing a data acquisition and SCR bridge firing hardware based on AVR ATMega32 microcontroller. This hardware has the capability to drive the phase-controlled rectifiers and acquire real-time values of current, voltage, temperature and speed of motor. Various analyses feasible with the designed hardware are of immense importance for dc motor manufacturers and quality-sensitive users. Authors, through this paper aim to provide details of this AVR-based hardware which can be used for dc motor parameter analysis and also for motor control applications.

  11. Analysis and coding technique based on computational intelligence methods and image-understanding architecture

    NASA Astrophysics Data System (ADS)

    Kuvychko, Igor

    2000-05-01

    Human vision involves higher-level knowledge and top-bottom processes for resolving ambiguity and uncertainty in the real images. Even very advanced low-level image processing can not give any advantages without a highly effective knowledge-representation and reasoning system that is the solution of image understanding problem. Methods of image analysis and coding are directly based on the methods of knowledge representation and processing. Article suggests such models and mechanisms in form of Spatial Turing Machine that in place of symbols and tapes works with hierarchical networks represented dually as discrete and continuous structures. Such networks are able to perform both graph and diagrammatic operations being the basis of intelligence. Computational intelligence methods provide transformation of continuous image information into the discrete structures, making it available for analysis. Article shows that symbols naturally emerge in such networks, giving opportunity to use symbolic operations. Such framework naturally combines methods of machine learning, classification and analogy with induction, deduction and other methods of higher level reasoning. Based on these principles image understanding system provides more flexible ways of handling with ambiguity and uncertainty in the real images and does not require supercomputers. That opens way to new technologies in the computer vision and image databases.

  12. Numerical analysis of radiation propagation in innovative volumetric receivers based on selective laser melting techniques

    NASA Astrophysics Data System (ADS)

    Alberti, Fabrizio; Santiago, Sergio; Roccabruna, Mattia; Luque, Salvador; Gonzalez-Aguilar, Jose; Crema, Luigi; Romero, Manuel

    2016-05-01

    Volumetric absorbers constitute one of the key elements in order to achieve high thermal conversion efficiencies in concentrating solar power plants. Regardless of the working fluid or thermodynamic cycle employed, design trends towards higher absorber output temperatures are widespread, which lead to the general need of components of high solar absorptance, high conduction within the receiver material, high internal convection, low radiative and convective heat losses and high mechanical durability. In this context, the use of advanced manufacturing techniques, such as selective laser melting, has allowed for the fabrication of intricate geometries that are capable of fulfilling the previous requirements. This paper presents a parametric design and analysis of the optical performance of volumetric absorbers of variable porosity conducted by means of detailed numerical ray tracing simulations. Sections of variable macroscopic porosity along the absorber depth were constructed by the fractal growth of single-cell structures. Measures of performance analyzed include optical reflection losses from the absorber front and rear faces, penetration of radiation inside the absorber volume, and radiation absorption as a function of absorber depth. The effects of engineering design parameters such as absorber length and wall thickness, material reflectance and porosity distribution on the optical performance of absorbers are discussed, and general design guidelines are given.

  13. Analysis of RDSS positioning accuracy based on RNSS wide area differential technique

    NASA Astrophysics Data System (ADS)

    Xing, Nan; Su, RanRan; Zhou, JianHua; Hu, XiaoGong; Gong, XiuQiang; Liu, Li; He, Feng; Guo, Rui; Ren, Hui; Hu, GuangMing; Zhang, Lei

    2013-10-01

    The BeiDou Navigation Satellite System (BDS) provides Radio Navigation Service System (RNSS) as well as Radio Determination Service System (RDSS). RDSS users can obtain positioning by responding the Master Control Center (MCC) inquiries to signal transmitted via GEO satellite transponder. The positioning result can be calculated with elevation constraint by MCC. The primary error sources affecting the RDSS positioning accuracy are the RDSS signal transceiver delay, atmospheric trans-mission delay and GEO satellite position error. During GEO orbit maneuver, poor orbit forecast accuracy significantly impacts RDSS services. A real-time 3-D orbital correction method based on wide-area differential technique is raised to correct the orbital error. Results from the observation shows that the method can successfully improve positioning precision during orbital maneuver, independent from the RDSS reference station. This improvement can reach 50% in maximum. Accurate calibration of the RDSS signal transceiver delay precision and digital elevation map may have a critical role in high precise RDSS positioning services.

  14. An Analysis Technique for Active Neutron Multiplicity Measurements Based on First Principles

    SciTech Connect

    Evans, Louise G; Goddard, Braden; Charlton, William S; Peerani, Paolo

    2012-08-13

    Passive neutron multiplicity counting is commonly used to quantify the total mass of plutonium in a sample, without prior knowledge of the sample geometry. However, passive neutron counting is less applicable to uranium measurements due to the low spontaneous fission rates of uranium. Active neutron multiplicity measurements are therefore used to determine the {sup 235}U mass in a sample. Unfortunately, there are still additional challenges to overcome for uranium measurements, such as the coupling of the active source and the uranium sample. Techniques, such as the coupling method, have been developed to help reduce the dependence of calibration curves for active measurements on uranium samples; although, they still require similar geometry known standards. An advanced active neutron multiplicity measurement method is being developed by Texas A&M University, in collaboration with Los Alamos National Laboratory (LANL) in an attempt to overcome the calibration curve requirements. This method can be used to quantify the {sup 235}U mass in a sample containing uranium without using calibration curves. Furthermore, this method is based on existing detectors and nondestructive assay (NDA) systems, such as the LANL Epithermal Neutron Multiplicity Counter (ENMC). This method uses an inexpensive boron carbide liner to shield the uranium sample from thermal and epithermal neutrons while allowing fast neutrons to reach the sample. Due to the relatively low and constant fission and absorption energy dependent cross-sections at high neutron energies for uranium isotopes, fast neutrons can penetrate the sample without significant attenuation. Fast neutron interrogation therefore creates a homogeneous fission rate in the sample, allowing for first principle methods to be used to determine the {sup 235}U mass in the sample. This paper discusses the measurement method concept and development, including measurements and simulations performed to date, as well as the potential

  15. Cost Analysis of MRI Services in Iran: An Application of Activity Based Costing Technique

    PubMed Central

    Bayati, Mohsen; Mahboub Ahari, Alireza; Badakhshan, Abbas; Gholipour, Mahin; Joulaei, Hassan

    2015-01-01

    Background: Considerable development of MRI technology in diagnostic imaging, high cost of MRI technology and controversial issues concerning official charges (tariffs) have been the main motivations to define and implement this study. Objectives: The present study aimed to calculate the unit-cost of MRI services using activity-based costing (ABC) as a modern cost accounting system and to fairly compare calculated unit-costs with official charges (tariffs). Materials and Methods: We included both direct and indirect costs of MRI services delivered in fiscal year 2011 in Shiraz Shahid Faghihi hospital. Direct allocation method was used for distribution of overhead costs. We used micro-costing approach to calculate unit-cost of all different MRI services. Clinical cost data were retrieved from the hospital registering system. Straight-line method was used for depreciation cost estimation. To cope with uncertainty and to increase the robustness of study results, unit costs of 33 MRI services was calculated in terms of two scenarios. Results: Total annual cost of MRI activity center (AC) was calculated at USD 400,746 and USD 532,104 based on first and second scenarios, respectively. Ten percent of the total cost was allocated from supportive departments. The annual variable costs of MRI center were calculated at USD 295,904. Capital costs measured at USD 104,842 and USD 236, 200 resulted from the first and second scenario, respectively. Existing tariffs for more than half of MRI services were above the calculated costs. Conclusion: As a public hospital, there are considerable limitations in both financial and administrative databases of Shahid Faghihi hospital. Labor cost has the greatest share of total annual cost of Shahid Faghihi hospital. The gap between unit costs and tariffs implies that the claim for extra budget from health providers may not be relevant for all services delivered by the studied MRI center. With some adjustments, ABC could be implemented in MRI

  16. Experimental investigation of evanescence-based infrared biodetection technique for micro-total-analysis systems

    NASA Astrophysics Data System (ADS)

    Chandrasekaran, Arvind; Packirisamy, Muthukumaran

    2009-09-01

    The advent of microoptoelectromechanical systems (MOEMS) and its integration with other technologies such as microfluidics, microthermal, immunoproteomics, etc. has led to the concept of an integrated micro-total-analysis systems (μTAS) or Lab-on-a-Chip for chemical and biological applications. Recently, research and development of μTAS have attained a significant growth rate over several biodetection sciences, in situ medical diagnoses, and point-of-care testing applications. However, it is essential to develop suitable biophysical label-free detection methods for the success, reliability, and ease of use of the μTAS. We proposed an infrared (IR)-based evanescence wave detection system on the silicon-on-insulator platform for biodetection with μTAS. The system operates on the principle of bio-optical interaction that occurs due to the evanescence of light from the waveguide device. The feasibility of biodetection has been experimentally investigated by the detection of horse radish peroxidase upon its reaction with hydrogen peroxide.

  17. Nuclear spectroscopy pulse height analysis based on digital signal processing techniques

    SciTech Connect

    Simoes, J.B.; Simoes, P.C.P.S.; Correia, C.M.B.A.

    1995-08-01

    A digital approach to pulse height analysis is presented. It consists of entire pulse digitization, using a flash analog-to-digital converter (ADC), being its height estimated by a floating point digital signal processor (DSP) as one parameter of a model best fitting to the pulse samples. The differential nonlinearity (DNL) is reduced by simultaneously adding to the pulse, prior to its digitization, two analog signals provided by a digital-to-analog converter (DAC). One of them is a small amplitude dither signal used to eliminate a bias introduced by the fitting algorithm. The other, with large amplitude, corrects the ADC nonlinearities by a method similar to the well known Gatti`s sliding scale. The simulations carried out showed that, using a 12-bit flash ADC, a 14-bit DAC and a dedicated floating point DSP performing a polynomial fitting to the samples around the pulse peak, it is actually possible to process about 10,000 events per second, with a constant height pulse dispersion of only 4 on 8,192 channels and a very good differential linearity. A prototype system based on the Texas Instruments floating point DSP TMS320C31 and built following the presented methodology has already been tested and performed as expected.

  18. A new approach to the analysis of alpha spectra based on neural network techniques

    NASA Astrophysics Data System (ADS)

    Baeza, A.; Miranda, J.; Guillén, J.; Corbacho, J. A.; Pérez, R.

    2011-10-01

    The analysis of alpha spectra requires good radiochemical procedures in order to obtain well differentiated alpha peaks in the spectrum, and the easiest way to analyze them is by directly summing the counts obtained in the Regions of Interest (ROIs). However, the low-energy tails of the alpha peaks frequently make this simple approach unworkable because some peaks partially overlap. Many fitting procedures have been proposed to solve this problem, most of them based on semi-empirical mathematical functions that emulate the shape of a theoretical alpha peak. The main drawback of these methods is that the great number of fitting parameters used means that their physical meaning is obscure or completely lacking. We propose another approach—the application of an artificial neural network. Instead of fitting the experimental data to a mathematical function, the fit is carried out by an artificial neural network (ANN) that has previously been trained to model the shape of an alpha peak using as training patterns several polonium spectra obtained from actual samples analyzed in our laboratory. In this sense, the ANN is able to learn the shape of an actual alpha peak. We have designed such an ANN as a feed-forward multi-layer perceptron with supervised training based on a back-propagation algorithm. The fitting procedure is based on the experimental observables that are characteristic of alpha peaks—the number of counts of the maximum and several peak widths at different heights. Polonium isotope spectra were selected because the alpha peaks corresponding to 208Po, 209Po, and 210Po are monoenergetic and well separated. The uncertainties introduced by this fitting procedure were less than the counting uncertainties. This new approach was applied to the problem of resolving overlapping peaks. Firstly, a theoretical study was carried out by artificially overlapping alpha peaks from actual samples in order to test the ability of the ANN to resolve each peak. Then, the ANN

  19. Advanced NMR-based techniques for pore structure analysis of coal. Final project report

    SciTech Connect

    Smith, D.M.; Hua, D.W.

    1996-02-01

    During the 3 year term of the project, new methods have been developed for characterizing the pore structure of porous materials such as coals, carbons, and amorphous silica gels. In general, these techniques revolve around; (1) combining multiple techniques such as small-angle x-ray scattering (SAXS) and adsorption of contrast-matched adsorbates or {sup 129}Xe NMR and thermoporometry (the change in freezing point with pore size), (2) combining adsorption isotherms over several pressure ranges to obtain a more complete description of pore filling, or (3) applying NMR ({sup 129}Xe, {sup 14}N{sub 2}, {sup 15}N{sub 2}) techniques with well-defined porous solids with pores in the large micropore size range (>1 nm).

  20. A meta-analysis of cognitive-based behaviour change techniques as interventions to improve medication adherence

    PubMed Central

    Easthall, Claire; Song, Fujian; Bhattacharya, Debi

    2013-01-01

    Objective To describe and evaluate the use of cognitive-based behaviour change techniques as interventions to improve medication adherence. Design Systematic review and meta-analysis of interventions to improve medication adherence. Data sources Search of the MEDLINE, EMBASE, PsycINFO, CINAHL and The Cochrane Library databases from the earliest year to April 2013 without language restriction. References of included studies were also screened to identify further relevant articles. Review methods We used predefined criteria to select randomised controlled trials describing a medication adherence intervention that used Motivational Interviewing (MI) or other cognitive-based techniques. Data were extracted and risk of bias was assessed by two independent reviewers. We conducted the meta-analysis using a random effects model and Hedges’ g as the measure of effect size. Results We included 26 studies (5216 participants) in the meta-analysis. Interventions most commonly used MI, but many used techniques such as aiming to increase the patient's confidence and sense of self-efficacy, encouraging support-seeking behaviours and challenging negative thoughts, which were not specifically categorised. Interventions were most commonly delivered from community-based settings by routine healthcare providers such as general practitioners and nurses. An effect size (95% CI) of 0.34 (0.23 to 0.46) was calculated and was statistically significant (p < 0.001). Heterogeneity was high with an I2 value of 68%. Adjustment for publication bias generated a more conservative estimate of summary effect size of 0.21 (0.08 to 0.33). The majority of subgroup analyses produced statistically non-significant results. Conclusions Cognitive-based behaviour change techniques are effective interventions eliciting improvements in medication adherence that are likely to be greater than the behavioural and educational interventions largely used in current practice. Subgroup analyses suggest that these

  1. Novel Laser-Based Technique is Ideal for Real-Time Environmental Analysis

    ERIC Educational Resources Information Center

    Journal of Chemical Education, 2005

    2005-01-01

    Ocean Optics offers laser-induced breakdown spectrometer systems (LIBS) that can be used to identify light to heavy metals in a variety of sample types and geometries in environmental analysis applications. LIBS are versatile, real-time, high-resolution analyzers for qualitative analysis, in less than one second, of every element in solids,…

  2. An alternative Shell inversion technique - Analysis and validation based on COSMIC and ionosonde data

    NASA Astrophysics Data System (ADS)

    Lin, Jian; Wu, Yun; Qiao, Xuejun; Zhou, Yiyan

    2012-01-01

    Multi-channel Global Positioning System (GPS) carrier phase signals, received by the six low Earth orbiting (LEO) satellites from the Constellation Observing System for Meteorology, Ionosphere, and Climate (COSMIC) program, were used to undertake active limb sounding of the Earth's atmosphere and ionosphere via radio occultation. In the ionospheric radio occultation (IRO) data processing, the standard Shell inversion technique (SIT), transformed from the traditional Abel inversion technique (AIT), is widely used, and can retrieve good electron density profiles. In this paper, an alternative SIT method is proposed. The comparison between different inversion techniques will be discussed, taking advantage of the availability of COSMIC datasets. Moreover, the occultation results obtained from the SIT and alternative SIT at 500 km and 800 km, are compared with ionosonde measurements. The electron densities from the alternative SIT show excellent consistency to those from the SIT, with strong correlations over 0.996 and 0.999 at altitudes of 500 km and 800 km, respectively, and the peak electron densities (NmF2) from the alternative SIT are equivalent to the SIT, with 0.839 vs. 0.844, and 0.907 vs. 0.909 correlation coefficients when comparing to those by the ionosondes. These results show that: (1) the NmF2 and hmF2 retrieved from the SIT and alternative SIT are highly consistent, and in a good agreement with those measured by ionosondes, (2) no matter which inversion technique is used, the occultation results at the higher orbits (˜800 km) are better than those at the lower orbits (˜500 km).

  3. Advanced NMR-based techniques for pore structure analysis of coal

    SciTech Connect

    Smith, D.M.

    1992-01-01

    One of the main problems in coal utilization is the inability to properly characterize its complex pore structure. Coals typically have micro/ultra-micro pores but they also exhibit meso and macroporosity. Conventional pore size techniques (adsorption/condensation, mercury porosimetry) are limited because of this broad pore size range, microporosity, reactive nature of coal, samples must be completely dried, and network/percolation effects. Small angle scattering is limited because it probes both open and closed pores. Although one would not expect any single technique to provide a satisfactory description of a coal's structure, it is apparent that better techniques are necessary. We believe that measurement of the NMR parameters of various gas phase and adsorbed phase NMR active probes can provide the resolution to this problem. We now have two suites of well-characterized microporous materials including oxides (zeolites and silica gel) and activated carbons from our industrial partner, Air Products in Allentown, PA. Our current work may be divided into three areas: small-angle X-ray scattering (SAXS), adsorption, and NMR.

  4. Characterization of electrically-active defects in ultraviolet light-emitting diodes with laser-based failure analysis techniques

    NASA Astrophysics Data System (ADS)

    Miller, Mary A.; Tangyunyong, Paiboon; Cole, Edward I.

    2016-01-01

    Laser-based failure analysis techniques demonstrate the ability to quickly and non-intrusively screen deep ultraviolet light-emitting diodes (LEDs) for electrically-active defects. In particular, two laser-based techniques, light-induced voltage alteration and thermally-induced voltage alteration, generate applied voltage maps (AVMs) that provide information on electrically-active defect behavior including turn-on bias, density, and spatial location. Here, multiple commercial LEDs were examined and found to have dark defect signals in the AVM indicating a site of reduced resistance or leakage through the diode. The existence of the dark defect signals in the AVM correlates strongly with an increased forward-bias leakage current. This increased leakage is not present in devices without AVM signals. Transmission electron microscopy analysis of a dark defect signal site revealed a dislocation cluster through the pn junction. The cluster included an open core dislocation. Even though LEDs with few dark AVM defect signals did not correlate strongly with power loss, direct association between increased open core dislocation densities and reduced LED device performance has been presented elsewhere [M. W. Moseley et al., J. Appl. Phys. 117, 095301 (2015)].

  5. A new QMR-based technique for body composition analysis in infants

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Accurate assessment and tracking of infant body composition is useful in evaluation of the amount and quality of weight gain, which can provide key information in both clinical and research settings. Thus, body composition analysis (BCA) results can be used to monitor and evaluate infant growth patt...

  6. FBGs cascade interrogation technique based on wavelength-to-delay mapping and KLT analysis

    NASA Astrophysics Data System (ADS)

    Hervás, J.; Barrera, D.; Fernández-Pousa, Carlos R.; Sales, S.

    2016-05-01

    The Karhunen-Loeve transform is applied to the coarsely sampled impulse response generated by an FBG cascade in order to calculate the temperature change suffered by the FBGs. Thanks to a dispersive media, the wavelength change performed by the temperature change produces a delay shift in the sample generated by an FBG, delay shift which is recorded in the eigenvalues calculated by the KLT routine, letting to measure the temperature variation. Although the FBGs samples are represented only by four points, a continuous temperature measurement can be performed thanks to the KLT algorithm. This means a three order reduction in the number of points giving this method a low computational complexity. Simulations are performed to validate the interrogation technique and estimate performance and an experimental example is provided to demonstrate real operation.

  7. Scatterometry based 65nm node CDU analysis and prediction using novel reticle measurement technique

    NASA Astrophysics Data System (ADS)

    van Ingen Schenau, Koen; Vanoppen, Peter; van der Laan, Hans; Kiers, Ton; Janssen, Maurice

    2005-05-01

    Scatterometry was selected as CD metrology for the 65nm CDU system qualification. Because of the dominant reticle residuals component in the 65nm CD budget for dense lines, significant improvements in reticle CD metrology were required. SEM is an option but requires extensive measurements due to the scatterometry grating modules. Therefore a new technique was developed and called SERUM (Spot sensor Enabled Reticle Uniformity Measurements). It uses the on board exposure system metrology sensors to measure transmission that is converted to reticle CD. It has the advantage that an entire reticle is measured within two minutes with good repeatability. The reticle fingerprints correlate well to the SEM measurements. With the improvements in reticle CD metrology offered by SEM and SERUM the reticle residuals component no longer dominates the 65nm budget for CDU system qualification.

  8. Application of the windowed-Fourier-transform-based fringe analysis technique for investigating temperature and concentration fields in fluids.

    PubMed

    Mohanan, Sharika; Srivastava, Atul

    2014-04-10

    The present work is concerned with the development and application of a novel fringe analysis technique based on the principles of the windowed-Fourier-transform (WFT) for the determination of temperature and concentration fields from interferometric images for a range of heat and mass transfer applications. Based on the extent of the noise level associated with the experimental data, the technique has been coupled with two different phase unwrapping methods: the Itoh algorithm and the quality guided phase unwrapping technique for phase extraction. In order to generate the experimental data, a range of experiments have been carried out which include cooling of a vertical flat plate in free convection conditions, combustion of mono-propellant flames, and growth of organic as well as inorganic crystals from their aqueous solutions. The flat plate and combustion experiments are modeled as heat transfer applications wherein the interest is to determine the whole-field temperature distribution. Aqueous-solution-based crystal growth experiments are performed to simulate the mass transfer phenomena and the interest is to determine the two-dimensional solute concentration field around the growing crystal. A Mach-Zehnder interferometer has been employed to record the path-integrated quantity of interest (temperature and/or concentration) in the form of interferometric images in the experiments. The potential of the WFT method has also been demonstrated on numerically simulated phase data for varying noise levels, and the accuracy in phase extraction have been quantified in terms of the root mean square errors. Three levels of noise, i.e., 0%, 10%, and 20% have been considered. Results of the present study show that the WFT technique allows an accurate extraction of phase values that can subsequently be converted into two-dimensional temperature and/or concentration distribution fields. Moreover, since WFT is a local processing technique, speckle patterns and the inherent

  9. DATA ANALYSIS TECHNIQUES

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Food scientists use standards and calibrations to relate the concentration of a compound of interest to the instrumental response. The techniques used include classical, single point, and inverse calibrations, as well as standard addition and internal standards. Several fundamental criteria -- sel...

  10. 2D Flood Modelling Using Advanced Terrain Analysis Techniques And A Fully Continuous DEM-Based Rainfall-Runoff Algorithm

    NASA Astrophysics Data System (ADS)

    Nardi, F.; Grimaldi, S.; Petroselli, A.

    2012-12-01

    Remotely sensed Digital Elevation Models (DEMs), largely available at high resolution, and advanced terrain analysis techniques built in Geographic Information Systems (GIS), provide unique opportunities for DEM-based hydrologic and hydraulic modelling in data-scarce river basins paving the way for flood mapping at the global scale. This research is based on the implementation of a fully continuous hydrologic-hydraulic modelling optimized for ungauged basins with limited river flow measurements. The proposed procedure is characterized by a rainfall generator that feeds a continuous rainfall-runoff model producing flow time series that are routed along the channel using a bidimensional hydraulic model for the detailed representation of the inundation process. The main advantage of the proposed approach is the characterization of the entire physical process during hydrologic extreme events of channel runoff generation, propagation, and overland flow within the floodplain domain. This physically-based model neglects the need for synthetic design hyetograph and hydrograph estimation that constitute the main source of subjective analysis and uncertainty of standard methods for flood mapping. Selected case studies show results and performances of the proposed procedure as respect to standard event-based approaches.

  11. [Analyzer Design of Atmospheric Particulate Matter's Concentration and Elemental Composition Based on β and X-Ray's Analysis Techniques].

    PubMed

    Ge, Liang-quan; Liu, He-fan; Zeng, Guo-qiang; Zhang, Qing-xian; Ren, Mao-qiang; Li, Dan; Gu, Yi; Luo, Yao-yao; Zhao, Jian-kun

    2016-03-01

    Monitoring atmospheric particulate matter requires real-time analysis, such as particulate matter's concentrations, their element types and contents. An analyzer which is based on β and X rays analysis techniques is designed to meet those demands. Applying β-ray attenuation law and energy dispersive X-ray fluorescence analysis principle, the paper introduces the analyzer's overall design scheme, structure, FPGA circuit hardware and software for the analyzer. And the analyzer can measure atmospheric particulate matters' concentration, elements and their contents by on-line analysis. Pure elemental particle standard samples were prepared by deposition, and those standard samples were used to set the calibration for the analyzer in this paper. The analyzer can monitor atmospheric particulate matters concentration, 30 kinds of elements and content, such as TSP, PM10 and PM2.5. Comparing the measurement results from the analyzer to Chengdu Environmental Protection Agency's monitoring results for monitoring particulate matters, a high consistency is obtained by the application in eastern suburbs of Chengdu. Meanwhile, the analyzer are highly sensitive in monitoring particulate matters which contained heavy metal elements (such as As, Hg, Cd, Cr, Pb and so on). The analyzer has lots of characteristics through technical performance testing, such as continuous measurement, low detection limit, quick analysis, easy to use and so on. In conclusion, the analyzer can meet the demands for analyzing atmospheric particulate matter's concentration, elements and their contents in urban environmental monitoring. PMID:27400540

  12. Application of an ensemble technique based on singular spectrum analysis to daily rainfall forecasting.

    PubMed

    Baratta, Daniela; Cicioni, Giovambattista; Masulli, Francesco; Studer, Léonard

    2003-01-01

    In previous work, we have proposed a constructive methodology for temporal data learning supported by results and prescriptions related to the embedding theorem, and using the singular spectrum analysis both in order to reduce the effects of the possible discontinuity of the signal and to implement an efficient ensemble method. In this paper we present new results concerning the application of this approach to the forecasting of the individual rain-fall intensities series collected by 135 stations distributed in the Tiber basin. The average RMS error of the obtained forecasting is less than 3mm of rain. PMID:12672433

  13. Frontier-based techniques in measuring hospital efficiency in Iran: a systematic review and meta-regression analysis

    PubMed Central

    2013-01-01

    Background In recent years, there has been growing interest in measuring the efficiency of hospitals in Iran and several studies have been conducted on the topic. The main objective of this paper was to review studies in the field of hospital efficiency and examine the estimated technical efficiency (TE) of Iranian hospitals. Methods Persian and English databases were searched for studies related to measuring hospital efficiency in Iran. Ordinary least squares (OLS) regression models were applied for statistical analysis. The PRISMA guidelines were followed in the search process. Results A total of 43 efficiency scores from 29 studies were retrieved and used to approach the research question. Data envelopment analysis was the principal frontier efficiency method in the estimation of efficiency scores. The pooled estimate of mean TE was 0.846 (±0.134). There was a considerable variation in the efficiency scores between the different studies performed in Iran. There were no differences in efficiency scores between data envelopment analysis (DEA) and stochastic frontier analysis (SFA) techniques. The reviewed studies are generally similar and suffer from similar methodological deficiencies, such as no adjustment for case mix and quality of care differences. The results of OLS regression revealed that studies that included more variables and more heterogeneous hospitals generally reported higher TE. Larger sample size was associated with reporting lower TE. Conclusions The features of frontier-based techniques had a profound impact on the efficiency scores among Iranian hospital studies. These studies suffer from major methodological deficiencies and were of sub-optimal quality, limiting their validity and reliability. It is suggested that improving data collection and processing in Iranian hospital databases may have a substantial impact on promoting the quality of research in this field. PMID:23945011

  14. Model building techniques for analysis.

    SciTech Connect

    Walther, Howard P.; McDaniel, Karen Lynn; Keener, Donald; Cordova, Theresa Elena; Henry, Ronald C.; Brooks, Sean; Martin, Wilbur D.

    2009-09-01

    The practice of mechanical engineering for product development has evolved into a complex activity that requires a team of specialists for success. Sandia National Laboratories (SNL) has product engineers, mechanical designers, design engineers, manufacturing engineers, mechanical analysts and experimentalists, qualification engineers, and others that contribute through product realization teams to develop new mechanical hardware. The goal of SNL's Design Group is to change product development by enabling design teams to collaborate within a virtual model-based environment whereby analysis is used to guide design decisions. Computer-aided design (CAD) models using PTC's Pro/ENGINEER software tools are heavily relied upon in the product definition stage of parts and assemblies at SNL. The three-dimensional CAD solid model acts as the design solid model that is filled with all of the detailed design definition needed to manufacture the parts. Analysis is an important part of the product development process. The CAD design solid model (DSM) is the foundation for the creation of the analysis solid model (ASM). Creating an ASM from the DSM currently is a time-consuming effort; the turnaround time for results of a design needs to be decreased to have an impact on the overall product development. This effort can be decreased immensely through simple Pro/ENGINEER modeling techniques that summarize to the method features are created in a part model. This document contains recommended modeling techniques that increase the efficiency of the creation of the ASM from the DSM.

  15. Mechanism analysis on biofouling detection based on optical fiber sensing technique

    NASA Astrophysics Data System (ADS)

    Ma, Huiping; Yuan, Feng; Liu, Yongmeng; Jiang, Xiuzhen

    2010-08-01

    More attention is paid to on-line monitoring of biofouling in industrial water process systems. Based on optical fiber sensing technology, biofouling detection mechanism is put forward in the paper. With biofouling formation, optical characteristics and the relation between light intensity and refractive index studied, schematic diagram of optical fiber self-referencing detecting system and technological flowchart are presented. Immunity to electromagnetic interference and other influencing factors by which the precision is great improved is also remarkable characteristic. Absorption spectrum of fluid medium molecule is measured by infrared spectrum and impurity is analyzed by character fingerprints of different liquid. Other pollutant source can be identified by means of infrared spectrum and arithmetic research of artificial neural networks (ANN) technology. It can be used in other fields such as mining, environment protection, medical treatment and transportation of oil, gas and water.

  16. Characterization of electrically-active defects in ultraviolet light-emitting diodes with laser-based failure analysis techniques

    DOE PAGESBeta

    Miller, Mary A.; Tangyunyong, Paiboon; Edward I. Cole, Jr.

    2016-01-12

    In this study, laser-based failure analysis techniques demonstrate the ability to quickly and non-intrusively screen deep ultraviolet light-emitting diodes(LEDs) for electrically-active defects. In particular, two laser-based techniques, light-induced voltage alteration and thermally-induced voltage alteration, generate applied voltage maps (AVMs) that provide information on electrically-active defect behavior including turn-on bias, density, and spatial location. Here, multiple commercial LEDs were examined and found to have dark defect signals in the AVM indicating a site of reduced resistance or leakage through the diode. The existence of the dark defect signals in the AVM correlates strongly with an increased forward-bias leakage current. This increasedmore » leakage is not present in devices without AVM signals. Transmission electron microscopyanalysis of a dark defect signal site revealed a dislocation cluster through the pn junction. The cluster included an open core dislocation. Even though LEDs with few dark AVM defect signals did not correlate strongly with power loss, direct association between increased open core dislocation densities and reduced LED device performance has been presented elsewhere [M. W. Moseley et al., J. Appl. Phys. 117, 095301 (2015)].« less

  17. NEO fireball diversity: energetics-based entry modeling and analysis techniques

    NASA Astrophysics Data System (ADS)

    Revelle, Douglas O.

    2007-05-01

    Observations of fireballs reveal that a number of very different types of materials are routinely entering the atmosphere over a very large height and corresponding mass and energy range. There are five well-known fireball groups. The compositions of these groups can be reliably deduced on a statistical basis based entirely on their observed end-heights in the atmosphere (Ceplecha and McCrosky, 1970, Wetherill and ReVelle, 1981). ReVelle (1983, 2001, 2002, 2005) has also reinterpreted these observations in terms of the properties of porous meteoroids, using the degree to which the observational data can be reproduced using a modern hypersonic aerodynamic entry dynamics approach for porous as well as homogeneous bodies. These data and modeled parameters include the standard properties of drag, deceleration, ablation and fragmentation as well as most recently a model of the panchromatic luminous emission from the fireball during progressive atmospheric penetration. Using a recently developed bolide entry modeling code, ReVelle (2005) has systematically examined the behavior of meteoroids using their semi-well known physical properties. In order to illustrate this, we have investigated a sampling of four of the possible extremes within the NEO bolide population: 1) Type I: Antarctic bolide of 2003: A "small" Aten asteroid, 2) Type I: Park Forest meteorite fall: March 27, 2003, 3) Type I: Mediterranean bolide June 6, 2002, 4) Type II: Revelstoke meteorite fall: March 31, 1965 (with no luminosity data available), and 5) Type II/III: Tagish Lake meteorite fall: January 18, 2000 (with infrasonic data questionable?) In addition to the entry properties, each of these events (except possibly Tagish Lake) also had mechanical, acoustic-gravity waves generated that were subsequently detected following their entry into the atmosphere. Since these waves can also be used to identify key physical properties of these unusual objects, we will also report on our ability to model such

  18. Semi-Automated Classification of Gray Scale Aerial Photographs using Geographic Object Based Image Analysis (GEOBIA) Technique

    NASA Astrophysics Data System (ADS)

    Harb Rabia, Ahmed; Terribile, Fabio

    2013-04-01

    Aerial photography is an important source of high resolution remotely sensed data. Before 1970, aerial photographs were the only remote sensing data source for land use and land cover classification. Using these old aerial photographs improve the final output of land use and land cover change detection. However, classic techniques of aerial photographs classification like manual interpretation or screen digitization require great experience, long processing time and vast effort. A new technique needs to be developed in order to reduce processing time and effort and to give better results. Geographic object based image analysis (GEOBIA) is a newly developed area of Geographic Information Science and remote sensing in which automatic segmentation of images into objects of similar spectral, temporal and spatial characteristics is undertaken. Unlike pixel-based technique, GEOBIA deals with the object properties such as texture, square fit, roundness and many other properties that can improve classification results. GEOBIA technique can be divided into two main steps; segmentation and classification. Segmentation process is grouping adjacent pixels into objects of similar spectral and spatial characteristics. Classification process is assigning classes to the generated objects based on the characteristics of the individual objects. This study aimed to use GEOBIA technique to develop a novel approach for land use and land cover classification of aerial photographs that saves time and effort and gives improved results. Aerial photographs from 1954 of Valle Telesina in Italy were used in this study. Images were rectified and georeferenced in Arcmap using topographic maps. Images were then processed in eCognition software to generate land use and land cover map of 1954. A decision tree rule set was developed in eCognition to classify images and finally nine classes of general land use and land cover in the study area were recognized (forest, trees stripes, agricultural

  19. Get to Understand More from Single-Cells: Current Studies of Microfluidic-Based Techniques for Single-Cell Analysis

    PubMed Central

    Lo, Shih-Jie; Yao, Da-Jeng

    2015-01-01

    This review describes the microfluidic techniques developed for the analysis of a single cell. The characteristics of microfluidic (e.g., little sample amount required, high-throughput performance) make this tool suitable to answer and to solve biological questions of interest about a single cell. This review aims to introduce microfluidic related techniques for the isolation, trapping and manipulation of a single cell. The major approaches for detection in single-cell analysis are introduced; the applications of single-cell analysis are then summarized. The review concludes with discussions of the future directions and opportunities of microfluidic systems applied in analysis of a single cell. PMID:26213918

  20. 2D wavelet-analysis-based calibration technique for flat-panel imaging detectors: application in cone beam volume CT

    NASA Astrophysics Data System (ADS)

    Tang, Xiangyang; Ning, Ruola; Yu, Rongfeng; Conover, David L.

    1999-05-01

    The application of the newly developed flat panel x-ray imaging detector in cone beam volume CT has attracted increasing interest recently. Due to an imperfect solid state array manufacturing process, however, defective elements, gain non-uniformity and offset image unavoidably exist in all kinds of flat panel x-ray imaging detectors, which will cause severe streak and ring artifacts in a cone beam reconstruction image and severely degrade image quality. A calibration technique, in which the artifacts resulting from the defective elements, gain non-uniformity and offset image can be reduced significantly, is presented in this paper. The detection of defective elements is distinctively based upon two-dimensional (2D) wavelet analysis. Because of its inherent localizability in recognizing singularities or discontinuities, wavelet analysis possesses the capability of detecting defective elements over a rather large x-ray exposure range, e.g., 20% to approximately 60% of the dynamic range of the detector used. Three-dimensional (3D) images of a low-contrast CT phantom have been reconstructed from projection images acquired by a flat panel x-ray imaging detector with and without calibration process applied. The artifacts caused individually by defective elements, gain non-uniformity and offset image have been separated and investigated in detail, and the correlation with each other have also been exposed explicitly. The investigation is enforced by quantitative analysis of the signal to noise ratio (SNR) and the image uniformity of the cone beam reconstruction image. It has been demonstrated that the ring and streak artifacts resulting from the imperfect performance of a flat panel x-ray imaging detector can be reduced dramatically, and then the image qualities of a cone beam reconstruction image, such as contrast resolution and image uniformity are improved significantly. Furthermore, with little modification, the calibration technique presented here is also applicable

  1. Technique based on LED multispectral imaging and multivariate analysis for monitoring the conservation state of the Dead Sea Scrolls.

    PubMed

    Marengo, Emilio; Manfredi, Marcello; Zerbinati, Orfeo; Robotti, Elisa; Mazzucco, Eleonora; Gosetti, Fabio; Bearman, Greg; France, Fenella; Shor, Pnina

    2011-09-01

    The aim of this project is the development of a noninvasive technique based on LED multispectral imaging (MSI) for monitoring the conservation state of the Dead Sea Scrolls (DSS) collection. It is well-known that changes in the parchment reflectance drive the transition of the scrolls from legible to illegible. Capitalizing on this fact, we will use spectral imaging to detect changes in the reflectance before they become visible to the human eye. The technique uses multivariate analysis and statistical process control theory. The present study was carried out on a "sample" parchment of calfskin. The monitoring of the surface of a commercial modern parchment aged consecutively for 2 h and 6 h at 80 °C and 50% relative humidity (ASTM) was performed at the Imaging Lab of the Library of Congress (Washington, DC, U.S.A.). MSI is here carried out in the vis-NIR range limited to 1 μm, with a number of bands of 13 and bandwidths that range from about 10 nm in UV to 40 nm in IR. Results showed that we could detect and locate changing pixels, on the basis of reflectance changes, after only a few "hours" of aging. PMID:21777009

  2. An expert diagnostic system based on neural networks and image analysis techniques in the field of automated cytogenetics.

    PubMed

    Beksaç, M S; Eskiizmirliler, S; Cakar, A N; Erkmen, A M; Dağdeviren, A; Lundsteen, C

    1996-03-01

    In this study, we introduce an expert system for intelligent chromosome recognition and classification based on artificial neural networks (ANN) and features obtained by automated image analysis techniques. A microscope equipped with a CCTV camera, integrated with an IBM-PC compatible computer environment including a frame grabber, is used for image data acquisition. Features of the chromosomes are obtained directly from the digital chromosome images. Two new algorithms for automated object detection and object skeletonizing constitute the basis of the feature extraction phase which constructs the components of the input vector to the ANN part of the system. This first version of our intelligent diagnostic system uses a trained unsupervised neural network structure and an original rule-based classification algorithm to find a karyotyped form of randomly distributed chromosomes over a complete metaphase. We investigate the effects of network parameters on the classification performance and discuss the adaptability and flexibility of the neural system in order to reach a structure giving an output including information about both structural and numerical abnormalities. Moreover, the classification performances of neural and rule-based system are compared for each class of chromosome. PMID:8705397

  3. NASA standard: Trend analysis techniques

    NASA Technical Reports Server (NTRS)

    1990-01-01

    Descriptive and analytical techniques for NASA trend analysis applications are presented in this standard. Trend analysis is applicable in all organizational elements of NASA connected with, or supporting, developmental/operational programs. This document should be consulted for any data analysis activity requiring the identification or interpretation of trends. Trend analysis is neither a precise term nor a circumscribed methodology: it generally connotes quantitative analysis of time-series data. For NASA activities, the appropriate and applicable techniques include descriptive and graphical statistics, and the fitting or modeling of data by linear, quadratic, and exponential models. Usually, but not always, the data is time-series in nature. Concepts such as autocorrelation and techniques such as Box-Jenkins time-series analysis would only rarely apply and are not included in this document. The basic ideas needed for qualitative and quantitative assessment of trends along with relevant examples are presented.

  4. Data Analysis Techniques at LHC

    SciTech Connect

    Boccali, Tommaso

    2005-10-12

    A review of the recent developments on data analysis techniques for the upcoming LHC experiments is presented, with the description of early tests ('Data Challenges'), which are being performed before the start-up, to validate the overall design.

  5. Graph-Based Symbolic Technique and Its Application in the Frequency Response Bound Analysis of Analog Integrated Circuits

    PubMed Central

    Tlelo-Cuautle, E.; Rodriguez-Chavez, S.; Palma-Rodriguez, A. A.

    2014-01-01

    A new graph-based symbolic technique (GBST) for deriving exact analytical expressions like the transfer function H(s) of an analog integrated circuit (IC), is introduced herein. The derived H(s) of a given analog IC is used to compute the frequency response bounds (maximum and minimum) associated to the magnitude and phase of H(s), subject to some ranges of process variational parameters, and by performing nonlinear constrained optimization. Our simulations demonstrate the usefulness of the new GBST for deriving the exact symbolic expression for H(s), and the last section highlights the good agreement between the frequency response bounds computed by our variational analysis approach versus traditional Monte Carlo simulations. As a conclusion, performing variational analysis using our proposed GBST for computing the frequency response bounds of analog ICs, shows a gain in computing time of 100x for a differential circuit topology and 50x for a 3-stage amplifier, compared to traditional Monte Carlo simulations. PMID:25136650

  6. A new technique for calculating reentry base heating. [analysis of laminar base flow field of two dimensional reentry body

    NASA Technical Reports Server (NTRS)

    Meng, J. C. S.

    1973-01-01

    The laminar base flow field of a two-dimensional reentry body has been studied by Telenin's method. The flow domain was divided into strips along the x-axis, and the flow variations were represented by Lagrange interpolation polynomials in the transformed vertical coordinate. The complete Navier-Stokes equations were used in the near wake region, and the boundary layer equations were applied elsewhere. The boundary conditions consisted of the flat plate thermal boundary layer in the forebody region and the near wake profile in the downstream region. The resulting two-point boundary value problem of 33 ordinary differential equations was then solved by the multiple shooting method. The detailed flow field and thermal environment in the base region are presented in the form of temperature contours, Mach number contours, velocity vectors, pressure distributions, and heat transfer coefficients on the base surface. The maximum heating rate was found on the centerline, and the two-dimensional stagnation point flow solution was adquate to estimate the maximum heating rate so long as the local Reynolds number could be obtained.

  7. Visualization and Analysis of Wireless Sensor Network Data for Smart Civil Structure Applications Based On Spatial Correlation Technique

    NASA Astrophysics Data System (ADS)

    Chowdhry, Bhawani Shankar; White, Neil M.; Jeswani, Jai Kumar; Dayo, Khalil; Rathi, Manorma

    2009-07-01

    Disasters affecting infrastructure, such as the 2001 earthquakes in India, 2005 in Pakistan, 2008 in China and the 2004 tsunami in Asia, provide a common need for intelligent buildings and smart civil structures. Now, imagine massive reductions in time to get the infrastructure working again, realtime information on damage to buildings, massive reductions in cost and time to certify that structures are undamaged and can still be operated, reductions in the number of structures to be rebuilt (if they are known not to be damaged). Achieving these ideas would lead to huge, quantifiable, long-term savings to government and industry. Wireless sensor networks (WSNs) can be deployed in buildings to make any civil structure both smart and intelligent. WSNs have recently gained much attention in both public and research communities because they are expected to bring a new paradigm to the interaction between humans, environment, and machines. This paper presents the deployment of WSN nodes in the Top Quality Centralized Instrumentation Centre (TQCIC). We created an ad hoc networking application to collect real-time data sensed from the nodes that were randomly distributed throughout the building. If the sensors are relocated, then the application automatically reconfigures itself in the light of the new routing topology. WSNs are event-based systems that rely on the collective effort of several micro-sensor nodes, which are continuously observing a physical phenomenon. WSN applications require spatially dense sensor deployment in order to achieve satisfactory coverage. The degree of spatial correlation increases with the decreasing inter-node separation. Energy consumption is reduced dramatically by having only those sensor nodes with unique readings transmit their data. We report on an algorithm based on a spatial correlation technique that assures high QoS (in terms of SNR) of the network as well as proper utilization of energy, by suppressing redundant data transmission

  8. Analysis of Land Covers over Northern Peninsular Malaysia by Using ALOS-PALSAR Data Based on Frequency-Based Contextual and Neural Network Classification Technique

    NASA Astrophysics Data System (ADS)

    Lim, H. S.; MatJafri, M. Z.; Abdullah, K.; Saleh, N. Mohd.

    2008-11-01

    Optical and microwave remote sensing data have been widely used in land cover and land use classification. Optical satellite remote sensing methods are more appropriate but require cloud-free conditions for data to be useful especially at Equatorial region. In Equatorial region cloud free acquisitions can be rare reducing these sensors' applicability to such studies. ALOS-PALSAR data can be acquired day and night irrespective of weather conditions. This paper presents a comparison between frequency-based contextual and neural network classification technique by using ALOS-PALSAR data for land cover assessment in Northern Peninsular Malaysia. The ALOS-PALSAR data acquired on 10 November 2006 were converted to vegetation, urban, water and other land features. The PALSAR data of training areas were choose and selected based on the optical satellite imagery and were classified using supervised classification methods. Supervised classification techniques were used in the classification analysis. The best supervised classifier was chosen based on the highest overall accuracy and Kappa statistic. Based on the result produced by this study, it can be pointed out the utility of ALOS-PALSAR data as an alternative data source for land cover classification in the Peninsular Malaysia.

  9. Comparison of two headspace sampling techniques for the analysis of off-flavour volatiles from oat based products.

    PubMed

    Cognat, Claudine; Shepherd, Tom; Verrall, Susan R; Stewart, Derek

    2012-10-01

    Two different headspace sampling techniques were compared for analysis of aroma volatiles from freshly produced and aged plain oatcakes. Solid phase microextraction (SPME) using a Carboxen-Polydimethylsiloxane (PDMS) fibre and entrainment on Tenax TA within an adsorbent tube were used for collection of volatiles. The effects of variation in the sampling method were also considered using SPME. The data obtained using both techniques were processed by multivariate statistical analysis (PCA). Both techniques showed similar capacities to discriminate between the samples at different ages. Discrimination between fresh and rancid samples could be made on the basis of changes in the relative abundances of 14-15 of the constituents in the volatile profiles. A significant effect on the detection level of volatile compounds was observed when samples were crushed and analysed by SPME-GC-MS, in comparison to undisturbed product. The applicability and cost effectiveness of both methods were considered. PMID:25005987

  10. A preliminary structural analysis of space-base living quarters modules to verify a weight-estimating technique

    NASA Technical Reports Server (NTRS)

    Grissom, D. S.; Schneider, W. C.

    1971-01-01

    The determination of a base line (minimum weight) design for the primary structure of the living quarters modules in an earth-orbiting space base was investigated. Although the design is preliminary in nature, the supporting analysis is sufficiently thorough to provide a reasonably accurate weight estimate of the major components that are considered to comprise the structural weight of the space base.

  11. NASA standard: Trend analysis techniques

    NASA Technical Reports Server (NTRS)

    1988-01-01

    This Standard presents descriptive and analytical techniques for NASA trend analysis applications. Trend analysis is applicable in all organizational elements of NASA connected with, or supporting, developmental/operational programs. Use of this Standard is not mandatory; however, it should be consulted for any data analysis activity requiring the identification or interpretation of trends. Trend Analysis is neither a precise term nor a circumscribed methodology, but rather connotes, generally, quantitative analysis of time-series data. For NASA activities, the appropriate and applicable techniques include descriptive and graphical statistics, and the fitting or modeling of data by linear, quadratic, and exponential models. Usually, but not always, the data is time-series in nature. Concepts such as autocorrelation and techniques such as Box-Jenkins time-series analysis would only rarely apply and are not included in this Standard. The document presents the basic ideas needed for qualitative and quantitative assessment of trends, together with relevant examples. A list of references provides additional sources of information.

  12. Global precipitation estimates based on a technique for combining satellite-based estimates, rain gauge analysis, and NWP model precipitation information

    NASA Technical Reports Server (NTRS)

    Huffman, George J.; Adler, Robert F.; Rudolf, Bruno; Schneider, Udo; Keehn, Peter R.

    1995-01-01

    The 'satellite-gauge model' (SGM) technique is described for combining precipitation estimates from microwave satellite data, infrared satellite data, rain gauge analyses, and numerical weather prediction models into improved estimates of global precipitation. Throughout, monthly estimates on a 2.5 degrees x 2.5 degrees lat-long grid are employed. First, a multisatellite product is developed using a combination of low-orbit microwave and geosynchronous-orbit infrared data in the latitude range 40 degrees N - 40 degrees S (the adjusted geosynchronous precipitation index) and low-orbit microwave data alone at higher latitudes. Then the rain gauge analysis is brougth in, weighting each field by its inverse relative error variance to produce a nearly global, observationally based precipitation estimate. To produce a complete global estimate, the numerical model results are used to fill data voids in the combined satellite-gauge estimate. Our sequential approach to combining estimates allows a user to select the multisatellite estimate, the satellite-gauge estimate, or the full SGM estimate (observationally based estimates plus the model information). The primary limitation in the method is imperfections in the estimation of relative error for the individual fields. The SGM results for one year of data (July 1987 to June 1988) show important differences from the individual estimates, including model estimates as well as climatological estimates. In general, the SGM results are drier in the subtropics than the model and climatological results, reflecting the relatively dry microwave estimates that dominate the SGM in oceanic regions.

  13. A single base extension technique for the analysis of known mutations utilizing capillary gel electrophoreisis with electrochemical detection.

    PubMed

    Brazill, Sara A; Kuhr, Werner G

    2002-07-15

    A novel single nucleotide polymorphism (SNP) detection system is described in which the accuracy of DNA polymerase and advantages of electrochemical detection are demonstrated. A model SNP system is presented to illustrate the potential advantages in coupling the single base extension (SBE) technique to capillary gel electrophoresis (CGE) with electrochemical detection. An electrochemically labeled primer, with a ferrocene acetate covalently attached to its 5' end, is used in the extension reaction. When the Watson-Crick complementary ddNTP is added to the SBE reaction, the primer is extended by a single nucleotide. The reaction mixture is subsequently separated by CGE, and the ferrocene-tagged fragments are detected at the separation anode with sinusoidal voltammetry. This work demonstrates the first single base resolution separation of DNA coupled with electrochemical detection. The unextended primer (20-mer) and the 21-mer extension product are separated with a resolution of 0.8. PMID:12139049

  14. High Throughput Sample Preparation and Analysis for DNA Sequencing, PCR and Combinatorial Screening of Catalysis Based on Capillary Array Technique

    SciTech Connect

    Yonghua Zhang

    2002-05-27

    Sample preparation has been one of the major bottlenecks for many high throughput analyses. The purpose of this research was to develop new sample preparation and integration approach for DNA sequencing, PCR based DNA analysis and combinatorial screening of homogeneous catalysis based on multiplexed capillary electrophoresis with laser induced fluorescence or imaging UV absorption detection. The author first introduced a method to integrate the front-end tasks to DNA capillary-array sequencers. protocols for directly sequencing the plasmids from a single bacterial colony in fused-silica capillaries were developed. After the colony was picked, lysis was accomplished in situ in the plastic sample tube using either a thermocycler or heating block. Upon heating, the plasmids were released while chromsomal DNA and membrane proteins were denatured and precipitated to the bottom of the tube. After adding enzyme and Sanger reagents, the resulting solution was aspirated into the reaction capillaries by a syringe pump, and cycle sequencing was initiated. No deleterious effect upon the reaction efficiency, the on-line purification system, or the capillary electrophoresis separation was observed, even though the crude lysate was used as the template. Multiplexed on-line DNA sequencing data from 8 parallel channels allowed base calling up to 620 bp with an accuracy of 98%. The entire system can be automatically regenerated for repeated operation. For PCR based DNA analysis, they demonstrated that capillary electrophoresis with UV detection can be used for DNA analysis starting from clinical sample without purification. After PCR reaction using cheek cell, blood or HIV-1 gag DNA, the reaction mixtures was injected into the capillary either on-line or off-line by base stacking. The protocol was also applied to capillary array electrophoresis. The use of cheaper detection, and the elimination of purification of DNA sample before or after PCR reaction, will make this approach an

  15. Prospects and limitations for determining the parameters in physical-based regional landslide susceptibility model using back analysis technique

    NASA Astrophysics Data System (ADS)

    Dong, Jia-Jyun; Liu, Chia-Nan; Lin, Yan-Cheng; Chen, Ci-Ren

    2010-05-01

    Landslide susceptibility analysis is crucial from viewpoint of hazard mitigation. Statistical and deterministic approaches are frequently adopted for landslide susceptibility analysis. Based on physical models, deterministic approaches are superior to the statistical approaches for they fully take the mechanical mechanisms into account. However, it is difficult to input the appropriate mechanical parameters (including strength and hydraulic) in a deterministic model. Back analysis is a promising way to calibrate the required parameters though few researches have paid attention to evaluate the performance of back analysis approach. This research use hypothetical cases (100 cells) to investigate the prospects and limitations for estimating the parameters of a deterministic model by using back-analysis approach. Based on the assigned hydraulic and strength parameters, the corresponding safety factor and landslide inventory (cell with safety factor less than 1), as well as the depth of ground water table for each cell, were calculated using a deterministic model, TRIGRS. The landslide inventory derived from the forward calculation is then used to back-calculate the pre-assigned parameters. Two scenarios of back analysis approaches were examined in this research. The results reveal that the non-uniqueness of back-analyzed hydraulic and strength parameters is detrimental to the performance if only the landslide inventory is utilized to back-calculate the parameters. However, the performance of back-calculation will be improved if the spatial and temporal variation of ground water table is used to calibrate the hydraulic parameters first. Thereafter, the multiple landslide inventories are hopefully helpful to soothe the non-uniqueness on back-calculating the hydraulic and strength parameters for a deterministic landslide susceptibility analysis in regional scale.

  16. Techniques for Automated Performance Analysis

    SciTech Connect

    Marcus, Ryan C.

    2014-09-02

    The performance of a particular HPC code depends on a multitude of variables, including compiler selection, optimization flags, OpenMP pool size, file system load, memory usage, MPI configuration, etc. As a result of this complexity, current predictive models have limited applicability, especially at scale. We present a formulation of scientific codes, nodes, and clusters that reduces complex performance analysis to well-known mathematical techniques. Building accurate predictive models and enhancing our understanding of scientific codes at scale is an important step towards exascale computing.

  17. A novel fast and flexible technique of radical kinetic behaviour investigation based on pallet for plasma evaluation structure and numerical analysis

    NASA Astrophysics Data System (ADS)

    Malinowski, Arkadiusz; Takeuchi, Takuya; Chen, Shang; Suzuki, Toshiya; Ishikawa, Kenji; Sekine, Makoto; Hori, Masaru; Lukasiak, Lidia; Jakubowski, Andrzej

    2013-07-01

    This paper describes a new, fast, and case-independent technique for sticking coefficient (SC) estimation based on pallet for plasma evaluation (PAPE) structure and numerical analysis. Our approach does not require complicated structure, apparatus, or time-consuming measurements but offers high reliability of data and high flexibility. Thermal analysis is also possible. This technique has been successfully applied to estimation of very low value of SC of hydrogen radicals on chemically amplified ArF 193 nm photoresist (the main goal of this study). Upper bound of our technique has been determined by investigation of SC of fluorine radical on polysilicon (in elevated temperature). Sources of estimation error and ways of its reduction have been also discussed. Results of this study give an insight into the process kinetics, and not only they are helpful in better process understanding but additionally they may serve as parameters in a phenomenological model development for predictive modelling of etching for ultimate CMOS topography simulation.

  18. Applicability of neuro-fuzzy techniques in predicting ground-water vulnerability: a GIS-based sensitivity analysis

    NASA Astrophysics Data System (ADS)

    Dixon, B.

    2005-07-01

    Modeling groundwater vulnerability reliably and cost effectively for non-point source (NPS) pollution at a regional scale remains a major challenge. In recent years, Geographic Information Systems (GIS), neural networks and fuzzy logic techniques have been used in several hydrological studies. However, few of these research studies have undertaken an extensive sensitivity analysis. The overall objective of this research is to examine the sensitivity of neuro-fuzzy models used to predict groundwater vulnerability in a spatial context by integrating GIS and neuro-fuzzy techniques. The specific objectives are to assess the sensitivity of neuro-fuzzy models in a spatial domain using GIS by varying (i) shape of the fuzzy sets, (ii) number of fuzzy sets, and (iii) learning and validation parameters (including rule weights). The neuro-fuzzy models were developed using NEFCLASS-J software on a JAVA platform and were loosely integrated with a GIS. Four plausible parameters which are critical in transporting contaminants through the soil profile to the groundwater, included soil hydrologic group, depth of the soil profile, soil structure (pedality points) of the A horizon, and landuse. In order to validate the model predictions, coincidence reports were generated among model inputs, model predictions, and well/spring contamination data for NO 3-N. A total of 16 neuro-fuzzy models were developed for selected sub-basins of Illinois River Watershed, AR. The sensitivity analysis showed that neuro-fuzzy models were sensitive to the shape of the fuzzy sets, number of fuzzy sets, nature of the rule weights, and validation techniques used during the learning processes. Compared to bell-shaped and triangular-shaped membership functions, the neuro-fuzzy models with a trapezoidal membership function were the least sensitive to the various permutations and combinations of the learning and validation parameters. Over all, Models 11 and 8 showed relatively higher coincidence with well

  19. Extension of an Itô-based general approximation technique for random vibration of a BBW general hysteris model part II: Non-Gaussian analysis

    NASA Astrophysics Data System (ADS)

    Davoodi, H.; Noori, M.

    1990-07-01

    The work presented in this paper constitutes the second phase of on-going research aimed at developing mathematical models for representing general hysteretic behavior of structures and approximation techniques for the computation and analysis of the response of hysteretic systems to random excitations. In this second part, the technique previously developed by the authors for the Gaussian response analysis of non-linear systems with general hysteretic behavior is extended for the non-Gaussian analysis of these systems. This approximation technique is based on the approach proposed independently by Ibrahim and Wu-Lin. In this work up to fourth order moments of the response co-ordinates are obtained for the Bouc-Baber-Wen smooth hysteresis model. These higher order statistics previously have not been made available for general hysteresis models by using existing approximation methods. Second order moments obtained for the model by this non-Gaussian closure scheme are compared with equivalent linearization and Gaussian closure results via Monte Carlo simulation (MCS). Higher order moments are compared with the simulation results. The study performed for a wide range of degradation parameters and input power spectral density ( PSD) levels shows that the non-Gaussian responses obtained by this approach are in better agreement with the MCS results than the linearized and Gaussian ones. This approximation technique can provide information on higher order moments for general hysteretic systems. This information is valuable in random vibration and the reliability analysis of hysteretically yielding structures.

  20. Task-based strategy for optimized contrast enhanced breast imaging: Analysis of six imaging techniques for mammography and tomosynthesis

    SciTech Connect

    Ikejimba, Lynda C.; Kiarashi, Nooshin; Ghate, Sujata V.; Samei, Ehsan; Lo, Joseph Y.

    2014-06-15

    Purpose: The use of contrast agents in breast imaging has the capability of enhancing nodule detectability and providing physiological information. Accordingly, there has been a growing trend toward using iodine as a contrast medium in digital mammography (DM) and digital breast tomosynthesis (DBT). Widespread use raises concerns about the best way to use iodine in DM and DBT, and thus a comparison is necessary to evaluate typical iodine-enhanced imaging methods. This study used a task-based observer model to determine the optimal imaging approach by analyzing six imaging paradigms in terms of their ability to resolve iodine at a given dose: unsubtracted mammography and tomosynthesis, temporal subtraction mammography and tomosynthesis, and dual energy subtraction mammography and tomosynthesis. Methods: Imaging performance was characterized using a detectability index d{sup ′}, derived from the system task transfer function (TTF), an imaging task, iodine signal difference, and the noise power spectrum (NPS). The task modeled a 10 mm diameter lesion containing iodine concentrations between 2.1 mg/cc and 8.6 mg/cc. TTF was obtained using an edge phantom, and the NPS was measured over several exposure levels, energies, and target-filter combinations. Using a structured CIRS phantom, d{sup ′} was generated as a function of dose and iodine concentration. Results: For all iodine concentrations and dose, temporal subtraction techniques for mammography and tomosynthesis yielded the highest d{sup ′}, while dual energy techniques for both modalities demonstrated the next best performance. Unsubtracted imaging resulted in the lowest d{sup ′} values for both modalities, with unsubtracted mammography performing the worst out of all six paradigms. Conclusions: At any dose, temporal subtraction imaging provides the greatest detectability, with temporally subtracted DBT performing the highest. The authors attribute the successful performance to excellent cancellation of

  1. Photogrammetric Techniques for Road Surface Analysis

    NASA Astrophysics Data System (ADS)

    Knyaz, V. A.; Chibunichev, A. G.

    2016-06-01

    The quality and condition of a road surface is of great importance for convenience and safety of driving. So the investigations of the behaviour of road materials in laboratory conditions and monitoring of existing roads are widely fulfilled for controlling a geometric parameters and detecting defects in the road surface. Photogrammetry as accurate non-contact measuring method provides powerful means for solving different tasks in road surface reconstruction and analysis. The range of dimensions concerned in road surface analysis can have great variation from tenths of millimetre to hundreds meters and more. So a set of techniques is needed to meet all requirements of road parameters estimation. Two photogrammetric techniques for road surface analysis are presented: for accurate measuring of road pavement and for road surface reconstruction based on imagery obtained from unmanned aerial vehicle. The first technique uses photogrammetric system based on structured light for fast and accurate surface 3D reconstruction and it allows analysing the characteristics of road texture and monitoring the pavement behaviour. The second technique provides dense 3D model road suitable for road macro parameters estimation.

  2. A RT-based Technique for the Analysis and the Removal of Titan's Atmosphere by Cassini/VIMS-IR data

    NASA Astrophysics Data System (ADS)

    Sindoni, G.; Tosi, F.; Adriani, A.; Moriconi, M. L.; D'Aversa, E.; Grassi, D.; Oliva, F.; Dinelli, B. M.; Castelli, E.

    2015-12-01

    Since 2004, the Visual and Infrared Mapping Spectrometer (VIMS), together with the CIRS and UVIS spectrometers, aboard the Cassini spacecraft has provided insight on Saturn and Titan atmospheres through remote sensing observations. The presence of clouds and aerosols in Titan's dense atmosphere makes the analysis of the surface radiation a difficult task. For this purpose, an atmospheric radiative transfer (RT) model is required. The implementation of a RT code, which includes multiple scattering, in an inversion algorithm based on the Bayesian approach, can provide strong constraints about both the surface albedo and the atmospheric composition. The application of this retrieval procedure we have developed to VIMS-IR spectra acquired in nadir or slant geometries allows us to retrieve the equivalent opacity of Titan's atmosphere in terms of variable aerosols and gaseous content. Thus, the separation of the atmospheric and surface contributions in the observed spectrum is possible. The atmospheric removal procedure was tested on the spectral range 1-2.2μm of publicly available VIMS data covering the Ontario Lacus and Ligeia Mare regions. The retrieval of the accurate composition of Titan's atmosphere is a much more complex task. So far, the information about the vertical structure of the atmosphere by limb spectra was mostly derived under conditions where the scattering could be neglected [1,2]. Indeed, since the very high aerosol load in the middle-low atmosphere produces strong scattering effects on the measured spectra, the analysis requires a RT modeling taking into account multiple scattering in a spherical-shell geometry. Therefore the use of an innovative method we are developing based on the Monte-Carlo approach, can provide important information about the vertical distribution of the aerosols and the gases composing Titan's atmosphere.[1]Bellucci et al., (2009). Icarus, 201, Issue 1, p. 198-216.[2]de Kok et al., (2007). Icarus, 191, Issue 1, p. 223-235.

  3. Terahertz Wide-Angle Imaging and Analysis on Plane-wave Criteria Based on Inverse Synthetic Aperture Techniques

    NASA Astrophysics Data System (ADS)

    Gao, Jing Kun; Qin, Yu Liang; Deng, Bin; Wang, Hong Qiang; Li, Jin; Li, Xiang

    2016-04-01

    This paper presents two parts of work around terahertz imaging applications. The first part aims at solving the problems occurred with the increasing of the rotation angle. To compensate for the nonlinearity of terahertz radar systems, a calibration signal acquired from a bright target is always used. Generally, this compensation inserts an extra linear phase term in the intermediate frequency (IF) echo signal which is not expected in large-rotation angle imaging applications. We carried out a detailed theoretical analysis on this problem, and a minimum entropy criterion was employed to estimate and compensate for the linear-phase errors. In the second part, the effects of spherical wave on terahertz inverse synthetic aperture imaging are analyzed. Analytic criteria of plane-wave approximation were derived in the cases of different rotation angles. Experimental results of corner reflectors and an aircraft model based on a 330-GHz linear frequency-modulated continuous wave (LFMCW) radar system validated the necessity and effectiveness of the proposed compensation. By comparing the experimental images obtained under plane-wave assumption and spherical-wave correction, it also showed to be highly consistent with the analytic criteria we derived.

  4. School Principals' Personal Constructs Regarding Technology: An Analysis Based on Decision-Making Grid Technique

    ERIC Educational Resources Information Center

    Bektas, Fatih

    2014-01-01

    This study aims to determine the similarities and differences between existing school principals' personal constructs of "ideal principal qualities" in terms of technology by means of the decision-making grid technique. The study has a phenomenological design, and the study group consists of 17 principals who have been serving at…

  5. Visual exploratory analysis of DCE-MRI data in breast cancer based on novel nonlinear dimensional data reduction techniques

    NASA Astrophysics Data System (ADS)

    Meyer-Bäse, Anke; Lespinats, Sylvain; Steinbrücker, Frank; Saalbach, Axel; Schlossbauer, Thomas; Barbu, Adrian

    2009-04-01

    Visualization of multi-dimensional data sets becomes a critical and significant area in modern medical image processing. To analyze such high dimensional data, novel nonlinear embedding approaches become increasingly important to show dependencies among these data in a two- or three-dimensional space. This paper investigates the potential of novel nonlinear dimensional data reduction techniques and compares their results with proven nonlinear techniques when applied to the differentiation of malignant and benign lesions described by high-dimensional data sets arising from dynamic contrast-enhanced magnetic resonance imaging (DCE-MRI). Two important visualization modalities in medical imaging are presented: the mapping on a lower-dimensional data manifold and the image fusion.

  6. Multicomponent analysis using established techniques

    NASA Astrophysics Data System (ADS)

    Dillehay, David L.

    1991-04-01

    Recent environmental concerns have greatly increased the need, application and scope of real-time continuous emission monitoring systems. New techniques like Fourier Transform Infrared have been applied with limited success for this application. However, the use of well-tried and established techniques (Gas Filter Correlation and Single Beam Dual Wavelength) combined with sophisticated microprocessor technology have produced reliable monitoring systems with increased measurement accuracy.

  7. Feature-Based Registration Techniques

    NASA Astrophysics Data System (ADS)

    Lorenz, Cristian; Klinder, Tobias; von Berg, Jens

    In contrast to intensity-based image registration, where a similarity measure is typically evaluated at each voxel location, feature-based registration works on a sparse set of image locations. Therefore, it needs an explicit step of interpolation to supply a dense deformation field. In this chapter, the application of feature-based registration to pulmonary image registration as well as hybrid methods, combining feature-based with intensity-based registration, is discussed. In contrast to pure feature based registration methods, hybrid methods are increasingly proposed in the pulmonary context and have the potential to out-perform purely intensity based registration methods. Available approaches will be classified along the categories feature type, correspondence definition, and interpolation type to finally achieve a dense deformation field.

  8. Application of spectral analysis techniques to the intercomparison of aerosol data - Part 4: Synthesized analysis of multisensor satellite and ground-based AOD measurements using combined maximum covariance analysis

    NASA Astrophysics Data System (ADS)

    Li, J.; Carlson, B. E.; Lacis, A. A.

    2014-08-01

    In this paper, we introduce the usage of a newly developed spectral decomposition technique - combined maximum covariance analysis (CMCA) - in the spatiotemporal comparison of four satellite data sets and ground-based observations of aerosol optical depth (AOD). This technique is based on commonly used principal component analysis (PCA) and maximum covariance analysis (MCA). By decomposing the cross-covariance matrix between the joint satellite data field and Aerosol Robotic Network (AERONET) station data, both parallel comparison across different satellite data sets and the evaluation of the satellite data against the AERONET measurements are simultaneously realized. We show that this new method not only confirms the seasonal and interannual variability of aerosol optical depth, aerosol-source regions and events represented by different satellite data sets, but also identifies the strengths and weaknesses of each data set in capturing the variability associated with sources, events or aerosol types. Furthermore, by examining the spread of the spatial modes of different satellite fields, regions with the largest uncertainties in aerosol observation are identified. We also present two regional case studies that respectively demonstrate the capability of the CMCA technique in assessing the representation of an extreme event in different data sets, and in evaluating the performance of different data sets on seasonal and interannual timescales. Global results indicate that different data sets agree qualitatively for major aerosol-source regions. Discrepancies are mostly found over the Sahel, India, eastern and southeastern Asia. Results for eastern Europe suggest that the intense wildfire event in Russia during summer 2010 was less well-represented by SeaWiFS (Sea-viewing Wide Field-of-view Sensor) and OMI (Ozone Monitoring Instrument), which might be due to misclassification of smoke plumes as clouds. Analysis for the Indian subcontinent shows that here SeaWiFS agrees

  9. Robustness of reliability-growth analysis techniques

    NASA Astrophysics Data System (ADS)

    Ellis, Karen E.

    The author examines the robustness of techniques commonly applied to failure time data to determine if the failure rate (1/mean-time-between-failures) is changing over time. The models examined are the Duane postulate, Crow-Army Material Systems Analysis Activity, and Kalman filtering (also referred to as dynamic linear modeling). Each has as a foundation the underlying premise of changing failure rate over time. The techniques seek to confirm or reject whether failure rate is changing significantly, based on observed data. To compare the ability of each method to accomplish such a rejection or confirmation, a known failure time distribution is simulated, and then each model is applied and results are compared.

  10. A Three Corner Hat-based analysis of station position time series for the assessment of inter-technique precision at ITRF co-located sites

    NASA Astrophysics Data System (ADS)

    Abbondanza, C.; Chin, T. M.; Gross, R. S.; Heflin, M. B.; Hurst, K. J.; Parker, J. W.; Wu, X.; Altamimi, Z.

    2012-12-01

    Assessing the uncertainty in geodetic positioning is a crucial factor when combining independent space-geodetic solutions for the computation of the International Terrestrial Reference Frame (ITRF). ITRF is a combined product based on the stacking of VLBI, GPS, SLR and DORIS solutions and merging the single technique reference frames with terrestrial local tie measurements at co-located sites. In current ITRF realizations, the uncertainty evaluation of the four techniques relies on the analysis of the post-fit residuals, which are a by-product of the combination process. An alternative approach to the assessment of the inter-technique precision can be offered by a Three Corner Hat (TCH) analysis of the non-linear residual time series obtained at ITRF co-location sites as a by-product of the stacking procedure. Non-linear residuals of station position time series stemming from global networks of the four techniques can be modeled as a composition of periodic signals (commonly annual and semi-annual) and stochastic noise, typically characterized as a combination of flicker and white noise. Pair-wise differences of station position time series of at least three co-located instruments can be formed with the aim of removing the common geophysical signal and characterizing the inter-technique precision. The application of TCH relies on the hypothesis of absence of correlation between the error processes of the four techniques and assumes the stochastic noise to be Gaussian. If the hypothesis of statistical independence between the space-geodetic technique errors is amply verified, the assumption of pure white noise of the stochastic error processes appears to be more questionable. In fact, previous studies focused on geodetic positioning consistently showed that flicker noise generally prevails over white noise in the analysis of global network GPS time series, whereas in VLBI, SLR and DORIS time series Gaussian noise is predominant. In this investigation, TCH is applied

  11. Computer navigation vs conventional mechanical jig technique in hip resurfacing arthroplasty: a meta-analysis based on 7 studies.

    PubMed

    Liu, Hao; Li, Lin; Gao, Wei; Wang, Meilin; Ni, Chunhui

    2013-01-01

    The studies on the accuracy of femoral component in hip resurfacing arthroplasty with the help of computer-assisted navigation were not consistent. This study aims to assess at the functional outcomes after computer navigation in hip resurfacing arthroplasty by systematically reviewing and meta-analyzing the data, which were searched up to December 2011 in PubMed, MEDLINE, EMBASE, MetaMed, EBSCO HOST, and the Web site of Google scholar. Totally, 197 articles about hip resurfacing arthroplasty were collected; finally, 7 articles met the inclusion criteria and were included in this meta-analysis (520 patients with 555 hip resurfacing arthroplasty). The odds ratio for the number of outliers was 0.155 (95% confidence interval, 0.048-0.498; P < .003). In conclusion, this meta-analysis suggests that the computer-assisted navigation system makes the femoral component positioning in hip resurfacing arthroplasty easier and more precise. PMID:22771091

  12. [Development of Selective LC Analysis Method for Biogenic and Related Compounds Based on a Fluorous Affinity Technique].

    PubMed

    Hayama, Tadashi

    2015-01-01

    A separation-oriented derivatization method combined with LC has been developed for the selective analysis of biogenic and related compounds. In this method, we utilized a specific affinity between perfluoroalkyl-containing compounds, i.e., 'fluorous' compounds (fluorophilicity). Our strategy involves the derivatization of target analytes with perfluoroalkyl reagents, followed by selective retention of the derivatives with a perfluoroalkyl-modified stationary phase LC column. The perfluoroalkylated derivatives are strongly retained on the column owing to their fluorophilicity, whereas non-derivatized species, such as sample matrices, are hardly retained. Therefore, utilizing this derivatization method, target analytes can be determined selectively without interference from matrices. This method has been successfully applied to the LC analysis of some biogenic and related compounds in complex biological samples. PMID:26329550

  13. Comprehensive analysis of mitochondrial permeability transition pore activity in living cells using fluorescence-imaging-based techniques.

    PubMed

    Bonora, Massimo; Morganti, Claudia; Morciano, Giampaolo; Giorgi, Carlotta; Wieckowski, Mariusz R; Pinton, Paolo

    2016-06-01

    Mitochondrial permeability transition (mPT) refers to a sudden increase in the permeability of the inner mitochondrial membrane. Long-term studies of mPT revealed that this phenomenon has a critical role in multiple pathophysiological processes. mPT is mediated by the opening of a complex termed the mPT pore (mPTP), which is responsible for the osmotic influx of water into the mitochondrial matrix, resulting in swelling of mitochondria and dissipation of the mitochondrial membrane potential. Here we provide three independent optimized protocols for monitoring mPT in living cells: (i) measurement using a calcein-cobalt technique, (ii) measurement of the mPTP-dependent alteration of the mitochondrial membrane potential, and (iii) measurement of mitochondrial swelling. These procedures can easily be modified and adapted to different cell types. Cell culture and preparation of the samples are estimated to take ∼1 d for methods (i) and (ii), and ∼3 d for method (iii). The entire experiment, including analyses, takes ∼2 h. PMID:27172167

  14. Bone quality around bioactive silica-based coated stainless steel implants: analysis by micro-Raman, XRF and XAS techniques.

    PubMed

    Ballarre, Josefina; Desimone, Paula M; Chorro, Matthieu; Baca, Matías; Orellano, Juan Carlos; Ceré, Silvia M

    2013-11-01

    Surface modification of surgical stainless steel implants by sol gel coatings has been proposed as a tool to generate a surface that besides being protective could also create a "bioactive" interface to generate a natural bonding between the metal surface and the existing bone. The aim of this work is to analyze the quality and bone formation around hybrid bioactive coatings containing glass-ceramic particles, made by sol-gel process on 316L stainless steel used as permanent implant in terms of mineralization, calcium content and bone maturity with micro Raman, X-ray microfluorescence and X-ray absorption techniques. Uncoated implants seem to generate a thin bone layer at the beginning of osseointegration process and then this layer being separated from the surface with time. The hybrid coatings without glass-ceramic particles generate new bone around implants, with high concentration of Ca and P at the implant/tissue interface. This fact seems to be related with the presence of silica nanoparticles in the layer. The addition of bioactive particles promotes and enhances the bone quality with a homogeneous Ca and P content and a low rate of beta carbonate substitution and crystallinity, similar to young and mechanical resistant bone. PMID:24076155

  15. Analysis of the nonlinear behavior of shear-Alfvén modes in tokamaks based on Hamiltonian mapping techniques

    SciTech Connect

    Briguglio, S. Vlad, G.; Fogaccia, G.; Di Troia, C.; Fusco, V.; Wang, X.; Zonca, F.

    2014-11-15

    We present a series of numerical simulation experiments set up to illustrate the fundamental physics processes underlying the nonlinear dynamics of Alfvénic modes resonantly excited by energetic particles in tokamak plasmas and of the ensuing energetic particle transports. These phenomena are investigated by following the evolution of a test particle population in the electromagnetic fields computed in self-consistent MHD-particle simulation performed by the HMGC code. Hamiltonian mapping techniques are used to extract and illustrate several features of wave-particle dynamics. The universal structure of resonant particle phase space near an isolated resonance is recovered and analyzed, showing that bounded orbits and untrapped trajectories, divided by the instantaneous separatrix, form phase space zonal structures, whose characteristic non-adiabatic evolution time is the same as the nonlinear time of the underlying fluctuations. Bounded orbits correspond to a net outward resonant particle flux, which produces a flattening and/or gradient inversion of the fast ion density profile around the peak of the linear wave-particle resonance. The connection of this phenomenon to the mode saturation is analyzed with reference to two different cases: a Toroidal Alfvén eigenmode in a low shear magnetic equilibrium and a weakly unstable energetic particle mode for stronger magnetic shear. It is shown that, in the former case, saturation is reached because of radial decoupling (resonant particle redistribution matching the mode radial width) and is characterized by a weak dependence of the mode amplitude on the growth rate. In the latter case, saturation is due to resonance detuning (resonant particle redistribution matching the resonance width) with a stronger dependence of the mode amplitude on the growth rate.

  16. Analysis of algebraic reconstruction technique for accurate imaging of gas temperature and concentration based on tunable diode laser absorption spectroscopy

    NASA Astrophysics Data System (ADS)

    Hui-Hui, Xia; Rui-Feng, Kan; Jian-Guo, Liu; Zhen-Yu, Xu; Ya-Bai, He

    2016-06-01

    An improved algebraic reconstruction technique (ART) combined with tunable diode laser absorption spectroscopy(TDLAS) is presented in this paper for determining two-dimensional (2D) distribution of H2O concentration and temperature in a simulated combustion flame. This work aims to simulate the reconstruction of spectroscopic measurements by a multi-view parallel-beam scanning geometry and analyze the effects of projection rays on reconstruction accuracy. It finally proves that reconstruction quality dramatically increases with the number of projection rays increasing until more than 180 for 20 × 20 grid, and after that point, the number of projection rays has little influence on reconstruction accuracy. It is clear that the temperature reconstruction results are more accurate than the water vapor concentration obtained by the traditional concentration calculation method. In the present study an innovative way to reduce the error of concentration reconstruction and improve the reconstruction quality greatly is also proposed, and the capability of this new method is evaluated by using appropriate assessment parameters. By using this new approach, not only the concentration reconstruction accuracy is greatly improved, but also a suitable parallel-beam arrangement is put forward for high reconstruction accuracy and simplicity of experimental validation. Finally, a bimodal structure of the combustion region is assumed to demonstrate the robustness and universality of the proposed method. Numerical investigation indicates that the proposed TDLAS tomographic algorithm is capable of detecting accurate temperature and concentration profiles. This feasible formula for reconstruction research is expected to resolve several key issues in practical combustion devices. Project supported by the Young Scientists Fund of the National Natural Science Foundation of China (Grant No. 61205151), the National Key Scientific Instrument and Equipment Development Project of China (Grant

  17. A dual resolution measurement based Monte Carlo simulation technique for detailed dose analysis of small volume organs in the skull base region

    NASA Astrophysics Data System (ADS)

    Yeh, Chi-Yuan; Tung, Chuan-Jung; Chao, Tsi-Chain; Lin, Mu-Han; Lee, Chung-Chi

    2014-11-01

    The purpose of this study was to examine dose distribution of a skull base tumor and surrounding critical structures in response to high dose intensity-modulated radiosurgery (IMRS) with Monte Carlo (MC) simulation using a dual resolution sandwich phantom. The measurement-based Monte Carlo (MBMC) method (Lin et al., 2009) was adopted for the study. The major components of the MBMC technique involve (1) the BEAMnrc code for beam transport through the treatment head of a Varian 21EX linear accelerator, (2) the DOSXYZnrc code for patient dose simulation and (3) an EPID-measured efficiency map which describes non-uniform fluence distribution of the IMRS treatment beam. For the simulated case, five isocentric 6 MV photon beams were designed to deliver a total dose of 1200 cGy in two fractions to the skull base tumor. A sandwich phantom for the MBMC simulation was created based on the patient's CT scan of a skull base tumor [gross tumor volume (GTV)=8.4 cm3] near the right 8th cranial nerve. The phantom, consisted of a 1.2-cm thick skull base region, had a voxel resolution of 0.05×0.05×0.1 cm3 and was sandwiched in between 0.05×0.05×0.3 cm3 slices of a head phantom. A coarser 0.2×0.2×0.3 cm3 single resolution (SR) phantom was also created for comparison with the sandwich phantom. A particle history of 3×108 for each beam was used for simulations of both the SR and the sandwich phantoms to achieve a statistical uncertainty of <2%. Our study showed that the planning target volume (PTV) receiving at least 95% of the prescribed dose (VPTV95) was 96.9%, 96.7% and 99.9% for the TPS, SR, and sandwich phantom, respectively. The maximum and mean doses to large organs such as the PTV, brain stem, and parotid gland for the TPS, SR and sandwich MC simulations did not show any significant difference; however, significant dose differences were observed for very small structures like the right 8th cranial nerve, right cochlea, right malleus and right semicircular canal. Dose

  18. Application of Electromigration Techniques in Environmental Analysis

    NASA Astrophysics Data System (ADS)

    Bald, Edward; Kubalczyk, Paweł; Studzińska, Sylwia; Dziubakiewicz, Ewelina; Buszewski, Bogusław

    Inherently trace-level concentration of pollutants in the environment, together with the complexity of sample matrices, place a strong demand on the detection capabilities of electromigration methods. Significant progress is continually being made, widening the applicability of these techniques, mostly capillary zone electrophoresis, micellar electrokinetic chromatography, and capillary electrochromatography, to the analysis of real-world environmental samples, including the concentration sensitivity and robustness of the developed analytical procedures. This chapter covers the recent major developments in the domain of capillary electrophoresis analysis of environmental samples for pesticides, polycyclic aromatic hydrocarbons, phenols, amines, carboxylic acids, explosives, pharmaceuticals, and ionic liquids. Emphasis is made on pre-capillary and on-capillary chromatography and electrophoresis-based concentration of analytes and detection improvement.

  19. Typology of delivery quality: latent profile analysis of teacher engagement and delivery techniques in a school-based prevention intervention, keepin’ it REAL curriculum

    PubMed Central

    Shin, YoungJu; Miller-Day, Michelle; Pettigrew, Jonathan; Hecht, Michael L.; Krieger, Janice L.

    2014-01-01

    Enhancing the delivery quality of school-based, evidence-based prevention programs is one key to ensuring uniform program effects on student outcomes. Program evaluations often focus on content dosage when implementing prevention curricula, however, less is known about implementation quality of prevention content, especially among teachers who may or may not have a prevention background. The goal of the current study is to add to the scholarly literature on implementation quality for a school-based substance use prevention intervention. Twenty-five schools in Ohio and Pennsylvania implemented the original keepin’ REAL (kiR) substance use prevention curriculum. Each of the 10, 40–45 min lessons of the kiR curriculum was video recorded. Coders observed and rated a random sample of 276 videos reflecting 78 classes taught by 31 teachers. Codes included teachers’ delivery techniques (e.g. lecture, discussion, demonstration and role play) and engagement with students (e.g. attentiveness, enthusiasm and positivity). Based on the video ratings, a latent profile analysis was run to identify typology of delivery quality. Five profiles were identified: holistic approach, attentive teacher-orientated approach, enthusiastic lecture approach, engaged interactive learning approach and skill practice-only approach. This study provides a descriptive typology of delivery quality while implementing a school-based substance use prevention intervention. PMID:25274721

  20. Wavelet analysis of an Ionospheric foF_2 parameter as a Precursor of Earthquakes using Ground based Techniques

    NASA Astrophysics Data System (ADS)

    Sonakia, Anjana; Gwal, Ashok Kumar; Sondhiya, Deepak Kumar; Kasde, Satish Kumar

    Abstract: The Wavelet analysis of the variations in hourly-mean value of F2-layer critical frequency foF2 is performed in association with the occurrence of three earthquakes occurred at New Zealand with magnitudes M > 6.0, depths h < 150 km and distances from the vertical sounding station R < 1500 km. For the study, data of the Christ church sounding station are used, which were registered every hour in the years 1957-1990. It is shown that on the average foF2 increases before the earthquakes. The aim of the present work is to prove the foF2-increases significantly for New Zealand region earthquakes and to determine behavior of foF2 and observable modification of the mean foF2 frequency in connection with the magnitudes of the earthquakes. Keywords: Earthquake precursor, Wavelet power spectrum, Scale-average wavelet power, Ionospheric Total Electron Content and foF_2 parameter

  1. Application of python-based Abaqus preprocess and postprocess technique in analysis of gearbox vibration and noise reduction

    NASA Astrophysics Data System (ADS)

    Yi, Guilian; Sui, Yunkang; Du, Jiazheng

    2011-06-01

    To reduce vibration and noise, a damping layer and constraint layer are usually pasted on the inner surface of a gearbox thin shell, and their thicknesses are the main parameters in the vibration and noise reduction design. The normal acceleration of the point on the gearbox surface is the main index that can reflect the vibration and noise of that point, and the normal accelerations of different points can reflect the degree of the vibration and noise of the whole structure. The K-S function is adopted to process many points' normal accelerations as the comprehensive index of the vibration characteristics of the whole structure, and the vibration acceleration level is adopted to measure the degree of the vibration and noise. Secondary development of the Abaqus preprocess and postprocess on the basis of the Python scripting programming automatically modifies the model parameters, submits the job, and restarts the analysis totally, which avoids the tedious work of returning to the Abaqus/CAE for modifying and resubmitting and improves the speed of the preprocess and postprocess and the computational efficiency.

  2. Crude oil price forecasting based on hybridizing wavelet multiple linear regression model, particle swarm optimization techniques, and principal component analysis.

    PubMed

    Shabri, Ani; Samsudin, Ruhaidah

    2014-01-01

    Crude oil prices do play significant role in the global economy and are a key input into option pricing formulas, portfolio allocation, and risk measurement. In this paper, a hybrid model integrating wavelet and multiple linear regressions (MLR) is proposed for crude oil price forecasting. In this model, Mallat wavelet transform is first selected to decompose an original time series into several subseries with different scale. Then, the principal component analysis (PCA) is used in processing subseries data in MLR for crude oil price forecasting. The particle swarm optimization (PSO) is used to adopt the optimal parameters of the MLR model. To assess the effectiveness of this model, daily crude oil market, West Texas Intermediate (WTI), has been used as the case study. Time series prediction capability performance of the WMLR model is compared with the MLR, ARIMA, and GARCH models using various statistics measures. The experimental results show that the proposed model outperforms the individual models in forecasting of the crude oil prices series. PMID:24895666

  3. Crude Oil Price Forecasting Based on Hybridizing Wavelet Multiple Linear Regression Model, Particle Swarm Optimization Techniques, and Principal Component Analysis

    PubMed Central

    Shabri, Ani; Samsudin, Ruhaidah

    2014-01-01

    Crude oil prices do play significant role in the global economy and are a key input into option pricing formulas, portfolio allocation, and risk measurement. In this paper, a hybrid model integrating wavelet and multiple linear regressions (MLR) is proposed for crude oil price forecasting. In this model, Mallat wavelet transform is first selected to decompose an original time series into several subseries with different scale. Then, the principal component analysis (PCA) is used in processing subseries data in MLR for crude oil price forecasting. The particle swarm optimization (PSO) is used to adopt the optimal parameters of the MLR model. To assess the effectiveness of this model, daily crude oil market, West Texas Intermediate (WTI), has been used as the case study. Time series prediction capability performance of the WMLR model is compared with the MLR, ARIMA, and GARCH models using various statistics measures. The experimental results show that the proposed model outperforms the individual models in forecasting of the crude oil prices series. PMID:24895666

  4. Analysis of land-use/land-cover change in the Carpathian region based on remote sensing techniques

    NASA Astrophysics Data System (ADS)

    Dezsõ, Zs.; Bartholy, J.; Pongrácz, R.; Barcza, Z.

    2003-04-01

    Human activities result in different significant environmental changes, these complex feedback processes may cause dramatic changes in our everyday life. Among others they include land-use and consequently land-cover changes. In order to study such complex variables full spatial coverage of the given area is one of the key issues. Rapid development of satellite use in different topics of research has provided an excellent tool to build agricultural monitoring systems and to improve our understanding of the complex links between air, water and land, including vegetation. In the last few years serious flood events occurred at the watershed of the river Tisza (both in Hungary and in Ukraine). One of the reasons of these floods is heavy precipitation at the region, which result in severe runoff consequences because of the significant change in land-use/land-cover. In this analysis both land-use change and Normalized Difference Vegetation Index (NDVI) values for the Carpathian Region have been statistically analysed for the last two decades. Remotely sensed datasets observed by NOAA and NASA satellites are available for this period. The spatial resolution of these measurements is 1 to 8 km. Tendencies in the change of natural and artificial land-cover types are investigated in the upper watershed of the river Tisza. According to our estimations the forest area on the Ukrainian part of the watershed decreased by about 10% in the last decade. Possible reasons include regional effects of the global climate change, deforestation in the region, etc.

  5. Detailed fuel spray analysis techniques

    NASA Technical Reports Server (NTRS)

    Mularz, E. J.; Bosque, M. A.; Humenik, F. M.

    1983-01-01

    Detailed fuel spray analyses are a necessary input to the analytical modeling of the complex mixing and combustion processes which occur in advanced combustor systems. It is anticipated that by controlling fuel-air reaction conditions, combustor temperatures can be better controlled, leading to improved combustion system durability. Thus, a research program is underway to demonstrate the capability to measure liquid droplet size, velocity, and number density throughout a fuel spray and to utilize this measurement technique in laboratory benchmark experiments. The research activities from two contracts and one grant are described with results to data. The experiment to characterize fuel sprays is also described. These experiments and data should be useful for application to and validation of turbulent flow modeling to improve the design systems of future advanced technology engines.

  6. Digital techniques for ULF wave polarization analysis

    NASA Technical Reports Server (NTRS)

    Arthur, C. W.

    1979-01-01

    Digital power spectral and wave polarization analysis are powerful techniques for studying ULF waves in the earth's magnetosphere. Four different techniques for using the spectral matrix to perform such an analysis have been presented in the literature. Three of these techniques are similar in that they require transformation of the spectral matrix to the principal axis system prior to performing the polarization analysis. The differences in the three techniques lie in the manner in which determine this transformation. A comparative study of these three techniques using both simulated and real data has shown them to be approximately equal in quality of performance. The fourth technique does not require transformation of the spectral matrix. Rather, it uses the measured spectral matrix and state vectors for a desired wave type to design a polarization detector function in the frequency domain. The design of various detector functions and their application to both simulated and real data will be presented.

  7. AI-based technique for tracking chains of discontinuous symbols and its application to the analysis of topographic maps

    NASA Astrophysics Data System (ADS)

    Mecocci, Alessandro; Lilla, Massimiliano

    1994-12-01

    Automatic digitization of topographic maps is a very important task nowadays. Among the different elements of a topographic map discontinuous lines represent important information. Generally they are difficult to track because they show very large gaps, and abrupt direction changes. In this paper an architecture that automates the digitalization of discontinuous lines (dot-dot lines, dash-dot-dash lines, dash-asterisk lines, etc.) is presented. The tracking process must detect the elementary symbols and then concatenate these symbols into a significant chain that represents the line. The proposed architecture is composed of a common kernel, based on a suitable modification of the A* algorithm, that starts different auxiliary processes depending on the particular line to be tracked. Three auxiliary processes are considered: search strategy generation (SSG) which is responsible for the strategy used to scan the image pixels; low level symbol detection (LSD) which decides if a certain image region around the pixel selected by the SSG is an elementary symbol; cost evaluation (CE) which gives the quality of each symbol with respect to the global course of the line. The whole system has been tested on a 1:50.000 map furnished by the Istituto Geografico Militare Italiano (IGMI). The results were very good for different types of discontinuous lines. Over the whole map (i.e. about 80 Mbytes of digitized data) 95% of the elementary symbols of the lines have been correctly chained. The operator time required to correct misclassifications is a small part of the time needed to manually digitize the discontinuous lines.

  8. Signal analysis techniques for incipient failure detection in turbomachinery

    NASA Technical Reports Server (NTRS)

    Coffin, T.

    1985-01-01

    Signal analysis techniques for the detection and classification of incipient mechanical failures in turbomachinery were developed, implemented and evaluated. Signal analysis techniques available to describe dynamic measurement characteristics are reviewed. Time domain and spectral methods are described, and statistical classification in terms of moments is discussed. Several of these waveform analysis techniques were implemented on a computer and applied to dynamic signals. A laboratory evaluation of the methods with respect to signal detection capability is described. Plans for further technique evaluation and data base development to characterize turbopump incipient failure modes from Space Shuttle main engine (SSME) hot firing measurements are outlined.

  9. Innovative Techniques Simplify Vibration Analysis

    NASA Technical Reports Server (NTRS)

    2010-01-01

    In the early years of development, Marshall Space Flight Center engineers encountered challenges related to components in the space shuttle main engine. To assess the problems, they evaluated the effects of vibration and oscillation. To enhance the method of vibration signal analysis, Marshall awarded Small Business Innovation Research (SBIR) contracts to AI Signal Research, Inc. (ASRI), in Huntsville, Alabama. ASRI developed a software package called PC-SIGNAL that NASA now employs on a daily basis, and in 2009, the PKP-Module won Marshall s Software of the Year award. The technology is also used in many industries: aircraft and helicopter, rocket engine manufacturing, transportation, and nuclear power."

  10. A numerical comparison of sensitivity analysis techniques

    SciTech Connect

    Hamby, D.M.

    1993-12-31

    Engineering and scientific phenomena are often studied with the aid of mathematical models designed to simulate complex physical processes. In the nuclear industry, modeling the movement and consequence of radioactive pollutants is extremely important for environmental protection and facility control. One of the steps in model development is the determination of the parameters most influential on model results. A {open_quotes}sensitivity analysis{close_quotes} of these parameters is not only critical to model validation but also serves to guide future research. A previous manuscript (Hamby) detailed many of the available methods for conducting sensitivity analyses. The current paper is a comparative assessment of several methods for estimating relative parameter sensitivity. Method practicality is based on calculational ease and usefulness of the results. It is the intent of this report to demonstrate calculational rigor and to compare parameter sensitivity rankings resulting from various sensitivity analysis techniques. An atmospheric tritium dosimetry model (Hamby) is used here as an example, but the techniques described can be applied to many different modeling problems. Other investigators (Rose; Dalrymple and Broyd) present comparisons of sensitivity analyses methodologies, but none as comprehensive as the current work.

  11. Comparing Techniques for Certified Static Analysis

    NASA Technical Reports Server (NTRS)

    Cachera, David; Pichardie, David

    2009-01-01

    A certified static analysis is an analysis whose semantic validity has been formally proved correct with a proof assistant. The recent increasing interest in using proof assistants for mechanizing programming language metatheory has given rise to several approaches for certification of static analysis. We propose a panorama of these techniques and compare their respective strengths and weaknesses.

  12. A New Microcell Technique for NMR Analysis.

    ERIC Educational Resources Information Center

    Yu, Sophia J.

    1987-01-01

    Describes a new laboratory technique for working with small samples of compounds used in nuclear magnetic resonance (NMR) analysis. Demonstrates how microcells can be constructed for each experiment and samples can be recycled. (TW)

  13. Statistical Evaluation of Time Series Analysis Techniques

    NASA Technical Reports Server (NTRS)

    Benignus, V. A.

    1973-01-01

    The performance of a modified version of NASA's multivariate spectrum analysis program is discussed. A multiple regression model was used to make the revisions. Performance improvements were documented and compared to the standard fast Fourier transform by Monte Carlo techniques.

  14. Techniques for Enhancing Web-Based Education.

    ERIC Educational Resources Information Center

    Barbieri, Kathy; Mehringer, Susan

    The Virtual Workshop is a World Wide Web-based set of modules on high performance computing developed at the Cornell Theory Center (CTC) (New York). This approach reaches a large audience, leverages staff effort, and poses challenges for developing interesting presentation techniques. This paper describes the following techniques with their…

  15. Exact geometry solid-shell element based on a sampling surfaces technique for 3D stress analysis of doubly-curved composite shells

    NASA Astrophysics Data System (ADS)

    Kulikov, G. M.; Mamontov, A. A.; Plotnikova, S. V.; Mamontov, S. A.

    2015-11-01

    A hybrid-mixed ANS four-node shell element by using the sampling surfaces (SaS) technique is developed. The SaS formulation is based on choosing inside the nth layer In not equally spaced SaS parallel to the middle surface of the shell in order to introduce the displacements of these surfaces as basic shell variables. Such choice of unknowns with the consequent use of Lagrange polynomials of degree In - 1 in the thickness direction for each layer permits the presentation of the layered shell formulation in a very compact form. The SaS are located inside each layer at Chebyshev polynomial nodes that allows one to minimize uniformly the error due to the Lagrange interpolation. To implement the efficient analytical integration throughout the element, the enhanced ANS method is employed. The proposed hybrid-mixed four-node shell element is based on the Hu-Washizu variational equation and exhibits a superior performance in the case of coarse meshes. It could be useful for the 3D stress analysis of thick and thin doubly-curved shells since the SaS formulation gives the possibility to obtain numerical solutions with a prescribed accuracy, which asymptotically approach the exact solutions of elasticity as the number of SaS tends to infinity.

  16. Advanced analysis techniques for uranium assay

    SciTech Connect

    Geist, W. H.; Ensslin, Norbert; Carrillo, L. A.; Beard, C. A.

    2001-01-01

    Uranium has a negligible passive neutron emission rate making its assay practicable only with an active interrogation method. The active interrogation uses external neutron sources to induce fission events in the uranium in order to determine the mass. This technique requires careful calibration with standards that are representative of the items to be assayed. The samples to be measured are not always well represented by the available standards which often leads to large biases. A technique of active multiplicity counting is being developed to reduce some of these assay difficulties. Active multiplicity counting uses the measured doubles and triples count rates to determine the neutron multiplication (f4) and the product of the source-sample coupling ( C ) and the 235U mass (m). Since the 35U mass always appears in the multiplicity equations as the product of Cm, the coupling needs to be determined before the mass can be known. A relationship has been developed that relates the coupling to the neutron multiplication. The relationship is based on both an analytical derivation and also on empirical observations. To determine a scaling constant present in this relationship, known standards must be used. Evaluation of experimental data revealed an improvement over the traditional calibration curve analysis method of fitting the doubles count rate to the 235Um ass. Active multiplicity assay appears to relax the requirement that the calibration standards and unknown items have the same chemical form and geometry.

  17. Photomorphic analysis techniques: An interim spatial analysis using satellite remote sensor imagery and historical data

    NASA Technical Reports Server (NTRS)

    Keuper, H. R.; Peplies, R. W.; Gillooly, R. P.

    1977-01-01

    The use of machine scanning and/or computer-based techniques to provide greater objectivity in the photomorphic approach was investigated. Photomorphic analysis and its application in regional planning are discussed. Topics included: delineation of photomorphic regions; inadequacies of existing classification systems; tonal and textural characteristics and signature analysis techniques; pattern recognition and Fourier transform analysis; and optical experiments. A bibliography is included.

  18. Magnetic Analysis Techniques Applied to Desert Varnish

    NASA Technical Reports Server (NTRS)

    Schmidgall, E. R.; Moskowitz, B. M.; Dahlberg, E. D.; Kuhlman, K. R.

    2003-01-01

    Desert varnish is a black or reddish coating commonly found on rock samples from arid regions. Typically, the coating is very thin, less than half a millimeter thick. Previous research has shown that the primary components of desert varnish are silicon oxide clay minerals (60%), manganese and iron oxides (20-30%), and trace amounts of other compounds [1]. Desert varnish is thought to originate when windborne particles containing iron and manganese oxides are deposited onto rock surfaces where manganese oxidizing bacteria concentrate the manganese and form the varnish [4,5]. If desert varnish is indeed biogenic, then the presence of desert varnish on rock surfaces could serve as a biomarker, indicating the presence of microorganisms. This idea has considerable appeal, especially for Martian exploration [6]. Magnetic analysis techniques have not been extensively applied to desert varnish. The only previous magnetic study reported that based on room temperature demagnetization experiments, there were noticeable differences in magnetic properties between a sample of desert varnish and the substrate sandstone [7]. Based upon the results of the demagnetization experiments, the authors concluded that the primary magnetic component of desert varnish was either magnetite (Fe3O4) or maghemite ( Fe2O3).

  19. Emerging techniques for ultrasensitive protein analysis.

    PubMed

    Yang, Xiaolong; Tang, Yanan; Alt, Ryan R; Xie, Xiaoyu; Li, Feng

    2016-06-21

    Many important biomarkers for devastating diseases and biochemical processes are proteins present at ultralow levels. Traditional techniques, such as enzyme-linked immunosorbent assays (ELISA), mass spectrometry, and protein microarrays, are often not sensitive enough to detect proteins with concentrations below the picomolar level, thus requiring the development of analytical techniques with ultrahigh sensitivities. In this review, we highlight the recent advances in developing novel techniques, sensors, and assays for ultrasensitive protein analysis. Particular attention will be focused on three classes of signal generation and/or amplification mechanisms, including the uses of nanomaterials, nucleic acids, and digital platforms. PMID:26898911

  20. Quantifying sources of black carbon in western North America using observationally based analysis and an emission tagging technique in the Community Atmosphere Model

    NASA Astrophysics Data System (ADS)

    Zhang, R.; Wang, H.; Hegg, D. A.; Qian, Y.; Doherty, S. J.; Dang, C.; Ma, P.-L.; Rasch, P. J.; Fu, Q.

    2015-11-01

    The Community Atmosphere Model (CAM5), equipped with a technique to tag black carbon (BC) emissions by source regions and types, has been employed to establish source-receptor relationships for atmospheric BC and its deposition to snow over western North America. The CAM5 simulation was conducted with meteorological fields constrained by reanalysis for year 2013 when measurements of BC in both near-surface air and snow are available for model evaluation. We find that CAM5 has a significant low bias in predicted mixing ratios of BC in snow but only a small low bias in predicted atmospheric concentrations over northwestern USA and western Canada. Even with a strong low bias in snow mixing ratios, radiative transfer calculations show that the BC-in-snow darkening effect is substantially larger than the BC dimming effect at the surface by atmospheric BC. Local sources contribute more to near-surface atmospheric BC and to deposition than distant sources, while the latter are more important in the middle and upper troposphere where wet removal is relatively weak. Fossil fuel (FF) is the dominant source type for total column BC burden over the two regions. FF is also the dominant local source type for BC column burden, deposition, and near-surface BC, while for all distant source regions combined the contribution of biomass/biofuel (BB) is larger than FF. An observationally based positive matrix factorization (PMF) analysis of the snow-impurity chemistry is conducted to quantitatively evaluate the CAM5 BC source-type attribution. While CAM5 is qualitatively consistent with the PMF analysis with respect to partitioning of BC originating from BB and FF emissions, it significantly underestimates the relative contribution of BB. In addition to a possible low bias in BB emissions used in the simulation, the model is likely missing a significant source of snow darkening from local soil found in the observations.

  1. Quantifying sources of black carbon in Western North America using observationally based analysis and an emission tagging technique in the Community Atmosphere Model

    NASA Astrophysics Data System (ADS)

    Zhang, R.; Wang, H.; Hegg, D. A.; Qian, Y.; Doherty, S. J.; Dang, C.; Ma, P.-L.; Rasch, P. J.; Fu, Q.

    2015-05-01

    The Community Atmosphere Model (CAM5), equipped with a technique to tag black carbon (BC) emissions by source regions and types, has been employed to establish source-receptor relationships for atmospheric BC and its deposition to snow over Western North America. The CAM5 simulation was conducted with meteorological fields constrained by reanalysis for year 2013 when measurements of BC in both near-surface air and snow are available for model evaluation. We find that CAM5 has a significant low bias in predicted mixing ratios of BC in snow but only a small low bias in predicted atmospheric concentrations over the Northwest USA and West Canada. Even with a strong low bias in snow mixing ratios, radiative transfer calculations show that the BC-in-snow darkening effect is substantially larger than the BC dimming effect at the surface by atmospheric BC. Local sources contribute more to near-surface atmospheric BC and to deposition than distant sources, while the latter are more important in the middle and upper troposphere where wet removal is relatively weak. Fossil fuel (FF) is the dominant source type for total column BC burden over the two regions. FF is also the dominant local source type for BC column burden, deposition, and near-surface BC, while for all distant source regions combined the contribution of biomass/biofuel (BB) is larger than FF. An observationally based Positive Matrix Factorization (PMF) analysis of the snow-impurity chemistry is conducted to quantitatively evaluate the CAM5 BC source-type attribution. While CAM5 is qualitatively consistent with the PMF analysis with respect to partitioning of BC originating from BB and FF emissions, it significantly underestimates the relative contribution of BB. In addition to a possible low bias in BB emissions used in the simulation, the model is likely missing a significant source of snow darkening from local soil found in the observations.

  2. Quantifying sources of black carbon in Western North America using observationally based analysis and an emission tagging technique in the Community Atmosphere Model

    DOE PAGESBeta

    Zhang, R.; Wang, H.; Hegg, D. A.; Qian, Y.; Doherty, S. J.; Dang, C.; Ma, P.-L.; Rasch, P. J.; Fu, Q.

    2015-05-04

    The Community Atmosphere Model (CAM5), equipped with a technique to tag black carbon (BC) emissions by source regions and types, has been employed to establish source-receptor relationships for atmospheric BC and its deposition to snow over Western North America. The CAM5 simulation was conducted with meteorological fields constrained by reanalysis for year 2013 when measurements of BC in both near-surface air and snow are available for model evaluation. We find that CAM5 has a significant low bias in predicted mixing ratios of BC in snow but only a small low bias in predicted atmospheric concentrations over the Northwest USA andmore » West Canada. Even with a strong low bias in snow mixing ratios, radiative transfer calculations show that the BC-in-snow darkening effect is substantially larger than the BC dimming effect at the surface by atmospheric BC. Local sources contribute more to near-surface atmospheric BC and to deposition than distant sources, while the latter are more important in the middle and upper troposphere where wet removal is relatively weak. Fossil fuel (FF) is the dominant source type for total column BC burden over the two regions. FF is also the dominant local source type for BC column burden, deposition, and near-surface BC, while for all distant source regions combined the contribution of biomass/biofuel (BB) is larger than FF. An observationally based Positive Matrix Factorization (PMF) analysis of the snow-impurity chemistry is conducted to quantitatively evaluate the CAM5 BC source-type attribution. While CAM5 is qualitatively consistent with the PMF analysis with respect to partitioning of BC originating from BB and FF emissions, it significantly underestimates the relative contribution of BB. In addition to a possible low bias in BB emissions used in the simulation, the model is likely missing a significant source of snow darkening from local soil found in the observations.« less

  3. Quantifying sources of black carbon in western North America using observationally based analysis and an emission tagging technique in the Community Atmosphere Model

    DOE PAGESBeta

    Zhang, R.; Wang, H.; Hegg, D. A.; Qian, Y.; Doherty, S. J.; Dang, C.; Ma, P.-L.; Rasch, P. J.; Fu, Q.

    2015-11-18

    The Community Atmosphere Model (CAM5), equipped with a technique to tag black carbon (BC) emissions by source regions and types, has been employed to establish source–receptor relationships for atmospheric BC and its deposition to snow over western North America. The CAM5 simulation was conducted with meteorological fields constrained by reanalysis for year 2013 when measurements of BC in both near-surface air and snow are available for model evaluation. We find that CAM5 has a significant low bias in predicted mixing ratios of BC in snow but only a small low bias in predicted atmospheric concentrations over northwestern USA and westernmore » Canada. Even with a strong low bias in snow mixing ratios, radiative transfer calculations show that the BC-in-snow darkening effect is substantially larger than the BC dimming effect at the surface by atmospheric BC. Local sources contribute more to near-surface atmospheric BC and to deposition than distant sources, while the latter are more important in the middle and upper troposphere where wet removal is relatively weak. Fossil fuel (FF) is the dominant source type for total column BC burden over the two regions. FF is also the dominant local source type for BC column burden, deposition, and near-surface BC, while for all distant source regions combined the contribution of biomass/biofuel (BB) is larger than FF. An observationally based positive matrix factorization (PMF) analysis of the snow-impurity chemistry is conducted to quantitatively evaluate the CAM5 BC source-type attribution. While CAM5 is qualitatively consistent with the PMF analysis with respect to partitioning of BC originating from BB and FF emissions, it significantly underestimates the relative contribution of BB. In addition to a possible low bias in BB emissions used in the simulation, the model is likely missing a significant source of snow darkening from local soil found in the observations.« less

  4. Quantifying sources of black carbon in Western North America using observationally based analysis and an emission tagging technique in the Community Atmosphere Model

    SciTech Connect

    Zhang, Rudong; Wang, Hailong; Hegg, D. A.; Qian, Yun; Doherty, Sarah J.; Dang, Cheng; Ma, Po-Lun; Rasch, Philip J.; Fu, Qiang

    2015-11-18

    The Community Atmosphere Model (CAM5), equipped with a technique to tag black carbon (BC) emissions by source regions and types, has been employed to establish source-receptor relationships for atmospheric BC and its deposition to snow over Western North America. The CAM5 simulation was conducted with meteorological fields constrained by reanalysis for year 2013 when measurements of BC in both near-surface air and snow are available for model evaluation. We find that CAM5 has a significant low bias in predicted mixing ratios of BC in snow but only a small low bias in predicted atmospheric concentrations over the Northwest USA and West Canada. Even with a strong low bias in snow mixing ratios, radiative transfer calculations show that the BC-in-snow darkening effect is substantially larger than the BC dimming effect at the surface by atmospheric BC. Local sources contribute more to near-surface atmospheric BC and to deposition than distant sources, while the latter are more important in the middle and upper troposphere where wet removal is relatively weak. Fossil fuel (FF) is the dominant source type for total column BC burden over the two regions. FF is also the dominant local source type for BC column burden, deposition, and near-surface BC, while for all distant source regions combined the contribution of biomass/biofuel (BB) is larger than FF. An observationally based Positive Matrix Factorization (PMF) analysis of the snow-impurity chemistry is conducted to quantitatively evaluate the CAM5 BC source-type attribution. While CAM5 is qualitatively consistent with the PMF analysis with respect to partitioning of BC originating from BB and FF emissions, it significantly underestimates the relative contribution of BB. In addition to a possible low bias in BB emissions used in the simulation, the model is likely missing a significant source of snow darkening from local soil found in the observations.

  5. A review of sensitivity analysis techniques

    SciTech Connect

    Hamby, D.M.

    1993-12-31

    Mathematical models are utilized to approximate various highly complex engineering, physical, environmental, social, and economic phenomena. Model parameters exerting the most influence on model results are identified through a {open_quotes}sensitivity analysis.{close_quotes} A comprehensive review is presented of more than a dozen sensitivity analysis methods. The most fundamental of sensitivity techniques utilizes partial differentiation whereas the simplest approach requires varying parameter values one-at-a-time. Correlation analysis is used to determine relationships between independent and dependent variables. Regression analysis provides the most comprehensive sensitivity measure and is commonly utilized to build response surfaces that approximate complex models.

  6. Autofluorescence based diagnostic techniques for oral cancer

    PubMed Central

    Balasubramaniam, A. Murali; Sriraman, Rajkumari; Sindhuja, P.; Mohideen, Khadijah; Parameswar, R. Arjun; Muhamed Haris, K. T.

    2015-01-01

    Oral cancer is one of the most common cancers worldwide. Despite of various advancements in the treatment modalities, oral cancer mortalities are more, particularly in developing countries like India. This is mainly due to the delay in diagnosis of oral cancer. Delay in diagnosis greatly reduces prognosis of the treatment and also cause increased morbidity and mortality rates. Early diagnosis plays a key role in effective management of oral cancer. A rapid diagnostic technique can greatly aid in the early diagnosis of oral cancer. Now a day's many adjunctive oral cancer screening techniques are available for the early diagnosis of cancer. Among these, autofluorescence based diagnostic techniques are rapidly emerging as a powerful tool. These techniques are broadly discussed in this review. PMID:26538880

  7. Gold analysis by the gamma absorption technique.

    PubMed

    Kurtoglu, Arzu; Tugrul, A Beril

    2003-01-01

    Gold (Au) analyses are generally performed using destructive techniques. In this study, the Gamma Absorption Technique has been employed for gold analysis. A series of different gold alloys of known gold content were analysed and a calibration curve was obtained. This curve was then used for the analysis of unknown samples. Gold analyses can be made non-destructively, easily and quickly by the gamma absorption technique. The mass attenuation coefficients of the alloys were measured around the K-shell absorption edge of Au. Theoretical mass attenuation coefficient values were obtained using the WinXCom program and comparison of the experimental results with the theoretical values showed generally good and acceptable agreement. PMID:12485656

  8. Techniques for the Analysis of Human Movement.

    ERIC Educational Resources Information Center

    Grieve, D. W.; And Others

    This book presents the major analytical techniques that may be used in the appraisal of human movement. Chapter 1 is devoted to the photopgraphic analysis of movement with particular emphasis on cine filming. Cine film may be taken with little or no restriction on the performer's range of movement; information on the film is permanent and…

  9. Aerosol particle analysis by Raman scattering technique

    SciTech Connect

    Fung, K.H.; Tang, I.N.

    1992-10-01

    Laser Raman spectroscopy is a very versatile tool for chemical characterization of micron-sized particles. Such particles are abundant in nature, and in numerous energy-related processes. In order to elucidate the formation mechanisms and understand the subsequent chemical transformation under a variety of reaction conditions, it is imperative to develop analytical measurement techniques for in situ monitoring of these suspended particles. In this report, we outline our recent work on spontaneous Raman, resonance Raman and non-linear Raman scattering as a novel technique for chemical analysis of aerosol particles as well as supersaturated solution droplets.

  10. Respiratory monitoring system based on the nasal pressure technique for the analysis of sleep breathing disorders: Reduction of static and dynamic errors, and comparisons with thermistors and pneumotachographs

    NASA Astrophysics Data System (ADS)

    Alves de Mesquita, Jayme; Lopes de Melo, Pedro

    2004-03-01

    Thermally sensitive devices—thermistors—have usually been used to monitor sleep-breathing disorders. However, because of their long time constant, these devices are not able to provide a good characterization of fast events, like hypopneas. Nasal pressure recording technique (NPR) has recently been suggested to quantify airflow during sleep. It is claimed that the short time constants of the devices used to implement this technique would allow an accurate analysis of fast abnormal respiratory events. However, these devices present errors associated with nonlinearities and acoustic resonance that could reduce the diagnostic value of the NPR. Moreover, in spite of the high scientific and clinical potential, there is no detailed description of a complete instrumentation system to implement this promising technique in sleep studies. In this context, the purpose of this work was twofold: (1) describe the development of a flexible NPR device and (2) evaluate the performance of this device when compared to pneumotachographs (PNTs) and thermistors. After the design details are described, the system static accuracy is evaluated by a comparative analysis with a PNT. This analysis revealed a significant reduction (p<0.001) of the static error when system nonlinearities were reduced. The dynamic performance of the NPR system was investigated by frequency response analysis and time constant evaluations and the results showed that the developed device response was as good as PNT and around 100 times faster (τ=5,3 ms) than thermistors (τ=512 ms). Experimental results obtained in simulated clinical conditions and in a patient are presented as examples, and confirmed the good features achieved in engineering tests. These results are in close agreement with physiological fundamentals, supplying substantial evidence that the improved dynamic and static characteristics of this device can contribute to a more accurate implementation of medical research projects and to improve the

  11. Application of spectral analysis techniques to the intercomparison of aerosol data - Part 4: Combined maximum covariance analysis to bridge the gap between multi-sensor satellite retrievals and ground-based measurements

    NASA Astrophysics Data System (ADS)

    Li, J.; Carlson, B. E.; Lacis, A. A.

    2014-04-01

    The development of remote sensing techniques has greatly advanced our knowledge of atmospheric aerosols. Various satellite sensors and the associated retrieval algorithms all add to the information of global aerosol variability, while well-designed surface networks provide time series of highly accurate measurements at specific locations. In studying the variability of aerosol properties, aerosol climate effects, and constraining aerosol fields in climate models, it is essential to make the best use of all of the available information. In the previous three parts of this series, we demonstrated the usefulness of several spectral decomposition techniques in the analysis and comparison of temporal and spatial variability of aerosol optical depth using satellite and ground-based measurements. Specifically, Principal Component Analysis (PCA) successfully captures and isolates seasonal and interannual variability from different aerosol source regions, Maximum Covariance Analysis (MCA) provides a means to verify the variability in one satellite dataset against Aerosol Robotic Network (AERONET) data, and Combined Principal Component Analysis (CPCA) realized parallel comparison among multi-satellite, multi-sensor datasets. As the final part of the study, this paper introduces a novel technique that integrates both multi-sensor datasets and ground observations, and thus effectively bridges the gap between these two types of measurements. The Combined Maximum Covariance Analysis (CMCA) decomposes the cross covariance matrix between the combined multi-sensor satellite data field and AERONET station data. We show that this new method not only confirms the seasonal and interannual variability of aerosol optical depth, aerosol source regions and events represented by different satellite datasets, but also identifies the strengths and weaknesses of each dataset in capturing the variability associated with sources, events or aerosol types. Furthermore, by examining the spread of

  12. Infusing Reliability Techniques into Software Safety Analysis

    NASA Technical Reports Server (NTRS)

    Shi, Ying

    2015-01-01

    Software safety analysis for a large software intensive system is always a challenge. Software safety practitioners need to ensure that software related hazards are completely identified, controlled, and tracked. This paper discusses in detail how to incorporate the traditional reliability techniques into the entire software safety analysis process. In addition, this paper addresses how information can be effectively shared between the various practitioners involved in the software safety analyses. The author has successfully applied the approach to several aerospace applications. Examples are provided to illustrate the key steps of the proposed approach.

  13. Multiview video codec based on KTA techniques

    NASA Astrophysics Data System (ADS)

    Seo, Jungdong; Kim, Donghyun; Ryu, Seungchul; Sohn, Kwanghoon

    2011-03-01

    Multi-view video coding (MVC) is a video coding standard developed by MPEG and VCEG for multi-view video. It showed average PSNR gain of 1.5dB compared with view-independent coding by H.264/AVC. However, because resolutions of multi-view video are getting higher for more realistic 3D effect, high performance video codec is needed. MVC adopted hierarchical B-picture structure and inter-view prediction as core techniques. The hierarchical B-picture structure removes the temporal redundancy, and the inter-view prediction reduces the inter-view redundancy by compensated prediction from the reconstructed neighboring views. Nevertheless, MVC has inherent limitation in coding efficiency, because it is based on H.264/AVC. To overcome the limit, an enhanced video codec for multi-view video based on Key Technology Area (KTA) is proposed. KTA is a high efficiency video codec by Video Coding Expert Group (VCEG), and it was carried out for coding efficiency beyond H.264/AVC. The KTA software showed better coding gain than H.264/AVC by using additional coding techniques. The techniques and the inter-view prediction are implemented into the proposed codec, which showed high coding gain compared with the view-independent coding result by KTA. The results presents that the inter-view prediction can achieve higher efficiency in a multi-view video codec based on a high performance video codec such as HEVC.

  14. Accelerometer Data Analysis and Presentation Techniques

    NASA Technical Reports Server (NTRS)

    Rogers, Melissa J. B.; Hrovat, Kenneth; McPherson, Kevin; Moskowitz, Milton E.; Reckart, Timothy

    1997-01-01

    The NASA Lewis Research Center's Principal Investigator Microgravity Services project analyzes Orbital Acceleration Research Experiment and Space Acceleration Measurement System data for principal investigators of microgravity experiments. Principal investigators need a thorough understanding of data analysis techniques so that they can request appropriate analyses to best interpret accelerometer data. Accelerometer data sampling and filtering is introduced along with the related topics of resolution and aliasing. Specific information about the Orbital Acceleration Research Experiment and Space Acceleration Measurement System data sampling and filtering is given. Time domain data analysis techniques are discussed and example environment interpretations are made using plots of acceleration versus time, interval average acceleration versus time, interval root-mean-square acceleration versus time, trimmean acceleration versus time, quasi-steady three dimensional histograms, and prediction of quasi-steady levels at different locations. An introduction to Fourier transform theory and windowing is provided along with specific analysis techniques and data interpretations. The frequency domain analyses discussed are power spectral density versus frequency, cumulative root-mean-square acceleration versus frequency, root-mean-square acceleration versus frequency, one-third octave band root-mean-square acceleration versus frequency, and power spectral density versus frequency versus time (spectrogram). Instructions for accessing NASA Lewis Research Center accelerometer data and related information using the internet are provided.

  15. Laser Remote Sensing: Velocimetry Based Techniques

    NASA Astrophysics Data System (ADS)

    Molebny, Vasyl; Steinvall, Ove

    Laser-based velocity measurement is an area of the field of remote sensing where the coherent properties of laser radiation are the most exposed. Much of the published literature deals with the theory and techniques of remote sensing. We restrict our discussion to current trends in this area, gathered from recent conferences and professional journals. Remote wind sensing and vibrometry are promising in their new scientific, industrial, military, and biomedical applications, including improving flight safety, precise weapon correction, non-contact mine detection, optimization of wind farm operation, object identification based on its vibration signature, fluid flow studies, and vibrometry-associated diagnosis.

  16. COMBINING A NEW 3-D SEISMIC S-WAVE PROPAGATION ANALYSIS FOR REMOTE FRACTURE DETECTION WITH A ROBUST SUBSURFACE MICROFRACTURE-BASED VERIFICATION TECHNIQUE

    SciTech Connect

    Bob Hardage; M.M. Backus; M.V. DeAngelo; R.J. Graebner; S.E. Laubach; Paul Murray

    2004-02-01

    Fractures within the producing reservoirs at McElroy Field could not be studied with the industry-provided 3C3D seismic data used as a cost-sharing contribution in this study. The signal-to-noise character of the converted-SV data across the targeted reservoirs in these contributed data was not adequate for interpreting azimuth-dependent data effects. After illustrating the low signal quality of the converted-SV data at McElroy Field, the seismic portion of this report abandons the McElroy study site and defers to 3C3D seismic data acquired across a different fractured carbonate reservoir system to illustrate how 3C3D seismic data can provide useful information about fracture systems. Using these latter data, we illustrate how fast-S and slow-S data effects can be analyzed in the prestack domain to recognize fracture azimuth, and then demonstrate how fast-S and slow-S data volumes can be analyzed in the poststack domain to estimate fracture intensity. In the geologic portion of the report, we analyze published regional stress data near McElroy Field and numerous formation multi-imager (FMI) logs acquired across McElroy to develop possible fracture models for the McElroy system. Regional stress data imply a fracture orientation different from the orientations observed in most of the FMI logs. This report culminates Phase 2 of the study, ''Combining a New 3-D Seismic S-Wave Propagation Analysis for Remote Fracture Detection with a Robust Subsurface Microfracture-Based Verification Technique''. Phase 3 will not be initiated because wells were to be drilled in Phase 3 of the project to verify the validity of fracture-orientation maps and fracture-intensity maps produced in Phase 2. Such maps cannot be made across McElroy Field because of the limitations of the available 3C3D seismic data at the depth level of the reservoir target.

  17. An analysis of I/O efficient order-statistic-based techniques for noise power estimation in the HRMS sky survey's operational system

    NASA Technical Reports Server (NTRS)

    Zimmerman, G. A.; Olsen, E. T.

    1992-01-01

    Noise power estimation in the High-Resolution Microwave Survey (HRMS) sky survey element is considered as an example of a constant false alarm rate (CFAR) signal detection problem. Order-statistic-based noise power estimators for CFAR detection are considered in terms of required estimator accuracy and estimator dynamic range. By limiting the dynamic range of the value to be estimated, the performance of an order-statistic estimator can be achieved by simpler techniques requiring only a single pass of the data. Simple threshold-and-count techniques are examined, and it is shown how several parallel threshold-and-count estimation devices can be used to expand the dynamic range to meet HRMS system requirements with minimal hardware complexity. An input/output (I/O) efficient limited-precision order-statistic estimator with wide but limited dynamic range is also examined.

  18. Visual exploratory analysis of integrated chromosome 19 proteomic data derived from glioma cancer stem-cell lines based on novel nonlinear dimensional data reduction techniques

    NASA Astrophysics Data System (ADS)

    Lespinats, Sylvain; Pinker-Domenig, Katja; Meyer-Bäse, Uwe; Meyer-Bäse, Anke

    2015-05-01

    Chromosome 19 is known to be linked to neurodegeneration and many cancers. Glioma-derived cancer stem cells (GSCs) are tumor-initiating cells and may be refractory to radiation and chemotherapy and thus have important implications for tumor biology and therapeutics. The analysis and interpretation of large proteomic data sets requires the development of new data mining and visualization approaches. Traditional techniques are insufficient to interpret and visualize these resulting experimental data. The emphasis of this paper lies in the presentation of novel approaches for the visualization, clustering and projection representation to unveil hidden data structures relevant for the accurate interpretation of biological experiments. These qualitative and quantitative methods are applied to the proteomic analysis of data sets derived from the GSCs. The achieved clustering and visualization results provide a more detailed insight into the expression patterns for chromosome 19 proteins.

  19. Further development of ultrasonic techniques for non-destructive evaluation based on Fourier analysis of signals from irregular and inhomogeneous structures

    NASA Technical Reports Server (NTRS)

    Miller, J. G.

    1979-01-01

    To investigate the use of Fourier analysis techniques model systems had to be designed to test some of the general properties of the interaction of sound with an inhomogeneity. The first models investigated were suspensions of solid spheres in water. These systems allowed comparison between theoretical computation of the frequency dependence of the attenuation coefficient and measurement of the attenuation coefficient over a range of frequencies. Ultrasonic scattering processes in both suspensions of hard spheres in water, and suspensions of hard spheres in polyester resin were investigated. The second model system was constructed to test the applicability of partial wave analysis to the description of an inhomogeneity in a solid, and to test the range of material properties over which the measurement systems were valid.

  20. DETECTION OF DNA DAMAGE USING MELTING ANALYSIS TECHNIQUES

    EPA Science Inventory

    A rapid and simple fluorescence screening assay for UV radiation-, chemical-, and enzyme-induced DNA damage is reported. This assay is based on a melting/annealing analysis technique and has been used with both calf thymus DNA and plasmid DNA (puc 19 plasmid from E. coli). DN...

  1. A comparison of geostatistically based inverse techniques for use in performance assessment analysis at the Waste Isolation Pilot Plant Site: Results from Test Case No. 1

    SciTech Connect

    Zimmerman, D.A.; Gallegos, D.P.

    1993-10-01

    The groundwater flow pathway in the Culebra Dolomite aquifer at the Waste Isolation Pilot Plant (WIPP) has been identified as a potentially important pathway for radionuclide migration to the accessible environment. Consequently, uncertainties in the models used to describe flow and transport in the Culebra need to be addressed. A ``Geostatistics Test Problem`` is being developed to evaluate a number of inverse techniques that may be used for flow calculations in the WIPP performance assessment (PA). The Test Problem is actually a series of test cases, each being developed as a highly complex synthetic data set; the intent is for the ensemble of these data sets to span the range of possible conceptual models of groundwater flow at the WIPP site. The Test Problem analysis approach is to use a comparison of the probabilistic groundwater travel time (GWTT) estimates produced by each technique as the basis for the evaluation. Participants are given observations of head and transmissivity (possibly including measurement error) or other information such as drawdowns from pumping wells, and are asked to develop stochastic models of groundwater flow for the synthetic system. Cumulative distribution functions (CDFs) of groundwater flow (computed via particle tracking) are constructed using the head and transmissivity data generated through the application of each technique; one semi-analytical method generates the CDFs of groundwater flow directly. This paper describes the results from Test Case No. 1.

  2. Flash Infrared Thermography Contrast Data Analysis Technique

    NASA Technical Reports Server (NTRS)

    Koshti, Ajay

    2014-01-01

    This paper provides information on an IR Contrast technique that involves extracting normalized contrast versus time evolutions from the flash thermography inspection infrared video data. The analysis calculates thermal measurement features from the contrast evolution. In addition, simulation of the contrast evolution is achieved through calibration on measured contrast evolutions from many flat-bottom holes in the subject material. The measurement features and the contrast simulation are used to evaluate flash thermography data in order to characterize delamination-like anomalies. The thermal measurement features relate to the anomaly characteristics. The contrast evolution simulation is matched to the measured contrast evolution over an anomaly to provide an assessment of the anomaly depth and width which correspond to the depth and diameter of the equivalent flat-bottom hole (EFBH) similar to that used as input to the simulation. A similar analysis, in terms of diameter and depth of an equivalent uniform gap (EUG) providing a best match with the measured contrast evolution, is also provided. An edge detection technique called the half-max is used to measure width and length of the anomaly. Results of the half-max width and the EFBH/EUG diameter are compared to evaluate the anomaly. The information provided here is geared towards explaining the IR Contrast technique. Results from a limited amount of validation data on reinforced carbon-carbon (RCC) hardware are included in this paper.

  3. Neutron-based nonintrusive inspection techniques

    NASA Astrophysics Data System (ADS)

    Gozani, Tsahi

    1997-02-01

    Non-intrusive inspection of large objects such as trucks, sea-going shipping containers, air cargo containers and pallets is gaining attention as a vital tool in combating terrorism, drug smuggling and other violation of international and national transportation and Customs laws. Neutrons are the preferred probing radiation when material specificity is required, which is most often the case. Great strides have been made in neutron based inspection techniques. Fast and thermal neutrons, whether in steady state or in microsecond, or even nanosecond pulses are being employed to interrogate, at high speeds, for explosives, drugs, chemical agents, and nuclear and many other smuggled materials. Existing neutron techniques will be compared and their current status reported.

  4. Forensic Analysis using Geological and Geochemical Techniques

    NASA Astrophysics Data System (ADS)

    Hoogewerff, J.

    2009-04-01

    Due to the globalisation of legal (and illegal) trade there is an increasing demand for techniques which can verify the geographical origin and transfer routes of many legal and illegal commodities and products. Although geological techniques have been used in forensic investigations since the emergence of forensics as a science in the late eighteen hundreds, the last decade has seen a marked increase in geo-scientists initiating concept studies using the latest analytical techniques, including studying natural abundance isotope variations, micro analysis with laser ablation ICPMS and geochemical mapping. Most of the concept studies have shown a good potential but uptake by the law enforcement and legal community has been limited due to concerns about the admissibility of the new methods. As an introduction to the UGU2009 session "Forensic Provenancing using Geological and Geochemical Techniques" I will give an overview of the state of the art of forensic geology and the issues that concern the admissibility of geological forensic evidence. I will use examples from the NITECRIME and FIRMS networks, the EU TRACE project and other projects and literature to illustrate the important issues at hand.

  5. Advanced NMR-based techniques for pore structure analysis of coal. Quarter report No. 4, 1 October 1992--30 December 1992

    SciTech Connect

    Smith, D.M.

    1992-12-31

    One of the main problems in coal utilization is the inability to properly characterize its complex pore structure. Coals typically have micro/ultra-micro pores but they also exhibit meso and macroporosity. Conventional pore size techniques (adsorption/condensation, mercury porosimetry) are limited because of this broad pore size range, microporosity, reactive nature of coal, samples must be completely dried, and network/percolation effects. Small angle scattering is limited because it probes both open and closed pores. Although one would not expect any single technique to provide a satisfactory description of a coal`s structure, it is apparent that better techniques are necessary. We believe that measurement of the NMR parameters of various gas phase and adsorbed phase NMR active probes can provide the resolution to this problem. We will investigate the dependence of the common NMR parameters such as chemical shifts and relaxation times of several different nuclei and compounds on the pore structure of model microporous solids, carbons, and coals. In particular, we will study the interaction between several small molecules and the pore surfaces in coals. These molecules have been selected for their chemical and physical properties. A special NMR probe will be constructed which will allow the concurrent measurement of NMR properties and adsorption uptake at a variety of temperatures. All samples will be subjected to a suite of ``conventional`` pore structure analyses. These include nitrogen adsorption at 77 K with BET analysis, CO{sub 2} and CH{sub 4} adsorption at 273 K with D-R (Dubinin-Radushkevich) analysis, helium pycnometry, and small angle X-ray scattering as well as gas diffusion measurements.

  6. The Network Protocol Analysis Technique in Snort

    NASA Astrophysics Data System (ADS)

    Wu, Qing-Xiu

    Network protocol analysis is a network sniffer to capture data for further analysis and understanding of the technical means necessary packets. Network sniffing is intercepted by packet assembly binary format of the original message content. In order to obtain the information contained. Required based on TCP / IP protocol stack protocol specification. Again to restore the data packets at protocol format and content in each protocol layer. Actual data transferred, as well as the application tier.

  7. COSIMA data analysis using multivariate techniques

    NASA Astrophysics Data System (ADS)

    Silén, J.; Cottin, H.; Hilchenbach, M.; Kissel, J.; Lehto, H.; Siljeström, S.; Varmuza, K.

    2015-02-01

    We describe how to use multivariate analysis of complex TOF-SIMS (time-of-flight secondary ion mass spectrometry) spectra by introducing the method of random projections. The technique allows us to do full clustering and classification of the measured mass spectra. In this paper we use the tool for classification purposes. The presentation describes calibration experiments of 19 minerals on Ag and Au substrates using positive mode ion spectra. The discrimination between individual minerals gives a cross-validation Cohen κ for classification of typically about 80%. We intend to use the method as a fast tool to deduce a qualitative similarity of measurements.

  8. Multiclass pesticide analysis in fruit-based baby food: A comparative study of sample preparation techniques previous to gas chromatography-mass spectrometry.

    PubMed

    Petrarca, Mateus H; Fernandes, José O; Godoy, Helena T; Cunha, Sara C

    2016-12-01

    With the aim to develop a new gas chromatography-mass spectrometry method to analyze 24 pesticide residues in baby foods at the level imposed by established regulation two simple, rapid and environmental-friendly sample preparation techniques based on QuEChERS (quick, easy, cheap, effective, robust and safe) were compared - QuEChERS with dispersive liquid-liquid microextraction (DLLME) and QuEChERS with dispersive solid-phase extraction (d-SPE). Both sample preparation techniques achieved suitable performance criteria, including selectivity, linearity, acceptable recovery (70-120%) and precision (⩽20%). A higher enrichment factor was observed for DLLME and consequently better limits of detection and quantification were obtained. Nevertheless, d-SPE provided a more effective removal of matrix co-extractives from extracts than DLLME, which contributed to lower matrix effects. Twenty-two commercial fruit-based baby food samples were analyzed by the developed method, being procymidone detected in one sample at a level above the legal limit established by EU. PMID:27374564

  9. Flood alert system based on bayesian techniques

    NASA Astrophysics Data System (ADS)

    Gulliver, Z.; Herrero, J.; Viesca, C.; Polo, M. J.

    2012-04-01

    The problem of floods in the Mediterranean regions is closely linked to the occurrence of torrential storms in dry regions, where even the water supply relies on adequate water management. Like other Mediterranean basins in Southern Spain, the Guadalhorce River Basin is a medium sized watershed (3856 km2) where recurrent yearly floods occur , mainly in autumn and spring periods, driven by cold front phenomena. The torrential character of the precipitation in such small basins, with a concentration time of less than 12 hours, produces flash flood events with catastrophic effects over the city of Malaga (600000 inhabitants). From this fact arises the need for specific alert tools which can forecast these kinds of phenomena. Bayesian networks (BN) have been emerging in the last decade as a very useful and reliable computational tool for water resources and for the decision making process. The joint use of Artificial Neural Networks (ANN) and BN have served us to recognize and simulate the two different types of hydrological behaviour in the basin: natural and regulated. This led to the establishment of causal relationships between precipitation, discharge from upstream reservoirs, and water levels at a gauging station. It was seen that a recurrent ANN model working at an hourly scale, considering daily precipitation and the two previous hourly values of reservoir discharge and water level, could provide R2 values of 0.86. BN's results slightly improve this fit, but contribute with uncertainty to the prediction. In our current work to Design a Weather Warning Service based on Bayesian techniques the first steps were carried out through an analysis of the correlations between the water level and rainfall at certain representative points in the basin, along with the upstream reservoir discharge. The lower correlation found between precipitation and water level emphasizes the highly regulated condition of the stream. The autocorrelations of the variables were also

  10. Oil species identification technique developed by Gabor wavelet analysis and support vector machine based on concentration-synchronous-matrix-fluorescence spectroscopy.

    PubMed

    Wang, Chunyan; Shi, Xiaofeng; Li, Wendong; Wang, Lin; Zhang, Jinliang; Yang, Chun; Wang, Zhendi

    2016-03-15

    Concentration-synchronous-matrix-fluorescence (CSMF) spectroscopy was applied to discriminate the oil species by characterizing the concentration dependent fluorescence properties of petroleum related samples. Seven days weathering experiment of 3 crude oil samples from the Bohai Sea platforms of China was carried out under controlled laboratory conditions and showed that weathering had no significant effect on the CSMF spectra. While different feature extraction methods, such as PCA, PLS and Gabor wavelet analysis, were applied to extract discriminative patterns from CSMF spectra, classifications were made via SVM to compare their respective performance of oil species recognition. Ideal correct rates of oil species recognition of 100% for the different types of oil spill samples and 92% for the closely-related source oil samples were achieved by combining Gabor wavelet with SVM, which indicated its advantages to be developed to a rapid, cost-effective, and accurate forensic oil spill identification technique. PMID:26795119

  11. Laser Scanning–Based Tissue Autofluorescence/Fluorescence Imaging (LS-TAFI), a New Technique for Analysis of Microanatomy in Whole-Mount Tissues

    PubMed Central

    Mori, Hidetoshi; Borowsky, Alexander D.; Bhat, Ramray; Ghajar, Cyrus M.; Seiki, Motoharu; Bissell, Mina J.

    2012-01-01

    Intact organ structure is essential in maintaining tissue specificity and cellular differentiation. Small physiological or genetic variations lead to changes in microanatomy that, if persistent, could have functional consequences and may easily be masked by the heterogeneity of tissue anatomy. Current imaging techniques rely on histological, two-dimensional sections requiring sample manipulation that are essentially two dimensional. We have developed a method for three-dimensional imaging of whole-mount, unsectioned mammalian tissues to elucidate subtle and detailed micro- and macroanatomies in adult organs and embryos. We analyzed intact or dissected organ whole mounts with laser scanning–based tissue autofluorescence/fluorescence imaging (LS-TAFI). We obtained clear visualization of microstructures within murine mammary glands and mammary tumors and other organs without the use of immunostaining and without probes or fluorescent reporter genes. Combining autofluorescence with reflected light signals from chromophore-stained tissues allowed identification of individual cells within three-dimensional structures of whole-mounted organs. This technique could be useful for rapid diagnosis of human clinical samples and possibly the effect of subtle variations such as low dose radiation. PMID:22542846

  12. Debonding damage analysis in composite-masonry strengthening systems with polymer- and mortar-based matrix by means of the acoustic emission technique

    NASA Astrophysics Data System (ADS)

    Verstrynge, E.; Wevers, M.; Ghiassi, B.; Lourenço, P. B.

    2016-01-01

    Different types of strengthening systems, based on fiber reinforced materials, are under investigation for external strengthening of historic masonry structures. A full characterization of the bond behavior and of the short- and long-term failure mechanisms is crucial to ensure effective design, compatibility with the historic substrate and durability of the strengthening solution. Therein, non-destructive techniques are essential for bond characterization, durability assessment and on-site condition monitoring. In this paper, the acoustic emission (AE) technique is evaluated for debonding characterization and localization on fiber reinforced polymer (FRP) and steel reinforced grout-strengthened clay bricks. Both types of strengthening systems are subjected to accelerated ageing tests under thermal cycles and to single-lap shear bond tests. During the reported experimental campaign, AE data from the accelerated ageing tests demonstrated the thermal incompatibility between brick and epoxy-bonded FRP composites, and debonding damage was successfully detected, characterized and located. In addition, a qualitative comparison is made with digital image correlation and infrared thermography, in view of efficient on-site debonding detection.

  13. Some Techniques for Computer-Based Assessment in Medical Education.

    ERIC Educational Resources Information Center

    Mooney, G. A.; Bligh, J. G.; Leinster, S. J.

    1998-01-01

    Presents a system of classification for describing computer-based assessment techniques based on the level of action and educational activity they offer. Illustrates 10 computer-based assessment techniques and discusses their educational value. Contains 14 references. (Author)

  14. Which Combinations of Techniques and Modes of Delivery in Internet-Based Interventions Effectively Change Health Behavior? A Meta-Analysis

    PubMed Central

    van Genugten, Lenneke; Webb, Thomas Llewelyn; van Empelen, Pepijn

    2016-01-01

    Background Many online interventions designed to promote health behaviors combine multiple behavior change techniques (BCTs), adopt different modes of delivery (MoD) (eg, text messages), and range in how usable they are. Research is therefore needed to examine the impact of these features on the effectiveness of online interventions. Objective This study applies Classification and Regression Trees (CART) analysis to meta-analytic data, in order to identify synergistic effects of BCTs, MoDs, and usability factors. Methods We analyzed data from Webb et al. This review included effect sizes from 52 online interventions targeting a variety of health behaviors and coded the use of 40 BCTs and 11 MoDs. Our research also developed a taxonomy for coding the usability of interventions. Meta-CART analyses were performed using the BCTs and MoDs as predictors and using treatment success (ie, effect size) as the outcome. Results Factors related to usability of the interventions influenced their efficacy. Specifically, subgroup analyses indicated that more efficient interventions (interventions that take little time to understand and use) are more likely to be effective than less efficient interventions. Meta-CART identified one synergistic effect: Interventions that included barrier identification/ problem solving and provided rewards for behavior change reported an average effect size that was smaller (ḡ=0.23, 95% CI 0.08-0.44) than interventions that used other combinations of techniques (ḡ=0.43, 95% CI 0.27-0.59). No synergistic effects were found for MoDs or for MoDs combined with BCTs. Conclusions Interventions that take little time to understand and use were more effective than those that require more time. Few specific combinations of BCTs that contribute to the effectiveness of online interventions were found. Furthermore, no synergistic effects between BCTs and MoDs were found, even though MoDs had strong effects when analyzed univariately in the original study

  15. Modular Sampling and Analysis Techniques for the Real-Time Analysis of Human Breath

    SciTech Connect

    Frank, M; Farquar, G; Adams, K; Bogan, M; Martin, A; Benner, H; Spadaccini, C; Steele, P; Davis, C; Loyola, B; Morgan, J; Sankaran, S

    2007-07-09

    At LLNL and UC Davis, we are developing several techniques for the real-time sampling and analysis of trace gases, aerosols and exhaled breath that could be useful for a modular, integrated system for breath analysis. Those techniques include single-particle bioaerosol mass spectrometry (BAMS) for the analysis of exhaled aerosol particles or droplets as well as breath samplers integrated with gas chromatography mass spectrometry (GC-MS) or MEMS-based differential mobility spectrometry (DMS). We describe these techniques and present recent data obtained from human breath or breath condensate, in particular, addressing the question of how environmental exposure influences the composition of breath.

  16. Comparison of performance of object-based image analysis techniques available in open source software (Spring and Orfeo Toolbox/Monteverdi) considering very high spatial resolution data

    NASA Astrophysics Data System (ADS)

    Teodoro, Ana C.; Araujo, Ricardo

    2016-01-01

    The use of unmanned aerial vehicles (UAVs) for remote sensing applications is becoming more frequent. However, this type of information can result in several software problems related to the huge amount of data available. Object-based image analysis (OBIA) has proven to be superior to pixel-based analysis for very high-resolution images. The main objective of this work was to explore the potentialities of the OBIA methods available in two different open source software applications, Spring and OTB/Monteverdi, in order to generate an urban land cover map. An orthomosaic derived from UAVs was considered, 10 different regions of interest were selected, and two different approaches were followed. The first one (Spring) uses the region growing segmentation algorithm followed by the Bhattacharya classifier. The second approach (OTB/Monteverdi) uses the mean shift segmentation algorithm followed by the support vector machine (SVM) classifier. Two strategies were followed: four classes were considered using Spring and thereafter seven classes were considered for OTB/Monteverdi. The SVM classifier produces slightly better results and presents a shorter processing time. However, the poor spectral resolution of the data (only RGB bands) is an important factor that limits the performance of the classifiers applied.

  17. Automated analysis of non-mass-enhancing lesions in breast MRI based on morphological, kinetic, and spatio-temporal moments and joint segmentation-motion compensation technique

    NASA Astrophysics Data System (ADS)

    Hoffmann, Sebastian; Shutler, Jamie D.; Lobbes, Marc; Burgeth, Bernhard; Meyer-Bäse, Anke

    2013-12-01

    Dynamic contrast-enhanced magnetic resonance imaging (DCE-MRI) represents an established method for the detection and diagnosis of breast lesions. While mass-like enhancing lesions can be easily categorized according to the Breast Imaging Reporting and Data System (BI-RADS) MRI lexicon, a majority of diagnostically challenging lesions, the so called non-mass-like enhancing lesions, remain both qualitatively as well as quantitatively difficult to analyze. Thus, the evaluation of kinetic and/or morphological characteristics of non-masses represents a challenging task for an automated analysis and is of crucial importance for advancing current computer-aided diagnosis (CAD) systems. Compared to the well-characterized mass-enhancing lesions, non-masses have no well-defined and blurred tumor borders and a kinetic behavior that is not easily generalizable and thus discriminative for malignant and benign non-masses. To overcome these difficulties and pave the way for novel CAD systems for non-masses, we will evaluate several kinetic and morphological descriptors separately and a novel technique, the Zernike velocity moments, to capture the joint spatio-temporal behavior of these lesions, and additionally consider the impact of non-rigid motion compensation on a correct diagnosis.

  18. Architectural stability analysis of the rotary-laser scanning technique

    NASA Astrophysics Data System (ADS)

    Xue, Bin; Yang, Xiaoxia; Zhu, Jigui

    2016-03-01

    The rotary-laser scanning technique is an important method in scale measurements due to its high accuracy and large measurement range. This paper first introduces a newly designed measurement station which is able to provide two-dimensional measurement information including the azimuth and elevation by using the rotary-laser scanning technique, then presents the architectural stability analysis of this technique by detailed theoretical derivations. Based on the designed station, a validation using both experiment and simulation is presented in order to verify the analytic conclusion. The results show that the architectural stability of the rotary-laser scanning technique is only affected by the two scanning angles' difference. And the difference which brings the best architectural stability can be calculated by using pre-calibrated parameters of the two laser planes. This research gives us an insight into the rotary-laser scanning technique. Moreover, the measurement accuracy of the rotary-laser scanning technique can be further improved based on the results of the study.

  19. HELCATS - Heliospheric Cataloguing, Analysis and Techniques Service

    NASA Astrophysics Data System (ADS)

    Harrison, Richard; Davies, Jackie; Perry, Chris; Moestl, Christian; Rouillard, Alexis; Bothmer, Volker; Rodriguez, Luciano; Eastwood, Jonathan; Kilpua, Emilia; Gallagher, Peter

    2016-04-01

    Understanding the evolution of the solar wind is fundamental to advancing our knowledge of energy and mass transport in the solar system, rendering it crucial to space weather and its prediction. The advent of truly wide-angle heliospheric imaging has revolutionised the study of both transient (CMEs) and background (SIRs/CIRs) solar wind plasma structures, by enabling their direct and continuous observation out to 1 AU and beyond. The EU-funded FP7 HELCATS project combines European expertise in heliospheric imaging, built up in particular through lead involvement in NASA's STEREO mission, with expertise in solar and coronal imaging as well as in-situ and radio measurements of solar wind phenomena, in a programme of work that will enable a much wider exploitation and understanding of heliospheric imaging observations. With HELCATS, we are (1.) cataloguing transient and background solar wind structures imaged in the heliosphere by STEREO/HI, since launch in late October 2006 to date, including estimates of their kinematic properties based on a variety of established techniques and more speculative, approaches; (2.) evaluating these kinematic properties, and thereby the validity of these techniques, through comparison with solar source observations and in-situ measurements made at multiple points throughout the heliosphere; (3.) appraising the potential for initialising advanced numerical models based on these kinematic properties; (4.) assessing the complementarity of radio observations (in particular of Type II radio bursts and interplanetary scintillation) in combination with heliospheric imagery. We will, in this presentation, provide an overview of progress from the first 18 months of the HELCATS project.

  20. CT-based morphometric analysis of C1 laminar dimensions: C1 translaminar screw fixation is a feasible technique for salvage of atlantoaxial fusions

    PubMed Central

    Yew, Andrew; Lu, Derek; Lu, Daniel C.

    2015-01-01

    Background: Translaminar screw fixation has become an alternative in the fixation of the axial and subaxial cervical spine. We report utilization of this approach in the atlas as a salvage technique for atlantoaxial stabilization when C1 lateral mass screws are precluded. To assess the feasibility of translaminar fixation at the atlas, we have characterized the dimensions of the C1 lamina in the general adult population using computed tomography (CT)-based morphometry. Methods: A 46-year-old male with symptomatic atlantoaxial instability secondary to os odontoideum underwent bilateral C1 and C2 translaminar screw/rod fixation as C1 lateral mass fixation was precluded by an anomalous vertebral artery. The follow-up evaluation 2½ years postoperatively revealed an asymptomatic patient without recurrent neck/shoulder pain or clinical signs of instability. To better assess the feasibility of utilizing this approach in the general population, we retrospectively analyzed 502 consecutive cervical CT scans performed over a 3-month period in patients aged over 18 years at a single institution. Measurements of C1 bicortical diameter, bilateral laminar length, height, and angulation were performed. Laminar and screw dimensions were compared to assess instrumentation feasibility. Results: Review of CT imaging found that 75.9% of C1 lamina had a sufficient bicortical diameter, and 63.7% of C1 lamina had sufficient height to accept bilateral translaminar screw placement. Conclusions: CT-based measurement of atlas morphology in the general population revealed that a majority of C1 lamina had sufficient dimensions to accept translaminar screw placement. Although these screws appear to be a feasible alternative when lateral mass screws are precluded, further research is required to determine if they provide comparable fixation strength versus traditional instrumentation methods. PMID:26005585

  1. 77 FR 40552 - Federal Acquisition Regulation; Price Analysis Techniques

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-07-10

    ... Federal Acquisition Regulation; Price Analysis Techniques AGENCY: Department of Defense (DoD), General... clarify the use of a price analysis technique in order to establish a fair and reasonable price. DATES....404-1(b)(2) addresses various price analysis techniques and procedures the Government may use...

  2. Statistics and Machine Learning based Outlier Detection Techniques for Exoplanets

    NASA Astrophysics Data System (ADS)

    Goel, Amit; Montgomery, Michele

    2015-08-01

    Architectures of planetary systems are observable snapshots in time that can indicate formation and dynamic evolution of planets. The observable key parameters that we consider are planetary mass and orbital period. If planet masses are significantly less than their host star masses, then Keplerian Motion is defined as P^2 = a^3 where P is the orbital period in units of years and a is the orbital period in units of Astronomical Units (AU). Keplerian motion works on small scales such as the size of the Solar System but not on large scales such as the size of the Milky Way Galaxy. In this work, for confirmed exoplanets of known stellar mass, planetary mass, orbital period, and stellar age, we analyze Keplerian motion of systems based on stellar age to seek if Keplerian motion has an age dependency and to identify outliers. For detecting outliers, we apply several techniques based on statistical and machine learning methods such as probabilistic, linear, and proximity based models. In probabilistic and statistical models of outliers, the parameters of a closed form probability distributions are learned in order to detect the outliers. Linear models use regression analysis based techniques for detecting outliers. Proximity based models use distance based algorithms such as k-nearest neighbour, clustering algorithms such as k-means, or density based algorithms such as kernel density estimation. In this work, we will use unsupervised learning algorithms with only the proximity based models. In addition, we explore the relative strengths and weaknesses of the various techniques by validating the outliers. The validation criteria for the outliers is if the ratio of planetary mass to stellar mass is less than 0.001. In this work, we present our statistical analysis of the outliers thus detected.

  3. Effective learning techniques for military applications using the Personalized Assistant that Learns (PAL) enhanced Web-Based Temporal Analysis System (WebTAS)

    NASA Astrophysics Data System (ADS)

    LaMonica, Peter; Dziegiel, Roger; Liuzzi, Raymond; Hepler, James

    2009-05-01

    The Personalized Assistant that Learns (PAL) Program is a Defense Advanced Research Projects Agency (DARPA) research effort that is advancing technologies in the area of cognitive learning by developing cognitive assistants to support military users, such as commanders and decision makers. The Air Force Research Laboratory's (AFRL) Information Directorate leveraged several core PAL components and applied them to the Web-Based Temporal Analysis System (WebTAS) so that users of this system can have automated features, such as task learning, intelligent clustering, and entity extraction. WebTAS is a modular software toolset that supports fusion of large amounts of disparate data sets, visualization, project organization and management, pattern analysis and activity prediction, and includes various presentation aids. WebTAS is predominantly used by analysts within the intelligence community and with the addition of these automated features, many transition opportunities exist for this integrated technology. Further, AFRL completed an extensive test and evaluation of this integrated software to determine its effectiveness for military applications in terms of timeliness and situation awareness, and these findings and conclusions, as well as future work, will be presented in this report.

  4. Function Analysis and Decomposistion using Function Analysis Systems Technique

    SciTech Connect

    J. R. Wixson

    1999-06-01

    The "Father of Value Analysis", Lawrence D. Miles, was a design engineer for General Electric in Schenectady, New York. Miles developed the concept of function analysis to address difficulties in satisfying the requirements to fill shortages of high demand manufactured parts and electrical components during World War II. His concept of function analysis was further developed in the 1960s by Charles W. Bytheway, a design engineer at Sperry Univac in Salt Lake City, Utah. Charles Bytheway extended Mile's function analysis concepts and introduced the methodology called Function Analysis Systems Techniques (FAST) to the Society of American Value Engineers (SAVE) at their International Convention in 1965 (Bytheway 1965). FAST uses intuitive logic to decompose a high level, or objective function into secondary and lower level functions that are displayed in a logic diagram called a FAST model. Other techniques can then be applied to allocate functions to components, individuals, processes, or other entities that accomplish the functions. FAST is best applied in a team setting and proves to be an effective methodology for functional decomposition, allocation, and alternative development.

  5. Function Analysis and Decomposistion using Function Analysis Systems Technique

    SciTech Connect

    Wixson, James Robert

    1999-06-01

    The "Father of Value Analysis", Lawrence D. Miles, was a design engineer for General Electric in Schenectady, New York. Miles developed the concept of function analysis to address difficulties in satisfying the requirements to fill shortages of high demand manufactured parts and electrical components during World War II. His concept of function analysis was further developed in the 1960s by Charles W. Bytheway, a design engineer at Sperry Univac in Salt Lake City, Utah. Charles Bytheway extended Mile's function analysis concepts and introduced the methodology called Function Analysis Systems Technique (FAST) to the Society of American Value Engineers (SAVE) at their International Convention in 1965 (Bytheway 1965). FAST uses intuitive logic to decompose a high level, or objective function into secondary and lower level functions that are displayed in a logic diagram called a FAST model. Other techniques can then be applied to allocate functions to components, individuals, processes, or other entities that accomplish the functions. FAST is best applied in a team setting and proves to be an effective methodology for functional decomposition, allocation, and alternative development.

  6. Analysis of diagnostic calorimeter data by the transfer function technique

    NASA Astrophysics Data System (ADS)

    Delogu, R. S.; Poggi, C.; Pimazzoni, A.; Rossi, G.; Serianni, G.

    2016-02-01

    This paper describes the analysis procedure applied to the thermal measurements on the rear side of a carbon fibre composite calorimeter with the purpose of reconstructing the energy flux due to an ion beam colliding on the front side. The method is based on the transfer function technique and allows a fast analysis by means of the fast Fourier transform algorithm. Its efficacy has been tested both on simulated and measured temperature profiles: in all cases, the energy flux features are well reproduced and beamlets are well resolved. Limits and restrictions of the method are also discussed, providing strategies to handle issues related to signal noise and digital processing.

  7. Analysis of diagnostic calorimeter data by the transfer function technique.

    PubMed

    Delogu, R S; Poggi, C; Pimazzoni, A; Rossi, G; Serianni, G

    2016-02-01

    This paper describes the analysis procedure applied to the thermal measurements on the rear side of a carbon fibre composite calorimeter with the purpose of reconstructing the energy flux due to an ion beam colliding on the front side. The method is based on the transfer function technique and allows a fast analysis by means of the fast Fourier transform algorithm. Its efficacy has been tested both on simulated and measured temperature profiles: in all cases, the energy flux features are well reproduced and beamlets are well resolved. Limits and restrictions of the method are also discussed, providing strategies to handle issues related to signal noise and digital processing. PMID:26932104

  8. Model-Based Safety Analysis

    NASA Technical Reports Server (NTRS)

    Joshi, Anjali; Heimdahl, Mats P. E.; Miller, Steven P.; Whalen, Mike W.

    2006-01-01

    System safety analysis techniques are well established and are used extensively during the design of safety-critical systems. Despite this, most of the techniques are highly subjective and dependent on the skill of the practitioner. Since these analyses are usually based on an informal system model, it is unlikely that they will be complete, consistent, and error free. In fact, the lack of precise models of the system architecture and its failure modes often forces the safety analysts to devote much of their effort to gathering architectural details about the system behavior from several sources and embedding this information in the safety artifacts such as the fault trees. This report describes Model-Based Safety Analysis, an approach in which the system and safety engineers share a common system model created using a model-based development process. By extending the system model with a fault model as well as relevant portions of the physical system to be controlled, automated support can be provided for much of the safety analysis. We believe that by using a common model for both system and safety engineering and automating parts of the safety analysis, we can both reduce the cost and improve the quality of the safety analysis. Here we present our vision of model-based safety analysis and discuss the advantages and challenges in making this approach practical.

  9. Proteomic Analysis of Vitreous Biopsy Techniques

    PubMed Central

    Skeie, Jessica M.; Brown, Eric N.; Martinez, Harryl D.; Russell, Stephen R.; Birkholz, Emily S.; Folk, James C.; Boldt, H. Culver; Gehrs, Karen M.; Stone, Edwin M.; Wright, Michael E.; Mahajan, Vinit B.

    2013-01-01

    Purpose To compare vitreous biopsy methods using analysis platforms employed in proteomics biomarker discovery. Methods Vitreous biopsies from 10 eyes were collected sequentially using a 23-gauge needle and a 23-gauge vitreous cutter instrument. Paired specimens were evaluated by UV absorbance spectroscopy, SDS-PAGE, and mass-spectrometry (LC-MS/MS). Results The total protein concentration obtained with a needle and vitrectomy instrument biopsy averaged 1.10 mg/ml (SEM = 0.35) and 1.13 mg/ml (SEM = 0.25), respectively. In eight eyes with low or medium viscidity, there was a very high correlation (R2 = 0.934) between the biopsy methods. When data from two eyes with high viscidity vitreous were included, the correlation was reduced (R2 = 0.704). The molecular weight protein SDS-PAGE profiles of paired needle and vitreous cutter samples were similar, except for a minority of pairs with single band intensity variance. Using LC-MS/MS, equivalent peptides were identified with similar frequencies (R2 ≥ 0.90) in paired samples. Conclusion Proteins and peptides collected from vitreous needle biopsies are nearly equivalent to those obtained from a vitreous cutter instrument. This study suggests both techniques may be used for most proteomic and biomarker discovery studies of vitreoretinal diseases, although a minority of proteins and peptides may differ in concentration. PMID:23095728

  10. Nuclear based techniques for detection of contraband

    SciTech Connect

    Gozani, T.

    1993-12-31

    The detection of contraband such as explosives and drugs concealed in luggage or other container can be quite difficult. Nuclear techniques offer capabilities which are essential to having effective detection devices. This report describes the features of various nuclear techniques and instrumentation.

  11. Advanced Techniques for Root Cause Analysis

    Energy Science and Technology Software Center (ESTSC)

    2000-09-19

    Five items make up this package, or can be used individually. The Chronological Safety Management Template utilizes a linear adaptation of the Integrated Safety Management System laid out in the form of a template that greatly enhances the ability of the analyst to perform the first step of any investigation which is to gather all pertinent facts and identify causal factors. The Problem Analysis Tree is a simple three (3) level problem analysis tree whichmore » is easier for organizations outside of WSRC to use. Another part is the Systemic Root Cause Tree. One of the most basic and unique features of Expanded Root Cause Analysis is the Systemic Root Cause portion of the Expanded Root Cause Pyramid. The Systemic Root Causes are even more basic than the Programmatic Root Causes and represent Root Causes that cut across multiple (if not all) programs in an organization. the Systemic Root Cause portion contains 51 causes embedded at the bottom level of a three level Systemic Root Cause Tree that is divided into logical, organizationally based categorie to assist the analyst. The Computer Aided Root Cause Analysis that allows the analyst at each level of the Pyramid to a) obtain a brief description of the cause that is being considered, b) record a decision that the item is applicable, c) proceed to the next level of the Pyramid to see only those items at the next level of the tree that are relevant to the particular cause that has been chosen, and d) at the end of the process automatically print out a summary report of the incident, the causal factors as they relate to the safety management system, the probable causes, apparent causes, Programmatic Root Causes and Systemic Root Causes for each causal factor and the associated corrective action.« less

  12. A Quantitative Study of Gully Erosion Based on Object-Oriented Analysis Techniques: A Case Study in Beiyanzikou Catchment of Qixia, Shandong, China

    PubMed Central

    Wang, Tao; He, Fuhong; Zhang, Anding; Gu, Lijuan; Wen, Yangmao; Jiang, Weiguo; Shao, Hongbo

    2014-01-01

    This paper took a subregion in a small watershed gully system at Beiyanzikou catchment of Qixia, China, as a study and, using object-orientated image analysis (OBIA), extracted shoulder line of gullies from high spatial resolution digital orthophoto map (DOM) aerial photographs. Next, it proposed an accuracy assessment method based on the adjacent distance between the boundary classified by remote sensing and points measured by RTK-GPS along the shoulder line of gullies. Finally, the original surface was fitted using linear regression in accordance with the elevation of two extracted edges of experimental gullies, named Gully 1 and Gully 2, and the erosion volume was calculated. The results indicate that OBIA can effectively extract information of gullies; average range difference between points field measured along the edge of gullies and classified boundary is 0.3166 m, with variance of 0.2116 m. The erosion area and volume of two gullies are 2141.6250 m2, 5074.1790 m3 and 1316.1250 m2, 1591.5784 m3, respectively. The results of the study provide a new method for the quantitative study of small gully erosion. PMID:24616626

  13. Investigation of electroforming techniques, literature analysis report

    NASA Technical Reports Server (NTRS)

    Malone, G. A.

    1975-01-01

    A literature analysis is presented of reports, specifications, and documented experiences with the use of electroforming to produce copper and nickel structures for aerospace and other engineering applications. The literature period covered is from 1948 to 1974. Specific effort was made to correlate mechanical property data for the electrodeposited material with known electroforming solution compositions and operating conditions. From this survey, electrolytes are suggested for selection to electroform copper and nickel outer shells on regeneratively cooled thrust chamber liners, and other devices subject to thermal and pressure exposure, based on mechanical properties obtainable, performance under various thermal environments, and ease of process control for product reproducibility. Processes of potential value in obtaining sound bonds between electrodeposited copper and nickel and copper alloy substrates are also discussed.

  14. A technique for human error analysis (ATHEANA)

    SciTech Connect

    Cooper, S.E.; Ramey-Smith, A.M.; Wreathall, J.; Parry, G.W.

    1996-05-01

    Probabilistic risk assessment (PRA) has become an important tool in the nuclear power industry, both for the Nuclear Regulatory Commission (NRC) and the operating utilities. Human reliability analysis (HRA) is a critical element of PRA; however, limitations in the analysis of human actions in PRAs have long been recognized as a constraint when using PRA. A multidisciplinary HRA framework has been developed with the objective of providing a structured approach for analyzing operating experience and understanding nuclear plant safety, human error, and the underlying factors that affect them. The concepts of the framework have matured into a rudimentary working HRA method. A trial application of the method has demonstrated that it is possible to identify potentially significant human failure events from actual operating experience which are not generally included in current PRAs, as well as to identify associated performance shaping factors and plant conditions that have an observable impact on the frequency of core damage. A general process was developed, albeit in preliminary form, that addresses the iterative steps of defining human failure events and estimating their probabilities using search schemes. Additionally, a knowledge- base was developed which describes the links between performance shaping factors and resulting unsafe actions.

  15. Using remote sensing techniques and field-based structural analysis to explore new gold and associated mineral sites around Al-Hajar mine, Asir terrane, Arabian Shield

    NASA Astrophysics Data System (ADS)

    Sonbul, Abdullah R.; El-Shafei, Mohamed K.; Bishta, Adel Z.

    2016-05-01

    Modern earth resource satellites provide huge amounts of digital imagery at different resolutions. These satellite imageries are considered one of the most significant sources of data for mineral exploration. Image processing techniques were applied to the exposed rocks around the Al-Aqiq area of the Asir terrane in the southern part of the Arabian Shield. The area under study has two sub-parallel N-S trending metamorphic belts of green-schist facies. The first belt is located southeast of Al-Aqiq, where the Al-Hajar Gold Mine is situated. It is essentially composed of metavolcanics and metasedimentary rocks, and it is intruded by different plutonic rocks of primarily diorite, syenite and porphyritic granite. The second belt is located northwest of Al-Aqiq, and it is composed of metavolcanics and metasedimentary rocks and is intruded by granite bodies. The current study aimed to distinguish the lithological units, detect and map the alteration zones, and extract the major fault lineaments around the Al-Hajar gold prospect. Digital satellite imageries, including Landsat 7 ETM + multispectral and panchromatic and SPOT-5 were used in addition to field verification. Areas with similar spectral signatures to the prospect were identified in the nearby metamorphic belt; it was considered as a target area and was inspected in the field. The relationships between the alteration zones, the mineral deposits and the structural elements were used to locate the ore-bearing zones in the subsurface. The metasedimentary units of the target area showed a dextral-ductile shearing top-to-the-north and the presence of dominant mineralized quartz vein-system. The area to the north of the Al-Hajar prospect showed also sub-parallel shear zones along which different types of alterations were detected. Field-based criteria such as hydrothermal breccia, jasper, iron gossans and porphyritic granite strongly indicate the presence of porphyry-type ore deposits in Al-Hajar metamorphic belt that

  16. Using remote sensing techniques and field-based structural analysis to explore new gold and associated mineral sites around Al-Hajar mine, Asir terrane, Arabian Shield

    NASA Astrophysics Data System (ADS)

    Sonbul, Abdullah R.; El-Shafei, Mohamed K.; Bishta, Adel Z.

    2016-05-01

    Modern earth resource satellites provide huge amounts of digital imagery at different resolutions. These satellite imageries are considered one of the most significant sources of data for mineral exploration. Image processing techniques were applied to the exposed rocks around the Al-Aqiq area of the Asir terrane in the southern part of the Arabian Shield. The area under study has two sub-parallel N-S trending metamorphic belts of green-schist facies. The first belt is located southeast of Al-Aqiq, where the Al-Hajar Gold Mine is situated. It is essentially composed of metavolcanics and metasedimentary rocks, and it is intruded by different plutonic rocks of primarily diorite, syenite and porphyritic granite. The second belt is located northwest of Al-Aqiq, and it is composed of metavolcanics and metasedimentary rocks and is intruded by granite bodies. The current study aimed to distinguish the lithological units, detect and map the alteration zones, and extract the major fault lineaments around the Al-Hajar gold prospect. Digital satellite imageries, including Landsat 7 ETM + multispectral and panchromatic and SPOT-5 were used in addition to field verification. Areas with similar spectral signatures to the prospect were identified in the nearby metamorphic belt; it was considered as a target area and was inspected in the field. The relationships between the alteration zones, the mineral deposits and the structural elements were used to locate the ore-bearing zones in the subsurface. The metasedimentary units of the target area showed a dextral-ductile shearing top-to-the-north and the presence of dominant mineralized quartz vein-system. The area to the north of the Al-Hajar prospect showed also sub-parallel shear zones along which different types of alterations were detected. Field-based criteria such as hydrothermal breccia, jasper, iron gossans and porphyritic granite strongly indicate the presence of porphyry-type ore deposits in Al-Hajar metamorphic belt

  17. DCT-based cyber defense techniques

    NASA Astrophysics Data System (ADS)

    Amsalem, Yaron; Puzanov, Anton; Bedinerman, Anton; Kutcher, Maxim; Hadar, Ofer

    2015-09-01

    With the increasing popularity of video streaming services and multimedia sharing via social networks, there is a need to protect the multimedia from malicious use. An attacker may use steganography and watermarking techniques to embed malicious content, in order to attack the end user. Most of the attack algorithms are robust to basic image processing techniques such as filtering, compression, noise addition, etc. Hence, in this article two novel, real-time, defense techniques are proposed: Smart threshold and anomaly correction. Both techniques operate at the DCT domain, and are applicable for JPEG images and H.264 I-Frames. The defense performance was evaluated against a highly robust attack, and the perceptual quality degradation was measured by the well-known PSNR and SSIM quality assessment metrics. A set of defense techniques is suggested for improving the defense efficiency. For the most aggressive attack configuration, the combination of all the defense techniques results in 80% protection against cyber-attacks with PSNR of 25.74 db.

  18. Modular techniques for dynamic fault-tree analysis

    NASA Technical Reports Server (NTRS)

    Patterson-Hine, F. A.; Dugan, Joanne B.

    1992-01-01

    It is noted that current approaches used to assess the dependability of complex systems such as Space Station Freedom and the Air Traffic Control System are incapable of handling the size and complexity of these highly integrated designs. A novel technique for modeling such systems which is built upon current techniques in Markov theory and combinatorial analysis is described. It enables the development of a hierarchical representation of system behavior which is more flexible than either technique alone. A solution strategy which is based on an object-oriented approach to model representation and evaluation is discussed. The technique is virtually transparent to the user since the fault tree models can be built graphically and the objects defined automatically. The tree modularization procedure allows the two model types, Markov and combinatoric, to coexist and does not require that the entire fault tree be translated to a Markov chain for evaluation. This effectively reduces the size of the Markov chain required and enables solutions with less truncation, making analysis of longer mission times possible. Using the fault-tolerant parallel processor as an example, a model is built and solved for a specific mission scenario and the solution approach is illustrated in detail.

  19. Neutron Activation Analysis: Techniques and Applications

    SciTech Connect

    MacLellan, Ryan

    2011-04-27

    The role of neutron activation analysis in low-energy low-background experimentsis discussed in terms of comparible methods. Radiochemical neutron activation analysis is introduce. The procedure of instrumental neutron activation analysis is detailed especially with respect to the measurement of trace amounts of natural radioactivity. The determination of reactor neutron spectrum parameters required for neutron activation analysis is also presented.

  20. Applying knowledge compilation techniques to model-based reasoning

    NASA Technical Reports Server (NTRS)

    Keller, Richard M.

    1991-01-01

    Researchers in the area of knowledge compilation are developing general purpose techniques for improving the efficiency of knowledge-based systems. In this article, an attempt is made to define knowledge compilation, to characterize several classes of knowledge compilation techniques, and to illustrate how some of these techniques can be applied to improve the performance of model-based reasoning systems.

  1. Node Augmentation Technique in Bayesian Network Evidence Analysis and Marshaling

    SciTech Connect

    Keselman, Dmitry; Tompkins, George H; Leishman, Deborah A

    2010-01-01

    Given a Bayesian network, sensitivity analysis is an important activity. This paper begins by describing a network augmentation technique which can simplifY the analysis. Next, we present two techniques which allow the user to determination the probability distribution of a hypothesis node under conditions of uncertain evidence; i.e. the state of an evidence node or nodes is described by a user specified probability distribution. Finally, we conclude with a discussion of three criteria for ranking evidence nodes based on their influence on a hypothesis node. All of these techniques have been used in conjunction with a commercial software package. A Bayesian network based on a directed acyclic graph (DAG) G is a graphical representation of a system of random variables that satisfies the following Markov property: any node (random variable) is independent of its non-descendants given the state of all its parents (Neapolitan, 2004). For simplicities sake, we consider only discrete variables with a finite number of states, though most of the conclusions may be generalized.

  2. Cochlear implant simulator for surgical technique analysis

    NASA Astrophysics Data System (ADS)

    Turok, Rebecca L.; Labadie, Robert F.; Wanna, George B.; Dawant, Benoit M.; Noble, Jack H.

    2014-03-01

    Cochlear Implant (CI) surgery is a procedure in which an electrode array is inserted into the cochlea. The electrode array is used to stimulate auditory nerve fibers and restore hearing for people with severe to profound hearing loss. The primary goals when placing the electrode array are to fully insert the array into the cochlea while minimizing trauma to the cochlea. Studying the relationship between surgical outcome and various surgical techniques has been difficult since trauma and electrode placement are generally unknown without histology. Our group has created a CI placement simulator that combines an interactive 3D visualization environment with a haptic-feedback-enabled controller. Surgical techniques and patient anatomy can be varied between simulations so that outcomes can be studied under varied conditions. With this system, we envision that through numerous trials we will be able to statistically analyze how outcomes relate to surgical techniques. As a first test of this system, in this work, we have designed an experiment in which we compare the spatial distribution of forces imparted to the cochlea in the array insertion procedure when using two different but commonly used surgical techniques for cochlear access, called round window and cochleostomy access. Our results suggest that CIs implanted using round window access may cause less trauma to deeper intracochlear structures than cochleostomy techniques. This result is of interest because it challenges traditional thinking in the otological community but might offer an explanation for recent anecdotal evidence that suggests that round window access techniques lead to better outcomes.

  3. Liquid refractometer based on fringe projection technique

    NASA Astrophysics Data System (ADS)

    de Angelis, Marco; De Nicola, Sergio; Ferraro, Pietro; Finizio, Andrea; Pierattini, Giovanni

    1999-08-01

    Measurement of the refractive index of liquids is of great importance in applications such as characterization and control of adulteration of liquid commonly used and in pollution monitoring. We present and discuss a fringe projection technique for measuring the index of refraction of transparent liquid materials.

  4. Analysis and calibration techniques for superconducting resonators.

    PubMed

    Cataldo, Giuseppe; Wollack, Edward J; Barrentine, Emily M; Brown, Ari D; Moseley, S Harvey; U-Yen, Kongpop

    2015-01-01

    A method is proposed and experimentally explored for in-situ calibration of complex transmission data for superconducting microwave resonators. This cryogenic calibration method accounts for the instrumental transmission response between the vector network analyzer reference plane and the device calibration plane. Once calibrated, the observed resonator response is analyzed in detail by two approaches. The first, a phenomenological model based on physically realizable rational functions, enables the extraction of multiple resonance frequencies and widths for coupled resonators without explicit specification of the circuit network. In the second, an ABCD-matrix representation for the distributed transmission line circuit is used to model the observed response from the characteristic impedance and propagation constant. When used in conjunction with electromagnetic simulations, the kinetic inductance fraction can be determined with this method with an accuracy of 2%. Datasets for superconducting microstrip and coplanar-waveguide resonator devices were investigated and a recovery within 1% of the observed complex transmission amplitude was achieved with both analysis approaches. The experimental configuration used in microwave characterization of the devices and self-consistent constraints for the electromagnetic constitutive relations for parameter extraction are also presented. PMID:25638068

  5. Analysis and calibration techniques for superconducting resonators

    NASA Astrophysics Data System (ADS)

    Cataldo, Giuseppe; Wollack, Edward J.; Barrentine, Emily M.; Brown, Ari D.; Moseley, S. Harvey; U-Yen, Kongpop

    2015-01-01

    A method is proposed and experimentally explored for in-situ calibration of complex transmission data for superconducting microwave resonators. This cryogenic calibration method accounts for the instrumental transmission response between the vector network analyzer reference plane and the device calibration plane. Once calibrated, the observed resonator response is analyzed in detail by two approaches. The first, a phenomenological model based on physically realizable rational functions, enables the extraction of multiple resonance frequencies and widths for coupled resonators without explicit specification of the circuit network. In the second, an ABCD-matrix representation for the distributed transmission line circuit is used to model the observed response from the characteristic impedance and propagation constant. When used in conjunction with electromagnetic simulations, the kinetic inductance fraction can be determined with this method with an accuracy of 2%. Datasets for superconducting microstrip and coplanar-waveguide resonator devices were investigated and a recovery within 1% of the observed complex transmission amplitude was achieved with both analysis approaches. The experimental configuration used in microwave characterization of the devices and self-consistent constraints for the electromagnetic constitutive relations for parameter extraction are also presented.

  6. Advanced NMR-based techniques for pore structure analysis of coal. Quarterly report No. 7, April 1, 1993--June 31, 1993

    SciTech Connect

    Smith, D.M.

    1993-09-01

    One of the main problems in coal utilization is the inability to properly characterize its complex pore structure. Coals typically have micro/ultramicro pores but they also exhibit meso and macroporosity. Conventional pore size techniques (adsorption/condensation, mercury porosimetry) are limited because of this broad pore size range, microporosity, reactive nature of coal, samples must be completely dried, and network/percolation effects. Small angle scattering is limited because it probes both open and closed pores. Although one would not expect any single technique to provide a satisfactory description of a coal`s structure, it is apparent that better techniques are necessary. Small angle scattering could be improved by combining scattering and adsorption measurements. Also, the measurement of NMR parameters of various gas phase and adsorbed phase NMR active probes can provide pore structure information. We will investigate the dependence of the common NMR parameters such as chemical shifts and relaxation times of several different nuclei and compounds on the pore structure of model microporous solids, carbons, and coals. In particular, we will study the interaction between several small molecules ({sup 129}Xe, {sup 3}He, {sup 2}H{sub 2},{sup 14}N{sub 2}, {sup 14}NH{sub 3}, {sup 15}N{sub 2},{sup 13}CH{sub 4}, {sup 13}CO{sub 2}) and the pore surfaces in coals.

  7. Advanced NMR-based techniques for pore structure analysis of coal. Quarterly report No. 3, July 1, 1992--September 30, 1992

    SciTech Connect

    Smith, D.M.

    1992-12-31

    One of the main problems in coal utilization is the inability to properly characterize its complex pore structure. Coals typically have micro/ultra-micro pores but they also exhibit meso and macroporosity. Conventional pore size techniques (adsorption/condensation, mercury porosimetry) are limited because of this broad pore size range, microporosity, reactive nature of coal, samples must be completely dried, and network/percolation effects. Small angle scattering is limited because it probes both open and closed pores. Although one would not expect any single technique to provide a satisfactory description of a coal`s structure, it is apparent that better techniques are necessary. We believe that measurement of the NMR parameters of various gas phase and adsorbed phase NMR active probes can provide the resolution to this problem. We now have two suites of well-characterized microporous materials including oxides (zeolites and silica gel) and activated carbons from our industrial partner, Air Products in Allentown, PA. Our current work may be divided into three areas: small-angle X-ray scattering (SAXS), adsorption, and NMR.

  8. Advanced NMR-based techniques for pore structure analysis of coal. Quarterly report No. 9, October 1, 1993--December 30, 1993

    SciTech Connect

    Smith, D.M.

    1993-12-31

    One of the main problems in coal utilization is the inability to properly characterize its complex pore structure. Coals typically have micro/ultra-micro pores but they also exhibit meso and macroporosity. Conventional pore size techniques (adsorption/condensation, mercury porosimetry) are limited because of this broad pore size range, microporosity, reactive nature of coal, samples must be completely dried, and network/percolation effects. Small angle scattering is limited because it probes both open and dosed pores. Although one would not expect any single technique to provide a satisfactory description of a coal`s structure, it is apparent that better techniques are necessary. Small angle scattering could be improved by combining scattering and adsorption measurements. Also, the measurement of NMR parameters of various gas phase and adsorbed phase NMR active probes can provide pore structure information. We will investigate the dependence of the common NMR parameters such as chemical shifts and relaxation times of several different nuclei and compounds on the pore structure of model microporous solids, carbons, and coals. In particular, we will study the interaction between several small molecules ({sup 129}Xe, {sup 3}He, {sup 14}N{sub 2}, {sup 14}NH{sub 3}, {sup 15}N{sub 2}, {sup 13}CH{sub 4}, {sup 13}CO{sub 2}) and pore surface. Our current work may be divided into three areas: small-angle X-ray scattering (SAXS), adsorption, and NMR.

  9. Advanced NMR-based techniques for pore structure analysis of coal. Quarter report {number_sign}8, 7/1/93--9/30/93

    SciTech Connect

    Smith, D.M.

    1993-12-31

    One of the main problems in coal utilization is the inability to properly characterize its complex pore structure. Coals typically have micro/ultramicro pores but they also exhibit meso and macroporosity. Conventional pore size techniques (adsorption/condensation, mercury porosimetry) are limited because of this broad pore size range, microporosity, reactive nature of coal, samples must be completely dried, and network/percolation effects. Small angle scattering is limited because it probes both open and closed pores. Although one would not expect any single technique to provide a satisfactory description of a coal`s structure, it is apparent that better techniques are necessary. Small angle scattering could be improved by combining scattering and adsorption measurements. Also, the measurement of NMR parameters of various gas phase and adsorbed phase NMR active probes can provide pore structure information. The dependence of the common NMR parameters such as chemical shifts and relaxation times of several different nuclei and compounds on the pore structure of model microporous solids, carbons, and coals is investigated. In particular, the interaction between several small molecules ({sup 129}Xe, {sup 3}He, {sup 14}N{sub 2},{sup 14}NH{sub 3},{sup 15}N{sub 2},{sup 13} CH{sub 4}, {sup 13}CO{sub 2}) and pore surface is studied.

  10. Advanced NMR-based techniques for pore structure analysis of coal. Quarterly report No. 6, January 1, 1993--March 31, 1993

    SciTech Connect

    Smith, D.M.

    1993-08-01

    One of the main problems in coal utilization is the inability to properly characterize its complex pore structure. Coals typically have micro/ultra-micro pores but they also exhibit meso and macroporosity. Conventional pore size techniques (adsorption/condensation, mercury porosimetry) are limited because of this broad pore size range, microporosity, reactive nature of coal, samples must be completely dried, and network/percolation effects. Small angle scattering is limited because it probes both open and closed pores. Although one would not expect any single technique to provide a satisfactory description of a coal`s structure, it is apparent that better techniques are necessary. Small angle scattering could be improved by combining scattering and adsorption measurements. Also, the measurement of NMR parameters of various gas phase and adsorbed phase NMR active probes can provide pore structure information. We will investigate the dependence of the common NMR parameters such as chemical shifts and relaxation times of several different nuclei and compounds on the pore structure of model microporous solids, carbons, and coals. In particular, we will study the interaction between several small molecules ({sup 129}Xe, {sup 3}He, {sup 2}H{sub 2}, {sup 14}N{sub 2},{sup 14}NH{sub 3}, {sup 15}N{sup 2}, {sup 13}CH{sub 4}, {sup 13}CO{sub 2}) and the pore surfaces in coals.

  11. Retention of denture bases fabricated by three different processing techniques – An in vivo study

    PubMed Central

    Chalapathi Kumar, V. H.; Surapaneni, Hemchand; Ravikiran, V.; Chandra, B. Sarat; Balusu, Srilatha; Reddy, V. Naveen

    2016-01-01

    Aim: Distortion due to Polymerization shrinkage compromises the retention. To evaluate the amount of retention of denture bases fabricated by conventional, anchorized, and injection molding polymerization techniques. Materials and Methods: Ten completely edentulous patients were selected, impressions were made, and master cast obtained was duplicated to fabricate denture bases by three polymerization techniques. Loop was attached to the finished denture bases to estimate the force required to dislodge them by retention apparatus. Readings were subjected to nonparametric Friedman two-way analysis of variance followed by Bonferroni correction methods and Wilcoxon matched-pairs signed-ranks test. Results: Denture bases fabricated by injection molding (3740 g), anchorized techniques (2913 g) recorded greater retention values than conventional technique (2468 g). Significant difference was seen between these techniques. Conclusions: Denture bases obtained by injection molding polymerization technique exhibited maximum retention, followed by anchorized technique, and least retention was seen in conventional molding technique. PMID:27382542

  12. IMAGE-BASED EROSION MEASUREMENT TECHNIQUE

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Two and three - dimensional analysis using close range digital photographs can be very useful in measuring changes in erosion on the landscape. Computer software exists for conducting photographic analysis but is often either cost prohibitive or very labor intensive to use. This paper describes a ...

  13. Typology of Delivery Quality: Latent Profile Analysis of Teacher Engagement and Delivery Techniques in a School-Based Prevention Intervention, "Keepin' It REAL" Curriculum

    ERIC Educational Resources Information Center

    Shin, YoungJu; Miller-Day, Michelle; Pettigrew, Jonathan; Hecht, Michael L.; Krieger, Janice L.

    2014-01-01

    Enhancing the delivery quality of school-based, evidence-based prevention programs is one key to ensuring uniform program effects on student outcomes. Program evaluations often focus on content dosage when implementing prevention curricula, however, less is known about implementation quality of prevention content, especially among teachers who may…

  14. A comparison of different texture analysis techniques

    SciTech Connect

    Wright, S.I.; Kocks, U.F.

    1996-08-01

    With the advent of automated techniques for measuring individual crystallographic orientations using electron diffraction, there has been an increase in the use of local orientation measurements for measuring textures in polycrystalline materials. Several studies have focused on the number of single orientation measurements necessary to achieve the statistics of more conventional texture measurement, techniques such as pole figure measurement using x-ray and neutron diffraction. This investigation considers this question but also is extended to consider the nature of the differences between textures measured using individual orientation measurements and those measured using x-ray diffraction.

  15. Noninvasive in vivo glucose sensing using an iris based technique

    NASA Astrophysics Data System (ADS)

    Webb, Anthony J.; Cameron, Brent D.

    2011-03-01

    Physiological glucose monitoring is important aspect in the treatment of individuals afflicted with diabetes mellitus. Although invasive techniques for glucose monitoring are widely available, it would be very beneficial to make such measurements in a noninvasive manner. In this study, a New Zealand White (NZW) rabbit animal model was utilized to evaluate a developed iris-based imaging technique for the in vivo measurement of physiological glucose concentration. The animals were anesthetized with isoflurane and an insulin/dextrose protocol was used to control blood glucose concentration. To further help restrict eye movement, a developed ocular fixation device was used. During the experimental time frame, near infrared illuminated iris images were acquired along with corresponding discrete blood glucose measurements taken with a handheld glucometer. Calibration was performed using an image based Partial Least Squares (PLS) technique. Independent validation was also performed to assess model performance along with Clarke Error Grid Analysis (CEGA). Initial validation results were promising and show that a high percentage of the predicted glucose concentrations are within 20% of the reference values.

  16. Surveying converter lining erosion state based on laser measurement technique

    NASA Astrophysics Data System (ADS)

    Li, Hongsheng; Shi, Tielin; Yang, Shuzi

    1998-08-01

    It is very important to survey the eroding state of the steelmaking converter lining real time so as to optimize technological process, extend converter durability and reduce steelmaking production costs. This paper gives one practical method based on the laser measure technique. It presents the basic principle of the measure technique. It presents the basic principle of the measure method, the composition of the measure system and the researches on key technological problems. The method is based on the technique of the laser range finding to net points on the surface of the surveyed converter lining, and the technology of angle finding to the laser beams. The angle signals would be used to help realizing the automatic scanning function also. The laser signals would be modulated and encoded. In the meantime, we would adopt the wavelet analysis and other filter algorithms, to denoise noisy data and extract useful information. And the main idea of some algorithms such as the net point measuring path planning and the measure device position optimal algorithm would also be given in order to improve the measure precision and real time property of the system.

  17. Development of Single-Nucleotide Polymorphism- Based Phylum-Specific PCR Amplification Technique: Application to the Community Analysis Using Ciliates as a Reference Organism

    PubMed Central

    Jung, Jae-Ho; Kim, Sanghee; Ryu, Seongho; Kim, Min-Seok; Baek, Ye-Seul; Kim, Se-Joo; Choi, Joong- Ki; Park, Joong-Ki; Min, Gi-Sik

    2012-01-01

    Despite recent advance in mass sequencing technologies such as pyrosequencing, assessment of culture-independent microbial eukaryote community structures using universal primers remains very difficult due to the tremendous richness and complexity of organisms in these communities. Use of a specific PCR marker targeting a particular group would provide enhanced sensitivity and more in-depth evaluation of microbial eukaryote communities compared to what can be achieved with universal primers. We discovered that many phylum- or group-specific single-nucleotide polymorphisms (SNPs) exist in small subunit ribosomal RNA (SSU rRNA) genes from diverse eukaryote groups. By applying this discovery to a known simple allele-discriminating (SAP) PCR method, we developed a technique that enables the identification of organisms belonging to a specific higher taxonomic group (or phylum) among diverse types of eukaryotes. We performed an assay using two complementary methods, pyrosequencing and clone library screening. In doing this, specificities for the group (ciliates) targeted in this study in bulked environmental samples were 94.6% for the clone library and 99.2% for pyrosequencing, respectively. In particular, our novel technique showed high selectivity for rare species, a feature that may be more important than the ability to identify quantitatively predominant species in community structure analyses. Additionally, our data revealed that a target-specific library (or ciliate-specific one for the present study) can better explain the ecological features of a sampling locality than a universal library. PMID:22965748

  18. Recent trends in particle size analysis techniques

    NASA Technical Reports Server (NTRS)

    Kang, S. H.

    1984-01-01

    Recent advances and developments in the particle-sizing technologies are briefly reviewed in accordance with three operating principles including particle size and shape descriptions. Significant trends of the particle size analysing equipment recently developed show that compact electronic circuitry and rapid data processing systems were mainly adopted in the instrument design. Some newly developed techniques characterizing the particulate system were also introduced.

  19. The cast aluminum denture base. Part II: Technique.

    PubMed

    Halperin, A R; Halperin, G C

    1980-07-01

    A technique to wax-up and cast an aluminum base and a method to incorporate the base into the final denture base has been discussed. This technique does not use induction casting, rather it uses two casting ovens and a centrifugal casting machine. PMID:6991680

  20. Liquid Tunable Microlenses based on MEMS techniques

    PubMed Central

    Zeng, Xuefeng; Jiang, Hongrui

    2013-01-01

    The recent rapid development in microlens technology has provided many opportunities for miniaturized optical systems, and has found a wide range of applications. Of these microlenses, tunable-focus microlenses are of special interest as their focal lengths can be tuned using micro-scale actuators integrated with the lens structure. Realization of such tunable microlens generally relies on the microelectromechanical system (MEMS) technologies. Here, we review the recent progress in tunable liquid microlenses. The underlying physics relevant to these microlenses are first discussed, followed by description of three main categories of tunable microlenses involving MEMS techniques, mechanically driven, electrically driven, and those integrated within microfluidic systems. PMID:24163480

  1. Liquid tunable microlenses based on MEMS techniques

    NASA Astrophysics Data System (ADS)

    Zeng, Xuefeng; Jiang, Hongrui

    2013-08-01

    The recent rapid development in microlens technology has provided many opportunities for miniaturized optical systems, and has found a wide range of applications. Of these microlenses, tunable-focus microlenses are of special interest as their focal lengths can be tuned using micro-scale actuators integrated with the lens structure. Realization of such tunable microlens generally relies on the microelectromechanical system (MEMS) technologies. Here, we review the recent progress in tunable liquid microlenses. The underlying physics relevant to these microlenses are first discussed, followed by description of three main categories of tunable microlenses involving MEMS techniques, mechanically driven, electrically driven and those integrated within microfluidic systems.

  2. Speech recognition based on pattern recognition techniques

    NASA Astrophysics Data System (ADS)

    Rabiner, Lawrence R.

    1990-05-01

    Algorithms for speech recognition can be characterized broadly as pattern recognition approaches and acoustic phonetic approaches. To date, the greatest degree of success in speech recognition has been obtained using pattern recognition paradigms. The use of pattern recognition techniques were applied to the problems of isolated word (or discrete utterance) recognition, connected word recognition, and continuous speech recognition. It is shown that understanding (and consequently the resulting recognizer performance) is best to the simplest recognition tasks and is considerably less well developed for large scale recognition systems.

  3. Hand-Based Biometric Analysis

    NASA Technical Reports Server (NTRS)

    Bebis, George

    2013-01-01

    Hand-based biometric analysis systems and techniques provide robust hand-based identification and verification. An image of a hand is obtained, which is then segmented into a palm region and separate finger regions. Acquisition of the image is performed without requiring particular orientation or placement restrictions. Segmentation is performed without the use of reference points on the images. Each segment is analyzed by calculating a set of Zernike moment descriptors for the segment. The feature parameters thus obtained are then fused and compared to stored sets of descriptors in enrollment templates to arrive at an identity decision. By using Zernike moments, and through additional manipulation, the biometric analysis is invariant to rotation, scale, or translation or an input image. Additionally, the analysis uses re-use of commonly seen terms in Zernike calculations to achieve additional efficiencies over traditional Zernike moment calculation.

  4. Hand-Based Biometric Analysis

    NASA Technical Reports Server (NTRS)

    Bebis, George (Inventor); Amayeh, Gholamreza (Inventor)

    2015-01-01

    Hand-based biometric analysis systems and techniques are described which provide robust hand-based identification and verification. An image of a hand is obtained, which is then segmented into a palm region and separate finger regions. Acquisition of the image is performed without requiring particular orientation or placement restrictions. Segmentation is performed without the use of reference points on the images. Each segment is analyzed by calculating a set of Zernike moment descriptors for the segment. The feature parameters thus obtained are then fused and compared to stored sets of descriptors in enrollment templates to arrive at an identity decision. By using Zernike moments, and through additional manipulation, the biometric analysis is invariant to rotation, scale, or translation or an in put image. Additionally, the analysis utilizes re-use of commonly-seen terms in Zernike calculations to achieve additional efficiencies over traditional Zernike moment calculation.

  5. Survey of immunoassay techniques for biological analysis

    SciTech Connect

    Burtis, C.A.

    1986-10-01

    Immunoassay is a very specific, sensitive, and widely applicable analytical technique. Recent advances in genetic engineering have led to the development of monoclonal antibodies which further improves the specificity of immunoassays. Originally, radioisotopes were used to label the antigens and antibodies used in immunoassays. However, in the last decade, numerous types of immunoassays have been developed which utilize enzymes and fluorescent dyes as labels. Given the technical, safety, health, and disposal problems associated with using radioisotopes, immunoassays that utilize the enzyme and fluorescent labels are rapidly replacing those using radioisotope labels. These newer techniques are as sensitive, are easily automated, have stable reagents, and do not have a disposal problem. 6 refs., 1 fig., 2 tabs.

  6. Analysis of Cultural Heritage by Accelerator Techniques and Analytical Imaging

    NASA Astrophysics Data System (ADS)

    Ide-Ektessabi, Ari; Toque, Jay Arre; Murayama, Yusuke

    2011-12-01

    In this paper we present the result of experimental investigation using two very important accelerator techniques: (1) synchrotron radiation XRF and XAFS; and (2) accelerator mass spectrometry and multispectral analytical imaging for the investigation of cultural heritage. We also want to introduce a complementary approach to the investigation of artworks which is noninvasive and nondestructive that can be applied in situ. Four major projects will be discussed to illustrate the potential applications of these accelerator and analytical imaging techniques: (1) investigation of Mongolian Textile (Genghis Khan and Kublai Khan Period) using XRF, AMS and electron microscopy; (2) XRF studies of pigments collected from Korean Buddhist paintings; (3) creating a database of elemental composition and spectral reflectance of more than 1000 Japanese pigments which have been used for traditional Japanese paintings; and (4) visible light-near infrared spectroscopy and multispectral imaging of degraded malachite and azurite. The XRF measurements of the Japanese and Korean pigments could be used to complement the results of pigment identification by analytical imaging through spectral reflectance reconstruction. On the other hand, analysis of the Mongolian textiles revealed that they were produced between 12th and 13th century. Elemental analysis of the samples showed that they contained traces of gold, copper, iron and titanium. Based on the age and trace elements in the samples, it was concluded that the textiles were produced during the height of power of the Mongol empire, which makes them a valuable cultural heritage. Finally, the analysis of the degraded and discolored malachite and azurite demonstrates how multispectral analytical imaging could be used to complement the results of high energy-based techniques.

  7. A uniform technique for flood frequency analysis.

    USGS Publications Warehouse

    Thomas, W.O., Jr.

    1985-01-01

    This uniform technique consisted of fitting the logarithms of annual peak discharges to a Pearson Type III distribution using the method of moments. The objective was to adopt a consistent approach for the estimation of floodflow frequencies that could be used in computing average annual flood losses for project evaluation. In addition, a consistent approach was needed for defining equitable flood-hazard zones as part of the National Flood Insurance Program. -from ASCE Publications Information

  8. Laser-induced breakdown spectroscopy technique for quantitative analysis of aqueous solution using matrix conversion based on plant fiber spunlaced nonwovens.

    PubMed

    Chen, Chenghan; Niu, Guanghui; Shi, Qi; Lin, Qingyu; Duan, Yixiang

    2015-10-01

    In the present work, laser-induced breakdown spectroscopy (LIBS) was applied to detect concentrations of chromium and nickel in aqueous solution in the form of matrix conversion using plant fiber spunlaced nonwovens as a solid-phase support, which can effectively avoid the inherent difficulties such as splashing, a quenching effect, and a shorter plasma lifetime during the liquid LIBS analysis. Drops of the sample solution were transferred to the plant fiber spunlaced nonwovens surface and uniformly diffused from the center to the whole area of the substrate. Owing to good hydrophilicity, the plant fiber spunlaced nonwovens can hold more of the liquid sample, and the surface of this material never wrinkles after being dried in a drying oven, which can effectively reduce the deviation during the LIBS analysis. In addition, the plant fiber spunlaced nonwovens used in the present work are relatively convenient and low cost. Also, the procedure of analysis was simple and fast, which are the unique features of LIBS technology. Therefore, this method has potential applications for practical and in situ analyses. To achieve sensitive elemental detection, the optimal delay time in this experiment was investigated. Under the optimized condition, the limits of detection for Cr and Ni are 0.7 and 5.7  μg·mL(-1), respectively. The results obtained in the present study show that the matrix conversion method is a feasible option for analyzing heavy metals in aqueous solutions by LIBS technology. PMID:26479603

  9. Statistical Analysis Techniques for Small Sample Sizes

    NASA Technical Reports Server (NTRS)

    Navard, S. E.

    1984-01-01

    The small sample sizes problem which is encountered when dealing with analysis of space-flight data is examined. Because of such a amount of data available, careful analyses are essential to extract the maximum amount of information with acceptable accuracy. Statistical analysis of small samples is described. The background material necessary for understanding statistical hypothesis testing is outlined and the various tests which can be done on small samples are explained. Emphasis is on the underlying assumptions of each test and on considerations needed to choose the most appropriate test for a given type of analysis.

  10. GAS CURTAIN EXPERIMENTAL TECHNIQUE AND ANALYSIS METHODOLOGIES

    SciTech Connect

    J. R. KAMM; ET AL

    2001-01-01

    The qualitative and quantitative relationship of numerical simulation to the physical phenomena being modeled is of paramount importance in computational physics. If the phenomena are dominated by irregular (i. e., nonsmooth or disordered) behavior, then pointwise comparisons cannot be made and statistical measures are required. The problem we consider is the gas curtain Richtmyer-Meshkov (RM) instability experiments of Rightley et al. (13), which exhibit complicated, disordered motion. We examine four spectral analysis methods for quantifying the experimental data and computed results: Fourier analysis, structure functions, fractal analysis, and continuous wavelet transforms. We investigate the applicability of these methods for quantifying the details of fluid mixing.

  11. Analysis of signal processing techniques in pulsed thermography

    NASA Astrophysics Data System (ADS)

    Lopez, Fernando; Ibarra-Castanedo, Clemente; Maldague, Xavier; de Paulo Nicolau, Vicente

    2013-05-01

    Pulsed Thermography (PT) is one of the most widely used approaches for the inspection of composites materials, being its main attraction the deployment in transient regime. However, due to the physical phenomena involved during the inspection, the signals acquired by the infrared camera are nearly always affected by external reflections and local emissivity variations. Furthermore, non-uniform heating at the surface and thermal losses at the edges of the material also represent constraints in the detection capability. For this reason, the thermographics signals should be processed in order to improve - qualitatively and quantitatively - the quality of the thermal images. Signal processing constitutes an important step in the chain of thermal image analysis, especially when defects characterization is required. Several of the signals processing techniques employed nowadays are based on the one-dimensional solution of Fourier's law of heat conduction. This investigation brings into discussion the three-most used techniques based on the 1D Fourier's law: Thermographic Signal Reconstruction (TSR), Differential Absolute Contrast (DAC) and Pulsed Phase Thermography (PPT), applied on carbon fiber laminated composites. It is of special interest to determine the detection capabilities of each technique, allowing in this way more reliable results when performing an inspection by PT.

  12. Trends and Techniques for Space Base Electronics

    NASA Technical Reports Server (NTRS)

    Trotter, J. D.; Wade, T. E.; Gassaway, J. D.

    1979-01-01

    Simulations of various phosphorus and boron diffusions in SOS were completed and a sputtering system, furnaces, and photolithography related equipment were set up. Double layer metal experiments initially utilized wet chemistry techniques. By incorporating ultrasonic etching of the vias, premetal cleaning a modified buffered HF, phosphorus doped vapox, and extended sintering, yields of 98% were obtained using the standard test pattern. A two dimensional modeling program was written for simulating short channel MOSFETs with nonuniform substrate doping. A key simplifying assumption used is that the majority carriers can be represented by a sheet charge at the silicon dioxide silicon interface. Although the program is incomplete, the two dimensional Poisson equation for the potential distribution was achieved. The status of other Z-D MOSFET simulation programs is summarized.

  13. Techniques for detumbling a disabled space base

    NASA Technical Reports Server (NTRS)

    Kaplan, M. H.

    1973-01-01

    Techniques and conceptual devices for carrying out detumbling operations are examined, and progress in the development of these concepts is discussed. Devices which reduce tumble to simple spin through active linear motion of a small mass are described, together with a Module for Automatic Dock and Detumble (MADD) that could perform an orbital transfer from the shuttle in order to track and dock at a preselected point on the distressed craft. Once docked, MADD could apply torques by firing thrustors to detumble the passive vehicle. Optimum combinations of mass-motion and external devices for various situation should be developed. The need for completely formulating the automatic control logic of MADD is also emphasized.

  14. Identification of Tea Storage Times by Linear Discrimination Analysis and Back-Propagation Neural Network Techniques Based on the Eigenvalues of Principal Components Analysis of E-Nose Sensor Signals

    PubMed Central

    Yu, Huichun; Wang, Yongwei; Wang, Jun

    2009-01-01

    An electronic nose (E-nose) was employed to detect the aroma of green tea after different storage times. Longjing green tea dry leaves, beverages and residues were detected with an E-nose, respectively. In order to decrease the data dimensionality and optimize the feature vector, the E-nose sensor response data were analyzed by principal components analysis (PCA) and the five main principal components values were extracted as the input for the discrimination analysis. The storage time (0, 60, 120, 180 and 240 days) was better discriminated by linear discrimination analysis (LDA) and was predicted by the back-propagation neural network (BPNN) method. The results showed that the discrimination and testing results based on the tea leaves were better than those based on tea beverages and tea residues. The mean errors of the tea leaf data were 9, 2.73, 3.93, 6.33 and 6.8 days, respectively. PMID:22408494

  15. 78 FR 37690 - Federal Acquisition Regulation; Price Analysis Techniques

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-06-21

    ... published a proposed rule in the Federal Register at 77 FR 40552 on July 10, 2012, to clarify and pinpoint a... Federal Acquisition Regulation; Price Analysis Techniques AGENCY: Department of Defense (DoD), General... clarify and give a precise reference in the use of a price analysis technique in order to establish a...

  16. Soil Analysis using the semi-parametric NAA technique

    SciTech Connect

    Zamboni, C. B.; Silveira, M. A. G.; Medina, N. H.

    2007-10-26

    The semi-parametric Neutron Activation Analysis technique, using Au as a flux monitor, was applied to measure element concentrations of Br, Ca, Cl, K, Mn and Na for soil characterization. The results were compared with those using the Instrumental Neutron Activation Analysis technique and they found to be compatible. The viability, advantages, and limitations of using these two analytic methodologies are discussed.

  17. Development of analysis techniques for remote sensing of vegetation resources

    NASA Technical Reports Server (NTRS)

    Draeger, W. C.

    1972-01-01

    Various data handling and analysis techniques are summarized for evaluation of ERTS-A and supporting high flight imagery. These evaluations are concerned with remote sensors applied to wildland and agricultural vegetation resource inventory problems. Monitoring California's annual grassland, automatic texture analysis, agricultural ground data collection techniques, and spectral measurements are included.

  18. Comparative analysis of techniques for measuring the modulation transfer functions of charge-coupled devices based on the generation of laser speckle.

    PubMed

    Pozo, Antonio Manuel; Rubiño, Manuel

    2005-03-20

    Two methods for measuring the modulation transfer function (MTF) of a charge-coupled device (CCD) that are based on the generation of laser speckle are analyzed and compared. The method based on a single-slit aperture is a quick method, although the measurements are limited to values of less than the Nyquist frequency of the device. The double-slit method permits the measurement of values of as much as some 1.8 times the Nyquist frequency, although it is a slower method because of the necessity to move the CCD. The difference between the MTF values obtained with the two methods is less than 0.1 in magnitude; the root-mean-square error between the two curves is 0.046 (4.6%). PMID:15813255

  19. Comparative analysis of techniques for measuring the modulation transfer functions of charge-coupled devices based on the generation of laser speckle

    NASA Astrophysics Data System (ADS)

    Pozo, Antonio Manuel; Rubiño, Manuel

    2005-03-01

    Two methods for measuring the modulation transfer function (MTF) of a charge-coupled device (CCD) that are based on the generation of laser speckle are analyzed and compared. The method based on a single-slit aperture is a quick method, although the measurements are limited to values of less than the Nyquist frequency of the device. The double-slit method permits the measurement of values of as much as some 1.8 times the Nyquist frequency, although it is a slower method because of the necessity to move the CCD. The difference between the MTF values obtained with the two methods is less than 0.1 in magnitude; the root-mean-square error between the two curves is 0.046 (4.6%).

  20. PRACTICAL SENSITIVITY AND UNCERTAINTY ANALYSIS TECHNIQUES APPLIED TO AGRICULTURAL SYSTEMS MODELS

    Technology Transfer Automated Retrieval System (TEKTRAN)

    We present a practical evaluation framework for analysis of two complex, process-based agricultural system models, WEPP and RZWQM. The evaluation framework combines sensitivity analysis and the uncertainty analysis techniques of first order error analysis (FOA) and Monte Carlo simulation with Latin ...

  1. Morphometric techniques for orientation analysis of karst in northern Florida

    SciTech Connect

    Jenkins, D.T.; Beck, B.F.

    1985-01-01

    Morphometric techniques for the analysis of karst landscape orientation data based on swallet catchment areas can be highly inadequate. The long axes of catchment areas may not coincide with structural control, especially in regions having very low relief. Better structural correlation was observed using multiply linear trend measurements of closed depressions rather than drainage basins. Trend analysis was performed on four areas, approximately 25 km/sup 2/ each, forming a sequence from the Suwannee River to the Cody Escarpment in northern Florida. This area is a karst plain, mantled by 12 to 25 meters of unconsolidated sands and clays. Structural control was examined by tabulating the azimuths of distinct linear trends as determined from depression shape based on 1:24,000 topographic maps. The topography was characterized by 1872 individual swallet catchment areas or 1457 closed depressions. The common geomorphic technique of analyzing orientation data in 10/sup 0/ increments beginning with O/sup 0/ may yield incorrect peak width and placement. To correctly detect all significant orientation peaks all possible combinations of peak width and placement must be tested. Fifty-five different plots were reviewed and tested for each area.

  2. Visualization techniques for malware behavior analysis

    NASA Astrophysics Data System (ADS)

    Grégio, André R. A.; Santos, Rafael D. C.

    2011-06-01

    Malware spread via Internet is a great security threat, so studying their behavior is important to identify and classify them. Using SSDT hooking we can obtain malware behavior by running it in a controlled environment and capturing interactions with the target operating system regarding file, process, registry, network and mutex activities. This generates a chain of events that can be used to compare them with other known malware. In this paper we present a simple approach to convert malware behavior into activity graphs and show some visualization techniques that can be used to analyze malware behavior, individually or grouped.

  3. Nuclear reaction techniques in materials analysis

    SciTech Connect

    Amsel, G.; Lanford, W.A.

    1984-01-01

    This article discusses nuclear reaction microanalysis (NRA). In NRA, data accumulated in the frame of low-energy nuclear physics is put to advantage for analytical purposes. Unknown targets are bombarded and known reactions are observed. For NRA, the accelerator, detectors, spectrum recording and interpretation must be reliable, simple, and fast. Other MeV ion-beam analytical techniques are described which are complementary to NRA, such as Rutherford backscattering (RBS), proton-induced x-ray emission (PIXE), and the more recent method of elastic recoil detection (ERD). Applications for NRA range from solid-state physics and electrochemistry, semiconductor technology, metallurgy, materials science, and surface science to biology and archeology.

  4. 48 CFR 215.404-1 - Proposal analysis techniques.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 48 Federal Acquisition Regulations System 3 2011-10-01 2011-10-01 false Proposal analysis... Contract Pricing 215.404-1 Proposal analysis techniques. (1) Follow the procedures at PGI 215.404-1 for proposal analysis. (2) For spare parts or support equipment, perform an analysis of— (i) Those line...

  5. Accelerator based techniques for contraband detection

    NASA Astrophysics Data System (ADS)

    Vourvopoulos, George

    1994-05-01

    It has been shown that narcotics, explosives, and other contraband materials, contain various chemical elements such as H, C, N, O, P, S, and Cl in quantities and ratios that differentiate them from each other and from other innocuous substances. Neutrons and γ-rays have the ability to penetrate through various materials at large depths. They are thus able, in a non-intrusive way, to interrogate volumes ranging from suitcases to Sea-Land containers, and have the ability to image the object with an appreciable degree of reliability. Neutron induced reactions such as (n, γ), (n, n') (n, p) or proton induced γ-resonance absorption are some of the reactions currently investigated for the identification of the chemical elements mentioned above. Various DC and pulsed techniques are discussed and their advantages, characteristics, and current progress are shown. Areas where use of these methods is currently under evaluation are detection of hidden explosives, illicit drug interdiction, chemical war agents identification, nuclear waste assay, nuclear weapons destruction and others.

  6. Injection Locking Techniques for Spectrum Analysis

    SciTech Connect

    Gathma, Timothy D.; Buckwalter, James F.

    2011-04-19

    Wideband spectrum analysis supports future communication systems that reconfigure and adapt to the capacity of the spectral environment. While test equipment manufacturers offer wideband spectrum analyzers with excellent sensitivity and resolution, these spectrum analyzers typically cannot offer acceptable size, weight, and power (SWAP). CMOS integrated circuits offer the potential to fully integrate spectrum analysis capability with analog front-end circuitry and digital signal processing on a single chip. Unfortunately, CMOS lacks high-Q passives and wideband resonator tunability that is necessary for heterodyne implementations of spectrum analyzers. As an alternative to the heterodyne receiver architectures, two nonlinear methods for performing wideband, low-power spectrum analysis are presented. The first method involves injecting the spectrum of interest into an array of injection-locked oscillators. The second method employs the closed loop dynamics of both injection locking and phase locking to independently estimate the injected frequency and power.

  7. Uncertainty analysis technique for OMEGA Dante measurementsa)

    NASA Astrophysics Data System (ADS)

    May, M. J.; Widmann, K.; Sorce, C.; Park, H.-S.; Schneider, M.

    2010-10-01

    The Dante is an 18 channel x-ray filtered diode array which records the spectrally and temporally resolved radiation flux from various targets (e.g., hohlraums, etc.) at x-ray energies between 50 eV and 10 keV. It is a main diagnostic installed on the OMEGA laser facility at the Laboratory for Laser Energetics, University of Rochester. The absolute flux is determined from the photometric calibration of the x-ray diodes, filters and mirrors, and an unfold algorithm. Understanding the errors on this absolute measurement is critical for understanding hohlraum energetic physics. We present a new method for quantifying the uncertainties on the determined flux using a Monte Carlo parameter variation technique. This technique combines the uncertainties in both the unfold algorithm and the error from the absolute calibration of each channel into a one sigma Gaussian error function. One thousand test voltage sets are created using these error functions and processed by the unfold algorithm to produce individual spectra and fluxes. Statistical methods are applied to the resultant set of fluxes to estimate error bars on the measurements.

  8. Uncertainty Analysis Technique for OMEGA Dante Measurements

    SciTech Connect

    May, M J; Widmann, K; Sorce, C; Park, H; Schneider, M

    2010-05-07

    The Dante is an 18 channel X-ray filtered diode array which records the spectrally and temporally resolved radiation flux from various targets (e.g. hohlraums, etc.) at X-ray energies between 50 eV to 10 keV. It is a main diagnostics installed on the OMEGA laser facility at the Laboratory for Laser Energetics, University of Rochester. The absolute flux is determined from the photometric calibration of the X-ray diodes, filters and mirrors and an unfold algorithm. Understanding the errors on this absolute measurement is critical for understanding hohlraum energetic physics. We present a new method for quantifying the uncertainties on the determined flux using a Monte-Carlo parameter variation technique. This technique combines the uncertainties in both the unfold algorithm and the error from the absolute calibration of each channel into a one sigma Gaussian error function. One thousand test voltage sets are created using these error functions and processed by the unfold algorithm to produce individual spectra and fluxes. Statistical methods are applied to the resultant set of fluxes to estimate error bars on the measurements.

  9. Uncertainty analysis technique for OMEGA Dante measurements

    SciTech Connect

    May, M. J.; Widmann, K.; Sorce, C.; Park, H.-S.; Schneider, M.

    2010-10-15

    The Dante is an 18 channel x-ray filtered diode array which records the spectrally and temporally resolved radiation flux from various targets (e.g., hohlraums, etc.) at x-ray energies between 50 eV and 10 keV. It is a main diagnostic installed on the OMEGA laser facility at the Laboratory for Laser Energetics, University of Rochester. The absolute flux is determined from the photometric calibration of the x-ray diodes, filters and mirrors, and an unfold algorithm. Understanding the errors on this absolute measurement is critical for understanding hohlraum energetic physics. We present a new method for quantifying the uncertainties on the determined flux using a Monte Carlo parameter variation technique. This technique combines the uncertainties in both the unfold algorithm and the error from the absolute calibration of each channel into a one sigma Gaussian error function. One thousand test voltage sets are created using these error functions and processed by the unfold algorithm to produce individual spectra and fluxes. Statistical methods are applied to the resultant set of fluxes to estimate error bars on the measurements.

  10. Automated fluid analysis apparatus and techniques

    DOEpatents

    Szecsody, James E.

    2004-03-16

    An automated device that couples a pair of differently sized sample loops with a syringe pump and a source of degassed water. A fluid sample is mounted at an inlet port and delivered to the sample loops. A selected sample from the sample loops is diluted in the syringe pump with the degassed water and fed to a flow through detector for analysis. The sample inlet is also directly connected to the syringe pump to selectively perform analysis without dilution. The device is airtight and used to detect oxygen-sensitive species, such as dithionite in groundwater following a remedial injection to treat soil contamination.

  11. Techniques for analysing ground-based UV-visible long-term BrO and NO2 observations for satellite validation and trend analysis

    NASA Astrophysics Data System (ADS)

    Kreher, Karin; Johnston, Paul; Hay, Timothy; Liley, Ben; Thomas, Alan; Martinez-Aviles, Monica; Friess, Udo; Bodeker, Greg; Schofield, Robyn; van Roozendael, Michel

    NIWA operates a network of zenith-sky viewing DOAS (Differential Optical Absorption Spec-troscopy) instruments to measure NO2 and BrO. The longest existing time series (1981 -present) of NO2 has been measured at Lauder (45oS), New Zealand and the trend of this long-term data set has been studied extensively. Here we present a summary of stratospheric NO2 trends observed at several Northern and Southern Hemisphere stations (including Lauder) and an update of our understanding of the observed hemispheric asymmetry. These trends provide an important anchor for the interpretation of NO2 trends measured by satellites. BrO observations are currently made by NIWA at two Southern Hemisphere sites, Lauder and Arrival Heights (78oS) with each data set spanning more than 15 years. The zenith sky BrO observations are complemented with direct sun observations at Lauder since 2001 and with MAX-DOAS (Multi-axis Differential Optical Absorption Spectroscopy) observations at Arrival Heights (78oS) since 1998. A retrieval technique to separate the tropospheric and stratospheric partial columns of BrO was developed for the combination of zenith sky and direct sun measurements -with the zenith sky observations providing predominantly the information on the stratospheric partial column and the direct sun observations providing the tropospheric contribution. This retrieval has now been applied to Lauder BrO UV-visible measurements for the whole time period (2001 -present) and the updated results including an upper limit of BrO in the troposphere and the stratospheric bromine loading will be presented. The retrieval method has now also been extended so that it can be applied to zenith sky data only. Furthermore, an independent retrieval algorithm has been developed including a forward model capable of dealing with multiple scattering (Monte Carlo radiative transfer model) to enable us to retrieve altitude information in the boundary layer and lower troposphere. This retrieval method has

  12. Permethylation Linkage Analysis Techniques for Residual Carbohydrates

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Permethylation analysis is the classic approach to establishing the position of glycosidic linkages between sugar residues. Typically, the carbohydrate is derivatized to form acid-stable methyl ethers, hydrolyzed, peracetylated, and analyzed by gas chromatography-mass spectrometry (GC-MS). The pos...

  13. Comparison of Hydrogen Sulfide Analysis Techniques

    ERIC Educational Resources Information Center

    Bethea, Robert M.

    1973-01-01

    A summary and critique of common methods of hydrogen sulfide analysis is presented. Procedures described are: reflectance from silver plates and lead acetate-coated tiles, lead acetate and mercuric chloride paper tapes, sodium nitroprusside and methylene blue wet chemical methods, infrared spectrophotometry, and gas chromatography. (BL)

  14. Analysis techniques for background rejection at the Majorana Demonstrator

    SciTech Connect

    Cuestra, Clara; Rielage, Keith Robert; Elliott, Steven Ray; Xu, Wenqin; Goett, John Jerome III

    2015-06-11

    The MAJORANA Collaboration is constructing the MAJORANA DEMONSTRATOR, an ultra-low background, 40-kg modular HPGe detector array to search for neutrinoless double beta decay in 76Ge. In view of the next generation of tonne-scale Ge-based 0νββ-decay searches that will probe the neutrino mass scale in the inverted-hierarchy region, a major goal of the MAJORANA DEMONSTRATOR is to demonstrate a path forward to achieving a background rate at or below 1 count/tonne/year in the 4 keV region of interest around the Q-value at 2039 keV. The background rejection techniques to be applied to the data include cuts based on data reduction, pulse shape analysis, event coincidences, and time correlations. The Point Contact design of the DEMONSTRATOR's germanium detectors allows for significant reduction of gamma background.

  15. Analysis techniques for background rejection at the MAJORANA DEMONSTRATOR

    NASA Astrophysics Data System (ADS)

    Cuesta, C.; Abgrall, N.; Arnquist, I. J.; Avignone, F. T.; Baldenegro-Barrera, C. X.; Barabash, A. S.; Bertrand, F. E.; Bradley, A. W.; Brudanin, V.; Busch, M.; Buuck, M.; Byram, D.; Caldwell, A. S.; Chan, Y.-D.; Christofferson, C. D.; Detwiler, J. A.; Efremenko, Yu.; Ejiri, H.; Elliott, S. R.; Galindo-Uribarri, A.; Gilliss, T.; Giovanetti, G. K.; Goett, J.; Green, M. P.; Gruszko, J.; Guinn, I. S.; Guiseppe, V. E.; Henning, R.; Hoppe, E. W.; Howard, S.; Howe, M. A.; Jasinski, B. R.; Keeter, K. J.; Kidd, M. F.; Konovalov, S. I.; Kouzes, R. T.; LaFerriere, B. D.; Leon, J.; MacMullin, J.; Martin, R. D.; Meijer, S. J.; Mertens, S.; Orrell, J. L.; O'Shaughnessy, C.; Poon, A. W. P.; Radford, D. C.; Rager, J.; Rielage, K.; Robertson, R. G. H.; Romero-Romero, E.; Shanks, B.; Shirchenko, M.; Snyder, N.; Suriano, A. M.; Tedeschi, D.; Trimble, J. E.; Varner, R. L.; Vasilyev, S.; Vetter, K.; Vorren, K.; White, B. R.; Wilkerson, J. F.; Wiseman, C.; Xu, W.; Yakushev, E.; Yu, C.-H.; Yumatov, V.; Zhitnikov, I.

    2015-08-01

    The MAJORANA Collaboration is constructing the MAJORANA DEMONSTRATOR, an ultra-low background, 40- kg modular HPGe detector array to search for neutrinoless double beta decay in 76Ge. In view of the next generation of tonne-scale Ge-based 0νβ β-decay searches that will probe the neutrino mass scale in the inverted-hierarchy region, a major goal of the MAJORANA DEMONSTRATOR is to demonstrate a path forward to achieving a background rate at or below 1 count/tonne/year in the 4 keV region of interest around the Q-value at 2039 keV. The background rejection techniques to be applied to the data include cuts based on data reduction, pulse shape analysis, event coincidences, and time correlations. The Point Contact design of the DEMONSTRATOR's germanium detectors allows for significant reduction of gamma background.

  16. Analysis techniques for background rejection at the MAJORANA DEMONSTRATOR

    SciTech Connect

    Cuesta, C.; Buuck, M.; Detwiler, J. A.; Gruszko, J.; Guinn, I. S.; Leon, J.; Robertson, R. G. H.; Abgrall, N.; Bradley, A. W.; Chan, Y-D.; Mertens, S.; Poon, A. W. P.; Arnquist, I. J.; Hoppe, E. W.; Kouzes, R. T.; LaFerriere, B. D.; Orrell, J. L.; Avignone, F. T.; Baldenegro-Barrera, C. X.; Bertrand, F. E.; and others

    2015-08-17

    The MAJORANA Collaboration is constructing the MAJORANA DEMONSTRATOR, an ultra-low background, 40- kg modular HPGe detector array to search for neutrinoless double beta decay in {sup 76}Ge. In view of the next generation of tonne-scale Ge-based 0νβ β-decay searches that will probe the neutrino mass scale in the inverted-hierarchy region, a major goal of the MAJORANA DEMONSTRATOR is to demonstrate a path forward to achieving a background rate at or below 1 count/tonne/year in the 4 keV region of interest around the Q-value at 2039 keV. The background rejection techniques to be applied to the data include cuts based on data reduction, pulse shape analysis, event coincidences, and time correlations. The Point Contact design of the DEMONSTRATOR’s germanium detectors allows for significant reduction of gamma background.

  17. Cognitive task analysis: Techniques applied to airborne weapons training

    SciTech Connect

    Terranova, M.; Seamster, T.L.; Snyder, C.E.; Treitler, I.E.; Carlow Associates, Inc., Fairfax, VA; Martin Marietta Energy Systems, Inc., Oak Ridge, TN; Tennessee Univ., Knoxville, TN )

    1989-01-01

    This is an introduction to cognitive task analysis as it may be used in Naval Air Systems Command (NAVAIR) training development. The focus of a cognitive task analysis is human knowledge, and its methods of analysis are those developed by cognitive psychologists. This paper explains the role that cognitive task analysis and presents the findings from a preliminary cognitive task analysis of airborne weapons operators. Cognitive task analysis is a collection of powerful techniques that are quantitative, computational, and rigorous. The techniques are currently not in wide use in the training community, so examples of this methodology are presented along with the results. 6 refs., 2 figs., 4 tabs.

  18. Automatic tumor segmentation using knowledge-based techniques.

    PubMed

    Clark, M C; Hall, L O; Goldgof, D B; Velthuizen, R; Murtagh, F R; Silbiger, M S

    1998-04-01

    A system that automatically segments and labels glioblastoma-multiforme tumors in magnetic resonance images (MRI's) of the human brain is presented. The MRI's consist of T1-weighted, proton density, and T2-weighted feature images and are processed by a system which integrates knowledge-based (KB) techniques with multispectral analysis. Initial segmentation is performed by an unsupervised clustering algorithm. The segmented image, along with cluster centers for each class are provided to a rule-based expert system which extracts the intracranial region. Multispectral histogram analysis separates suspected tumor from the rest of the intracranial region, with region analysis used in performing the final tumor labeling. This system has been trained on three volume data sets and tested on thirteen unseen volume data sets acquired from a single MRI system. The KB tumor segmentation was compared with supervised, radiologist-labeled "ground truth" tumor volumes and supervised k-nearest neighbors tumor segmentations. The results of this system generally correspond well to ground truth, both on a per slice basis and more importantly in tracking total tumor volume during treatment over time. PMID:9688151

  19. Analysis of doxorubicin distribution in MCF-7 cells treated with drug-loaded nanoparticles by combination of two fluorescence-based techniques, confocal spectral imaging and capillary electrophoresis.

    PubMed

    Gautier, Juliette; Munnier, Emilie; Soucé, Martin; Chourpa, Igor; Douziech Eyrolles, Laurence

    2015-05-01

    The intracellular distribution of the antiancer drug doxorubicin (DOX) was followed qualitatively by fluorescence confocal spectral imaging (FCSI) and quantitatively by capillary electrophoresis (CE). FCSI permits the localization of the major fluorescent species in cell compartments, with spectral shifts indicating the polarity of the respective environment. However, distinction between drug and metabolites by FCSI is difficult due to their similar fluorochromes, and direct quantification of their fluorescence is complicated by quantum yield variation between different subcellular environments. On the other hand, capillary electrophoresis with fluorescence detection (CE-LIF) is a quantitative method capable of separating doxorubicin and its metabolites. In this paper, we propose a method for determining drug and metabolite concentration in enriched nuclear and cytosolic fractions of cancer cells by CE-LIF, and we compare these data with those of FCSI. Significant differences in the subcellular distribution of DOX are observed between the drug administered as a molecular solution or as a suspension of drug-loaded iron oxide nanoparticles coated with polyethylene glycol. Comparative analysis of the CE-LIF vs FCSI data may lead to a tentative calibration of this latter method in terms of DOX fluorescence quantum yields in the nucleus and more or less polar regions of the cytosol. PMID:25749791

  20. Maintenance Audit through Value Analysis Technique: A Case Study

    NASA Astrophysics Data System (ADS)

    Carnero, M. C.; Delgado, S.

    2008-11-01

    The increase in competitiveness, technological changes and the increase in the requirements of quality and service have forced a change in the design and application of maintenance, as well as the way in which it is considered within the managerial strategy. There are numerous maintenance activities that must be developed in a service company. As a result the maintenance functions as a whole have to be outsourced. Nevertheless, delegating this subject to specialized personnel does not exempt the company from responsibilities, but rather leads to the need for control of each maintenance activity. In order to achieve this control and to evaluate the efficiency and effectiveness of the company it is essential to carry out an audit that diagnoses the problems that could develop. In this paper a maintenance audit applied to a service company is developed. The methodology applied is based on the expert systems. The expert system by means of rules uses the weighting technique SMART and value analysis to obtain the weighting between the decision functions and between the alternatives. The expert system applies numerous rules and relations between different variables associated with the specific maintenance functions, to obtain the maintenance state by sections and the general maintenance state of the enterprise. The contributions of this paper are related to the development of a maintenance audit in a service enterprise, in which maintenance is not generally considered a strategic subject and to the integration of decision-making tools such as the weighting technique SMART with value analysis techniques, typical in the design of new products, in the area of the rule-based expert systems.

  1. Using Conceptual Analysis To Build Knowledge Bases.

    ERIC Educational Resources Information Center

    Shinghal, Rajjan; Le Xuan, Albert

    This paper describes the methods and techniques called Conceptual Analysis (CA), a rigorous procedure to generate (without involuntary omissions and repetitions) knowledge bases for the development of knowledge-based systems. An introduction is given of CA and how it can be used to produce knowledge bases. A discussion is presented on what is…

  2. On the factors governing water vapor turbulence mixing in the convective boundary layer over land: Concept and data analysis technique using ground-based lidar measurements.

    PubMed

    Pal, Sandip

    2016-06-01

    The convective boundary layer (CBL) turbulence is the key process for exchanging heat, momentum, moisture and trace gases between the earth's surface and the lower part of the troposphere. The turbulence parameterization of the CBL is a challenging but important component in numerical models. In particular, correct estimation of CBL turbulence features, parameterization, and the determination of the contribution of eddy diffusivity are important for simulating convection initiation, and the dispersion of health hazardous air pollutants and Greenhouse gases. In general, measurements of higher-order moments of water vapor mixing ratio (q) variability yield unique estimates of turbulence in the CBL. Using the high-resolution lidar-derived profiles of q variance, third-order moment, and skewness and analyzing concurrent profiles of vertical velocity, potential temperature, horizontal wind and time series of near-surface measurements of surface flux and meteorological parameters, a conceptual framework based on bottom up approach is proposed here for the first time for a robust characterization of the turbulent structure of CBL over land so that our understanding on the processes governing CBL q turbulence could be improved. Finally, principal component analyses will be applied on the lidar-derived long-term data sets of q turbulence statistics to identify the meteorological factors and the dominant physical mechanisms governing the CBL turbulence features. PMID:26950615

  3. FDI and Accommodation Using NN Based Techniques

    NASA Astrophysics Data System (ADS)

    Garcia, Ramon Ferreiro; de Miguel Catoira, Alberto; Sanz, Beatriz Ferreiro

    Massive application of dynamic backpropagation neural networks is used on closed loop control FDI (fault detection and isolation) tasks. The process dynamics is mapped by means of a trained backpropagation NN to be applied on residual generation. Process supervision is then applied to discriminate faults on process sensors, and process plant parameters. A rule based expert system is used to implement the decision making task and the corresponding solution in terms of faults accommodation and/or reconfiguration. Results show an efficient and robust FDI system which could be used as the core of an SCADA or alternatively as a complement supervision tool operating in parallel with the SCADA when applied on a heat exchanger.

  4. Retrieval techniques and information content analysis to improve remote sensing of atmospheric water vapor, liquid water and temperature from ground-based microwave radiometer measurements

    NASA Astrophysics Data System (ADS)

    Sahoo, Swaroop

    Observation of profiles of temperature, humidity and winds with sufficient accuracy and fine vertical and temporal resolution are needed to improve mesoscale weather prediction, track conditions in the lower to mid-troposphere, predict winds for renewable energy, inform the public of severe weather and improve transportation safety. In comparing these thermodynamic variables, the absolute atmospheric temperature varies only by 15%; in contrast, total water vapor may change by up to 50% over several hours. In addition, numerical weather prediction (NWP) models are initialized using water vapor profile information, so improvements in their accuracy and resolution tend to improve the accuracy of NWP. Current water vapor profile observation systems are expensive and have insufficient spatial coverage to observe humidity in the lower to mid-troposphere. To address this important scientific need, the principal objective of this dissertation is to improve the accuracy, vertical resolution and revisit time of tropospheric water vapor profiles retrieved from microwave and millimeter-wave brightness temperature measurements. This dissertation advances the state of knowledge of retrieval of atmospheric water vapor from microwave brightness temperature measurements. It focuses on optimizing two information sources of interest for water vapor profile retrieval, i.e. independent measurements and background data set size. From a theoretical perspective, it determines sets of frequencies in the ranges of 20-23, 85-90 and 165-200 GHz that are optimal for water vapor retrieval from each of ground-based and airborne radiometers. The maximum number of degrees of freedom for the selected frequencies for ground-based radiometers is 5-6, while the optimum vertical resolution is 0.5 to 1.5 km. On the other hand, the maximum number of degrees of freedom for airborne radiometers is 8-9, while the optimum vertical resolution is 0.2 to 0.5 km. From an experimental perspective, brightness

  5. Techniques for region coding in object-based image compression

    NASA Astrophysics Data System (ADS)

    Schmalz, Mark S.

    2004-01-01

    Object-based compression (OBC) is an emerging technology that combines region segmentation and coding to produce a compact representation of a digital image or video sequence. Previous research has focused on a variety of segmentation and representation techniques for regions that comprise an image. The author has previously suggested [1] partitioning of the OBC problem into three steps: (1) region segmentation, (2) region boundary extraction and compression, and (3) region contents compression. A companion paper [2] surveys implementationally feasible techniques for boundary compression. In this paper, we analyze several strategies for region contents compression, including lossless compression, lossy VPIC, EPIC, and EBLAST compression, wavelet-based coding (e.g., JPEG-2000), as well as texture matching approaches. This paper is part of a larger study that seeks to develop highly efficient compression algorithms for still and video imagery, which would eventually support automated object recognition (AOR) and semantic lookup of images in large databases or high-volume OBC-format datastreams. Example applications include querying journalistic archives, scientific or medical imaging, surveillance image processing and target tracking, as well as compression of video for transmission over the Internet. Analysis emphasizes time and space complexity, as well as sources of reconstruction error in decompressed imagery.

  6. Envelopment technique and topographic overlays in bite mark analysis

    PubMed Central

    Djeapragassam, Parimala; Daniel, Mariappan Jonathan; Srinivasan, Subramanian Vasudevan; Ramadoss, Koliyan; Jimsha, Vannathan Kumaran

    2015-01-01

    Aims and Objectives: The aims and objectives of our study were to compare four sequential overlays generated using the envelopment technique and to evaluate inter- and intraoperator reliability of the overlays obtained by the envelopment technique. Materials and Methods: Dental stone models were prepared from impressions made from healthy individuals; photographs were taken and computer-assisted overlays were generated. The models were then enveloped in a different-color dental stone. After this, four sequential cuts were made at a thickness of 1mm each. Each sectional cut was photographed and overlays were generated. Thus, 125 overlays were generated and compared. Results: The scoring was done based on matching accuracy and the data were analyzed. The Kruskal-Wallis one-way analysis of variance (ANOVA) test was used to compare four sequential overlays and Spearman's rank correlation tests were used to evaluate the inter- and intraoperator reliability of the overlays obtained by the envelopment technique. Conclusion: Through our study, we conclude that the third and fourth cuts were the best among the four cuts and inter- and intraoperator reliability were found to be statistically significant at 5% level that is 95% confidence interval (P < 0.05). PMID:26816458

  7. Measurement techniques in animal locomotion analysis.

    PubMed

    Schamhardt, H C; van den Bogert, A J; Hartman, W

    1993-01-01

    Animal performance can be determined by subjective observations or objective measurements. Numerical data are only then superior to results of subjective observations when they are the result of measurements carried out to test a well-defined hypothesis or to give the answer to a clear, precisely formulated question. In the analysis of kinematics a careful evaluation of the set-up of the measurement equipment and the resulting accuracy in the data is required. Measurements in three dimensions (3D) are theoretically better than those in 2D. Practically, however, collection, analysis, interpretation and presentation of 3D data are so much more complicated that frequently 2D analysis appears to be more useful. The minimal size of markers necessary to obtain a certain accuracy in kinematic data is usually too big for practical use. Smaller markers impair accuracy. Reduction of measurement noise is obligatory when time derivatives are to be calculated. Skin movement artefacts cannot be removed by data smoothing. Forces occurring between the digits and the ground can be determined using a force plate or an instrumented shoe. A force plate is accurate, but repeated trials are necessary. Using a force shoe each ground contact results in useful data. However, the shoe itself may affect locomotion. Surface strains on long bones can be recorded relatively easily. Determination of loading forces from surface strains is complicated but can be carried out using multiple strain gauges and a post-mortem calibration test. Strain in tendons is difficult to measure due to problems in defining a'zero' or reference length.(ABSTRACT TRUNCATED AT 250 WORDS) PMID:8470454

  8. Speckle-adaptive VISAR fringe analysis technique

    NASA Astrophysics Data System (ADS)

    Erskine, David

    2015-06-01

    A line-VISAR (velocity interferometer) is an important diagnostic in shock physics, simultaneously measuring many fringe histories of adjacent portions of a target splayed along a line on a target, with fringes recorded vs time and space by a streak camera. Due to laser illumination speckle (spatial intensity variation), target surface unevenness, or rapid spatial variation of target physics, conventional fringe analysis algorithms which do not properly model these variations can suffer from inferred velocity (fringe phase) errors. A speckle-adaptive algorithm has been developed which senses the interferometer and illumination parameters for each individual row (spatial position Y) of the 2d interferogram, so that the interferogram can be compensated for Y-dependent nonfringing intensity, fringe visibility, and nonlinear phase distribution. In numerical simulations and on actual data we have found this individual row-by-row modeling improves the accuracy of the result, compared to a conventional column-by-column analysis approach. Prepared by LLNL under Contract DE-AC52-07NA27344.

  9. MULTIELEMENTAL ANALYTICAL TECHNIQUES FOR HAZARDOUS WASTE ANALYSIS: THE STATE-OF-THE-ART

    EPA Science Inventory

    Based on a comprehensive review of the literature, the multielemental techniques of inductively coupled plasma optical emission spectroscopy (ICP), x-ray fluorescence (XRF) and instrumental neutron activation analysis (INAA) have been compared for the determination of antimony, a...

  10. Vortex metrology using Fourier analysis techniques: vortex networks correlation fringes.

    PubMed

    Angel-Toro, Luciano; Sierra-Sosa, Daniel; Tebaldi, Myrian; Bolognini, Néstor

    2012-10-20

    In this work, we introduce an alternative method of analysis in vortex metrology based on the application of the Fourier optics techniques. The first part of the procedure is conducted as is usual in vortex metrology for uniform in-plane displacement determination. On the basis of two recorded intensity speckled distributions, corresponding to two states of a diffuser coherently illuminated, we numerically generate an analytical signal from each recorded intensity pattern by using a version of the Riesz integral transform. Then, from each analytical signal, a two-dimensional pseudophase map is generated in which the vortices are located and characterized in terms of their topological charges and their core's structural properties. The second part of the procedure allows obtaining Young's interference fringes when Fourier transforming the light passing through a diffracting mask with multiple apertures at the locations of the homologous vortices. In fact, we use the Fourier transform as a mathematical operation to compute the far-field diffraction intensity pattern corresponding to the multiaperture set. Each aperture from the set is associated with a rectangular hole that coincides both in shape and size with a pixel from recorded images. We show that the fringe analysis can be conducted as in speckle photography in an extended range of displacement measurements. Effects related with speckled decorrelation are also considered. Our experimental results agree with those of speckle photography in the range in which both techniques are applicable. PMID:23089799

  11. Comparison Of Four FFT-Based Frequency-Acquisition Techniques

    NASA Technical Reports Server (NTRS)

    Shah, Biren N.; Hinedi, Sami M.; Holmes, Jack K.

    1993-01-01

    Report presents comparative theoretical analysis of four conceptual techniques for initial estimation of carrier frequency of suppressed-carrier, binary-phase-shift-keyed radio signal. Each technique effected by open-loop analog/digital signal-processing subsystem part of Costas-loop phase-error detector functioning in closed-loop manner overall.

  12. Flexible control techniques for a lunar base

    NASA Technical Reports Server (NTRS)

    Kraus, Thomas W.

    1992-01-01

    applications with little or no customization. This means that lunar process control projects will not be delayed by unforeseen problems or last minute process modifications. The software will include all of the tools needed to adapt to virtually any changes. In contrast to other space programs which required the development of tremendous amounts of custom software, lunar-based processing facilities will benefit from the use of existing software technology which is being proven in commercial applications on Earth.

  13. Validation of Design and Analysis Techniques of Tailored Composite Structures

    NASA Technical Reports Server (NTRS)

    Jegley, Dawn C. (Technical Monitor); Wijayratne, Dulnath D.

    2004-01-01

    Aeroelasticity is the relationship between the elasticity of an aircraft structure and its aerodynamics. This relationship can cause instabilities such as flutter in a wing. Engineers have long studied aeroelasticity to ensure such instabilities do not become a problem within normal operating conditions. In recent decades structural tailoring has been used to take advantage of aeroelasticity. It is possible to tailor an aircraft structure to respond favorably to multiple different flight regimes such as takeoff, landing, cruise, 2-g pull up, etc. Structures can be designed so that these responses provide an aerodynamic advantage. This research investigates the ability to design and analyze tailored structures made from filamentary composites. Specifically the accuracy of tailored composite analysis must be verified if this design technique is to become feasible. To pursue this idea, a validation experiment has been performed on a small-scale filamentary composite wing box. The box is tailored such that its cover panels induce a global bend-twist coupling under an applied load. Two types of analysis were chosen for the experiment. The first is a closed form analysis based on a theoretical model of a single cell tailored box beam and the second is a finite element analysis. The predicted results are compared with the measured data to validate the analyses. The comparison of results show that the finite element analysis is capable of predicting displacements and strains to within 10% on the small-scale structure. The closed form code is consistently able to predict the wing box bending to 25% of the measured value. This error is expected due to simplifying assumptions in the closed form analysis. Differences between the closed form code representation and the wing box specimen caused large errors in the twist prediction. The closed form analysis prediction of twist has not been validated from this test.

  14. Evaluation of energy system analysis techniques for identifying underground facilities

    SciTech Connect

    VanKuiken, J.C.; Kavicky, J.A.; Portante, E.C.

    1996-03-01

    This report describes the results of a study to determine the feasibility and potential usefulness of applying energy system analysis techniques to help detect and characterize underground facilities that could be used for clandestine activities. Four off-the-shelf energy system modeling tools were considered: (1) ENPEP (Energy and Power Evaluation Program) - a total energy system supply/demand model, (2) ICARUS (Investigation of Costs and Reliability in Utility Systems) - an electric utility system dispatching (or production cost and reliability) model, (3) SMN (Spot Market Network) - an aggregate electric power transmission network model, and (4) PECO/LF (Philadelphia Electric Company/Load Flow) - a detailed electricity load flow model. For the purposes of most of this work, underground facilities were assumed to consume about 500 kW to 3 MW of electricity. For some of the work, facilities as large as 10-20 MW were considered. The analysis of each model was conducted in three stages: data evaluation, base-case analysis, and comparative case analysis. For ENPEP and ICARUS, open source data from Pakistan were used for the evaluations. For SMN and PECO/LF, the country data were not readily available, so data for the state of Arizona were used to test the general concept.

  15. Reliability analysis of laminated CMC components through shell subelement techniques

    NASA Technical Reports Server (NTRS)

    Starlinger, Alois; Duffy, Stephen F.; Gyekenyesi, John P.

    1992-01-01

    An updated version of the integrated design program Composite Ceramics Analysis and Reliability Evaluation of Structures (C/CARES) was developed for the reliability evaluation of ceramic matrix composites (CMC) laminated shell components. The algorithm is now split into two modules: a finite-element data interface program and a reliability evaluation algorithm. More flexibility is achieved, allowing for easy implementation with various finite-element programs. The interface program creates a neutral data base which is then read by the reliability module. This neutral data base concept allows easy data transfer between different computer systems. The new interface program from the finite-element code Matrix Automated Reduction and Coupling (MARC) also includes the option of using hybrid laminates (a combination of plies of different materials or different layups) and allows for variations in temperature fields throughout the component. In the current version of C/CARES, a subelement technique was implemented, enabling stress gradients within an element to be taken into account. The noninteractive reliability function is now evaluated at each Gaussian integration point instead of using averaging techniques. As a result of the increased number of stress evaluation points, considerable improvements in the accuracy of reliability analyses were realized.

  16. Hyperspectral-imaging-based techniques applied to wheat kernels characterization

    NASA Astrophysics Data System (ADS)

    Serranti, Silvia; Cesare, Daniela; Bonifazi, Giuseppe

    2012-05-01

    Single kernels of durum wheat have been analyzed by hyperspectral imaging (HSI). Such an approach is based on the utilization of an integrated hardware and software architecture able to digitally capture and handle spectra as an image sequence, as they results along a pre-defined alignment on a surface sample properly energized. The study was addressed to investigate the possibility to apply HSI techniques for classification of different types of wheat kernels: vitreous, yellow berry and fusarium-damaged. Reflectance spectra of selected wheat kernels of the three typologies have been acquired by a laboratory device equipped with an HSI system working in near infrared field (1000-1700 nm). The hypercubes were analyzed applying principal component analysis (PCA) to reduce the high dimensionality of data and for selecting some effective wavelengths. Partial least squares discriminant analysis (PLS-DA) was applied for classification of the three wheat typologies. The study demonstrated that good classification results were obtained not only considering the entire investigated wavelength range, but also selecting only four optimal wavelengths (1104, 1384, 1454 and 1650 nm) out of 121. The developed procedures based on HSI can be utilized for quality control purposes or for the definition of innovative sorting logics of wheat.

  17. PVUSA instrumentation and data analysis techniques for photovoltaic systems

    SciTech Connect

    Newmiller, J.; Hutchinson, P.; Townsend, T.; Whitaker, C.

    1995-10-01

    The Photovoltaics for Utility Scale Applications (PVUSA) project tests two types of PV systems at the main test site in Davis, California: new module technologies fielded as 20-kW Emerging Module Technology (EMT) arrays and more mature technologies fielded as 70- to 500-kW turnkey Utility-Scale (US) systems. PVUSA members have also installed systems in their service areas. Designed appropriately, data acquisition systems (DASs) can be a convenient and reliable means of assessing system performance, value, and health. Improperly designed, they can be complicated, difficult to use and maintain, and provide data of questionable validity. This report documents PVUSA PV system instrumentation and data analysis techniques and lessons learned. The report is intended to assist utility engineers, PV system designers, and project managers in establishing an objective, then, through a logical series of topics, facilitate selection and design of a DAS to meet the objective. Report sections include Performance Reporting Objectives (including operational versus research DAS), Recommended Measurements, Measurement Techniques, Calibration Issues, and Data Processing and Analysis Techniques. Conclusions and recommendations based on the several years of operation and performance monitoring are offered. This report is one in a series of 1994--1995 PVUSA reports documenting PVUSA lessons learned at the demonstration sites in Davis and Kerman, California. Other topical reports address: five-year assessment of EMTs; validation of the Kerman 500-kW grid support PV plant benefits; construction and safety experience in installing and operating PV systems; balance-of-system design and costs; procurement, acceptance, and rating practices for PV power plants; experience with power conditioning units and power quality.

  18. Search for the top quark using multivariate analysis techniques

    SciTech Connect

    Bhat, P.C.; D0 Collaboration

    1994-08-01

    The D0 collaboration is developing top search strategies using multivariate analysis techniques. We report here on applications of the H-matrix method to the e{mu} channel and neural networks to the e+jets channel.

  19. Frequency Analysis Techniques for Identification of Viral Genetic Data

    PubMed Central

    Trifonov, Vladimir; Rabadan, Raul

    2010-01-01

    Environmental metagenomic samples and samples obtained as an attempt to identify a pathogen associated with the emergence of a novel infectious disease are important sources of novel microorganisms. The low costs and high throughput of sequencing technologies are expected to allow for the genetic material in those samples to be sequenced and the genomes of the novel microorganisms to be identified by alignment to those in a database of known genomes. Yet, for various biological and technical reasons, such alignment might not always be possible. We investigate a frequency analysis technique which on one hand allows for the identification of genetic material without relying on alignment and on the other hand makes possible the discovery of nonoverlapping contigs from the same organism. The technique is based on obtaining signatures of the genetic data and defining a distance/similarity measure between signatures. More precisely, the signatures of the genetic data are the frequencies of k-mers occurring in them, with k being a natural number. We considered an entropy-based distance between signatures, similar to the Kullback-Leibler distance in information theory, and investigated its ability to categorize negative-sense single-stranded RNA (ssRNA) viral genetic data. Our conclusion is that in this viral context, the technique provides a viable way of discovering genetic relationships without relying on alignment. We envision that our approach will be applicable to other microbial genetic contexts, e.g., other types of viruses, and will be an important tool in the discovery of novel microorganisms. PMID:20824103

  20. Application of Wavelet Unfolding Technique in Neutron Spectroscopic Analysis

    NASA Astrophysics Data System (ADS)

    Hartman, Jessica; Barzilov, Alexander

    Nonproliferation of nuclear materials is important in nuclear power industry and fuel cycle facilities. It requires technologies capable of measuring and assessing the radiation signatures of fission events. Neutrons produced in spontaneous or induced fission reactions are mainly fast. The neutron energy information allows characterization of nuclear materials and neutron sources. It can also be applied in remote sensing and source search tasks. The plastic scintillator EJ-299-33A was studied as a fast neutron detector. The detector response to a polyenergetic flux was unfolded usingthe multiple linear regression method. It yields the intensities of neutron flux of particular energy, hence, enabling the spectroscopic analysis. The wavelet technique was evaluated for the unfolding of neutron spectrum using the scintillator's response functions between 1 MeV and 14 MeV computed with the MCNPX code. This paperpresents the computational results of the wavelet-based spectrum unfolding applied to a scintillator detector with neutron / photon pulse shape discrimination properties.

  1. A study of trends and techniques for space base electronics

    NASA Technical Reports Server (NTRS)

    Trotter, J. D.; Wade, T. E.; Gassaway, J. D.; Mahmood, Q.

    1978-01-01

    A sputtering system was developed to deposit aluminum and aluminum alloys by the dc sputtering technique. This system is designed for a high level of cleanliness and for monitoring the deposition parameters during film preparation. This system is now ready for studying the deposition and annealing parameters upon double-level metal preparation. A technique recently applied for semiconductor analysis, the finite element method, was studied for use in the computer modeling of two dimensional MOS transistor structures. It was concluded that the method has not been sufficiently well developed for confident use at this time. An algorithm was developed for confident use at this time. An algorithm was developed for implementing a computer study which is based upon the finite difference method. The program which was developed was modified and used to calculate redistribution data for boron and phosphorous which had been predeposited by ion implantation with range and straggle conditions. Data were generated for 111 oriented SOS films with redistribution in N2, dry O2 and steam ambients.

  2. Model-checking techniques based on cumulative residuals.

    PubMed

    Lin, D Y; Wei, L J; Ying, Z

    2002-03-01

    Residuals have long been used for graphical and numerical examinations of the adequacy of regression models. Conventional residual analysis based on the plots of raw residuals or their smoothed curves is highly subjective, whereas most numerical goodness-of-fit tests provide little information about the nature of model misspecification. In this paper, we develop objective and informative model-checking techniques by taking the cumulative sums of residuals over certain coordinates (e.g., covariates or fitted values) or by considering some related aggregates of residuals, such as moving sums and moving averages. For a variety of statistical models and data structures, including generalized linear models with independent or dependent observations, the distributions of these stochastic processes tinder the assumed model can be approximated by the distributions of certain zero-mean Gaussian processes whose realizations can be easily generated by computer simulation. Each observed process can then be compared, both graphically and numerically, with a number of realizations from the Gaussian process. Such comparisons enable one to assess objectively whether a trend seen in a residual plot reflects model misspecification or natural variation. The proposed techniques are particularly useful in checking the functional form of a covariate and the link function. Illustrations with several medical studies are provided. PMID:11890304

  3. A Word-Based Compression Technique for Text Files.

    ERIC Educational Resources Information Center

    Vernor, Russel L., III; Weiss, Stephen F.

    1978-01-01

    Presents a word-based technique for storing natural language text in compact form. The compressed text consists of a dictionary and a text that is a combination of actual running text and pointers to the dictionary. This technique has shown itself to be effective for both text storage and retrieval. (VT)

  4. Astronomical Image Compression Techniques Based on ACC and KLT Coder

    NASA Astrophysics Data System (ADS)

    Schindler, J.; Páta, P.; Klíma, M.; Fliegel, K.

    This paper deals with a compression of image data in applications in astronomy. Astronomical images have typical specific properties -- high grayscale bit depth, size, noise occurrence and special processing algorithms. They belong to the class of scientific images. Their processing and compression is quite different from the classical approach of multimedia image processing. The database of images from BOOTES (Burst Observer and Optical Transient Exploring System) has been chosen as a source of the testing signal. BOOTES is a Czech-Spanish robotic telescope for observing AGN (active galactic nuclei) and the optical transient of GRB (gamma ray bursts) searching. This paper discusses an approach based on an analysis of statistical properties of image data. A comparison of two irrelevancy reduction methods is presented from a scientific (astrometric and photometric) point of view. The first method is based on a statistical approach, using the Karhunen-Loève transform (KLT) with uniform quantization in the spectral domain. The second technique is derived from wavelet decomposition with adaptive selection of used prediction coefficients. Finally, the comparison of three redundancy reduction methods is discussed. Multimedia format JPEG2000 and HCOMPRESS, designed especially for astronomical images, are compared with the new Astronomical Context Coder (ACC) coder based on adaptive median regression.

  5. Principals Use Research-Based Techniques for Facilitating School Effectiveness.

    ERIC Educational Resources Information Center

    Hord, Shirley M.; Hall, Gene E.

    Research shows that principals with strong leadership qualities are a critical factor in effective schools. This paper describes three research based techniques that principals can use when making decisions about how to help teachers develop their skills. The Concerns Based Adoption Model (CBAM) is an empirically based conceptual framework that…

  6. Damage identification techniques via modal curvature analysis: Overview and comparison

    NASA Astrophysics Data System (ADS)

    Dessi, Daniele; Camerlengo, Gabriele

    2015-02-01

    This paper aims to compare several damage identification methods based on the analysis of modal curvature and related quantities (natural frequencies and modal strain energy) by evaluating their performances on the same test case, a damaged Euler-Bernoulli beam. Damage is modelled as a localized and uniform reduction of stiffness so that closed-form expressions of the mode-shape curvatures can be analytically computed and data accuracy, which affects final results, can be controlled. The selected techniques belong to two categories: one includes several methods that need reference data for detecting structural modifications due to damage, the second group, including the modified Laplacian operator and the fractal dimension, avoids the knowledge of the undamaged behavior for issuing a damage diagnosis. To explain better the different performances of the methods, the mathematical formulation has been revised in some cases so as to fit into a common framework where the underlying hypotheses are clearly stated. Because the various damage indexes are calculated on 'exact' data, a sensitivity analysis has been carried out with respect to the number of points where curvature information is available, to the position of damage between adjacent points, to the modes involved in the index computation. In this way, this analysis intends to point out comparatively the capability of locating and estimating damage of each method along with some critical issues already present with noiseless data.

  7. Wavelet-based polarimetry analysis

    NASA Astrophysics Data System (ADS)

    Ezekiel, Soundararajan; Harrity, Kyle; Farag, Waleed; Alford, Mark; Ferris, David; Blasch, Erik

    2014-06-01

    Wavelet transformation has become a cutting edge and promising approach in the field of image and signal processing. A wavelet is a waveform of effectively limited duration that has an average value of zero. Wavelet analysis is done by breaking up the signal into shifted and scaled versions of the original signal. The key advantage of a wavelet is that it is capable of revealing smaller changes, trends, and breakdown points that are not revealed by other techniques such as Fourier analysis. The phenomenon of polarization has been studied for quite some time and is a very useful tool for target detection and tracking. Long Wave Infrared (LWIR) polarization is beneficial for detecting camouflaged objects and is a useful approach when identifying and distinguishing manmade objects from natural clutter. In addition, the Stokes Polarization Parameters, which are calculated from 0°, 45°, 90°, 135° right circular, and left circular intensity measurements, provide spatial orientations of target features and suppress natural features. In this paper, we propose a wavelet-based polarimetry analysis (WPA) method to analyze Long Wave Infrared Polarimetry Imagery to discriminate targets such as dismounts and vehicles from background clutter. These parameters can be used for image thresholding and segmentation. Experimental results show the wavelet-based polarimetry analysis is efficient and can be used in a wide range of applications such as change detection, shape extraction, target recognition, and feature-aided tracking.

  8. Comparison of laser transit anemometry data analysis techniques

    NASA Technical Reports Server (NTRS)

    Humphreys, William M., Jr.; Gartrell, Luther R.

    1991-01-01

    Two techniques for the extraction of two-dimensional flow information from laser transit anemometry (LTA) data sets are presented and compared via a simulation study and experimental investigation. The methods are a probability density function (PDF) estimation technique and a marginal distribution analysis technique. The simulation study builds on the results of previous work and provides a quantification of the accuracy of both techniques for various LTA data acquisition scenarios. The experimental comparison consists of using an LTA system to survey the flow downstream of a turbulence generator in a small low-speed wind tunnel. The collected data sets are analyzed and compared.

  9. The detection of bulk explosives using nuclear-based techniques

    SciTech Connect

    Morgado, R.E.; Gozani, T.; Seher, C.C.

    1988-01-01

    In 1986 we presented a rationale for the detection of bulk explosives based on nuclear techniques that addressed the requirements of civil aviation security in the airport environment. Since then, efforts have intensified to implement a system based on thermal neutron activation (TNA), with new work developing in fast neutron and energetic photon reactions. In this paper we will describe these techniques and present new results from laboratory and airport testing. Based on preliminary results, we contended in our earlier paper that nuclear-based techniques did provide sufficiently penetrating probes and distinguishable detectable reaction products to achieve the FAA operational goals; new data have supported this contention. The status of nuclear-based techniques for the detection of bulk explosives presently under investigation by the US Federal Aviation Administration (FAA) is reviewed. These include thermal neutron activation (TNA), fast neutron activation (FNA), the associated particle technique, nuclear resonance absorption, and photoneutron activation. The results of comprehensive airport testing of the TNA system performed during 1987-88 are summarized. From a technical point of view, nuclear-based techniques now represent the most comprehensive and feasible approach for meeting the operational criteria of detection, false alarms, and throughput. 9 refs., 5 figs., 2 tabs.

  10. Wood lens design philosophy based on a binary additive manufacturing technique

    NASA Astrophysics Data System (ADS)

    Marasco, Peter L.; Bailey, Christopher

    2016-04-01

    Using additive manufacturing techniques in optical engineering to construct a gradient index (GRIN) optic may overcome a number of limitations of GRIN technology. Such techniques are maturing quickly, yielding additional design degrees of freedom for the engineer. How best to employ these degrees of freedom is not completely clear at this time. This paper describes a preliminary design philosophy, including assumptions, pertaining to a particular printing technique for GRIN optics. It includes an analysis based on simulation and initial component measurement.

  11. High-level power analysis and optimization techniques

    NASA Astrophysics Data System (ADS)

    Raghunathan, Anand

    1997-12-01

    This thesis combines two ubiquitous trends in the VLSI design world--the move towards designing at higher levels of design abstraction, and the increasing importance of power consumption as a design metric. Power estimation and optimization tools are becoming an increasingly important part of design flows, driven by a variety of requirements such as prolonging battery life in portable computing and communication devices, thermal considerations and system cooling and packaging costs, reliability issues (e.g. electromigration, ground bounce, and I-R drops in the power network), and environmental concerns. This thesis presents a suite of techniques to automatically perform power analysis and optimization for designs at the architecture or register-transfer, and behavior or algorithm levels of the design hierarchy. High-level synthesis refers to the process of synthesizing, from an abstract behavioral description, a register-transfer implementation that satisfies the desired constraints. High-level synthesis tools typically perform one or more of the following tasks: transformations, module selection, clock selection, scheduling, and resource allocation and assignment (also called resource sharing or hardware sharing). High-level synthesis techniques for minimizing the area, maximizing the performance, and enhancing the testability of the synthesized designs have been investigated. This thesis presents high-level synthesis techniques that minimize power consumption in the synthesized data paths. This thesis investigates the effects of resource sharing on the power consumption in the data path, provides techniques to efficiently estimate power consumption during resource sharing, and resource sharing algorithms to minimize power consumption. The RTL circuit that is obtained from the high-level synthesis process can be further optimized for power by applying power-reducing RTL transformations. This thesis presents macro-modeling and estimation techniques for switching

  12. Application of glyph-based techniques for multivariate engineering visualization

    NASA Astrophysics Data System (ADS)

    Glazar, Vladimir; Marunic, Gordana; Percic, Marko; Butkovic, Zlatko

    2016-01-01

    This article presents a review of glyph-based techniques for engineering visualization as well as practical application for the multivariate visualization process. Two glyph techniques, Chernoff faces and star glyphs, uncommonly used in engineering practice, are described, applied to the selected data set, run through the chosen optimization methods and user evaluated. As an example of how these techniques function, a set of data for the optimization of a heat exchanger with a microchannel coil is adopted for visualization. The results acquired by the chosen visualization techniques are related to the results of optimization carried out by the response surface method and compared with the results of user evaluation. Based on the data set from engineering research and practice, the advantages and disadvantages of these techniques for engineering visualization are identified and discussed.

  13. Metrology Optical Power Budgeting in SIM Using Statistical Analysis Techniques

    NASA Technical Reports Server (NTRS)

    Kuan, Gary M

    2008-01-01

    The Space Interferometry Mission (SIM) is a space-based stellar interferometry instrument, consisting of up to three interferometers, which will be capable of micro-arc second resolution. Alignment knowledge of the three interferometer baselines requires a three-dimensional, 14-leg truss with each leg being monitored by an external metrology gauge. In addition, each of the three interferometers requires an internal metrology gauge to monitor the optical path length differences between the two sides. Both external and internal metrology gauges are interferometry based, operating at a wavelength of 1319 nanometers. Each gauge has fiber inputs delivering measurement and local oscillator (LO) power, split into probe-LO and reference-LO beam pairs. These beams experience power loss due to a variety of mechanisms including, but not restricted to, design efficiency, material attenuation, element misalignment, diffraction, and coupling efficiency. Since the attenuation due to these sources may degrade over time, an accounting of the range of expected attenuation is needed so an optical power margin can be book kept. A method of statistical optical power analysis and budgeting, based on a technique developed for deep space RF telecommunications, is described in this paper and provides a numerical confidence level for having sufficient optical power relative to mission metrology performance requirements.

  14. Hyphenated techniques and their applications in natural products analysis.

    PubMed

    Sarker, Satyajit D; Nahar, Lutfun

    2012-01-01

    A technique where a separation technique is coupled with an online spectroscopic detection technology is known as hyphenated technique, e.g., GC-MS, LC-PDA, LC-MS, LC-FTIR, LC-NMR, LC-NMR-MS, and CE-MS. Recent advances in hyphenated analytical techniques have remarkably widened their applications to the analysis of complex biomaterials, especially natural products. This chapter focuses on the applications of hyphenated techniques to pre-isolation and isolation of natural products, dereplication, online partial identification of compounds, chemotaxonomic studies, chemical finger-printing, quality control of herbal products, and metabolomic studies, and presents specific examples. However, a particular emphasis has been given on the hyphenated techniques that involve an LC as the separation tool. PMID:22367902

  15. Initial planetary base construction techniques and machine implementation

    NASA Technical Reports Server (NTRS)

    Crockford, William W.

    1987-01-01

    Conceptual designs of (1) initial planetary base structures, and (2) an unmanned machine to perform the construction of these structures using materials local to the planet are presented. Rock melting is suggested as a possible technique to be used by the machine in fabricating roads, platforms, and interlocking bricks. Identification of problem areas in machine design and materials processing is accomplished. The feasibility of the designs is contingent upon favorable results of an analysis of the engineering behavior of the product materials. The analysis requires knowledge of several parameters for solution of the constitutive equations of the theory of elasticity. An initial collection of these parameters is presented which helps to define research needed to perform a realistic feasibility study. A qualitative approach to estimating power and mass lift requirements for the proposed machine is used which employs specifications of currently available equipment. An initial, unmanned mission scenario is discussed with emphasis on identifying uncompleted tasks and suggesting design considerations for vehicles and primitive structures which use the products of the machine processing.

  16. Geotechnical Analysis of Paleoseismic Shaking Using Liquefaction Features: Part I. Major Updating of Analysis Techniques

    USGS Publications Warehouse

    Olson, Scott M.; Green, Russell A.; Obermeier, Stephen F.

    2003-01-01

    A new methodology is proposed for the geotechnical analysis of strength of paleoseismic shaking using liquefaction effects. The proposed method provides recommendations for selection of both individual and regionally located test sites, techniques for validation of field data for use in back-analysis, and use of a recently developed energy-based solution to back-calculate paleoearthquake magnitude and strength of shaking. The proposed method allows investigators to assess the influence of post-earthquake density change and aging. The proposed method also describes how the back-calculations from individual sites should be integrated into a regional assessment of paleoseismic parameters.

  17. Basic Sequence Analysis Techniques for Use with Audit Trail Data

    ERIC Educational Resources Information Center

    Judd, Terry; Kennedy, Gregor

    2008-01-01

    Audit trail analysis can provide valuable insights to researchers and evaluators interested in comparing and contrasting designers' expectations of use and students' actual patterns of use of educational technology environments (ETEs). Sequence analysis techniques are particularly effective but have been neglected to some extent because of real…

  18. Regional environmental analysis and management: New techniques for current problems

    NASA Technical Reports Server (NTRS)

    Honea, R. B.; Paludan, C. T. N.

    1974-01-01

    Advances in data acquisition and processing procedures for regional environmental analysis are discussed. Automated and semi-automated techniques employing Earth Resources Technology Satellite data and conventional data sources are presented. Experiences are summarized. The ERTS computer compatible tapes provide a very complete and flexible record of earth resources data and represent a viable medium to enhance regional environmental analysis research.

  19. Optical performance monitoring technique using software-based synchronous amplitude histograms.

    PubMed

    Choi, H G; Chang, J H; Kim, Hoon; Chung, Y C

    2014-10-01

    We propose and demonstrate a simple technique to monitor both the optical signal-to-noise ratio (OSNR) and chromatic dispersion (CD) by using the software-based synchronous amplitude histogram (SAH) analysis. We exploit the software-based synchronization technique to construct SAHs from the asynchronously sampled intensities of the signal. The use of SAHs facilitates the accurate extraction of the monitoring parameters at the center of the symbol. Thus, unlike in the case of using the technique based on the asynchronous amplitude histogram (AAH), this technique is not affected by the transient characteristics of the modulated signals. The performance of the proposed monitoring technique is evaluated experimentally by using 10-Gbaud quadrature phase-shift keying (QPSK) and quadrature amplitude modulation (QAM) signals over wide ranges of OSNR and CD. We also evaluate the robustness of the proposed technique to the signal's transient characteristics. PMID:25321978

  20. Computer-aided diagnosis in breast MRI based on unsupervised clustering techniques

    NASA Astrophysics Data System (ADS)

    Meyer-Baese, Anke; Wismueller, Axel; Lange, Oliver; Leinsinger, Gerda

    2004-04-01

    Exploratory data analysis techniques are applied to the segmentation of lesions in MRI mammography as a first step of a computer-aided diagnosis system. Three new unsupervised clustering techniques are tested on biomedical time-series representing breast MRI scans: fuzzy clustering based on deterministic annealing, "neural gas" network, and topographic independent component analysis. While the first two methods enable a correct segmentation of the lesion, the latter, although incorporating a topographic mapping, fails to detect and subclassify lesions.

  1. Recent Electrochemical and Optical Sensors in Flow-Based Analysis

    PubMed Central

    Chailapakul, Orawon; Ngamukot, Passapol; Yoosamran, Alongkorn; Siangproh, Weena; Wangfuengkanagul, Nattakarn

    2006-01-01

    Some recent analytical sensors based on electrochemical and optical detection coupled with different flow techniques have been chosen in this overview. A brief description of fundamental concepts and applications of each flow technique, such as flow injection analysis (FIA), sequential injection analysis (SIA), all injection analysis (AIA), batch injection analysis (BIA), multicommutated FIA (MCFIA), multisyringe FIA (MSFIA), and multipumped FIA (MPFIA) were reviewed.

  2. Damage detection technique by measuring laser-based mechanical impedance

    SciTech Connect

    Lee, Hyeonseok; Sohn, Hoon

    2014-02-18

    This study proposes a method for measurement of mechanical impedance using noncontact laser ultrasound. The measurement of mechanical impedance has been of great interest in nondestructive testing (NDT) or structural health monitoring (SHM) since mechanical impedance is sensitive even to small-sized structural defects. Conventional impedance measurements, however, have been based on electromechanical impedance (EMI) using contact-type piezoelectric transducers, which show deteriorated performances induced by the effects of a) Curie temperature limitations, b) electromagnetic interference (EMI), c) bonding layers and etc. This study aims to tackle the limitations of conventional EMI measurement by utilizing laser-based mechanical impedance (LMI) measurement. The LMI response, which is equivalent to a steady-state ultrasound response, is generated by shooting the pulse laser beam to the target structure, and is acquired by measuring the out-of-plane velocity using a laser vibrometer. The formation of the LMI response is observed through the thermo-mechanical finite element analysis. The feasibility of applying the LMI technique for damage detection is experimentally verified using a pipe specimen under high temperature environment.

  3. Perceptually based techniques for semantic image classification and retrieval

    NASA Astrophysics Data System (ADS)

    Depalov, Dejan; Pappas, Thrasyvoulos; Li, Dongge; Gandhi, Bhavan

    2006-02-01

    The accumulation of large collections of digital images has created the need for efficient and intelligent schemes for content-based image retrieval. Our goal is to organize the contents semantically, according to meaningful categories. We present a new approach for semantic classification that utilizes a recently proposed color-texture segmentation algorithm (by Chen et al.), which combines knowledge of human perception and signal characteristics to segment natural scenes into perceptually uniform regions. The color and texture features of these regions are used as medium level descriptors, based on which we extract semantic labels, first at the segment and then at the scene level. The segment features consist of spatial texture orientation information and color composition in terms of a limited number of locally adapted dominant colors. The focus of this paper is on region classification. We use a hierarchical vocabulary of segment labels that is consistent with those used in the NIST TRECVID 2003 development set. We test the approach on a database of 9000 segments obtained from 2500 photographs of natural scenes. For training and classification we use the Linear Discriminant Analysis (LDA) technique. We examine the performance of the algorithm (precision and recall rates) when different sets of features (e.g., one or two most dominant colors versus four quantized dominant colors) are used. Our results indicate that the proposed approach offers significant performance improvements over existing approaches.

  4. Damage detection technique by measuring laser-based mechanical impedance

    NASA Astrophysics Data System (ADS)

    Lee, Hyeonseok; Sohn, Hoon

    2014-02-01

    This study proposes a method for measurement of mechanical impedance using noncontact laser ultrasound. The measurement of mechanical impedance has been of great interest in nondestructive testing (NDT) or structural health monitoring (SHM) since mechanical impedance is sensitive even to small-sized structural defects. Conventional impedance measurements, however, have been based on electromechanical impedance (EMI) using contact-type piezoelectric transducers, which show deteriorated performances induced by the effects of a) Curie temperature limitations, b) electromagnetic interference (EMI), c) bonding layers and etc. This study aims to tackle the limitations of conventional EMI measurement by utilizing laser-based mechanical impedance (LMI) measurement. The LMI response, which is equivalent to a steady-state ultrasound response, is generated by shooting the pulse laser beam to the target structure, and is acquired by measuring the out-of-plane velocity using a laser vibrometer. The formation of the LMI response is observed through the thermo-mechanical finite element analysis. The feasibility of applying the LMI technique for damage detection is experimentally verified using a pipe specimen under high temperature environment.

  5. CANDU in-reactor quantitative visual-based inspection techniques

    NASA Astrophysics Data System (ADS)

    Rochefort, P. A.

    2009-02-01

    This paper describes two separate visual-based inspection procedures used at CANDU nuclear power generating stations. The techniques are quantitative in nature and are delivered and operated in highly radioactive environments with access that is restrictive, and in one case is submerged. Visual-based inspections at stations are typically qualitative in nature. For example a video system will be used to search for a missing component, inspect for a broken fixture, or locate areas of excessive corrosion in a pipe. In contrast, the methods described here are used to measure characteristic component dimensions that in one case ensure ongoing safe operation of the reactor and in the other support reactor refurbishment. CANDU reactors are Pressurized Heavy Water Reactors (PHWR). The reactor vessel is a horizontal cylindrical low-pressure calandria tank approximately 6 m in diameter and length, containing heavy water as a neutron moderator. Inside the calandria, 380 horizontal fuel channels (FC) are supported at each end by integral end-shields. Each FC holds 12 fuel bundles. The heavy water primary heat transport water flows through the FC pressure tube, removing the heat from the fuel bundles and delivering it to the steam generator. The general design of the reactor governs both the type of measurements that are required and the methods to perform the measurements. The first inspection procedure is a method to remotely measure the gap between FC and other in-core horizontal components. The technique involves delivering vertically a module with a high-radiation-resistant camera and lighting into the core of a shutdown but fuelled reactor. The measurement is done using a line-of-sight technique between the components. Compensation for image perspective and viewing elevation to the measurement is required. The second inspection procedure measures flaws within the reactor's end shield FC calandria tube rolled joint area. The FC calandria tube (the outer shell of the FC) is

  6. Efficient Plant Supervision Strategy Using NN Based Techniques

    NASA Astrophysics Data System (ADS)

    Garcia, Ramon Ferreiro; Rolle, Jose Luis Calvo; Castelo, Francisco Javier Perez

    Most of non-linear type one and type two control systems suffers from lack of detectability when model based techniques are applied on FDI (fault detection and isolation) tasks. In general, all types of processes suffer from lack of detectability also due to the ambiguity to discriminate the process, sensors and actuators in order to isolate any given fault. This work deals with a strategy to detect and isolate faults which include massive neural networks based functional approximation procedures associated to recursive rule based techniques applied to a parity space approach.

  7. Clustering and classification techniques for the analysis of vibration signatures

    NASA Astrophysics Data System (ADS)

    Alguindigue, Israel E.; Loskiewicz-Buczak, Anna; Uhrig, Robert E.

    1992-09-01

    A methodology is proposed for the clustering and classification of vibration signatures in the frequency domain. The technique is based on the technologies of neural networks and fuzzy clustering and it is especially suited for the problem of vibration analysis because it permits the incorporation of specific knowledge about the domain in a very simple manner, and because the system learns from actual process data. The system uses the backpropagation algorithm for classification of compressed signatures, where compression is used as a mechanism for noise removal and automatic feature extraction. The clustering system uses the Fuzzy C algorithm with a matrix of weights for the calculation of distances between patterns and centroids. The matrix is used to assign factors of importance to frequencies in the spectrum which are known to be related to particular defects. The two aspects of the analysis (clustering and classification) are complementary because in many cases the exact operating state of a machine cannot be assessed, and clustering may unveil classes of operating states that would not be discovered otherwise. Accurate results were obtained from testing the system on rolling element bearing data.

  8. VLBI Analysis with the Multi-Technique Software GEOSAT

    NASA Technical Reports Server (NTRS)

    Kierulf, Halfdan Pascal; Andersen, Per-Helge; Boeckmann, Sarah; Kristiansen, Oddgeir

    2010-01-01

    GEOSAT is a multi-technique geodetic analysis software developed at Forsvarets Forsknings Institutt (Norwegian defense research establishment). The Norwegian Mapping Authority has now installed the software and has, together with Forsvarets Forsknings Institutt, adapted the software to deliver datum-free normal equation systems in SINEX format. The goal is to be accepted as an IVS Associate Analysis Center and to provide contributions to the IVS EOP combination on a routine basis. GEOSAT is based on an upper diagonal factorized Kalman filter which allows estimation of time variable parameters like the troposphere and clocks as stochastic parameters. The tropospheric delays in various directions are mapped to tropospheric zenith delay using ray-tracing. Meteorological data from ECMWF with a resolution of six hours is used to perform the ray-tracing which depends both on elevation and azimuth. Other models are following the IERS and IVS conventions. The Norwegian Mapping Authority has submitted test SINEX files produced with GEOSAT to IVS. The results have been compared with the existing IVS combined products. In this paper the outcome of these comparisons is presented.

  9. S-192 analysis: Conventional and special data processing techniques. [Michigan

    NASA Technical Reports Server (NTRS)

    Nalepka, R. F. (Principal Investigator); Morganstern, J.; Cicone, R.; Sarno, J.; Lambeck, P.; Malila, W.

    1975-01-01

    The author has identified the following significant results. Multispectral scanner data gathered over test sites in southeast Michigan were analyzed. This analysis showed the data to be somewhat deficient especially in terms of the limited signal range in most SDOs and also in regard to SDO-SDO misregistration. Further analysis showed that the scan line straightening algorithm increased the misregistration of the data. Data were processed using the conic format. The effects of such misregistration on classification accuracy was analyzed via simulation and found to be significant. Results of employing conventional as well as special, unresolved object, processing techniques were disappointing due, at least in part, to the limited signal range and noise content of the data. Application of a second class of special processing techniques, signature extension techniques, yielded better results. Two of the more basic signature extension techniques seemed to be useful in spite of the difficulties.

  10. An ionospheric occultation inversion technique based on epoch difference

    NASA Astrophysics Data System (ADS)

    Lin, Jian; Xiong, Jing; Zhu, Fuying; Yang, Jian; Qiao, Xuejun

    2013-09-01

    Of the ionospheric radio occultation (IRO) electron density profile (EDP) retrievals, the Abel based calibrated TEC inversion (CTI) is the most widely used technique. In order to eliminate the contribution from the altitude above the RO satellite, it is necessary to utilize the calibrated TEC to retrieve the EDP, which introduces the error due to the coplanar assumption. In this paper, a new technique based on the epoch difference inversion (EDI) is firstly proposed to eliminate this error. The comparisons between CTI and EDI have been done, taking advantage of the simulated and real COSMIC data. The following conclusions can be drawn: the EDI technique can successfully retrieve the EDPs without non-occultation side measurements and shows better performance than the CTI method, especially for lower orbit mission; no matter which technique is used, the inversion results at the higher altitudes are better than those at the lower altitudes, which could be explained theoretically.

  11. Diode laser based water vapor DIAL using modulated pulse technique

    NASA Astrophysics Data System (ADS)

    Pham, Phong Le Hoai; Abo, Makoto

    2014-11-01

    In this paper, we propose a diode laser based differential absorption lidar (DIAL) for measuring lower-tropospheric water vapor profile using the modulated pulse technique. The transmitter is based on single-mode diode laser and tapered semiconductor optical amplifier with a peak power of 10W around 800nm absorption band, and the receiver telescope diameter is 35cm. The selected wavelengths are compared to referenced wavelengths in terms of random error and systematic errors. The key component of modulated pulse technique, a macropulse, is generated with a repetition rate of 10 kHz, and the modulation within the macropulse is coded according to a pseudorandom sequence with 100ns chip width. As a result, we evaluate both single pulse modulation and pseudorandom coded pulse modulation technique. The water vapor profiles conducted from these modulation techniques are compared to the real observation data in summer in Japan.

  12. An analysis of spectral transformation techniques on graphs

    NASA Astrophysics Data System (ADS)

    Djurović, Igor; Sejdić, Ervin; Bulatović, Nikola; Simeunović, Marko

    2015-05-01

    Emerging methods for the spectral analysis of graphs are analyzed in this paper, as graphs are currently used to study interactions in many fields from neuroscience to social networks. There are two main approaches related to the spectral transformation of graphs. The first approach is based on the Laplacian matrix. The graph Fourier transform is defined as an expansion of a graph signal in terms of eigenfunctions of the graph Laplacian. The calculated eigenvalues carry the notion of frequency of graph signals. The second approach is based on the graph weighted adjacency matrix, as it expands the graph signal into a basis of eigenvectors of the adjacency matrix instead of the graph Laplacian. Here, the notion of frequency is then obtained from the eigenvalues of the adjacency matrix or its Jordan decomposition. In this paper, advantages and drawbacks of both approaches are examined. Potential challenges and improvements to graph spectral processing methods are considered as well as the generalization of graph processing techniques in the spectral domain. Its generalization to the time-frequency domain and other potential extensions of classical signal processing concepts to graph datasets are also considered. Lastly, it is given an overview of the compressive sensing on graphs concepts.

  13. Analysis techniques used on field degraded photovoltaic modules

    SciTech Connect

    Hund, T.D.; King, D.L.

    1995-09-01

    Sandia National Laboratory`s PV System Components Department performs comprehensive failure analysis of photovoltaic modules after extended field exposure at various sites around the world. A full spectrum of analytical techniques are used to help identify the causes of degradation. The techniques are used to make solder fatigue life predictions for PV concentrator modules, identify cell damage or current mismatch, and measure the adhesive strength of the module encapsulant.

  14. Advanced airfoil design empirically based transonic aircraft drag buildup technique

    NASA Technical Reports Server (NTRS)

    Morrison, W. D., Jr.

    1976-01-01

    To systematically investigate the potential of advanced airfoils in advance preliminary design studies, empirical relationships were derived, based on available wind tunnel test data, through which total drag is determined recognizing all major aircraft geometric variables. This technique recognizes a single design lift coefficient and Mach number for each aircraft. Using this technique drag polars are derived for all Mach numbers up to MDesign + 0.05 and lift coefficients -0.40 to +0.20 from CLDesign.

  15. Intramuscular injection technique: an evidence-based approach.

    PubMed

    Ogston-Tuck, Sherri

    2014-09-30

    Intramuscular injections require a thorough and meticulous approach to patient assessment and injection technique. This article, the second in a series of two, reviews the evidence base to inform safer practice and to consider the evidence for nursing practice in this area. A framework for safe practice is included, identifying important points for safe technique, patient care and clinical decision making. It also highlights the ongoing debate in selection of intramuscular injection sites, predominately the ventrogluteal and dorsogluteal muscles. PMID:25249123

  16. An integrated technique for the analysis of skin bite marks.

    PubMed

    Bernitz, Herman; Owen, Johanna H; van Heerden, Willie F P; Solheim, Tore

    2008-01-01

    The high number of murder, rape, and child abuse cases in South Africa has led to increased numbers of bite mark cases being heard in high courts. Objective analysis to match perpetrators to bite marks at crime scenes must be able to withstand vigorous cross-examination to be of value in conviction of perpetrators. An analysis technique is described in four stages, namely determination of the mark to be a human bite mark, pattern association analysis, metric analysis and comparison with the population data, and illustrated by a real case study. New and accepted techniques are combined to determine the likelihood ratio of guilt expressed as one of a range of conclusions described in the paper. Each stage of the analysis adds to the confirmation (or rejection) of concordance between the dental features present on the victim and the dentition of the suspect. The results illustrate identification to a high degree of certainty. PMID:18279256

  17. Dynamic analysis of large structures by modal synthesis techniques.

    NASA Technical Reports Server (NTRS)

    Hurty, W. C.; Hart, G. C.; Collins, J. D.

    1971-01-01

    Several criteria that may be used to evaluate the merits of some of the existing techniques for the dynamic analysis of large structures which involve division into substructures or components are examined. These techniques make use of component displacement modes to synthetize global systems of generalized coordinates and, for that reason, they have come to be known as modal synthesis or component mode methods. Two techniques have been found to be particularly useful - i.e., the modal synthesis method with fixed attachment modes, and the modal synthesis method with free attachment modes. These two methods are treated in detail, and general flow charts are presented for guidance in computer programming.

  18. Review and classification of variability analysis techniques with clinical applications

    PubMed Central

    2011-01-01

    Analysis of patterns of variation of time-series, termed variability analysis, represents a rapidly evolving discipline with increasing applications in different fields of science. In medicine and in particular critical care, efforts have focussed on evaluating the clinical utility of variability. However, the growth and complexity of techniques applicable to this field have made interpretation and understanding of variability more challenging. Our objective is to provide an updated review of variability analysis techniques suitable for clinical applications. We review more than 70 variability techniques, providing for each technique a brief description of the underlying theory and assumptions, together with a summary of clinical applications. We propose a revised classification for the domains of variability techniques, which include statistical, geometric, energetic, informational, and invariant. We discuss the process of calculation, often necessitating a mathematical transform of the time-series. Our aims are to summarize a broad literature, promote a shared vocabulary that would improve the exchange of ideas, and the analyses of the results between different studies. We conclude with challenges for the evolving science of variability analysis. PMID:21985357

  19. Applications of Electromigration Techniques: Applications of Electromigration Techniques in Food Analysis

    NASA Astrophysics Data System (ADS)

    Wieczorek, Piotr; Ligor, Magdalena; Buszewski, Bogusław

    Electromigration techniques, including capillary electrophoresis (CE), are widely used for separation and identification of compounds present in food products. These techniques may also be considered as alternate and complementary with respect to commonly used analytical techniques, such as high-performance liquid chromatography (HPLC), or gas chromatography (GC). Applications of CE concern the determination of high-molecular compounds, like polyphenols, including flavonoids, pigments, vitamins, food additives (preservatives, antioxidants, sweeteners, artificial pigments) are presented. Also, the method developed for the determination of proteins and peptides composed of amino acids, which are basic components of food products, are studied. Other substances such as carbohydrates, nucleic acids, biogenic amines, natural toxins, and other contaminations including pesticides and antibiotics are discussed. The possibility of CE application in food control laboratories, where analysis of the composition of food and food products are conducted, is of great importance. CE technique may be used during the control of technological processes in the food industry and for the identification of numerous compounds present in food. Due to the numerous advantages of the CE technique it is successfully used in routine food analysis.

  20. The Analysis of Dimensionality Reduction Techniques in Cryptographic Object Code Classification

    SciTech Connect

    Jason L. Wright; Milos Manic

    2010-05-01

    This paper compares the application of three different dimension reduction techniques to the problem of locating cryptography in compiled object code. A simple classi?er is used to compare dimension reduction via sorted covariance, principal component analysis, and correlation-based feature subset selection. The analysis concentrates on the classi?cation accuracy as the number of dimensions is increased.

  1. Machine Learning Techniques for Arterial Pressure Waveform Analysis

    PubMed Central

    Almeida, Vânia G.; Vieira, João; Santos, Pedro; Pereira, Tânia; Pereira, H. Catarina; Correia, Carlos; Pego, Mariano; Cardoso, João

    2013-01-01

    The Arterial Pressure Waveform (APW) can provide essential information about arterial wall integrity and arterial stiffness. Most of APW analysis frameworks individually process each hemodynamic parameter and do not evaluate inter-dependencies in the overall pulse morphology. The key contribution of this work is the use of machine learning algorithms to deal with vectorized features extracted from APW. With this purpose, we follow a five-step evaluation methodology: (1) a custom-designed, non-invasive, electromechanical device was used in the data collection from 50 subjects; (2) the acquired position and amplitude of onset, Systolic Peak (SP), Point of Inflection (Pi) and Dicrotic Wave (DW) were used for the computation of some morphological attributes; (3) pre-processing work on the datasets was performed in order to reduce the number of input features and increase the model accuracy by selecting the most relevant ones; (4) classification of the dataset was carried out using four different machine learning algorithms: Random Forest, BayesNet (probabilistic), J48 (decision tree) and RIPPER (rule-based induction); and (5) we evaluate the trained models, using the majority-voting system, comparatively to the respective calculated Augmentation Index (AIx). Classification algorithms have been proved to be efficient, in particular Random Forest has shown good accuracy (96.95%) and high area under the curve (AUC) of a Receiver Operating Characteristic (ROC) curve (0.961). Finally, during validation tests, a correlation between high risk labels, retrieved from the multi-parametric approach, and positive AIx values was verified. This approach gives allowance for designing new hemodynamic morphology vectors and techniques for multiple APW analysis, thus improving the arterial pulse understanding, especially when compared to traditional single-parameter analysis, where the failure in one parameter measurement component, such as Pi, can jeopardize the whole evaluation. PMID

  2. Automated thermal mapping techniques using chromatic image analysis

    NASA Technical Reports Server (NTRS)

    Buck, Gregory M.

    1989-01-01

    Thermal imaging techniques are introduced using a chromatic image analysis system and temperature sensitive coatings. These techniques are used for thermal mapping and surface heat transfer measurements on aerothermodynamic test models in hypersonic wind tunnels. Measurements are made on complex vehicle configurations in a timely manner and at minimal expense. The image analysis system uses separate wavelength filtered images to analyze surface spectral intensity data. The system was initially developed for quantitative surface temperature mapping using two-color thermographic phosphors but was found useful in interpreting phase change paint and liquid crystal data as well.

  3. Grid techniques in the analysis of gaseous pollutant propagation

    NASA Astrophysics Data System (ADS)

    Pisarek, Jerzy; Blaszczuk, A.

    2003-10-01

    The article describes trends in the development of gradient techniques used for the analysis of the spatial distribution of the breakdown coefficient of a gas different to the surrounding atmosphere. Depending on the modification made to the Schlieren technique, it was possible to measure in real time the mass distribution with a different range and ratio. In the optical system, periodic patterns (rasters) as well as the arrangements of Rife prisms were used. Carbon dioxide and propane were used as gaseous pollutants of the air. The new solution proposed by the authors has turned out to be an effective tool for the analysis of gaseous pollutant distribution processes.

  4. Laser-based direct-write techniques for cell printing

    PubMed Central

    Schiele, Nathan R; Corr, David T; Huang, Yong; Raof, Nurazhani Abdul; Xie, Yubing; Chrisey, Douglas B

    2016-01-01

    Fabrication of cellular constructs with spatial control of cell location (±5 μm) is essential to the advancement of a wide range of applications including tissue engineering, stem cell and cancer research. Precise cell placement, especially of multiple cell types in co- or multi-cultures and in three dimensions, can enable research possibilities otherwise impossible, such as the cell-by-cell assembly of complex cellular constructs. Laser-based direct writing, a printing technique first utilized in electronics applications, has been adapted to transfer living cells and other biological materials (e.g., enzymes, proteins and bioceramics). Many different cell types have been printed using laser-based direct writing, and this technique offers significant improvements when compared to conventional cell patterning techniques. The predominance of work to date has not been in application of the technique, but rather focused on demonstrating the ability of direct writing to pattern living cells, in a spatially precise manner, while maintaining cellular viability. This paper reviews laser-based additive direct-write techniques for cell printing, and the various cell types successfully laser direct-written that have applications in tissue engineering, stem cell and cancer research are highlighted. A particular focus is paid to process dynamics modeling and process-induced cell injury during laser-based cell direct writing. PMID:20814088

  5. Construction of dynamic stochastic simulation models using knowledge-based techniques

    NASA Technical Reports Server (NTRS)

    Williams, M. Douglas; Shiva, Sajjan G.

    1990-01-01

    Over the past three decades, computer-based simulation models have proven themselves to be cost-effective alternatives to the more structured deterministic methods of systems analysis. During this time, many techniques, tools and languages for constructing computer-based simulation models have been developed. More recently, advances in knowledge-based system technology have led many researchers to note the similarities between knowledge-based programming and simulation technologies and to investigate the potential application of knowledge-based programming techniques to simulation modeling. The integration of conventional simulation techniques with knowledge-based programming techniques is discussed to provide a development environment for constructing knowledge-based simulation models. A comparison of the techniques used in the construction of dynamic stochastic simulation models and those used in the construction of knowledge-based systems provides the requirements for the environment. This leads to the design and implementation of a knowledge-based simulation development environment. These techniques were used in the construction of several knowledge-based simulation models including the Advanced Launch System Model (ALSYM).

  6. Preconditioned conjugate gradient technique for the analysis of symmetric anisotropic structures

    NASA Technical Reports Server (NTRS)

    Noor, Ahmed K.; Peters, Jeanne M.

    1987-01-01

    An efficient preconditioned conjugate gradient (PCG) technique and a computational procedure are presented for the analysis of symmetric anisotropic structures. The technique is based on selecting the preconditioning matrix as the orthotropic part of the global stiffness matrix of the structure, with all the nonorthotropic terms set equal to zero. This particular choice of the preconditioning matrix results in reducing the size of the analysis model of the anisotropic structure to that of the corresponding orthotropic structure. The similarities between the proposed PCG technique and a reduction technique previously presented by the authors are identified and exploited to generate from the PCG technique direct measures for the sensitivity of the different response quantities to the nonorthotropic (anisotropic) material coefficients of the structure. The effectiveness of the PCG technique is demonstrated by means of a numerical example of an anisotropic cylindrical panel.

  7. Towards Effective Clustering Techniques for the Analysis of Electric Power Grids

    SciTech Connect

    Hogan, Emilie A.; Cotilla Sanchez, Jose E.; Halappanavar, Mahantesh; Wang, Shaobu; Mackey, Patrick S.; Hines, Paul; Huang, Zhenyu

    2013-11-30

    Clustering is an important data analysis technique with numerous applications in the analysis of electric power grids. Standard clustering techniques are oblivious to the rich structural and dynamic information available for power grids. Therefore, by exploiting the inherent topological and electrical structure in the power grid data, we propose new methods for clustering with applications to model reduction, locational marginal pricing, phasor measurement unit (PMU or synchrophasor) placement, and power system protection. We focus our attention on model reduction for analysis based on time-series information from synchrophasor measurement devices, and spectral techniques for clustering. By comparing different clustering techniques on two instances of realistic power grids we show that the solutions are related and therefore one could leverage that relationship for a computational advantage. Thus, by contrasting different clustering techniques we make a case for exploiting structure inherent in the data with implications for several domains including power systems.

  8. Certification-Based Process Analysis

    NASA Technical Reports Server (NTRS)

    Knight, Russell L.

    2013-01-01

    Space mission architects are often challenged with knowing which investment in technology infusion will have the highest return. Certification-based analysis (CBA) gives architects and technologists a means to communicate the risks and advantages of infusing technologies at various points in a process. Various alternatives can be compared, and requirements based on supporting streamlining or automation can be derived and levied on candidate technologies. CBA is a technique for analyzing a process and identifying potential areas of improvement. The process and analysis products are used to communicate between technologists and architects. Process means any of the standard representations of a production flow; in this case, any individual steps leading to products, which feed into other steps, until the final product is produced at the end. This sort of process is common for space mission operations, where a set of goals is reduced eventually to a fully vetted command sequence to be sent to the spacecraft. Fully vetting a product is synonymous with certification. For some types of products, this is referred to as verification and validation, and for others it is referred to as checking. Fundamentally, certification is the step in the process where one insures that a product works as intended, and contains no flaws.

  9. Comparative Analysis of Techniques to Purify Plasma Membrane Proteins

    PubMed Central

    Weekes, Michael P.; Antrobus, Robin; Lill, Jennie R.; Duncan, Lidia M.; Hör, Simon; Lehner, Paul J.

    2010-01-01

    The aim of this project was to identify the best method for the enrichment of plasma membrane (PM) proteins for proteomics experiments. Following tryptic digestion and extended liquid chromatography-tandem mass spectrometry acquisitions, data were processed using MaxQuant and Gene Ontology (GO) terms used to determine protein subcellular localization. The following techniques were examined for the total number and percentage purity of PM proteins identified: (a) whole cell lysate (total number, 84–112; percentage purity, 9–13%); (b) crude membrane preparation (104–111; 17–20%); (c) biotinylation of surface proteins with N-hydroxysulfosuccinimydyl-S,S-biotin and streptavidin pulldown (78–115; 27–31%); (d) biotinylation of surface glycoproteins with biocytin hydrazide and streptavidin pulldown (41–54; 59–85%); or (e) biotinylation of surface glycoproteins with amino-oxy-biotin (which labels the sialylated fraction of PM glycoproteins) and streptavidin pulldown (120; 65%). A two- to threefold increase in the overall number of proteins identified was achieved by using stop and go extraction tip (StageTip)-based anion exchange (SAX) fractionation. Combining technique (e) with SAX fractionation increased the number of proteins identified to 281 (54%). Analysis of GO terms describing these proteins identified a large subset of proteins integral to the membrane with no subcellular assignment. These are likely to be of PM location and bring the total PM protein identifications to 364 (68%). This study suggests that selective biotinylation of the cell surface using amino-oxy-biotin in combination with SAX fractionation is a useful method for identification of sialylated PM proteins. PMID:20808639

  10. Knowledge-based GIS techniques applied to geological engineering

    USGS Publications Warehouse

    Usery, E. Lynn; Altheide, Phyllis; Deister, Robin R.P.; Barr, David J.

    1988-01-01

    A knowledge-based geographic information system (KBGIS) approach which requires development of a rule base for both GIS processing and for the geological engineering application has been implemented. The rule bases are implemented in the Goldworks expert system development shell interfaced to the Earth Resources Data Analysis System (ERDAS) raster-based GIS for input and output. GIS analysis procedures including recoding, intersection, and union are controlled by the rule base, and the geological engineering map product is generted by the expert system. The KBGIS has been used to generate a geological engineering map of Creve Coeur, Missouri.

  11. Evaluation of Analysis Techniques for Fluted-Core Sandwich Cylinders

    NASA Technical Reports Server (NTRS)

    Lovejoy, Andrew E.; Schultz, Marc R.

    2012-01-01

    Buckling-critical launch-vehicle structures require structural concepts that have high bending stiffness and low mass. Fluted-core, also known as truss-core, sandwich construction is one such concept. In an effort to identify an analysis method appropriate for the preliminary design of fluted-core cylinders, the current paper presents and compares results from several analysis techniques applied to a specific composite fluted-core test article. The analysis techniques are evaluated in terms of their ease of use and for their appropriateness at certain stages throughout a design analysis cycle (DAC). Current analysis techniques that provide accurate determination of the global buckling load are not readily applicable early in the DAC, such as during preliminary design, because they are too costly to run. An analytical approach that neglects transverse-shear deformation is easily applied during preliminary design, but the lack of transverse-shear deformation results in global buckling load predictions that are significantly higher than those from more detailed analysis methods. The current state of the art is either too complex to be applied for preliminary design, or is incapable of the accuracy required to determine global buckling loads for fluted-core cylinders. Therefore, it is necessary to develop an analytical method for calculating global buckling loads of fluted-core cylinders that includes transverse-shear deformations, and that can be easily incorporated in preliminary design.

  12. LOFT Debriefings: An Analysis of Instructor Techniques and Crew Participation

    NASA Technical Reports Server (NTRS)

    Dismukes, R. Key; Jobe, Kimberly K.; McDonnell, Lori K.

    1997-01-01

    This study analyzes techniques instructors use to facilitate crew analysis and evaluation of their Line-Oriented Flight Training (LOFT) performance. A rating instrument called the Debriefing Assessment Battery (DAB) was developed which enables raters to reliably assess instructor facilitation techniques and characterize crew participation. Thirty-six debriefing sessions conducted at five U.S. airlines were analyzed to determine the nature of instructor facilitation and crew participation. Ratings obtained using the DAB corresponded closely with descriptive measures of instructor and crew performance. The data provide empirical evidence that facilitation can be an effective tool for increasing the depth of crew participation and self-analysis of CRM performance. Instructor facilitation skill varied dramatically, suggesting a need for more concrete hands-on training in facilitation techniques. Crews were responsive but fell short of actively leading their own debriefings. Ways to improve debriefing effectiveness are suggested.

  13. Evaluation of Meterorite Amono Acid Analysis Data Using Multivariate Techniques

    NASA Technical Reports Server (NTRS)

    McDonald, G.; Storrie-Lombardi, M.; Nealson, K.

    1999-01-01

    The amino acid distributions in the Murchison carbonaceous chondrite, Mars meteorite ALH84001, and ice from the Allan Hills region of Antarctica are shown, using a multivariate technique known as Principal Component Analysis (PCA), to be statistically distinct from the average amino acid compostion of 101 terrestrial protein superfamilies.

  14. Regression Commonality Analysis: A Technique for Quantitative Theory Building

    ERIC Educational Resources Information Center

    Nimon, Kim; Reio, Thomas G., Jr.

    2011-01-01

    When it comes to multiple linear regression analysis (MLR), it is common for social and behavioral science researchers to rely predominately on beta weights when evaluating how predictors contribute to a regression model. Presenting an underutilized statistical technique, this article describes how organizational researchers can use commonality…

  15. Analysis of leaching data using asymptotic expansion techniques

    SciTech Connect

    Simonson, S.A.; Machiels, A.J.

    1983-01-01

    Asymptotic analysis constitutes a useful technique to determine the adjustable parameters appearing in mathematical models attempting to reproduce some experimental data. In particular, asymptotic expansions of a leach model proposed by A.J. Machiels and C. Pescatore are used to interpret leaching data from PNL 76-68 glass in terms of corrosion velocities and diffusion coefficients.

  16. Dynamic mechanical analysis: A practical introduction to techniques and applications

    SciTech Connect

    Menard, K.

    1999-01-01

    This book introduces DMA, its history, and its current position as part of thermal analysis on polymers. It discusses major types of instrumentation, including oscillatory rotational, oscillatory axial, and torsional pendulum. It also describes analytical techniques in terms of utility, quality of data, methods of calibration, and suitability for different types of materials and assesses applications for thermoplastics, thermosetting systems, and thermosets.

  17. Wavelet transformation based watermarking technique for human electrocardiogram (ECG).

    PubMed

    Engin, Mehmet; Cidam, Oğuz; Engin, Erkan Zeki

    2005-12-01

    Nowadays, watermarking has become a technology of choice for a broad range of multimedia copyright protection applications. Watermarks have also been used to embed prespecified data in biomedical signals. Thus, the watermarked biomedical signals being transmitted through communication are resistant to some attacks. This paper investigates discrete wavelet transform based watermarking technique for signal integrity verification in an Electrocardiogram (ECG) coming from four ECG classes for monitoring application of cardiovascular diseases. The proposed technique is evaluated under different noisy conditions for different wavelet functions. Daubechies (db2) wavelet function based technique performs better than those of Biorthogonal (bior5.5) wavelet function. For the beat-to-beat applications, all performance results belonging to four ECG classes are highly moderate. PMID:16235811

  18. Comparative Analysis of Different LIDAR System Calibration Techniques

    NASA Astrophysics Data System (ADS)

    Miller, M.; Habib, A.

    2016-06-01

    With light detection and ranging (LiDAR) now being a crucial tool for engineering products and on the fly spatial analysis, it is necessary for the user community to have standardized calibration methods. The three methods in this study were developed and proven by the Digital Photogrammetry Research Group (DPRG) for airborne LiDAR systems and are as follows; Simplified, Quasi-Rigorous, and Rigorous. In lieu of using expensive control surfaces for calibration, these methods compare overlapping LiDAR strips to estimate the systematic errors. These systematic errors are quantified by these methods and include the lever arm biases, boresight biases, range bias and scan angle scale bias. These three methods comprehensively represent all of the possible flight configurations and data availability and this paper will test the limits of the method with the most assumptions, the simplified calibration, by using data that violates the assumptions it's math model is based on and compares the results to the quasi-rigorous and rigorous techniques. The overarching goal is to provide a LiDAR system calibration that does not require raw measurements which can be carried out with minimal control and flight lines to reduce costs. This testing is unique because the terrain used for calibration does not contain gable roofs, all other LiDAR system calibration testing and development has been done with terrain containing features with high geometric integrity such as gable roofs.

  19. "Ayeli": Centering Technique Based on Cherokee Spiritual Traditions.

    ERIC Educational Resources Information Center

    Garrett, Michael Tlanusta; Garrett, J. T.

    2002-01-01

    Presents a centering technique called "Ayeli," based on Cherokee spiritual traditions as a way of incorporating spirituality into counseling by helping clients identify where they are in their journey, where they want to be, and how they can get there. Relevant Native cultural traditions and meanings are explored. (Contains 25 references.) (GCP)

  20. Techniques of Trend Analysis for Monthly Water Quality Data

    NASA Astrophysics Data System (ADS)

    Hirsch, Robert M.; Slack, James R.; Smith, Richard A.

    1982-02-01

    Some of the characteristics that complicate the analysis of water quality time series are non-normal distributions, seasonality, flow relatedness, missing values, values below the limit of detection, and serial correlation. Presented here are techniques that are suitable in the face of the complications listed above for the exploratory analysis of monthly water quality data for monotonie trends. The first procedure described is a nonparametric test for trend applicable to data sets with seasonality, missing values, or values reported as `less than': the seasonal Kendall test. Under realistic stochastic processes (exhibiting seasonality, skewness, and serial correlation), it is robust in comparison to parametric alternatives, although neither the seasonal Kendall test nor the alternatives can be considered an exact test in the presence of serial correlation. The second procedure, the seasonal Kendall slope estimator, is an estimator of trend magnitude. It is an unbiased estimator of the slope of a linear trend and has considerably higher precision than a regression estimator where data are highly skewed but somewhat lower precision where the data are normal. The third procedure provides a means for testing for change over time in the relationship between constituent concentration and flow, thus avoiding the problem of identifying trends in water quality that are artifacts of the particular sequence of discharges observed (e.g., drought effects). In this method a flow-adjusted concentration is defined as the residual (actual minus conditional expectation) based on a regression of concentration on some function of discharge. These flow-adjusted concentrations, which may also be seasonal and non-normal, can then be tested for trend by using the seasonal Kendall test.

  1. Sensitivity analysis technique for application to deterministic models

    SciTech Connect

    Ishigami, T.; Cazzoli, E.; Khatib-Rahbar, M.; Unwin, S.D.

    1987-01-01

    The characterization of sever accident source terms for light water reactors should include consideration of uncertainties. An important element of any uncertainty analysis is an evaluation of the sensitivity of the output probability distributions reflecting source term uncertainties to assumptions regarding the input probability distributions. Historically, response surface methods (RSMs) were developed to replace physical models using, for example, regression techniques, with simplified models for example, regression techniques, with simplified models for extensive calculations. The purpose of this paper is to present a new method for sensitivity analysis that does not utilize RSM, but instead relies directly on the results obtained from the original computer code calculations. The merits of this approach are demonstrated by application of the proposed method to the suppression pool aerosol removal code (SPARC), and the results are compared with those obtained by sensitivity analysis with (a) the code itself, (b) a regression model, and (c) Iman's method.

  2. Microfluidic IEF technique for sequential phosphorylation analysis of protein kinases

    NASA Astrophysics Data System (ADS)

    Choi, Nakchul; Song, Simon; Choi, Hoseok; Lim, Bu-Taek; Kim, Young-Pil

    2015-11-01

    Sequential phosphorylation of protein kinases play the important role in signal transduction, protein regulation, and metabolism in living cells. The analysis of these phosphorylation cascades will provide new insights into their physiological functions in many biological functions. Unfortunately, the existing methods are limited to analyze the cascade activity. Therefore, we suggest a microfluidic isoelectric focusing technique (μIEF) for the analysis of the cascade activity. Using the technique, we show that the sequential phosphorylation of a peptide by two different kinases can be successfully detected on a microfluidic chip. In addition, the inhibition assay for kinase activity and the analysis on a real sample have also been conducted. The results indicate that μIEF is an excellent means for studies on phosphorylation cascade activity.

  3. Intestinal Preparation Techniques for Histological Analysis in the Mouse.

    PubMed

    Williams, Jonathan M; Duckworth, Carrie A; Vowell, Kate; Burkitt, Michael D; Pritchard, D Mark

    2016-01-01

    The murine intestinal tract represents a difficult organ system to study due to its long convoluted tubular structure, narrow diameter, and delicate mucosa which undergoes rapid changes after sampling prior to fixation. These features do not make for easy histological analysis as rapid fixation in situ, or after simple removal without careful dissection, results in poor postfixation tissue handling and limited options for high quality histological sections. Collecting meaningful quantitative data by analysis of this tissue is further complicated by the anatomical changes in structure along its length. This article describes two methods of intestinal sampling at necropsy that allow systematic histological analysis of the entire intestinal tract, either through examination of cross sections (circumferences) by the gut bundling technique or longitudinal sections by the adapted Swiss roll technique, together with basic methods for data collection. © 2016 by John Wiley & Sons, Inc. PMID:27248432

  4. Emerging techniques for soil analysis via mid-infrared spectroscopy

    NASA Astrophysics Data System (ADS)

    Linker, R.; Shaviv, A.

    2009-04-01

    Transmittance and diffuse reflectance (DRIFT) spectroscopy in the mid-IR range are well-established methods for soil analysis. Over the last five years, additional mid-IR techniques have been investigated, and in particular: 1. Attenuated total reflectance (ATR) Attenuated total reflectance is commonly used for analysis of liquids and powders for which simple transmittance measurements are not possible. The method relies on a crystal with a high refractive index, which is in contact with the sample and serves as a waveguide for the IR radiation. The radiation beam is directed in such a way that it hits the crystal/sample interface several times, each time penetrating a few microns into the sample. Since the penetration depth is limited to a few microns, very good contact between the sample and the crystal must be ensured, which can be achieved by working with samples close to water saturation. However, the strong absorbance of water in the mid-infrared range as well as the absorbance of some soil constituents (e.g., calcium carbonate) interfere with some of the absorbance bands of interest. This has led to the development of several post-processing methods for analysis of the spectra. The FTIR-ATR technique has been successfully applied to soil classification as well as to determination of nitrate concentration [1, 6-8, 10]. Furthermore, Shaviv et al. [12] demonstrated the possibility of using fiber optics as an ATR devise for direct determination of nitrate concentration in soil extracts. Recently, Du et al. [5] showed that it is possible to differentiate between 14N and 15N in such spectra, which opens very promising opportunities for developing FTIR-ATR based methods for investigating nitrogen transformation in soils by tracing changes in N-isotopic species. 2. Photo-acoustic spectroscopy Photoacoustic spectroscopy (PAS) is based on absorption-induced heating of the sample, which produces pressure fluctuations in a surrounding gas. These fluctuations are

  5. Rule-based analysis of pilot decisions

    NASA Technical Reports Server (NTRS)

    Lewis, C. M.

    1985-01-01

    The application of the rule identification technique to the analysis of human performance data is proposed. The relation between the language and identifiable consistencies is discussed. The advantages of production system models for the description of complex human behavior are studied. The use of a Monte Carlo significance testing procedure to assure the validity of the rule identification is examined. An example of the rule-based analysis of Palmer's (1983) data is presented.

  6. Video multiple watermarking technique based on image interlacing using DWT.

    PubMed

    Ibrahim, Mohamed M; Abdel Kader, Neamat S; Zorkany, M

    2014-01-01

    Digital watermarking is one of the important techniques to secure digital media files in the domains of data authentication and copyright protection. In the nonblind watermarking systems, the need of the original host file in the watermark recovery operation makes an overhead over the system resources, doubles memory capacity, and doubles communications bandwidth. In this paper, a robust video multiple watermarking technique is proposed to solve this problem. This technique is based on image interlacing. In this technique, three-level discrete wavelet transform (DWT) is used as a watermark embedding/extracting domain, Arnold transform is used as a watermark encryption/decryption method, and different types of media (gray image, color image, and video) are used as watermarks. The robustness of this technique is tested by applying different types of attacks such as: geometric, noising, format-compression, and image-processing attacks. The simulation results show the effectiveness and good performance of the proposed technique in saving system resources, memory capacity, and communications bandwidth. PMID:25587570

  7. Video Multiple Watermarking Technique Based on Image Interlacing Using DWT

    PubMed Central

    Ibrahim, Mohamed M.; Abdel Kader, Neamat S.; Zorkany, M.

    2014-01-01

    Digital watermarking is one of the important techniques to secure digital media files in the domains of data authentication and copyright protection. In the nonblind watermarking systems, the need of the original host file in the watermark recovery operation makes an overhead over the system resources, doubles memory capacity, and doubles communications bandwidth. In this paper, a robust video multiple watermarking technique is proposed to solve this problem. This technique is based on image interlacing. In this technique, three-level discrete wavelet transform (DWT) is used as a watermark embedding/extracting domain, Arnold transform is used as a watermark encryption/decryption method, and different types of media (gray image, color image, and video) are used as watermarks. The robustness of this technique is tested by applying different types of attacks such as: geometric, noising, format-compression, and image-processing attacks. The simulation results show the effectiveness and good performance of the proposed technique in saving system resources, memory capacity, and communications bandwidth. PMID:25587570

  8. Fiber probes based optical techniques for biomedical diagnosis

    NASA Astrophysics Data System (ADS)

    Arce-Diego, José L.; Fanjul-Vélez, Félix

    2007-06-01

    Although fiber optics have been applied in optical communication and sensor systems for several years in a very successful way, their first application was developed in medicine in the early 20's. Manufacturing and developing of optical fibers for biomedical purposes have required a lot of research efforts in order to achieve a non-invasive, in-vivo, and real-time diagnosis of different diseases in human or animal tissues. In general, optical fiber probes are designed as a function of the optical measurement technique. In this work, a brief description of the main optical techniques for optical characterization of biological tissues is presented. The recent advances in optical fiber probes for biomedical diagnosis in clinical analysis and optical biopsy in relation with the different spectroscopic or tomographic optical techniques are described.

  9. Hyphenated techniques for the analysis of heparin and heparan sulfate

    PubMed Central

    Yang, Bo; Solakyildirim, Kemal; Chang, Yuqing

    2011-01-01

    The elucidation of the structure of glycosaminoglycan has proven to be challenging for analytical chemists. Molecules of glycosaminoglycan have a high negative charge and are polydisperse and microheterogeneous, thus requiring the application of multiple analytical techniques and methods. Heparin and heparan sulfate are the most structurally complex of the glycosaminoglycans and are widely distributed in nature. They play critical roles in physiological and pathophysiological processes through their interaction with heparin-binding proteins. Moreover, heparin and low-molecular weight heparin are currently used as pharmaceutical drugs to control blood coagulation. In 2008, the health crisis resulting from the contamination of pharmaceutical heparin led to considerable attention regarding their analysis and structural characterization. Modern analytical techniques, including high-performance liquid chromatography, capillary electrophoresis, mass spectrometry, and nuclear magnetic resonance spectroscopy, played critical roles in this effort. A successful combination of separation and spectral techniques will clearly provide a critical advantage in the future analysis of heparin and heparan sulfate. This review focuses on recent efforts to develop hyphenated techniques for the analysis of heparin and heparan sulfate. PMID:20853165

  10. Experiments on Adaptive Techniques for Host-Based Intrusion Detection

    SciTech Connect

    DRAELOS, TIMOTHY J.; COLLINS, MICHAEL J.; DUGGAN, DAVID P.; THOMAS, EDWARD V.; WUNSCH, DONALD

    2001-09-01

    This research explores four experiments of adaptive host-based intrusion detection (ID) techniques in an attempt to develop systems that can detect novel exploits. The technique considered to have the most potential is adaptive critic designs (ACDs) because of their utilization of reinforcement learning, which allows learning exploits that are difficult to pinpoint in sensor data. Preliminary results of ID using an ACD, an Elman recurrent neural network, and a statistical anomaly detection technique demonstrate an ability to learn to distinguish between clean and exploit data. We used the Solaris Basic Security Module (BSM) as a data source and performed considerable preprocessing on the raw data. A detection approach called generalized signature-based ID is recommended as a middle ground between signature-based ID, which has an inability to detect novel exploits, and anomaly detection, which detects too many events including events that are not exploits. The primary results of the ID experiments demonstrate the use of custom data for generalized signature-based intrusion detection and the ability of neural network-based systems to learn in this application environment.

  11. Comment on 'Comparative analysis of the isovolume calibration method for non-invasive respiratory monitoring techniques based on area transduction versus circumference transduction using the connected cylinders model' (2011 Physiol. Meas. 32 1265-74).

    PubMed

    Augousti, A T; Radosz, A

    2015-05-01

    An analysis introduced by the authors in 2011 examining the robustness of the isovolume method for the calibration of the respiratory inductive plethysmograph based on the connected cylinders particular model of Konno and Mead's generalized two-compartment model of respiration is extended. It is demonstrated that extending this to a more physically realistic geometrical model, termed the connected prismatic elliptical segments model, does not enhance the earlier analysis, and that the analysis can easily be proven to cover all area-based transduction sensors, irrespective of the actual geometry of the compartments. PMID:25903299

  12. Graphene-based terahertz photodetector by noise thermometry technique

    SciTech Connect

    Wang, Ming-Jye; Wang, Ji-Wun; Wang, Chun-Lun; Chiang, Yen-Yu; Chang, Hsian-Hong

    2014-01-20

    We report the characteristics of graphene-based terahertz (THz) photodetector based on noise thermometry technique by measuring its noise power at frequency from 4 to 6 GHz. Hot electron system in graphene microbridge is generated after THz photon pumping and creates extra noise power. The equivalent noise temperature and electron temperature increase rapidly in low THz pumping regime and saturate gradually in high THz power regime which is attributed to a faster energy relaxation process involved by stronger electron-phonon interaction. Based on this detector, a conversion efficiency around 0.15 from THz power to noise power in 4–6 GHz span has been achieved.

  13. Comparison of ITS, RAPD and ISSR from DNA-based genetic diversity techniques.

    PubMed

    Poyraz, Ismail

    2016-01-01

    ITS, RAPD-PCR and ISSR-PCR are most popular DNA-based techniques that are extensively applied in the determination of the genetic diversity of species among populations. However, especially for organisms having high genetic polymorphism, phylogenetic trees drawn from the results of these techniques may be different. For finding a meaningful phylogenetic tree, it should be compared phylogenetic trees obtained from these different techniques with geographic locations of populations. Lichens have a high genetic polymorphism and tolerance against different environmental conditions. In this study, these three DNA-based genetic diversity techniques were compared, using different populations of a lichen species (Xanthoria parietina). X. parietina was especially chosen because of its high genetic diversity in narrow zones. Lichen samples were collected from ten different locations in a narrow transition climate zone Bilecik (Turkey). Statistical analyses of all results were calculated using UPGMA analysis. Phylogenic trees for each technique were drawn and transferred to the Bilecik map for comparative analysis. The results of three techniques allowed us to verify that populations of X. parietina have high genetic variety in a narrow zone. But phylogenetic trees obtained from these results were found to be very different. Our comparative analysis demonstrated that the results of these techniques are not similar and have critical differences. We observed that the ITS method provides more clear data and is more successful in genetic diversity analyses of more asunder populations, in contrast to ISSR-PCR and RAPD-PCR methods. PMID:27156497

  14. Probabilistic Analysis Techniques Applied to Complex Spacecraft Power System Modeling

    NASA Technical Reports Server (NTRS)

    Hojnicki, Jeffrey S.; Rusick, Jeffrey J.

    2005-01-01

    Electric power system performance predictions are critical to spacecraft, such as the International Space Station (ISS), to ensure that sufficient power is available to support all the spacecraft s power needs. In the case of the ISS power system, analyses to date have been deterministic, meaning that each analysis produces a single-valued result for power capability because of the complexity and large size of the model. As a result, the deterministic ISS analyses did not account for the sensitivity of the power capability to uncertainties in model input variables. Over the last 10 years, the NASA Glenn Research Center has developed advanced, computationally fast, probabilistic analysis techniques and successfully applied them to large (thousands of nodes) complex structural analysis models. These same techniques were recently applied to large, complex ISS power system models. This new application enables probabilistic power analyses that account for input uncertainties and produce results that include variations caused by these uncertainties. Specifically, N&R Engineering, under contract to NASA, integrated these advanced probabilistic techniques with Glenn s internationally recognized ISS power system model, System Power Analysis for Capability Evaluation (SPACE).

  15. Noble Gas Measurement and Analysis Technique for Monitoring Reprocessing Facilities

    SciTech Connect

    Charlton, William S

    1999-09-01

    An environmental monitoring technique using analysis of stable noble gas isotopic ratios on-stack at a reprocessing facility was developed. This technique integrates existing technologies to strengthen safeguards at reprocessing facilities. The isotopic ratios are measured using a mass spectrometry system and are compared to a database of calculated isotopic ratios using a Bayesian data analysis method to determine specific fuel parameters (e.g., burnup, fuel type, fuel age, etc.). These inferred parameters can be used by investigators to verify operator declarations. A user-friendly software application (named NOVA) was developed for the application of this technique. NOVA included a Visual Basic user interface coupling a Bayesian data analysis procedure to a reactor physics database (calculated using the Monteburns 3.01 code system). The integrated system (mass spectrometry, reactor modeling, and data analysis) was validated using on-stack measurements during the reprocessing of target fuel from a U.S. production reactor and gas samples from the processing of EBR-II fast breeder reactor driver fuel. These measurements led to an inferred burnup that matched the declared burnup with sufficient accuracy and consistency for most safeguards applications. The NOVA code was also tested using numerous light water reactor measurements from the literature. NOVA was capable of accurately determining spent fuel type, burnup, and fuel age for these experimental results. Work should continue to demonstrate the robustness of this system for production, power, and research reactor fuels.

  16. MEMS-Based Power Generation Techniques for Implantable Biosensing Applications

    PubMed Central

    Lueke, Jonathan; Moussa, Walied A.

    2011-01-01

    Implantable biosensing is attractive for both medical monitoring and diagnostic applications. It is possible to monitor phenomena such as physical loads on joints or implants, vital signs, or osseointegration in vivo and in real time. Microelectromechanical (MEMS)-based generation techniques can allow for the autonomous operation of implantable biosensors by generating electrical power to replace or supplement existing battery-based power systems. By supplementing existing battery-based power systems for implantable biosensors, the operational lifetime of the sensor is increased. In addition, the potential for a greater amount of available power allows additional components to be added to the biosensing module, such as computational and wireless and components, improving functionality and performance of the biosensor. Photovoltaic, thermovoltaic, micro fuel cell, electrostatic, electromagnetic, and piezoelectric based generation schemes are evaluated in this paper for applicability for implantable biosensing. MEMS-based generation techniques that harvest ambient energy, such as vibration, are much better suited for implantable biosensing applications than fuel-based approaches, producing up to milliwatts of electrical power. High power density MEMS-based approaches, such as piezoelectric and electromagnetic schemes, allow for supplemental and replacement power schemes for biosensing applications to improve device capabilities and performance. In addition, this may allow for the biosensor to be further miniaturized, reducing the need for relatively large batteries with respect to device size. This would cause the implanted biosensor to be less invasive, increasing the quality of care received by the patient. PMID:22319362

  17. Large areas elemental mapping by ion beam analysis techniques

    NASA Astrophysics Data System (ADS)

    Silva, T. F.; Rodrigues, C. L.; Curado, J. F.; Allegro, P.; Moro, M. V.; Campos, P. H. O. V.; Santos, S. B.; Kajiya, E. A. M.; Rizzutto, M. A.; Added, N.; Tabacniks, M. H.

    2015-07-01

    The external beam line of the Laboratory for Material Analysis with Ion Beams (LAMFI) is a versatile setup for multi-technique analysis. X-ray detectors for Particle Induced X-rays Emission (PIXE) measurements, a Gamma-ray detector for Particle Induced Gamma- ray Emission (PIGE), and a particle detector for scattering analysis, such as Rutherford Backscattering Spectrometry (RBS), were already installed. In this work, we present some results, using a large (60-cm range) XYZ computer controlled sample positioning system, completely developed and build in our laboratory. The XYZ stage was installed at the external beam line and its high spacial resolution (better than 5 μm over the full range) enables positioning the sample with high accuracy and high reproducibility. The combination of a sub-millimeter beam with the large range XYZ robotic stage is being used to produce elemental maps of large areas in samples like paintings, ceramics, stones, fossils, and all sort of samples. Due to its particular characteristics, this is a unique device in the sense of multi-technique analysis of large areas. With the continuous development of the external beam line at LAMFI, coupled to the robotic XYZ stage, it is becoming a robust and reliable option for regular analysis of trace elements (Z > 5) competing with the traditional in-vacuum ion-beam-analysis with the advantage of automatic rastering.

  18. Novel techniques and the future of skull base reconstruction.

    PubMed

    Meier, Joshua C; Bleier, Benjamin S

    2013-01-01

    The field of endoscopic skull base surgery has evolved considerably in recent years fueled largely by advances in both imaging and instrumentation. While the indications for these approaches continue to be extended, the ability to reconstruct the resultant defects has emerged as a rate-limiting obstacle. Postoperative failures with current multilayer grafting techniques remain significant and may increase as the indications for endoscopic resections continue to expand. Laser tissue welding represents a novel method of wound repair in which laser energy is applied to a chromophore doped biologic solder at the wound edge to create a laser weld (fig. 1). These repairs are capable of withstanding forces far exceeding those exerted by intracranial pressure with negligible collateral thermal tissue injury. Recent clinical trials have demonstrated the safety and feasibility of endoscopic laser welding while exposing the limitations of first generation hyaluronic acid based solders. Novel supersaturated gel based solders are currently being tested in clinical trials and appear to possess significantly improved viscoelastic properties. While laser tissue welding remains an experimental technique, continued success with these novel solder formulations may catalyze the widespread adoption of this technique for skull base repair in the near future. PMID:23257563

  19. Analysis of monitoring techniques for prestressed concrete cylinder pipe

    SciTech Connect

    Hall, S.C.

    1994-12-31

    Concrete pressure pipe (CPP) is used in water and waste water systems that serve virtually every city in North America. Various techniques are used to evaluate the corrosion state of a buried pipeline. The two most commonly used are the pipe-to-soil (P/S) and cell-to-cell potential techniques. However, only a few references exist relating to the use of these monitoring procedures for CPP. Various corrosion engineering firms have confidence in one or the other technique without being able to provide the rationale for their preference. Both techniques have recently been challenged as being insufficiently reliable for CPP. This project consisted of setting up simulated corrosion cells on a 48-inch (1.22 m) diameter prestressed concrete cylinder pipe (PCCP) line and allowing five corrosion engineering firms the opportunity to use their monitoring techniques to locate corroding sites. This project evaluated existing corrosion monitoring techniques based on measuring electrical potentials on PCCP. It was found that bonded and unbonded prestressed concrete cylinder pipe can be monitored for corrosion depending on the intensity of corrosion and the location of the corrosion site on the pipe circumference.

  20. System availability management technique for reliability and maintainability analysis

    NASA Technical Reports Server (NTRS)

    Davenport, G. K.

    1970-01-01

    Method for total system availability analysis is based on numerical prediction of the reliability, maintainability, and availability of each function system. It incorporates these functional-system estimates into an overall mathematical model.

  1. Small area analysis using micro-diffraction techniques

    SciTech Connect

    GOEHNER,RAYMOND P.; TISSOT JR.,RALPH G.; MICHAEL,JOSEPH R.

    2000-02-11

    An overall trend toward smaller electronic packages and devices makes it increasingly important and difficult to obtain meaningful diffraction information from small areas. X-ray micro-diffraction, electron back-scattered diffraction (EBSD) and Kossel are micro-diffraction techniques used for crystallographic analysis including texture, phase identification and strain measurements. X-ray micro-diffraction primarily is used for phase analysis and residual strain measurements. X-ray micro-diffraction primarily is used for phase analysis and residual strain measurements of areas between 10 {micro}m to 100 {micro}m. For areas this small glass capillary optics are used for producing a usable collimated x-ray beam. These optics are designed to reflect x-rays below the critical angle therefore allowing for larger solid acceptance angle at the x-ray source resulting in brighter smaller x-ray beams. The determination of residual strain using micro-diffraction techniques is very important to the semiconductor industry. Residual stresses have caused voiding of the interconnect metal which then destroys electrical continuity. Being able to determine the residual stress helps industry to predict failures from the aging effects of interconnects due to this stress voiding. Stress measurements would be impossible using a conventional x-ray diffractometer; however, utilizing a 30{micro}m glass capillary these small areas are readily assessable for analysis. Kossel produces a wide angle diffraction pattern from fluorescent x-rays generated in the sample by an e-beam in a SEM. This technique can yield very precise lattice parameters for determining strain. Fig. 2 shows a Kossel pattern from a Ni specimen. Phase analysis on small areas is also possible using an energy dispersive spectrometer (EBSD) and x-ray micro-diffraction techniques. EBSD has the advantage of allowing the user to observe the area of interest using the excellent imaging capabilities of the SEM. An EDS detector has

  2. Analysis of planar acintigraphic images using the Li-Ma technique.

    PubMed

    Karhan, Pavel; Ptáček, Jaroslav; Fiala, Petr; Henzlová, Lenka

    2016-02-01

    A statistics-based approach to comparison of planar scintigraphic images is introduced to provide additional information to subtraction method. The proposed procedure leads to parametric images with better noise properties allowing subsequent statistical analysis. An example of an application of the technique is given using parathyroid scintigrams. The presented technique is not intended to replace the image subtraction method but offers a tool that may help during a diagnosis-making process. PMID:26936483

  3. A unified neural-network-based speaker localization technique.

    PubMed

    Arslan, G; Sakarya, F A

    2000-01-01

    Locating and tracking a speaker in real time using microphone arrays is important in many applications such as hands-free video conferencing, speech processing in large rooms, and acoustic echo cancellation. A speaker can be moving from the far field to the near field of the array, or vice versa. Many neural-network-based localization techniques exist, but they are applicable to either far-field or near-field sources, and are computationally intensive for real-time speaker localization applications because of the wide-band nature of the speech. We propose a unified neural-network-based source localization technique, which is simultaneously applicable to wide-band and narrow-band signal sources that are in the far field or near field of a microphone array. The technique exploits a multilayer perceptron feedforward neural network structure and forms the feature vectors by computing the normalized instantaneous cross-power spectrum samples between adjacent pairs of sensors. Simulation results indicate that our technique is able to locate a source with an absolute error of less than 3.5 degrees at a signal-to-noise ratio of 20 dB and a sampling rate of 8000 Hz at each sensor. PMID:18249826

  4. Cost-variance analysis by DRGs; a technique for clinical budget analysis.

    PubMed

    Voss, G B; Limpens, P G; Brans-Brabant, L J; van Ooij, A

    1997-02-01

    In this article it is shown how a cost accounting system based on DRGs can be valuable in determining changes in clinical practice and explaining alterations in expenditure patterns from one period to another. A cost-variance analysis is performed using data from the orthopedic department from the fiscal years 1993 and 1994. Differences between predicted and observed cost for medical care, such as diagnostic procedures, therapeutic procedures and nursing care are analyzed into different components: changes in patient volume, case-mix differences, changes in resource use and variations in cost per procedure. Using a DRG cost accounting system proved to be a useful technique for clinical budget analysis. Results may stimulate discussions between hospital managers and medical professionals to explain cost variations integrating medical and economic aspects of clinical health care. PMID:10165044

  5. Characterization of PTFE Using Advanced Thermal Analysis Techniques

    NASA Astrophysics Data System (ADS)

    Blumm, J.; Lindemann, A.; Meyer, M.; Strasser, C.

    2010-10-01

    Polytetrafluoroethylene (PTFE) is a synthetic fluoropolymer used in numerous industrial applications. It is often referred to by its trademark name, Teflon. Thermal characterization of a PTFE material was carried out using various thermal analysis and thermophysical properties test techniques. The transformation energetics and specific heat were measured employing differential scanning calorimetry. The thermal expansion and the density changes were determined employing pushrod dilatometry. The viscoelastic properties (storage and loss modulus) were analyzed using dynamic mechanical analysis. The thermal diffusivity was measured using the laser flash technique. Combining thermal diffusivity data with specific heat and density allows calculation of the thermal conductivity of the polymer. Measurements were carried out from - 125 °C up to 150 °C. Additionally, measurements of the mechanical properties were carried out down to - 170 °C. The specific heat tests were conducted into the fully molten regions up to 370 °C.

  6. Mathematical analysis techniques for modeling the space network activities

    NASA Technical Reports Server (NTRS)

    Foster, Lisa M.

    1992-01-01

    The objective of the present work was to explore and identify mathematical analysis techniques, and in particular, the use of linear programming. This topic was then applied to the Tracking and Data Relay Satellite System (TDRSS) in order to understand the space network better. Finally, a small scale version of the system was modeled, variables were identified, data was gathered, and comparisons were made between actual and theoretical data.

  7. Refinement of Techniques Metallographic Analysis of Highly Dispersed Structures

    NASA Astrophysics Data System (ADS)

    Khammatov, A.; Belkin, D.; Barbina, N.

    2016-01-01

    Flaws are regularly made while developing standards and technical specifications. They can come out as minor misprints, as an insufficient description of a technique. In spite the fact that the flaws are well known, it does not come to the stage of introducing changes to standards. In this paper shows that in the normative documents is necessary to clarify the requirements for metallurgical microscopes, which are used for analysis of finely-dispersed.

  8. Scalable Analysis Techniques for Microprocessor Performance Counter Metrics

    SciTech Connect

    Ahn, D H; Vetter, J S

    2002-07-24

    Contemporary microprocessors provide a rich set of integrated performance counters that allow application developers and system architects alike the opportunity to gather important information about workload behaviors. These counters can capture instruction, memory, and operating system behaviors. Current techniques for analyzing data produced from these counters use raw counts, ratios, and visualization techniques to help users make decisions about their application source code. While these techniques are appropriate for analyzing data from one process, they do not scale easily to new levels demanded by contemporary computing systems. Indeed, the amount of data generated by these experiments is on the order of tens of thousands of data points. Furthermore, if users execute multiple experiments, then we add yet another dimension to this already knotty picture. This flood of multidimensional data can swamp efforts to harvest important ideas from these valuable counters. Very simply, this paper addresses these concerns by evaluating several multivariate statistical techniques on these datasets. We find that several techniques, such as statistical clustering, can automatically extract important features from this data. These derived results can, in turn, be feed directly back to an application developer, or used as input to a more comprehensive performance analysis environment, such as a visualization or an expert system.

  9. Choosing a DIVA: a comparison of emerging digital imagery vegetation analysis techniques

    USGS Publications Warehouse

    Jorgensen, Christopher F.; Stutzman, Ryan J.; Anderson, Lars C.; Decker, Suzanne E.; Powell, Larkin A.; Schacht, Walter H.; Fontaine, Joseph J.

    2013-01-01

    Question: What is the precision of five methods of measuring vegetation structure using ground-based digital imagery and processing techniques? Location: Lincoln, Nebraska, USA Methods: Vertical herbaceous cover was recorded using digital imagery techniques at two distinct locations in a mixed-grass prairie. The precision of five ground-based digital imagery vegetation analysis (DIVA) methods for measuring vegetation structure was tested using a split-split plot analysis of covariance. Variability within each DIVA technique was estimated using coefficient of variation of mean percentage cover. Results: Vertical herbaceous cover estimates differed among DIVA techniques. Additionally, environmental conditions affected the vertical vegetation obstruction estimates for certain digital imagery methods, while other techniques were more adept at handling various conditions. Overall, percentage vegetation cover values differed among techniques, but the precision of four of the five techniques was consistently high. Conclusions: DIVA procedures are sufficient for measuring various heights and densities of standing herbaceous cover. Moreover, digital imagery techniques can reduce measurement error associated with multiple observers' standing herbaceous cover estimates, allowing greater opportunity to detect patterns associated with vegetation structure.

  10. Preliminary assessment of aerial photography techniques for canvasback population analysis

    USGS Publications Warehouse

    Munro, R.E.; Trauger, D.L.

    1976-01-01

    Recent intensive research on the canvasback has focused attention on the need for more precise estimates of population parameters. During the 1972-75 period, various types of aerial photographing equipment were evaluated to determine the problems and potentials for employing these techniques in appraisals of canvasback populations. The equipment and procedures available for automated analysis of aerial photographic imagery were also investigated. Serious technical problems remain to be resolved, but some promising results were obtained. Final conclusions about the feasibility of operational implementation await a more rigorous analysis of the data collected.

  11. Residual stress measurement and analysis using ultrasonic techniques.

    NASA Technical Reports Server (NTRS)

    Noronha, P. J.; Chapman, J. R.; Wert, J. J.

    1973-01-01

    A technique which utilizes ultrasonic radiation has been developed to measure residual stresses in metals. This technique makes it possible to detect and measure the magnitude of the principle stresses and also to obtain their direction. The velocities of ultrasonic waves in materials are measured as the time to travel a fixed path length, and the change in transit time is related to the applied stress. The linear relationship obtained allows a procedure based on this principle to be used for the measurement of residual stress using surface waves and shear waves. A method for plotting stress profiles through a material using surface waves uses varying frequencies for the ultrasonic wave. A limitation of the shear wave method is considered. The system used for this technique is called the Modified Time of Flight System.

  12. Fusing modeling techniques to support domain analysis for reuse opportunities identification

    NASA Technical Reports Server (NTRS)

    Hall, Susan Main; Mcguire, Eileen

    1993-01-01

    Functional modeling techniques or object-oriented graphical representations, which are more useful to someone trying to understand the general design or high level requirements of a system? For a recent domain analysis effort, the answer was a fusion of popular modeling techniques of both types. By using both functional and object-oriented techniques, the analysts involved were able to lean on their experience in function oriented software development, while taking advantage of the descriptive power available in object oriented models. In addition, a base of familiar modeling methods permitted the group of mostly new domain analysts to learn the details of the domain analysis process while producing a quality product. This paper describes the background of this project and then provides a high level definition of domain analysis. The majority of this paper focuses on the modeling method developed and utilized during this analysis effort.

  13. Artificial intelligence techniques used in respiratory sound analysis--a systematic review.

    PubMed

    Palaniappan, Rajkumar; Sundaraj, Kenneth; Sundaraj, Sebastian

    2014-02-01

    Artificial intelligence (AI) has recently been established as an alternative method to many conventional methods. The implementation of AI techniques for respiratory sound analysis can assist medical professionals in the diagnosis of lung pathologies. This article highlights the importance of AI techniques in the implementation of computer-based respiratory sound analysis. Articles on computer-based respiratory sound analysis using AI techniques were identified by searches conducted on various electronic resources, such as the IEEE, Springer, Elsevier, PubMed, and ACM digital library databases. Brief descriptions of the types of respiratory sounds and their respective characteristics are provided. We then analyzed each of the previous studies to determine the specific respiratory sounds/pathology analyzed, the number of subjects, the signal processing method used, the AI techniques used, and the performance of the AI technique used in the analysis of respiratory sounds. A detailed description of each of these studies is provided. In conclusion, this article provides recommendations for further advancements in respiratory sound analysis. PMID:24114889

  14. A Different Web-Based Geocoding Service Using Fuzzy Techniques

    NASA Astrophysics Data System (ADS)

    Pahlavani, P.; Abbaspour, R. A.; Zare Zadiny, A.

    2015-12-01

    Geocoding - the process of finding position based on descriptive data such as address or postal code - is considered as one of the most commonly used spatial analyses. Many online map providers such as Google Maps, Bing Maps and Yahoo Maps present geocoding as one of their basic capabilities. Despite the diversity of geocoding services, users usually face some limitations when they use available online geocoding services. In existing geocoding services, proximity and nearness concept is not modelled appropriately as well as these services search address only by address matching based on descriptive data. In addition there are also some limitations in display searching results. Resolving these limitations can enhance efficiency of the existing geocoding services. This paper proposes the idea of integrating fuzzy technique with geocoding process to resolve these limitations. In order to implement the proposed method, a web-based system is designed. In proposed method, nearness to places is defined by fuzzy membership functions and multiple fuzzy distance maps are created. Then these fuzzy distance maps are integrated using fuzzy overlay technique for obtain the results. Proposed methods provides different capabilities for users such as ability to search multi-part addresses, searching places based on their location, non-point representation of results as well as displaying search results based on their priority.

  15. Flow management techniques for base and afterbody drag reduction

    NASA Astrophysics Data System (ADS)

    Viswanath, P. R.

    The problem of turbulent base flows and the drag associated with it have been of significant interest in missile as well as fighter aircraft design. Numerous studies in the literature have been devoted to aspects of reducing base drag on two-dimensional as well as on axisymmetric bodies. This paper presents a review of the developments that have taken place on the use of passive techniques or devices for axisymmetric base and net afterbody drag reduction in the absence of jet flow at the base. In particular, the paper discusses the effectiveness of base cavities, ventilated cavities, locked vortex afterbodies, multi-step afterbodies and afterbodies employing a non-axisymmetric boat-tailing concept for base and net drag reduction in different speed regimes. The broad features of the flow and the likely fluid-dynamical mechanisms associated with the device leading to base drag reduction are highlighted. Flight-test results assessing the effectiveness of some of the devices are compared with data from wind tunnels. The present survey indicates that base and net afterbody drag reduction of considerable engineering significance in aerospace applications can be achieved by various passive devices even when the (unmanipulated) base flow is not characterised by vortex shedding.

  16. Wavelet-based techniques for the gamma-ray sky

    NASA Astrophysics Data System (ADS)

    McDermott, Samuel D.; Fox, Patrick J.; Cholis, Ilias; Lee, Samuel K.

    2016-07-01

    We demonstrate how the image analysis technique of wavelet decomposition can be applied to the gamma-ray sky to separate emission on different angular scales. New structures on scales that differ from the scales of the conventional astrophysical foreground and background uncertainties can be robustly extracted, allowing a model-independent characterization with no presumption of exact signal morphology. As a test case, we generate mock gamma-ray data to demonstrate our ability to extract extended signals without assuming a fixed spatial template. For some point source luminosity functions, our technique also allows us to differentiate a diffuse signal in gamma-rays from dark matter annihilation and extended gamma-ray point source populations in a data-driven way.

  17. Hydrocarbon microseepage mapping using signature based target detection techniques

    NASA Astrophysics Data System (ADS)

    Soydan, Hilal; Koz, Alper; Şebnem Düzgün, H.; Aydin Alatan, A.

    2015-10-01

    In this paper, we compare the conventional methods in hydrocarbon seepage anomalies with the signature based detection algorithms. The Crosta technique [1] is selected as a basement in the experimental comparisons for the conventional approach. The Crosta technique utilizes the characteristic bands of the searched target for principal component transformation in order to determine the components characterizing the target in interest. Desired Target Detection and Classification Algorithm (DTDCA), Spectral Matched Filter (SMF), and Normalized Correlation (NC) are employed for signature based target detection. Signature based target detection algorithms are applied to the whole spectrum benefiting from the information stored in all spectral bands. The selected methods are applied to a multispectral Advanced SpaceBorne Thermal Emission and Radiometer (ASTER) image of the study region, with an atmospheric correction prior to the realization of the algorithms. ASTER provides multispectral bands covering visible, short wave, and thermal infrared region, which serves as a useful tool for the interpretation of the areas with hydrocarbon anomalies. The exploration area is selected as Gemrik Anticline which is located in South East Anatolia, Adıyaman, Bozova Oil Field, where microseeps can be observed with almost no vegetation cover. The spectral signatures collected with Analytical Spectral Devices Inc. (ASD) spectrometer from the reference valley [2] have been utilized as an input to the signature based detection algorithms. The experiments have indicated that DTDCA and MF outperforms the Crosta technique by locating the microseepage patterns along the mitigation pathways with a better contrast. On the other hand, NC has not been able to map the searched target with a visible distinction. It is concluded that the signature based algorithms can be more effective than the conventional methods for the detection of microseepage induced anomalies.

  18. Novel optical password security technique based on optical fractal synthesizer

    NASA Astrophysics Data System (ADS)

    Wu, Kenan; Hu, Jiasheng; Wu, Xu

    2009-06-01

    A novel optical security technique for safeguarding user passwords based on an optical fractal synthesizer is proposed. A validating experiment has been carried out. In the proposed technique, a user password is protected by being converted to a fractal image. When a user sets up a new password, the password is transformed into a fractal pattern, and the fractal pattern is stored in authority. If the user is online-validated, his or her password is converted to a fractal pattern again to compare with the previous stored fractal pattern. The converting process is called the fractal encoding procedure, which consists of two steps. First, the password is nonlinearly transformed to get the parameters for the optical fractal synthesizer. Then the optical fractal synthesizer is operated to generate the output fractal image. The experimental result proves the validity of our method. The proposed technique bridges the gap between digital security systems and optical security systems and has many advantages, such as high security level, convenience, flexibility, hyper extensibility, etc. This provides an interesting optical security technique for the protection of digital passwords.

  19. Manifold learning techniques for the analysis of hyperspectral ocean data

    NASA Astrophysics Data System (ADS)

    Gillis, David; Bowles, Jeffrey; Lamela, Gia M.; Rhea, William J.; Bachmann, Charles M.; Montes, Marcos; Ainsworth, Tom

    2005-06-01

    A useful technique in hyperspectral data analysis is dimensionality reduction, which replaces the original high dimensional data with low dimensional representations. Usually this is done with linear techniques such as linear mixing or principal components (PCA). While often useful, there is no a priori reason for believing that the data is actually linear. Lately there has been renewed interest in modeling high dimensional data using nonlinear techniques such as manifold learning (ML). In ML, the data is assumed to lie on a low dimensional, possibly curved surface (or manifold). The goal is to discover this manifold and therefore find the best low dimensional representation of the data. Recently, researchers at the Naval Research Lab have begun to model hyperspectral data using ML. We continue this work by applying ML techniques to hyperspectral ocean water data. We focus on water since there are underlying physical reasons for believing that the data lies on a certain type of nonlinear manifold. In particular, ocean data is influenced by three factors: the water parameters, the bottom type, and the depth. For fixed water and bottom types, the spectra that arise by varying the depth will lie on a nonlinear, one dimensional manifold (i.e. a curve). Generally, water scenes will contain a number of different water and bottom types, each combination of which leads to a distinct curve. In this way, the scene may be modeled as a union of one dimensional curves. In this paper, we investigate the use of manifold learning techniques to separate the various curves, thus partitioning the scene into homogeneous areas. We also discuss ways in which these techniques may be able to derive various scene characteristics such as bathymetry.

  20. The analysis of unsteady wind turbine data using wavelet techniques

    SciTech Connect

    Slepski, J.E.; Kirchhoff, R.H.

    1995-09-01

    Wavelet analysis employs a relatively new technique which decomposes a signal into wavelets of finite length. A wavelet map is generated showing the distribution of signal variance in both the time and frequency domain. The first section of this paper begins with an introduction to wavelet theory, contrasting it to standard Fourier analysis. Some simple applications to the processing of harmonic signals are then given. Since wind turbines operate under unsteady stochastic loads, the time series of most machine parameters are non-stationary; wavelet analysis can be applied to this problem. In the second section of this paper, wavelet methods are used to examine data from Phase 2 of the NREL Combined Experiment. Data analyzed includes airfoil surface pressure, and low speed shaft torque. In each case the wavelet map offers valuable insight that could not be made without it.

  1. Detection of arterial disorders by spectral analysis techniques.

    PubMed

    Ubeyli, Elif Derya

    2007-01-01

    This paper intends to an integrated view of the spectral analysis techniques in the detection of arterial disorders. The paper includes illustrative information about feature extraction from signals recorded from arteries. Short-time Fourier transform (STFT) and wavelet transform (WT) were used for spectral analysis of ophthalmic arterial (OA) Doppler signals. Using these spectral analysis methods, the variations in the shape of the Doppler spectra as a function of time were presented in the form of sonograms in order to obtain medical information. These sonograms were then used to compare the applied methods in terms of their frequency resolution and the effects in determination of OA stenosis. The author suggest that the content of the paper will assist to the people in gaining a better understanding of the STFT and WT in the detection of arterial disorders. PMID:17502695

  2. Crankshaft stress analysis; Combination of finite element and classical analysis techniques

    SciTech Connect

    Heath, A.R.; McNamara, P.M. )

    1990-07-01

    The conflicting legislative and customer pressures on engine design, for example, combining low friction and a high level of refinement, require sophisticated tools if competitive designs are to be realized. This is particularly true of crankshafts, probably the most analyzed of all engine components. This paper describes the hierarchy of methods used for crankshaft stress analysis with case studies. A computer-based analysis system is described that combines FE and classical methods to allow optimized designs to be produced efficiently. At the lowest level simplified classical techniques are integrated into the CAD-based design process. These methods give the rapid feedback necessary to perform concept design iterations. Various levels of FE analysis are available to carry out more detailed analyses of the crankshaft. The FE studies may feed information to or take information from the classical methods. At the highest level a method for including the load sharing effects of the flexible crankshaft within a flexible block interconnected by nonlinear oil films is described.

  3. Analysis techniques for two-dimensional infrared data

    NASA Technical Reports Server (NTRS)

    Winter, E. M.; Smith, M. C.

    1978-01-01

    In order to evaluate infrared detection and remote sensing systems, it is necessary to know the characteristics of the observational environment. For both scanning and staring sensors, the spatial characteristics of the background may be more of a limitation to the performance of a remote sensor than system noise. This limitation is the so-called spatial clutter limit and may be important for systems design of many earth application and surveillance sensors. The data used in this study is two dimensional radiometric data obtained as part of the continuing NASA remote sensing programs. Typical data sources are the Landsat multi-spectral scanner (1.1 micrometers), the airborne heat capacity mapping radiometer (10.5 - 12.5 micrometers) and various infrared data sets acquired by low altitude aircraft. Techniques used for the statistical analysis of one dimensional infrared data, such as power spectral density (PSD), exceedance statistics, etc. are investigated for two dimensional applicability. Also treated are two dimensional extensions of these techniques (2D PSD, etc.), and special techniques developed for the analysis of 2D data.

  4. Using object-oriented analysis techniques to support system testing

    NASA Astrophysics Data System (ADS)

    Zucconi, Lin

    1990-03-01

    Testing of real-time control systems can be greatly facilitated by use of object-oriented and structured analysis modeling techniques. This report describes a project where behavior, process and information models built for a real-time control system were used to augment and aid traditional system testing. The modeling techniques used were an adaptation of the Ward/Mellor method for real-time systems analysis and design (Ward85) for object-oriented development. The models were used to simulate system behavior by means of hand execution of the behavior or state model and the associated process (data and control flow) and information (data) models. The information model, which uses an extended entity-relationship modeling technique, is used to identify application domain objects and their attributes (instance variables). The behavioral model uses state-transition diagrams to describe the state-dependent behavior of the object. The process model uses a transformation schema to describe the operations performed on or by the object. Together, these models provide a means of analyzing and specifying a system in terms of the static and dynamic properties of the objects which it manipulates. The various models were used to simultaneously capture knowledge about both the objects in the application domain and the system implementation. Models were constructed, verified against the software as-built and validated through informal reviews with the developer. These models were then hand-executed.

  5. New acquisition techniques and statistical analysis of bubble size distributions

    NASA Astrophysics Data System (ADS)

    Proussevitch, A.; Sahagian, D.

    2005-12-01

    Various approaches have been taken to solve the long-standing problem of determining size distributions of objects embedded in an opaque medium. In the case of vesicles in volcanic rocks, the most reliable technique is 3-D imagery by computed X-Ray tomography. However, this method is expensive, requires intensive computational resources and thus limited and not always available for an investigator. As a cheaper alternative, 2-D cross-sectional data is commonly available, but requires stereological analysis for 3-D conversion. A stereology technique for spherical bubbles is quite robust but elongated non-spherical bubbles require complicated conversion approaches and large observed populations. We have revised computational schemes of applying non-spherical stereology for practical analysis of bubble size distributions. The basic idea of this new approach is to exclude from the conversion those classes (bins) of non-spherical bubbles that provide a larger cross-section probability distribution than a maximum value which depends on mean aspect ratio. Thus, in contrast to traditional stereological techniques, larger bubbles are "predicted" from the rest of the population. As a proof of principle, we have compared distributions so obtained with direct 3-D imagery (X-Ray tomography) for non-spherical bubbles from the same samples of vesicular basalts collected from the Colorado Plateau. The results of the comparison demonstrate that in cases where x-ray tomography is impractical, stereology can be used with reasonable reliability, even for non-spherical vesicles.

  6. An in Situ Technique for Elemental Analysis of Lunar Surfaces

    NASA Technical Reports Server (NTRS)

    Kane, K. Y.; Cremers, D. A.

    1992-01-01

    An in situ analytical technique that can remotely determine the elemental constituents of solids has been demonstrated. Laser-Induced Breakdown Spectroscopy (LIBS) is a form of atomic emission spectroscopy in which a powerful laser pulse is focused on a solid to generate a laser spark, or microplasma. Material in the plasma is vaporized, and the resulting atoms are excited to emit light. The light is spectrally resolved to identify the emitting species. LIBS is a simple technique that can be automated for inclusion aboard a remotely operated vehicle. Since only optical access to a sample is required, areas inaccessible to a rover can be analyzed remotely. A single laser spark both vaporizes and excites the sample so that near real-time analysis (a few minutes) is possible. This technique provides simultaneous multielement detection and has good sensitivity for many elements. LIBS also eliminates the need for sample retrieval and preparation preventing possible sample contamination. These qualities make the LIBS technique uniquely suited for use in the lunar environment.

  7. An osmolyte-based micro-volume ultrafiltration technique.

    PubMed

    Ghosh, Raja

    2014-12-01

    This paper discusses a novel, simple, and inexpensive micro-volume ultrafiltration technique for protein concentration, desalting, buffer exchange, and size-based protein purification. The technique is suitable for processing protein samples in a high-throughput mode. It utilizes a combination of capillary action, and osmosis for drawing water and other permeable species from a micro-volume sample droplet applied on the surface of an ultrafiltration membrane. A macromolecule coated on the permeate side of the membrane functions as the osmolyte. The action of the osmolyte could, if required, be augmented by adding a supersorbent polymer layer over the osmolyte. The mildly hydrophobic surface of the polymeric ultrafiltration membrane used in this study minimized sample droplet spreading, thus making it easy to recover the retained material after separation, without sample interference and cross-contamination. High protein recoveries were observed in the micro-volume ultrafiltration experiments described in the paper. PMID:25284741

  8. New modulation-based watermarking technique for video

    NASA Astrophysics Data System (ADS)

    Lemma, Aweke; van der Veen, Michiel; Celik, Mehmet

    2006-02-01

    Successful watermarking algorithms have already been developed for various applications ranging from meta-data tagging to forensic tracking. Nevertheless, it is commendable to develop alternative watermarking techniques that provide a broader basis for meeting emerging services, usage models and security threats. To this end, we propose a new multiplicative watermarking technique for video, which is based on the principles of our successful MASK audio watermark. Audio-MASK has embedded the watermark by modulating the short-time envelope of the audio signal and performed detection using a simple envelope detector followed by a SPOMF (symmetrical phase-only matched filter). Video-MASK takes a similar approach and modulates the image luminance envelope. In addition, it incorporates a simple model to account for the luminance sensitivity of the HVS (human visual system). Preliminary tests show algorithms transparency and robustness to lossy compression.

  9. Characterization of high resolution MR images reconstructed by a GRAPPA based parallel technique

    NASA Astrophysics Data System (ADS)

    Banerjee, Suchandrima; Majumdar, Sharmila

    2006-03-01

    This work implemented an auto-calibrating parallel imaging technique and applied it to in vivo magnetic resonance imaging (MRI) of trabecular bone micro-architecture. A Generalized auto-calibrating partially parallel acquisition (GRAPPA) based reconstruction technique using modified robust data fitting was developed. The MR data was acquired with an eight channel phased array receiver on three normal volunteers on a General Electric 3 Tesla scanner. Microstructures comprising the trabecular bone architecture are of the order of 100 microns and hence their depiction requires very high imaging resolution. This work examined the effects of GRAPPA based parallel imaging on signal and noise characteristics and effective spatial resolution in high resolution (HR) images, for the range of undersampling or reduction factors 2-4. Additionally quantitative analysis was performed to obtain structural measures of trabecular bone from the images. Image quality in terms of contrast and depiction of structures was maintained in parallel images for reduction factors up to 3. Comparison between regular and parallel images suggested similar spatial resolution for both. However differences in noise characteristics in parallel images compared to regular images affected the threshholding based quantification. This suggested that GRAPPA based parallel images might require different analysis techniques. In conclusion, the study showed the feasibility of using parallel imaging techniques in HR-MRI of trabecular bone, although quantification strategies will have to be further investigated. Reduction of acquisition time using parallel techniques can improve the clinical feasibility of MRI of trabecular bone for prognosis and staging of the skeletal disorder osteoporosis.

  10. Vision based techniques for rotorcraft low altitude flight

    NASA Technical Reports Server (NTRS)

    Sridhar, Banavar; Suorsa, Ray; Smith, Philip

    1991-01-01

    An overview of research in obstacle detection at NASA Ames Research Center is presented. The research applies techniques from computer vision to automation of rotorcraft navigation. The development of a methodology for detecting the range to obstacles based on the maximum utilization of passive sensors is emphasized. The development of a flight and image data base for verification of vision-based algorithms, and a passive ranging methodology tailored to the needs of helicopter flight are discussed. Preliminary results indicate that it is possible to obtain adequate range estimates except at regions close to the FOE. Closer to the FOE, the error in range increases since the magnitude of the disparity gets smaller, resulting in a low SNR.

  11. Laser ablation in liquids as a new technique of sampling in elemental analysis of solid materials

    NASA Astrophysics Data System (ADS)

    Muravitskaya, E. V.; Rosantsev, V. A.; Belkov, M. V.; Ershov-Pavlov, E. A.; Klyachkovskaya, E. V.

    2009-02-01

    Laser ablation in liquid media is considered as a new sample preparation technique in the elemental composition analysis of materials using optical emission spectroscopy of inductively coupled plasma (ICP-OES). Solid samples are transformed into uniform colloidal solutions of nanosized analyte particles using laser radiation focused onto the sample surface. High homogeneity of the resulting solution allows performing the ICP-OES quantitative analysis especially for the samples, which are poorly soluble in acids. The technique is compatible with the conventional solution-based standards.

  12. A Novel Graph Based Fuzzy Clustering Technique For Unsupervised Classification Of Remote Sensing Images

    NASA Astrophysics Data System (ADS)

    Banerjee, B.; Krishna Moohan, B.

    2014-11-01

    This paper addresses the problem of unsupervised land-cover classification of multi-spectral remotely sensed images in the context of self-learning by exploring different graph based clustering techniques hierarchically. The only assumption used here is that the number of land-cover classes is known a priori. Object based image analysis paradigm which processes a given image at different levels, has emerged as a popular alternative to the pixel based approaches for remote sensing image segmentation considering the high spatial resolution of the images. A graph based fuzzy clustering technique is proposed here to obtain a better merging of an initially oversegmented image in the spectral domain compared to conventional clustering techniques. Instead of using Euclidean distance measure, the cumulative graph edge weight is used to find the distance between a pair of points to better cope with the topology of the feature space. In order to handle uncertainty in assigning class labels to pixels, which is not always a crisp allocation for remote sensing data, fuzzy set theoretic technique is incorporated to the graph based clustering. Minimum Spanning Tree (MST) based clustering technique is used to over-segment the image at the first level. Furthermore, considering that the spectral signature of different land-cover classes may overlap significantly, a self-learning based Maximum Likelihood (ML) classifier coupled with the Expectation Maximization (EM) based iterative unsupervised parameter retraining scheme is used to generate the final land-cover classification map. Results on two medium resolution images establish the superior performance of the proposed technique in comparison to the traditional fuzzy c-means clustering technique.

  13. Recording and analysis techniques for high-frequency oscillations

    PubMed Central

    Worrell, G.A.; Jerbi, K.; Kobayashi, K.; Lina, J.M.; Zelmann, R.; Le Van Quyen, M.

    2013-01-01

    In recent years, new recording technologies have advanced such that, at high temporal and spatial resolutions, high-frequency oscillations (HFO) can be recorded in human partial epilepsy. However, because of the deluge of multichannel data generated by these experiments, achieving the full potential of parallel neuronal recordings depends on the development of new data mining techniques to extract meaningful information relating to time, frequency and space. Here, we aim to bridge this gap by focusing on up-to-date recording techniques for measurement of HFO and new analysis tools for their quantitative assessment. In particular, we emphasize how these methods can be applied, what property might be inferred from neuronal signals, and potentially productive future directions. PMID:22420981

  14. Application of thermal analysis techniques in activated carbon production

    USGS Publications Warehouse

    Donnals, G.L.; DeBarr, J.A.; Rostam-Abadi, M.; Lizzio, A.A.; Brady, T.A.

    1996-01-01

    Thermal analysis techniques have been used at the ISGS as an aid in the development and characterization of carbon adsorbents. Promising adsorbents from fly ash, tires, and Illinois coals have been produced for various applications. Process conditions determined in the preparation of gram quantities of carbons were used as guides in the preparation of larger samples. TG techniques developed to characterize the carbon adsorbents included the measurement of the kinetics of SO2 adsorption, the performance of rapid proximate analyses, and the determination of equilibrium methane adsorption capacities. Thermal regeneration of carbons was assessed by TG to predict the life cycle of carbon adsorbents in different applications. TPD was used to determine the nature of surface functional groups and their effect on a carbon's adsorption properties.

  15. Methodologies and techniques for analysis of network flow data

    SciTech Connect

    Bobyshev, A.; Grigoriev, M.; /Fermilab

    2004-12-01

    Network flow data gathered at the border routers and core switches is used at Fermilab for statistical analysis of traffic patterns, passive network monitoring, and estimation of network performance characteristics. Flow data is also a critical tool in the investigation of computer security incidents. Development and enhancement of flow based tools is an on-going effort. This paper describes the most recent developments in flow analysis at Fermilab.

  16. Study of systems and techniques for data base management

    NASA Technical Reports Server (NTRS)

    1976-01-01

    Data management areas were studied to identify pertinent problems and issues that will affect future NASA data users in terms of performance and cost. Specific topics discussed include the identifications of potential NASA data users other than those normally discussed, consideration affecting the clustering of minicomputers, low cost computer system for information retrieval and analysis, the testing of minicomputer based data base management systems, ongoing work related to the use of dedicated systems for data base management, and the problems of data interchange among a community of NASA data users.

  17. The future of magnetic resonance-based techniques in neurology.

    PubMed

    2001-01-01

    Magnetic resonance techniques have become increasingly important in neurology for defining: 1. brain, spinal cord and peripheral nerve or muscle structure; 2. pathological changes in tissue structures and properties; and 3. dynamic patterns of functional activation of the brain. New applications have been driven in part by advances in hardware, particularly improvements in magnet and gradient coil design. New imaging strategies allow novel approaches to contrast with, for example, diffusion imaging, magnetization transfer imaging, perfusion imaging and functional magnetic resonance imaging. In parallel with developments in hardware and image acquisition have been new approaches to image analysis. These have allowed quantitative descriptions of the image changes to be used for a precise, non-invasive definition of pathology. With the increasing capabilities and specificity of magnetic resonance techniques it is becoming more important that the neurologist is intimately involved in both the selection of magnetic resonance studies for patients and their interpretation. There is a need for considerably improved access to magnetic resonance technology, particularly in the acute or intensive care ward and in the neurosurgical theatre. This report illustrates several key developments. The task force concludes that magnetic resonance imaging is a major clinical tool of growing significance and offers recommendations for maximizing the potential future for magnetic resonance techniques in neurology. PMID:11509077

  18. Infrared Contrast Analysis Technique for Flash Thermography Nondestructive Evaluation

    NASA Technical Reports Server (NTRS)

    Koshti, Ajay

    2014-01-01

    The paper deals with the infrared flash thermography inspection to detect and analyze delamination-like anomalies in nonmetallic materials. It provides information on an IR Contrast technique that involves extracting normalized contrast verses time evolutions from the flash thermography infrared video data. The paper provides the analytical model used in the simulation of infrared image contrast. The contrast evolution simulation is achieved through calibration on measured contrast evolutions from many flat bottom holes in the subject material. The paper also provides formulas to calculate values of the thermal measurement features from the measured contrast evolution curve. Many thermal measurement features of the contrast evolution that relate to the anomaly characteristics are calculated. The measurement features and the contrast simulation are used to evaluate flash thermography inspection data in order to characterize the delamination-like anomalies. In addition, the contrast evolution prediction is matched to the measured anomaly contrast evolution to provide an assessment of the anomaly depth and width in terms of depth and diameter of the corresponding equivalent flat-bottom hole (EFBH) or equivalent uniform gap (EUG). The paper provides anomaly edge detection technique called the half-max technique which is also used to estimate width of an indication. The EFBH/EUG and half-max width estimations are used to assess anomaly size. The paper also provides some information on the "IR Contrast" software application, half-max technique and IR Contrast feature imaging application, which are based on models provided in this paper.

  19. Nuclear and radiochemical techniques in chemical analysis. Final report

    SciTech Connect

    Finston, H.L.; Williams, E.T.

    1981-06-01

    The areas studied during the period of the contract included determinations of cross sections for nuclear reactions, determination of neutron capture cross sections of radionuclides, application of special activation techniques, and x-ray counting, elucidation of synergic solvent extraction mechanisms and development of new solvent extraction techniques, and the development of a PIXE analytical facility. The thermal neutron capture cross section of /sup 22/Na was determined, and cross sections and energy levels were determined for /sup 20/Ne(n,..cap alpha..)/sup 17/O, /sup 20/Ne(n,P)/sup 20/F, and /sup 40/Ar(n,..cap alpha..)/sup 37/S. Inelastic scattering with 2 to 3 MeV neutrons followed by counting of the metastable states permits analysis of the following elements: In, Sr, Cd, Hg, and Pb. Bromine can be detected in the presence of a 500-fold excess of Na and/or K by thermal neutron activation and x-ray counting, and as little as 0.3 x 10/sup -9/ g of Hg can be detected by this technique. Mediun energy neutrons (10 to 160 MeV) have been used to determine Tl, Pb, and Bi by (n,Xn) and (n,PXn) reactions. The reaction /sup 19/F(P,..cap alpha..)/sup 76/O has been used to determine as little as 50 ..mu..mol of Freon -14. Mechanisms for synergic solvent extractions have been elucidated and a new technique of homogeneous liquid-liquid solvent extraction has been developed in which the neutral complex is rapidly extracted propylene carbonate by raising and lowering the temperature of the system. An external-beam PIXE system has been developed for trace element analyses of a variety of sample types. Various sample preparation techniques have been applied to a diverse range of samples including marine sediment, coral, coal, and blood.

  20. Antimisting kerosene: Base fuel effects, blending and quality control techniques

    NASA Technical Reports Server (NTRS)

    Yavrouian, A. H.; Ernest, J.; Sarohia, V.

    1984-01-01

    The problems associated with blending of the AMK additive with Jet A, and the base fuel effects on AMK properties are addressed. The results from the evaluation of some of the quality control techniques for AMK are presented. The principal conclusions of this investigation are: significant compositional differences for base fuel (Jet A) within the ASTM specification DI655; higher aromatic content of the base fuel was found to be beneficial for the polymer dissolution at ambient (20 C) temperature; using static mixer technology, the antimisting additive (FM-9) is in-line blended with Jet A, producing AMK which has adequate fire-protection properties 15 to 20 minutes after blending; degradability of freshly blended and equilibrated AMK indicated that maximum degradability is reached after adequate fire protection is obtained; the results of AMK degradability as measured by filter ratio, confirmed previous RAE data that power requirements to decade freshly blended AMK are significantly higher than equilibrated AMK; blending of the additive by using FM-9 concentrate in Jet A produces equilibrated AMK almost instantly; nephelometry offers a simple continuous monitoring capability and is used as a real time quality control device for AMK; and trajectory (jet thurst) and pressure drop tests are useful laboratory techniques for evaluating AMK quality.

  1. A modal impedance technique for mid and high frequency analysis of an uncertain stiffened composite plate

    NASA Astrophysics Data System (ADS)

    Seçgin, A.; Kara, M.; Ozankan, A.

    2016-03-01

    A modal impedance technique is introduced for mid frequency vibration analyses. The approach is mainly based on statistical energy analysis (SEA), however loss factors are determined by not only driving but also contributed by transfer mobilities. The mobilities are computed by finite element modal analysis. The technique takes geometrical complexity and boundary condition into account to handle their mid-frequency effects. It is applied to a stiffened composite plate having randomized mass, i.e., uncertain plate. For the verification, several numerical and experimental tests are performed. Internal damping of subsystems is evaluated using power injection and is then fed to finite element software to perform numerical analyses. Monte Carlo simulation is employed for the uncertainty analyses. To imitate plate mass heterogeneity, many small masses are used in both numerical and experimental analysis. It is shown that the proposed technique can reliably be used for vibration analyses of uncertain complex structures from mid to high frequency regions.

  2. Communication methods and production techniques in fixed prosthesis fabrication: a UK based survey. Part 2: Production techniques

    PubMed Central

    Berry, J.; Nesbit, M.; Saberi, S.; Petridis, H.

    2014-01-01

    Aim The aim of this study was to identify the communication methods and production techniques used by dentists and dental technicians for the fabrication of fixed prostheses within the UK from the dental technicians' perspective. This second paper reports on the production techniques utilised. Materials and methods Seven hundred and eighty-two online questionnaires were distributed to the Dental Laboratories Association membership and included a broad range of topics, such as demographics, impression disinfection and suitability, and various production techniques. Settings were managed in order to ensure anonymity of respondents. Statistical analysis was undertaken to test the influence of various demographic variables such as the source of information, the location, and the size of the dental laboratory. Results The number of completed responses totalled 248 (32% response rate). Ninety percent of the respondents were based in England and the majority of dental laboratories were categorised as small sized (working with up to 25 dentists). Concerns were raised regarding inadequate disinfection protocols between dentists and dental laboratories and the poor quality of master impressions. Full arch plastic trays were the most popular impression tray used by dentists in the fabrication of crowns (61%) and bridgework (68%). The majority (89%) of jaw registration records were considered inaccurate. Forty-four percent of dental laboratories preferred using semi-adjustable articulators. Axial and occlusal under-preparation of abutment teeth was reported as an issue in about 25% of cases. Base metal alloy was the most (52%) commonly used alloy material. Metal-ceramic crowns were the most popular choice for anterior (69%) and posterior (70%) cases. The various factors considered did not have any statistically significant effect on the answers provided. The only notable exception was the fact that more methods of communicating the size and shape of crowns were utilised for

  3. Rapid prototyping of extrusion dies using layer-based techniques

    SciTech Connect

    Misiolek, W.Z.; Winther, K.T.; Prats, A.E.; Rock, S.J.

    1999-02-01

    Extrusion die design and development often requires significant craftsman skill and iterative improvement to arrive at a production-ready die geometry. Constructing the dies used during this iterative process from layers, rather than from one solid block of material, offers unique opportunities to improve die development efficiency when coupled with concepts drawn from the rapid prototyping field. This article presents a proof-of-concept illustrating the potential utility of layer-based extrusion dies for the die design and fabrication process. The major benefits include greater flexibility in the design process, a more efficient, automated fabrication technique, and a means for performing localized die modifications and repairs.

  4. Simultaneous algebraic reconstruction technique based on guided image filtering.

    PubMed

    Ji, Dongjiang; Qu, Gangrong; Liu, Baodong

    2016-07-11

    The challenge of computed tomography is to reconstruct high-quality images from few-view projections. Using a prior guidance image, guided image filtering smoothes images while preserving edge features. The prior guidance image can be incorporated into the image reconstruction process to improve image quality. We propose a new simultaneous algebraic reconstruction technique based on guided image filtering. Specifically, the prior guidance image is updated in the image reconstruction process, merging information iteratively. To validate the algorithm practicality and efficiency, experiments were performed with numerical phantom projection data and real projection data. The results demonstrate that the proposed method is effective and efficient for nondestructive testing and rock mechanics. PMID:27410859

  5. Foreign fiber detecting system based on multispectral technique

    NASA Astrophysics Data System (ADS)

    Li, Qi; Han, Shaokun; Wang, Ping; Wang, Liang; Xia, Wenze

    2015-08-01

    This paper presents a foreign fiber detecting system based on multi-spectral technique. The absorption rate and the reflectivity of foreign fibers differently under different wavelengths of light so that the characteristics of the image has difference in the different light irradiation. Contrast pyramid image fusion algorithm and adaptive enhancement is improved to extracted the foreign fiber from the cotton background. The experimental results show that the single light source can detect 6 kinds of foreign fiber in cotton and multi-spectral detection can detect eight kinds.

  6. NIOS II processor-based acceleration of motion compensation techniques

    NASA Astrophysics Data System (ADS)

    González, Diego; Botella, Guillermo; Mookherjee, Soumak; Meyer-Bäse, Uwe; Meyer-Bäse, Anke

    2011-06-01

    This paper focuses on the hardware acceleration of motion compensation techniques suitable for the MPEG video compression. A plethora of representative motion estimation search algorithms and the new perspectives are introduced. The methods and designs described here are qualified for medical imaging area where are involved larger images. The structure of the processing systems considered has a good fit for reconfigurable acceleration. The system is based in a platform like FPGA working with the Nios II Microprocessor platform applying C2H acceleration. The paper shows the results in terms of performance and resources needed.

  7. Laser jamming technique research based on combined fiber laser

    NASA Astrophysics Data System (ADS)

    Jie, Xu; Shanghong, Zhao; Rui, Hou; Shengbao, Zhan; Lei, Shi; Jili, Wu; Shaoqiang, Fang; Yongjun, Li

    2009-06-01

    A compact and light laser jamming source is needed to increase the flexibility of laser jamming technique. A novel laser jamming source based on combined fiber lasers is proposed. Preliminary experimental results show that power levels in excess of 10 kW could be achieved. An example of laser jamming used for an air-to-air missile is given. It shows that the tracking system could complete tracking in only 4 s and came into a steady state with its new tracking target being the laser jamming source.

  8. Reduction and analysis techniques for infrared imaging data

    NASA Technical Reports Server (NTRS)

    Mccaughrean, Mark

    1989-01-01

    Infrared detector arrays are becoming increasingly available to the astronomy community, with a number of array cameras already in use at national observatories, and others under development at many institutions. As the detector technology and imaging instruments grow more sophisticated, more attention is focussed on the business of turning raw data into scientifically significant information. Turning pictures into papers, or equivalently, astronomy into astrophysics, both accurately and efficiently, is discussed. Also discussed are some of the factors that can be considered at each of three major stages; acquisition, reduction, and analysis, concentrating in particular on several of the questions most relevant to the techniques currently applied to near infrared imaging.

  9. Development of solution techniques for nonlinear structural analysis

    NASA Technical Reports Server (NTRS)

    Vos, R. G.; Andrews, J. S.

    1974-01-01

    Nonlinear structural solution methods in the current research literature are classified according to order of the solution scheme, and it is shown that the analytical tools for these methods are uniformly derivable by perturbation techniques. A new perturbation formulation is developed for treating an arbitrary nonlinear material, in terms of a finite-difference generated stress-strain expansion. Nonlinear geometric effects are included in an explicit manner by appropriate definition of an applicable strain tensor. A new finite-element pilot computer program PANES (Program for Analysis of Nonlinear Equilibrium and Stability) is presented for treatment of problems involving material and geometric nonlinearities, as well as certain forms on nonconservative loading.

  10. Techniques for Improving Filters in Power Grid Contingency Analysis

    SciTech Connect

    Adolf, Robert D.; Haglin, David J.; Halappanavar, Mahantesh; Chen, Yousu; Huang, Zhenyu

    2011-12-31

    In large-scale power transmission systems, predicting faults and preemptively taking corrective action to avoid them is essential to preventing rolling blackouts. The computational study of the constantly-shifting state of the power grid and its weaknesses is called contingency analysis. Multiple-contingency planning in the electrical grid is one example of a complex monitoring system where a full computational solution is operationally infeasible. We present a general framework for building and evaluating resource-aware models of filtering techniques for this type of monitoring.

  11. Prompt gamma activation analysis: An old technique made new

    SciTech Connect

    English, Jerry; Firestone, Richard; Perry, Dale; Leung, Ka-Ngo; Reijonen, Jani; Garabedian, Glenn; Bandong, Bryan; Molnar, Gabor; Revay, Zsolt

    2002-12-01

    The long list of contributors to the prompt gamma activation analysis (PGAA) project is important because it highlights the broad cast of active PGAA researchers from various facilities and backgrounds. PGAA is basically a simple process in principle that was traditionally difficult in application. It is an old technique that has for years been tied to and associated exclusively with nuclear reactor facilities, which has limited its acceptance as a general, analytical tool for identifying and quantifying elements or, more precisely, isotopes, whether radioactive or nonradioactive. Field use was not a viable option.

  12. Data Analysis Techniques for a Lunar Surface Navigation System Testbed

    NASA Technical Reports Server (NTRS)

    Chelmins, David; Sands, O. Scott; Swank, Aaron

    2011-01-01

    NASA is interested in finding new methods of surface navigation to allow astronauts to navigate on the lunar surface. In support of the Vision for Space Exploration, the NASA Glenn Research Center developed the Lunar Extra-Vehicular Activity Crewmember Location Determination System and performed testing at the Desert Research and Technology Studies event in 2009. A significant amount of sensor data was recorded during nine tests performed with six test subjects. This paper provides the procedure, formulas, and techniques for data analysis, as well as commentary on applications.

  13. Microfluidic techniques for high throughput single cell analysis.

    PubMed

    Reece, Amy; Xia, Bingzhao; Jiang, Zhongliang; Noren, Benjamin; McBride, Ralph; Oakey, John

    2016-08-01

    The microfabrication of microfluidic control systems and the development of increasingly sensitive molecular amplification tools have enabled the miniaturization of single cells analytical platforms. Only recently has the throughput of these platforms increased to a level at which populations can be screened at the single cell level. Techniques based upon both active and passive manipulation are now capable of discriminating between single cell phenotypes for sorting, diagnostic or prognostic applications in a variety of clinical scenarios. The introduction of multiphase microfluidics enables the segmentation of single cells into biochemically discrete picoliter environments. The combination of these techniques are enabling a class of single cell analytical platforms within great potential for data driven biomedicine, genomics and transcriptomics. PMID:27032065

  14. Analysis of a proposed Compton backscatter imaging technique

    NASA Astrophysics Data System (ADS)

    Hall, J.; Jacoby, B.

    1992-12-01

    Imaging techniques which require access to only one side of the object being viewed are potentially useful in situations where conventional projection radiography and tomography cannot be applied, such as looking for voids in a large container where access to the back of the object is inconvenient or even impossible. One-sided imaging techniques are currently being used in nondestructive evaluation of surfaces and shallow subsurface structures. In this work we present both analytical calculations and detailed Monte Carlo simulations aimed at assessing the capability of a proposed Compton backscatter imaging technique designed to detect and characterize voids located several centimeters below the surface of a solid. The proposed technique, based on a scheme suggested by Farmer and Collins, encodes the spatial position and structure of voids in a solid in the energy spectrum of the Compton-scattered photons as recorded by a high resolution detector. Our calculations model a Cs-137 source projecting a 1 sq mm pencil beam of 662 keV gammas into a target slab at an incident angle of 45 degrees and a collimated detector (also oriented at 45 degrees with respect to the surface) which views the beam path at a central angle of 90 degrees. The detector collimator is modeled here as a triangular slit viewing a 2.54 cm (1.000 inch) segment of the beam path at a depth of 2 cm below the surface of the slab. Our results suggest that the proposed technique should be capable of an absolute position resolution of approximately 0.25 mm (approximately equal to 0.010 inches) for isolated voids and an overall object resolution of approximately 1 Ip/mm (approximately 0.040 inches). The predicted signal contrast for voids packed with various contraband materials will be discussed as well as multiple scattering contributions to the predicted yields.

  15. Analysis of a proposed Compton backscatter imaging technique

    SciTech Connect

    Hall, J.; Jacoby, B.

    1992-12-01

    Imaging techniques which require access to only one side of the object being viewed are potentially useful in situations where conventional projection radiography and tomography cannot be applied, such as looking for voids in a large container where access to the back of the object is inconvenient or even impossible. One-sided imaging techniques are currently being used in nondestructive evaluation of surfaces and shallow subsurface structures. In this work we present both analytical calculations and detailed Monte Carlo simulations aimed at assessing the capability of a proposed Compton backscatter imaging technique designed to detect and characterize voids located several centimeters below the surface of a solid. The proposed technique, based on a scheme suggested by Farmer and Collins, encodes the spatial position and structure of voids in a solid in the energy spectrum of the Compton-scattered photons as recorded by a high resolution detector. Our calculations model a {sup 137}Cs source projecting a 1 mm{sup 2} pencil beam of 662 keV gammas into a target slab at an incident angle of 45{degrees} and a collimated detector (also oriented at 45{degrees} with respect to the surface) which views the beam path at a central angle of 90{degrees}. The detector collimator is modeled here as a triangular slit viewing a 2.54 cm (1.000``) segment of the beam path at a depth of 2 cm below the surface of the slab. Our results suggest that the proposed technique should be capable of an absolute position resolution of {approx} 0.25 mm ({approx} 0.010``) for isolated voids and an overall object resolution of {approx} 1.00 Ip/mm ({approx} 0.04``). The predicted signal contrast for voids packed with various contraband materials will be discussed as well as multiple scattering contributions to the predicted yields.

  16. Video detection and analysis techniques of transient astronomical phenomena

    NASA Technical Reports Server (NTRS)

    Clifton, K. S.; Reese, R., Jr.; Davis, C. W.

    1979-01-01

    Low-light-level television systems have been utilized to gain information on meteors, aurorae, and other faint, transient astronomical phenomena. Such phenomena change not only their position as a function of time, but also their photometric and spectral characteristics in as little as 1/60 second, thus requiring unique methods of analysis. Data observed with television systems and recorded on video tape have been analyzed with a system utilizing both analog and digital techniques. Both off-the-shelf equipment and inhouse developments are used to isolate sequences of moving images and to store them in a form suitable for photometric and spectral reduction. Current emphasis of the analysis effort is directed at the measurement of the first-order emission lines of meteor spectra, the results of which will yield important compositional information concerning the nature of the impinging meteoroid.

  17. Sensitivity analysis techniques for models of human behavior.

    SciTech Connect

    Bier, Asmeret Brooke

    2010-09-01

    Human and social modeling has emerged as an important research area at Sandia National Laboratories due to its potential to improve national defense-related decision-making in the presence of uncertainty. To learn about which sensitivity analysis techniques are most suitable for models of human behavior, different promising methods were applied to an example model, tested, and compared. The example model simulates cognitive, behavioral, and social processes and interactions, and involves substantial nonlinearity, uncertainty, and variability. Results showed that some sensitivity analysis methods create similar results, and can thus be considered redundant. However, other methods, such as global methods that consider interactions between inputs, can generate insight not gained from traditional methods.

  18. Three-dimensional region-based adaptive image processing techniques for volume visualization applications

    NASA Astrophysics Data System (ADS)

    de Deus Lopes, Roseli; Zuffo, Marcelo K.; Rangayyan, Rangaraj M.

    1996-04-01

    Recent advances in three-dimensional (3D) imaging techniques have expanded the scope of applications of volume visualization to many areas such as medical imaging, scientific visualization, robotic vision, and virtual reality. Advanced image filtering, enhancement, and analysis techniques are being developed in parallel in the field of digital image processing. Although the fields cited have many aspects in common, it appears that many of the latest developments in image processing are not being applied to the fullest extent possible in visualization. It is common to encounter the use of rather simple and elementary image pre- processing operations being used in visualization and 3D imaging applications. The purpose of this paper is to present an overview of selected topics from recent developments in adaptive image processing and demonstrate or suggest their applications in volume visualization. The techniques include adaptive noise removal; improvement of contrast and visibility of objects; space-variant deblurring and restoration; segmentation-based lossless coding for data compression; and perception-based measures for analysis, enhancement, and rendering. The techniques share the common base of identification of adaptive regions by region growing, which lends them a perceptual basis related to the human visual system. Preliminary results obtained with some of the techniques implemented so far are used to illustrate the concepts involved, and to indicate potential performance capabilities of the methods.

  19. Comparative study of manual liquid-based cytology (MLBC) technique and direct smear technique (conventional) on fine-needle cytology/fine-needle aspiration cytology samples

    PubMed Central

    Pawar, Prajkta Suresh; Gadkari, Rasika Uday; Swami, Sunil Y.; Joshi, Anil R.

    2014-01-01

    Background: Liquid-based cytology technique enables cells to be suspended in a liquid medium and spread in a monolayer, making better morphological assessment. Automated techniques have been widely used, but limited due to cost and availability. Aim: The aim was to establish manual liquid-based cytology (MLBC) technique on fine-needle aspiration cytology (FNAC) material and compare its results with conventional technique. Materials and Methods: In this study, we examined cells trapped in needles hub used for the collection of FNAC samples. 50 cases were examined by the MLBC technique and compared with the conventional FNAC technique. By centrifugation, sediment was obtained and imprint was taken on defined area. Papanicolaou (Pap) and May-Grünwald Giemsa (MGG) staining was done. Direct smears and MLBC smears were compared for cellularity, background, cellular preservation, and nuclear preservation. Slides were diagnosed independently by two cytologists with more than 5 years’ experience. Standard error of proportion was used for statistical analysis. Results: Cellularity was low in MLBC as compared with conventional smears, which is expected as remnant material in the needle hub was used. Nuclei overlap to a lesser extent and hemorrhage and necrosis was reduced, so cell morphology can be better studied in the MLBC technique. P value obtained was <0.05. Conclusion: This MLBC technique gives results comparable to the conventional technique with better morphology. In a set up where aspirators are learners, this technique will ensure adequacy due to remnant in needle hub getting processed PMID:25210235

  20. Evolutionary Based Techniques for Fault Tolerant Field Programmable Gate Arrays

    NASA Technical Reports Server (NTRS)

    Larchev, Gregory V.; Lohn, Jason D.

    2006-01-01

    The use of SRAM-based Field Programmable Gate Arrays (FPGAs) is becoming more and more prevalent in space applications. Commercial-grade FPGAs are potentially susceptible to permanently debilitating Single-Event Latchups (SELs). Repair methods based on Evolutionary Algorithms may be applied to FPGA circuits to enable successful fault recovery. This paper presents the experimental results of applying such methods to repair four commonly used circuits (quadrature decoder, 3-by-3-bit multiplier, 3-by-3-bit adder, 440-7 decoder) into which a number of simulated faults have been introduced. The results suggest that evolutionary repair techniques can improve the process of fault recovery when used instead of or as a supplement to Triple Modular Redundancy (TMR), which is currently the predominant method for mitigating FPGA faults.

  1. On combining Laplacian and optimization-based mesh smoothing techniques

    SciTech Connect

    Freitag, L.A.

    1997-07-01

    Local mesh smoothing algorithms have been shown to be effective in repairing distorted elements in automatically generated meshes. The simplest such algorithm is Laplacian smoothing, which moves grid points to the geometric center of incident vertices. Unfortunately, this method operates heuristically and can create invalid meshes or elements of worse quality than those contained in the original mesh. In contrast, optimization-based methods are designed to maximize some measure of mesh quality and are very effective at eliminating extremal angles in the mesh. These improvements come at a higher computational cost, however. In this article the author proposes three smoothing techniques that combine a smart variant of Laplacian smoothing with an optimization-based approach. Several numerical experiments are performed that compare the mesh quality and computational cost for each of the methods in two and three dimensions. The author finds that the combined approaches are very cost effective and yield high-quality meshes.

  2. RBF-based technique for statistical demodulation of pathological tremor.

    PubMed

    Gianfelici, Francesco

    2013-10-01

    This paper presents an innovative technique based on the joint approximation capabilities of radial basis function (RBF) networks and the estimation capability of the multivariate iterated Hilbert transform (IHT) for the statistical demodulation of pathological tremor from electromyography (EMG) signals in patients with Parkinson's disease. We define a stochastic model of the multichannel high-density surface EMG by means of the RBF networks applied to the reconstruction of the stochastic process (characterizing the disease) modeled by the multivariate relationships generated by the Karhunen-Loéve transform in Hilbert spaces. Next, we perform a demodulation of the entire random field by means of the estimation capability of the multivariate IHT in a statistical setting. The proposed method is applied to both simulated signals and data recorded from three Parkinsonian patients and the results show that the amplitude modulation components of the tremor oscillation can be estimated with signal-to-noise ratio close to 30 dB with root-mean-square error for the estimates of the tremor instantaneous frequency. Additionally, the comparisons with a large number of techniques based on all the combinations of the RBF, extreme learning machine, backpropagation, support vector machine used in the first step of the algorithm; and IHT, empirical mode decomposition, multiband energy separation algorithm, periodic algebraic separation and energy demodulation used in the second step of the algorithm, clearly show the effectiveness of our technique. These results show that the proposed approach is a potential useful tool for advanced neurorehabilitation technologies that aim at tremor characterization and suppression. PMID:24808594

  3. Analysis of filter tuning techniques for sequential orbit determination

    NASA Technical Reports Server (NTRS)

    Lee, T.; Yee, C.; Oza, D.

    1995-01-01

    This paper examines filter tuning techniques for a sequential orbit determination (OD) covariance analysis. Recently, there has been a renewed interest in sequential OD, primarily due to the successful flight qualification of the Tracking and Data Relay Satellite System (TDRSS) Onboard Navigation System (TONS) using Doppler data extracted onboard the Extreme Ultraviolet Explorer (EUVE) spacecraft. TONS computes highly accurate orbit solutions onboard the spacecraft in realtime using a sequential filter. As the result of the successful TONS-EUVE flight qualification experiment, the Earth Observing System (EOS) AM-1 Project has selected TONS as the prime navigation system. In addition, sequential OD methods can be used successfully for ground OD. Whether data are processed onboard or on the ground, a sequential OD procedure is generally favored over a batch technique when a realtime automated OD system is desired. Recently, OD covariance analyses were performed for the TONS-EUVE and TONS-EOS missions using the sequential processing options of the Orbit Determination Error Analysis System (ODEAS). ODEAS is the primary covariance analysis system used by the Goddard Space Flight Center (GSFC) Flight Dynamics Division (FDD). The results of these analyses revealed a high sensitivity of the OD solutions to the state process noise filter tuning parameters. The covariance analysis results show that the state estimate error contributions from measurement-related error sources, especially those due to the random noise and satellite-to-satellite ionospheric refraction correction errors, increase rapidly as the state process noise increases. These results prompted an in-depth investigation of the role of the filter tuning parameters in sequential OD covariance analysis. This paper analyzes how the spacecraft state estimate errors due to dynamic and measurement-related error sources are affected by the process noise level used. This information is then used to establish

  4. Modern Micro and Nanoparticle-Based Imaging Techniques

    PubMed Central

    Ryvolova, Marketa; Chomoucka, Jana; Drbohlavova, Jana; Kopel, Pavel; Babula, Petr; Hynek, David; Adam, Vojtech; Eckschlager, Tomas; Hubalek, Jaromir; Stiborova, Marie; Kaiser, Jozef; Kizek, Rene

    2012-01-01

    The requirements for early diagnostics as well as effective treatment of insidious diseases such as cancer constantly increase the pressure on development of efficient and reliable methods for targeted drug/gene delivery as well as imaging of the treatment success/failure. One of the most recent approaches covering both the drug delivery as well as the imaging aspects is benefitting from the unique properties of nanomaterials. Therefore a new field called nanomedicine is attracting continuously growing attention. Nanoparticles, including fluorescent semiconductor nanocrystals (quantum dots) and magnetic nanoparticles, have proven their excellent properties for in vivo imaging techniques in a number of modalities such as magnetic resonance and fluorescence imaging, respectively. In this article, we review the main properties and applications of nanoparticles in various in vitro imaging techniques, including microscopy and/or laser breakdown spectroscopy and in vivo methods such as magnetic resonance imaging and/or fluorescence-based imaging. Moreover the advantages of the drug delivery performed by nanocarriers such as iron oxides, gold, biodegradable polymers, dendrimers, lipid based carriers such as liposomes or micelles are also highlighted. PMID:23202187

  5. The Fourier analysis technique and epsilon-pseudo-eigenvalues

    SciTech Connect

    Donato, J.M.

    1993-07-01

    The spectral radii of iteration matrices and the spectra and condition numbers of preconditioned systems are important in forecasting the convergence rates for iterative methods. Unfortunately, the spectra of iteration matrices or preconditioned systems is rarely easily available. The Fourier analysis technique has been shown to be a useful tool in studying the effectiveness of iterative methods by determining approximate expressions for the eigenvalues or condition numbers of matrix systems. For non-symmetric matrices the eigenvalues may be highly sensitive to perturbations. The spectral radii of nonsymmetric iteration matrices may not give a numerically realistic indication of the convergence of the iterative method. Trefethen and others have presented a theory on the use of {epsilon}-pseudo-eigenvalues in the study of matrix equations. For Toeplitz matrices, we show that the theory of c-pseudo-eigenvalues includes the Fourier analysis technique as a limiting case. For non-Toeplitz matrices, the relationship is not clear. We shall examine this relationship for non-Toeplitz matrices that arise when studying preconditioned systems for methods applied to a two-dimensional discretized elliptic differential equation.

  6. Homogenization techniques for the analysis of porous SMA

    NASA Astrophysics Data System (ADS)

    Sepe, V.; Auricchio, F.; Marfia, S.; Sacco, E.

    2016-05-01

    In this paper the mechanical response of porous Shape Memory Alloy (SMA) is modeled. The porous SMA is considered as a composite medium made of a dense SMA matrix with voids treated as inclusions. The overall response of this very special composite is deduced performing a micromechanical and homogenization analysis. In particular, the incremental Mori-Tanaka averaging scheme is provided; then, the Transformation Field Analysis procedure in its uniform and nonuniform approaches, UTFA and NUTFA respectively, are presented. In particular, the extension of the NUTFA technique proposed by Sepe et al. (Int J Solids Struct 50:725-742, 2013) is presented to investigate the response of porous SMA characterized by closed and open porosity. A detailed comparison between the outcomes provided by the Mori-Tanaka, the UTFA and the proposed NUTFA procedures for porous SMA is presented, through numerical examples for two- and three-dimensional problems. In particular, several values of porosity and different loading conditions, inducing pseudoelastic effect in the SMA matrix, are investigated. The predictions assessed by the Mori-Tanaka, the UTFA and the NUTFA techniques are compared with the results obtained by nonlinear finite element analyses. A comparison with experimental data available in literature is also presented.

  7. Statistical analysis of heartbeat data with wavelet techniques

    NASA Astrophysics Data System (ADS)

    Pazsit, Imre

    2004-05-01

    The purpose of this paper is to demonstrate the use of some methods of signal analysis, performed on ECG and in some cases blood pressure signals, for the classification of the health status of the heart of mice and rats. Spectral and wavelet analysis were performed on the raw signals. FFT-based coherence and phase was also calculated between blood pressure and raw ECG signals. Finally, RR-intervals were deduced from the ECG signals and an analysis of the fractal dimensions was performed. The analysis was made on data from mice and rats. A correlation was found between the health status of the mice and the rats and some of the statistical descriptors, most notably the phase of the cross-spectra between ECG and blood pressure, and the fractal properties and dimensions of the interbeat series (RR-interval fluctuations).

  8. Analysis of non-linearity in differential wavefront sensing technique.

    PubMed

    Duan, Hui-Zong; Liang, Yu-Rong; Yeh, Hsien-Chi

    2016-03-01

    An analytical model of a differential wavefront sensing (DWS) technique based on Gaussian Beam propagation has been derived. Compared with the result of the interference signals detected by quadrant photodiode, which is calculated by using the numerical method, the analytical model has been verified. Both the analytical model and numerical simulation show milli-radians level non-linearity effect of DWS detection. In addition, the beam clipping has strong influence on the non-linearity of DWS. The larger the beam clipping is, the smaller the non-linearity is. However, the beam walking effect hardly has influence on DWS. Thus, it can be ignored in laser interferometer. PMID:26974079

  9. Thermographic techniques applied to solar collector systems analysis

    SciTech Connect

    Eden, A.

    1980-02-01

    The use of thermography to analyze large solar collector array systems under dynamic operating conditions is discussed. The research at the Solar Energy Research Institute (SERI) in this area has focused on thermographic techniques and equipment to determine temperature distributions, flow patterns, and air blockages in solar collectors. The results of this extensive study, covering many sites and types of collectors, illustrate the capabilities of infrared (IR) analysis as a qualitative analysis tool and operation and maintenance procedure when applied to large arrays. Thermographic analysis of most collector systems qualitatively showed relative temperature distributions that indicated balanced flow patterns. In three significant cases, blocked or broken collector arrays, which previously had gone undetected, were discovered. Using this analysis, validation studies of large computer codes could examine collector arrays for flow patterns or blockages that could cause disagreement between actual and predicted performance. Initial operation and balancing of large systems could be accomplished without complicated sensor systems not needed for normal operations. Maintenance personnel could quickly check their systems without climbing onto the roof and without complicated sensor systems.

  10. Large area photodetector based on microwave cavity perturbation techniques

    SciTech Connect

    Braggio, C. Carugno, G.; Sirugudu, R. K.; Lombardi, A.; Ruoso, G.

    2014-07-28

    We present a preliminary study to develop a large area photodetector, based on a semiconductor crystal placed inside a superconducting resonant cavity. Laser pulses are detected through a variation of the cavity impedance, as a consequence of the conductivity change in the semiconductor. A novel method, whereby the designed photodetector is simulated by finite element analysis, makes it possible to perform pulse-height spectroscopy on the reflected microwave signals. We measure an energy sensitivity of 100 fJ in the average mode without the employment of low noise electronics and suggest possible ways to further reduce the single-shot detection threshold, based on the results of the described method.

  11. Enhancing the effectiveness of IST through risk-based techniques

    SciTech Connect

    Floyd, S.D.

    1996-12-01

    Current IST requirements were developed mainly through deterministic-based methods. While this approach has resulted in an adequate level of safety and reliability for pumps and valves, insights from probabilistic safety assessments suggest a better safety focus can be achieved at lower costs. That is, some high safety impact pumps and valves are currently not tested under the IST program and should be added, while low safety impact valves could be tested at significantly greater intervals than allowed by the current IST program. The nuclear utility industry, through the Nuclear Energy Institute (NEI), has developed a draft guideline for applying risk-based techniques to focus testing on those pumps and valves with a high safety impact while reducing test frequencies on low safety impact pumps and valves. The guideline is being validated through an industry pilot application program that is being reviewed by the U.S. Nuclear Regulatory Commission. NEI and the ASME maintain a dialogue on the two groups` activities related to risk-based IST. The presenter will provide an overview of the NEI guideline, discuss the methodological approach for applying risk-based technology to IST and provide the status of the industry pilot plant effort.

  12. Evaluations of mosquito age grading techniques based on morphological changes.

    PubMed

    Hugo, L E; Quick-Miles, S; Kay, B H; Ryan, P A

    2008-05-01

    Evaluations were made of the accuracy and practicality of mosquito age grading methods based on changes to mosquito morphology; including the Detinova ovarian tracheation, midgut meconium, Polovodova ovariole dilatation, ovarian injection, and daily growth line methods. Laboratory maintained Aedes vigilax (Skuse) and Culex annulirostris (Skuse) females of known chronological and physiological ages were used for these assessments. Application of the Detinova technique to laboratory reared Ae. vigilax females in a blinded trial enabled the successful identification of nulliparous and parous females in 83.7-89.8% of specimens. The success rate for identifying nulliparous females increased to 87.8-98.0% when observations of ovarian tracheation were combined with observations of the presence of midgut meconium. However, application of the Polovodova method only enabled 57.5% of nulliparous, 1-parous, 2-parous, and 3-parous Ae. vigilax females to be correctly classified, and ovarian injections were found to be unfeasible. Poor correlation was observed between the number of growth lines per phragma and the calendar age of laboratory reared Ae. vigilax females. In summary, morphological age grading methods that offer simple two-category predictions (ovarian tracheation and midgut meconium methods) were found to provide high-accuracy classifications, whereas methods that offer the separation of multiple age categories (ovariolar dilatation and growth line methods) were found to be extremely difficult and of low accuracy. The usefulness of the morphology-based methods is discussed in view of the availability of new mosquito age grading techniques based on cuticular hydrocarbon and gene transcription changes. PMID:18533427

  13. Insight to Nanoparticle Size Analysis-Novel and Convenient Image Analysis Method Versus Conventional Techniques.

    PubMed

    Vippola, Minnamari; Valkonen, Masi; Sarlin, Essi; Honkanen, Mari; Huttunen, Heikki

    2016-12-01

    The aim of this paper is to introduce a new image analysis program "Nanoannotator" particularly developed for analyzing individual nanoparticles in transmission electron microscopy images. This paper describes the usefulness and efficiency of the program when analyzing nanoparticles, and at the same time, we compare it to more conventional nanoparticle analysis techniques. The techniques which we are concentrating here are transmission electron microscopy (TEM) linked with different image analysis methods and X-ray diffraction techniques. The developed program appeared as a good supplement to the field of particle analysis techniques, since the traditional image analysis programs suffer from the inability to separate the individual particles from agglomerates in the TEM images. The program is more efficient, and it offers more detailed morphological information of the particles than the manual technique. However, particle shapes that are very different from spherical proved to be problematic also for the novel program. When compared to X-ray techniques, the main advantage of the small-angle X-ray scattering (SAXS) method is the average data it provides from a very large amount of particles. However, the SAXS method does not provide any data about the shape or appearance of the sample. PMID:27030469

  14. Analysis techniques for airborne laser range safety evaluations

    NASA Astrophysics Data System (ADS)

    Ramsburg, M. S.; Jenkins, D. L.; Doerflein, R. D.

    1982-08-01

    Techniques to evaluate safety of airborne laser operations on the range are reported. The objectives of the safety evaluations were to (1) protect civilian and military personnel from the hazards associated with lasers, (2) provide users with the least restrictive constraints in which to perform their mission and still maintain an adequate degree of safety, and (3) develop a data base for the Navy in the event of suspected laser exposure of other related incidents involving military or civilian personnel. A microcomputer code, written in ASNI 77 FORTRAN, has been developed, which will provide safe flight profiles for airborne laser systems. The output of this code can also be used in establishing operating areas for ground based Lasers. Input to the code includes output parameters, NOHD and assigned buffer zone for the laser system, as well as parameters describing the geometry of the range.

  15. Geospatial Products and Techniques at the Center for Transportation Analysis

    SciTech Connect

    Chin, Shih-Miao; Hwang, Ho-Ling; Peterson, Bruce E

    2008-01-01

    This paper highlights geospatial science-related innovations and developments conducted by the Center for Transportation Analysis (CTA) at the Oak Ridge National Laboratory. CTA researchers have been developing integrated inter-modal transportation solutions through innovative and cost-effective research and development for many years. Specifically, this paper profiles CTA-developed Geographic Information System (GIS) products that are publicly available. Examples of these GIS-related products include: the CTA Transportation Networks; GeoFreight system; and the web-based Multi-Modal Routing Analysis System. In addition, an application on assessment of railroad Hazmat routing alternatives is also discussed.

  16. Methods and Techniques for miRNA Data Analysis.

    PubMed

    Cristiano, Francesca; Veltri, Pierangelo

    2016-01-01

    Genomic data analysis consists of techniques to analyze and extract information from genes. In particular, genome sequencing technologies allow to characterize genomic profiles and identify biomarkers and mutations that can be relevant for diagnosis and designing of clinical therapies. Studies often regard identification of genes related to inherited disorders, but recently mutations and phenotypes are considered both in diseases studies and drug designing as well as for biomarkers identification for early detection.Gene mutations are studied by comparing fold changes in a redundancy version of numeric and string representation of analyzed genes starting from macromolecules. This consists of studying often thousands of repetitions of gene representation and signatures identified by biological available instruments that starting from biological samples generate arrays of data representing nucleotides sequences representing known genes in an often not well-known sequence.High-performance platforms and optimized algorithms are required to manipulate gigabytes of raw data that are generated by the so far mentioned biological instruments, such as NGS (standing for Next-Generation Sequencing) as well as for microarray. Also, data analysis requires the use of several tools and databases that store gene targets as well as gene ontologies and gene-disease association.In this chapter we present an overview of available software platforms for genomic data analysis, as well as available databases with their query engines. PMID:26069024

  17. Detecting Molecular Properties by Various Laser-Based Techniques

    SciTech Connect

    Hsin, Tse-Ming

    2007-01-01

    Four different laser-based techniques were applied to study physical and chemical characteristics of biomolecules and dye molecules. These techniques are liole burning spectroscopy, single molecule spectroscopy, time-resolved coherent anti-Stokes Raman spectroscopy and laser-induced fluorescence microscopy. Results from hole burning and single molecule spectroscopy suggested that two antenna states (C708 & C714) of photosystem I from cyanobacterium Synechocystis PCC 6803 are connected by effective energy transfer and the corresponding energy transfer time is ~6 ps. In addition, results from hole burning spectroscopy indicated that the chlorophyll dimer of the C714 state has a large distribution of the dimer geometry. Direct observation of vibrational peaks and evolution of coumarin 153 in the electronic excited state was demonstrated by using the fs/ps CARS, a variation of time-resolved coherent anti-Stokes Raman spectroscopy. In three different solvents, methanol, acetonitrile, and butanol, a vibration peak related to the stretch of the carbonyl group exhibits different relaxation dynamics. Laser-induced fluorescence microscopy, along with the biomimetic containers-liposomes, allows the measurement of the enzymatic activity of individual alkaline phosphatase from bovine intestinal mucosa without potential interferences from glass surfaces. The result showed a wide distribution of the enzyme reactivity. Protein structural variation is one of the major reasons that are responsible for this highly heterogeneous behavior.

  18. Post-test navigation data analysis techniques for the shuttle ALT

    NASA Technical Reports Server (NTRS)

    1975-01-01

    Postflight test analysis data processing techniques for shuttle approach and landing tests (ALT) navigation data are defined. Postfight test processor requirements are described along with operational and design requirements, data input requirements, and software test requirements. The postflight test data processing is described based on the natural test sequence: quick-look analysis, postflight navigation processing, and error isolation processing. Emphasis is placed on the tradeoffs that must remain open and subject to analysis until final definition is achieved in the shuttle data processing system and the overall ALT plan. A development plan for the implementation of the ALT postflight test navigation data processing system is presented. Conclusions are presented.

  19. The combined use of order tracking techniques for enhanced Fourier analysis of order components

    NASA Astrophysics Data System (ADS)

    Wang, K. S.; Heyns, P. S.

    2011-04-01

    Order tracking is one of the most important vibration analysis techniques for diagnosing faults in rotating machinery. It can be performed in many different ways, each of these with distinct advantages and disadvantages. However, in the end the analyst will often use Fourier analysis to transform the data from a time series to frequency or order spectra. It is therefore surprising that the study of the Fourier analysis of order-tracked systems seems to have been largely ignored in the literature. This paper considers the frequently used Vold-Kalman filter-based order tracking and computed order tracking techniques. The main pros and cons of each technique for Fourier analysis are discussed and the sequential use of Vold-Kalman filtering and computed order tracking is proposed as a novel idea to enhance the results of Fourier analysis for determining the order components. The advantages of the combined use of these order tracking techniques are demonstrated numerically on an SDOF rotor simulation model. Finally, the approach is also demonstrated on experimental data from a real rotating machine.

  20. Use of PIXE analysis technique for the study of Beirut amphora production in the Roman period

    NASA Astrophysics Data System (ADS)

    Roumié, M.; Waksman, S. Y.; Nsouli, B.; Reynolds, P.; Lemaı̂tre, S.

    2004-01-01

    Ion Beam Analysis techniques were developed and utilized for applications in the domain of archeology at the accelerator laboratory of the Lebanese Atomic Energy Commission. The characterization of Beirut kiln materials, mainly amphorae ceramics from the Roman period, was done using PIXE technique. In two runs with 1 and 3 MeV protons, we measured 20 major and trace elements. Consequently, a classification based on the elemental composition and on multivariate statistical techniques of some 70 ceramic objects was obtained providing the first step of a Lebanese database for future studies. Furthermore, the analysis of carrot amphorae found in Gaul (south of France) showed that some of them were of Beirut products and hence emphasized the role of Beirut city in the Mediterranean trade in the Roman period.

  1. Vegetation change detection based on image fusion technique

    NASA Astrophysics Data System (ADS)

    Jia, Yonghong; Liu, Yueyan; Yu, Hui; Li, Deren

    2005-10-01

    The change detection of land use and land cover has always been the focus of remotely sensed study and application. Based on techniques of image fusion, a new approach of detecting vegetation change according to vector of brightness index (BI) and perpendicular vegetation index (PVI) extracted from multi-temporal remotely sensed imagery is proposed. The procedure is introduced. Firstly, the Landsat eTM+ imagery is geometrically corrected and registered. Secondly, band 2,3,4 and panchromatic images of Landsat eTM+ are fused by a trous wavelet fusion, and bands 1,2,3 of SPOT are registered to the fused images. Thirdly, brightness index and perpendicular vegetation index are respectively extracted from SPOT images and fused images. Finally, change vectors are obtained and used to detect vegetation change. The testing results show that the approach of detecting vegetation change is very efficient.

  2. Protein elasticity probed with two synchrotron-based techniques.

    SciTech Connect

    Leu, B. M.; Alatas, A.; Sinn, H.; Alp, E. E.; Said, A.; Yavas, H.; Zhao, J.; Sage, J. T.; Sturhahn, W.; X-Ray Science Division; Hasylab; Northeastern Univ.

    2010-02-25

    Compressibility characterizes three interconnecting properties of a protein: dynamics, structure, and function. The compressibility values for the electron-carrying protein cytochrome c and for other proteins, as well, available in the literature vary considerably. Here, we apply two synchrotron-based techniques - nuclear resonance vibrational spectroscopy and inelastic x-ray scattering - to measure the adiabatic compressibility of this protein. This is the first report of the compressibility of any material measured with this method. Unlike the methods previously used, this novel approach probes the protein globally, at ambient pressure, does not require the separation of protein and solvent contributions to the total compressibility, and uses samples that contain the heme iron, as in the native state. We show, by comparing our results with molecular dynamics predictions, that the compressibility is almost independent of temperature. We discuss potential applications of this method to other materials beyond proteins.

  3. Validation techniques for fault emulation of SRAM-based FPGAs

    SciTech Connect

    Quinn, Heather; Wirthlin, Michael

    2015-08-07

    A variety of fault emulation systems have been created to study the effect of single-event effects (SEEs) in static random access memory (SRAM) based field-programmable gate arrays (FPGAs). These systems are useful for augmenting radiation-hardness assurance (RHA) methodologies for verifying the effectiveness for mitigation techniques; understanding error signatures and failure modes in FPGAs; and failure rate estimation. For radiation effects researchers, it is important that these systems properly emulate how SEEs manifest in FPGAs. If the fault emulation systems does not mimic the radiation environment, the system will generate erroneous data and incorrect predictions of behavior of the FPGA in a radiation environment. Validation determines whether the emulated faults are reasonable analogs to the radiation-induced faults. In this study we present methods for validating fault emulation systems and provide several examples of validated FPGA fault emulation systems.

  4. Active-contour-based image segmentation using machine learning techniques.

    PubMed

    Etyngier, Patrick; Ségonne, Florent; Keriven, Renaud

    2007-01-01

    We introduce a non-linear shape prior for the deformable model framework that we learn from a set of shape samples using recent manifold learning techniques. We model a category of shapes as a finite dimensional manifold which we approximate using Diffusion maps. Our method computes a Delaunay triangulation of the reduced space, considered as Euclidean, and uses the resulting space partition to identify the closest neighbors of any given shape based on its Nyström extension. We derive a non-linear shape prior term designed to attract a shape towards the shape prior manifold at given constant embedding. Results on shapes of ventricle nuclei demonstrate the potential of our method for segmentation tasks. PMID:18051143

  5. Mars laser altimeter based on a single photon ranging technique

    NASA Technical Reports Server (NTRS)

    Prochazka, Ivan; Hamal, Karel; Sopko, B.; Pershin, S.

    1993-01-01

    The Mars 94/96 Mission will carry, among others things, the balloon probe experiment. The balloon with the scientific cargo in the gondola underneath will drift in the Mars atmosphere, its altitude will range from zero, in the night, up to 5 km at noon. The accurate gondola altitude will be determined by an altimeter. As the Balloon gondola mass is strictly limited, the altimeter total mass and power consumption are critical; maximum allowed is a few hundred grams a few tens of mWatts of average power consumption. We did propose, design, and construct the laser altimeter based on the single photon ranging technique. Topics covered include the following: principle of operation, altimeter construction, and ground tests.

  6. Validation techniques for fault emulation of SRAM-based FPGAs

    DOE PAGESBeta

    Quinn, Heather; Wirthlin, Michael

    2015-08-07

    A variety of fault emulation systems have been created to study the effect of single-event effects (SEEs) in static random access memory (SRAM) based field-programmable gate arrays (FPGAs). These systems are useful for augmenting radiation-hardness assurance (RHA) methodologies for verifying the effectiveness for mitigation techniques; understanding error signatures and failure modes in FPGAs; and failure rate estimation. For radiation effects researchers, it is important that these systems properly emulate how SEEs manifest in FPGAs. If the fault emulation systems does not mimic the radiation environment, the system will generate erroneous data and incorrect predictions of behavior of the FPGA inmore » a radiation environment. Validation determines whether the emulated faults are reasonable analogs to the radiation-induced faults. In this study we present methods for validating fault emulation systems and provide several examples of validated FPGA fault emulation systems.« less

  7. Diagnosis of Dengue Infection Using Conventional and Biosensor Based Techniques.

    PubMed

    Parkash, Om; Shueb, Rafidah Hanim

    2015-10-01

    Dengue is an arthropod-borne viral disease caused by four antigenically different serotypes of dengue virus. This disease is considered as a major public health concern around the world. Currently, there is no licensed vaccine or antiviral drug available for the prevention and treatment of dengue disease. Moreover, clinical features of dengue are indistinguishable from other infectious diseases such as malaria, chikungunya, rickettsia and leptospira. Therefore, prompt and accurate laboratory diagnostic test is urgently required for disease confirmation and patient triage. The traditional diagnostic techniques for the dengue virus are viral detection in cell culture, serological testing, and RNA amplification using reverse transcriptase PCR. This paper discusses the conventional laboratory methods used for the diagnosis of dengue during the acute and convalescent phase and highlights the advantages and limitations of these routine laboratory tests. Subsequently, the biosensor based assays developed using various transducers for the detection of dengue are also reviewed. PMID:26492265

  8. Diagnosis of Dengue Infection Using Conventional and Biosensor Based Techniques

    PubMed Central

    Parkash, Om; Hanim Shueb, Rafidah

    2015-01-01

    Dengue is an arthropod-borne viral disease caused by four antigenically different serotypes of dengue virus. This disease is considered as a major public health concern around the world. Currently, there is no licensed vaccine or antiviral drug available for the prevention and treatment of dengue disease. Moreover, clinical features of dengue are indistinguishable from other infectious diseases such as malaria, chikungunya, rickettsia and leptospira. Therefore, prompt and accurate laboratory diagnostic test is urgently required for disease confirmation and patient triage. The traditional diagnostic techniques for the dengue virus are viral detection in cell culture, serological testing, and RNA amplification using reverse transcriptase PCR. This paper discusses the conventional laboratory methods used for the diagnosis of dengue during the acute and convalescent phase and highlights the advantages and limitations of these routine laboratory tests. Subsequently, the biosensor based assays developed using various transducers for the detection of dengue are also reviewed. PMID:26492265

  9. Uncertainty analysis on the design of thermal conductivity measurement by a guarded cut-bar technique

    NASA Astrophysics Data System (ADS)

    Xing, Changhu; Jensen, Colby; Ban, Heng; Phillips, Jeffrey

    2011-07-01

    A technique adapted from the guarded-comparative-longitudinal heat flow method was selected for the measurement of the thermal conductivity of a nuclear fuel compact over a temperature range characteristic of its usage. This technique fulfills the requirement for non-destructive measurement of the composite compact. Although numerous measurement systems have been created based on the guarded-comparative method, comprehensive systematic (bias) and measurement (precision) uncertainty associated with this technique have not been fully analyzed. In addition to the geometric effect in the bias error, which has been analyzed previously, this paper studies the working condition which is another potential error source. Using finite element analysis, this study showed the effect of these two types of error sources in the thermal conductivity measurement process and the limitations in the design selection of various parameters by considering their effect on the precision error. The results and conclusions provide valuable reference for designing and operating an experimental measurement system using this technique.

  10. General Approach To Materials Classification Using Neutron Analysis Techniques

    NASA Astrophysics Data System (ADS)

    Solovyev, Vladimir G.; Koltick, David S.

    2006-03-01

    The `neutron in, gamma out' method of elemental analysis has been known and used in many applications as an elemental analysis tool. This method is non-intrusive, non-destructive, fast and precise. This set of advantages makes neutron analysis attractive for even wider variety of uses beyond simple elemental analysis. The question that is addressed within this study is under what conditions neutron analysis can be used to differentiate materials of interest from a group or class of materials in the face of knowing that what is truly of interest is the molecular content of any sample under interrogation. Purpose of the study was to develop a neutron-based scanner for rapid differentiation of classes of materials sealed in small bottles. Developed scanner employs D-T neutron generator as a neutron source and HPGe gamma detectors. Materials can be placed into classes by many different properties. However, neutron analysis method can be used only few of them, such as elemental content, stoichiometric ratios and density of the scanned material. Set of parameters obtainable through neutron analysis serves as a basis for a hyperspace, where each point corresponds to a certain scanned material. Sub-volumes of the hyperspace correspond to different classes of materials. One of the most important properties of the materials are stoichiometric ratios of the elements comprising the materials. Constructing an algorithm for converting the observed gamma ray counts into quantities of the elements in the scanned sample is a crucial part of the analysis. Gamma rays produced in both fast inelastic scatterings and neutron captures are considered. Presence of certain elements in materials, such as hydrogen and chlorine can significantly change neutron dynamics within the sample, and, in turn, characteristic gamma lines development. These effects have been studied and corresponding algorithms have been developed to account for them.

  11. Comparing uncertainty analysis techniques for a SWAT application to the Chaohe Basin in China

    NASA Astrophysics Data System (ADS)

    Yang, Jing; Reichert, Peter; Abbaspour, K. C.; Xia, Jun; Yang, Hong

    2008-08-01

    SummaryDistributed watershed models are increasingly being used to support decisions about alternative management strategies in the areas of land use change, climate change, water allocation, and pollution control. For this reason it is important that these models pass through a careful calibration and uncertainty analysis. To fulfil this demand, in recent years, scientists have come up with various uncertainty analysis techniques for watershed models. To determine the differences and similarities of these techniques we compared five uncertainty analysis procedures: Generalized Likelihood Uncertainty Estimation (GLUE), Parameter Solution (ParaSol), Sequential Uncertainty FItting algorithm (SUFI-2), and a Bayesian framework implemented using Markov chain Monte Carlo (MCMC) and Importance Sampling (IS) techniques. As these techniques are different in their philosophies and leave the user some freedom in formulating the generalized likelihood measure, objective function, or likelihood function, a literal comparison between these techniques is not possible. As there is a small spectrum of different applications in hydrology for the first three techniques, we made this choice according to their typical use in hydrology. For Bayesian inference, we used a recently developed likelihood function that does not obviously violate the statistical assumptions, namely a continuous-time autoregressive error model. We implemented all these techniques for the soil and water assessment tool (SWAT) and applied them to the Chaohe Basin in China. We compared the results with respect to the posterior parameter distributions, performances of their best estimates, prediction uncertainty, conceptual bases, computational efficiency, and difficulty of implementation. The comparison results for these categories are listed and the advantages and disadvantages are analyzed. From the point of view of the authors, if computationally feasible, Bayesian-based approaches are most recommendable

  12. Breath Analysis Using Laser Spectroscopic Techniques: Breath Biomarkers, Spectral Fingerprints, and Detection Limits

    PubMed Central

    Wang, Chuji; Sahay, Peeyush

    2009-01-01

    Breath analysis, a promising new field of medicine and medical instrumentation, potentially offers noninvasive, real-time, and point-of-care (POC) disease diagnostics and metabolic status monitoring. Numerous breath biomarkers have been detected and quantified so far by using the GC-MS technique. Recent advances in laser spectroscopic techniques and laser sources have driven breath analysis to new heights, moving from laboratory research to commercial reality. Laser spectroscopic detection techniques not only have high-sensitivity and high-selectivity, as equivalently offered by the MS-based techniques, but also have the advantageous features of near real-time response, low instrument costs, and POC function. Of the approximately 35 established breath biomarkers, such as acetone, ammonia, carbon dioxide, ethane, methane, and nitric oxide, 14 species in exhaled human breath have been analyzed by high-sensitivity laser spectroscopic techniques, namely, tunable diode laser absorption spectroscopy (TDLAS), cavity ringdown spectroscopy (CRDS), integrated cavity output spectroscopy (ICOS), cavity enhanced absorption spectroscopy (CEAS), cavity leak-out spectroscopy (CALOS), photoacoustic spectroscopy (PAS), quartz-enhanced photoacoustic spectroscopy (QEPAS), and optical frequency comb cavity-enhanced absorption spectroscopy (OFC-CEAS). Spectral fingerprints of the measured biomarkers span from the UV to the mid-IR spectral regions and the detection limits achieved by the laser techniques range from parts per million to parts per billion levels. Sensors using the laser spectroscopic techniques for a few breath biomarkers, e.g., carbon dioxide, nitric oxide, etc. are commercially available. This review presents an update on the latest developments in laser-based breath analysis. PMID:22408503

  13. Novel technique for coal pyrolysis and hydrogenation product analysis

    SciTech Connect

    Pfefferle, L.D.; Boyle, J.

    1993-03-15

    A microjet reactor coupled to a VUV photoionization time-of-flight mass spectrometer has been used to obtain species measurements during high temperature pyrolysis and oxidation of a wide range of hydrocarbon compounds ranging from allene and acetylene to cyclohexane, benzene and toluene. Initial work focused on calibration of the technique, optimization of ion collection and detection and characterization of limitations. Using the optimized technique with 118 nm photoionization, intermediate species profiles were obtained for analysis of the hydrocarbon pyrolysis and oxidation mechanisms. The soft'' ionization, yielding predominantly molecular ions, allowed the study of reaction pathways in these high temperature systems where both sampling and detection challenges are severe. Work has focused on the pyrolysis and oxidative pyrolysis of aliphatic and aromatic hydrocarbon mixtures representative of coal pyrolysis and hydropyrolysis products. The detailed mass spectra obtained during pyrolysis and oxidation of hydrocarbon mixtures is especially important because of the complex nature of the product mixture even at short residence times and low primary reactant conversions. The combustion community has advanced detailed modeling of pyrolysis and oxidation to the C4 hydrocarbon level but in general above that size uncertainties in rate constant and thermodynamic data do not allow us to a priori predict products from mixed hydrocarbon pyrolyses using a detailed chemistry model. For pyrolysis of mixtures of coal-derived liquid fractions with a large range of compound structures and molecular weights in the hundreds of amu the modeling challenge is severe. Lumped models are possible from stable product data.

  14. Transit Spectroscopy: new data analysis techniques and interpretation

    NASA Astrophysics Data System (ADS)

    Tinetti, Giovanna; Waldmann, Ingo P.; Morello, Giuseppe; Tessenyi, Marcell; Varley, Ryan; Barton, Emma; Yurchenko, Sergey; Tennyson, Jonathan; Hollis, Morgan

    2014-11-01

    Planetary science beyond the boundaries of our Solar System is today in its infancy. Until a couple of decades ago, the detailed investigation of the planetary properties was restricted to objects orbiting inside the Kuiper Belt. Today, we cannot ignore that the number of known planets has increased by two orders of magnitude nor that these planets resemble anything but the objects present in our own Solar System. A key observable for planets is the chemical composition and state of their atmosphere. To date, two methods can be used to sound exoplanetary atmospheres: transit and eclipse spectroscopy, and direct imaging spectroscopy. Although the field of exoplanet spectroscopy has been very successful in past years, there are a few serious hurdles that need to be overcome to progress in this area: in particular instrument systematics are often difficult to disentangle from the signal, data are sparse and often not recorded simultaneously causing degeneracy of interpretation. We will present here new data analysis techniques and interpretation developed by the “ExoLights” team at UCL to address the above-mentioned issues. Said techniques include statistical tools, non-parametric, machine-learning algorithms, optimized radiative transfer models and spectroscopic line-lists. These new tools have been successfully applied to existing data recorded with space and ground instruments, shedding new light on our knowledge and understanding of these alien worlds.

  15. An objective isobaric/isentropic technique for upper air analysis

    NASA Technical Reports Server (NTRS)

    Mancuso, R. L.; Endlich, R. M.; Ehernberger, L. J.

    1981-01-01

    An objective meteorological analysis technique is presented whereby both horizontal and vertical upper air analyses are performed. The process used to interpolate grid-point values from the upper-air station data is the same as for grid points on both an isobaric surface and a vertical cross-sectional plane. The nearby data surrounding each grid point are used in the interpolation by means of an anisotropic weighting scheme, which is described. The interpolation for a grid-point potential temperature is performed isobarically; whereas wind, mixing-ratio, and pressure height values are interpolated from data that lie on the isentropic surface that passes through the grid point. Two versions (A and B) of the technique are evaluated by qualitatively comparing computer analyses with subjective handdrawn analyses. The objective products of version A generally have fair correspondence with the subjective analyses and with the station data, and depicted the structure of the upper fronts, tropopauses, and jet streams fairly well. The version B objective products correspond more closely to the subjective analyses, and show the same strong gradients across the upper front with only minor smoothing.

  16. One-Dimensional Analysis Techniques for Pulsed Blowing Distribution

    NASA Astrophysics Data System (ADS)

    Chambers, Frank

    2005-11-01

    Pulsed blowing offers reductions in bleed air requirements for aircraft flow control. Efficient pulsed blowing systems require careful design to minimize bleed air use while distributing blowing to multiple locations. Pulsed blowing systems start with a steady flow supply and process it to generate a pulsatile flow. The fluid-acoustic dynamics of the system play an important role in overall effectiveness. One-dimensional analysis techniques that in the past have been applied to ventilation systems and internal combustion engines have been adapted to pulsed blowing. Pressure wave superposition and reflection are used with the governing equations of continuity, momentum and energy to determine particle velocities and pressures through the flow field. Simulations have been performed to find changes in the amplitude and wave shape as pulses are transmitted through a simple pulsed blowing system. A general-purpose code is being developed to simulate wave transmission and allow the determination of blowing system dynamic parameters.

  17. Radial Velocity Data Analysis with Compressed Sensing Techniques

    NASA Astrophysics Data System (ADS)

    Hara, Nathan C.; Boué, G.; Laskar, J.; Correia, A. C. M.

    2016-09-01

    We present a novel approach for analysing radial velocity data that combines two features: all the planets are searched at once and the algorithm is fast. This is achieved by utilizing compressed sensing techniques, which are modified to be compatible with the Gaussian processes framework. The resulting tool can be used like a Lomb-Scargle periodogram and has the same aspect but with much fewer peaks due to aliasing. The method is applied to five systems with published radial velocity data sets: HD 69830, HD 10180, 55 Cnc, GJ 876 and a simulated very active star. The results are fully compatible with previous analysis, though obtained more straightforwardly. We further show that 55 Cnc e and f could have been respectively detected and suspected in early measurements from the Lick observatory and Hobby-Eberly Telescope available in 2004, and that frequencies due to dynamical interactions in GJ 876 can be seen.

  18. Image analysis technique applied to lock-exchange gravity currents

    NASA Astrophysics Data System (ADS)

    Nogueira, Helena I. S.; Adduce, Claudia; Alves, Elsa; Franca, Mário J.

    2013-04-01

    An image analysis technique is used to estimate the two-dimensional instantaneous density field of unsteady gravity currents produced by full-depth lock-release of saline water. An experiment reproducing a gravity current was performed in a 3.0 m long, 0.20 m wide and 0.30 m deep Perspex flume with horizontal smooth bed and recorded with a 25 Hz CCD video camera under controlled light conditions. Using dye concentration as a tracer, a calibration procedure was established for each pixel in the image relating the amount of dye uniformly distributed in the tank and the greyscale values in the corresponding images. The results are evaluated and corrected by applying the mass conservation principle within the experimental tank. The procedure is a simple way to assess the time-varying density distribution within the gravity current, allowing the investigation of gravity current dynamics and mixing processes.

  19. Applying machine learning techniques to DNA sequence analysis

    SciTech Connect

    Shavlik, J.W. . Dept. of Computer Sciences); Noordewier, M.O. . Dept. of Computer Science)

    1992-01-01

    We are primarily developing a machine teaming (ML) system that modifies existing knowledge about specific types of biological sequences. It does this by considering sample members and nonmembers of the sequence motif being teamed. Using this information, our teaming algorithm produces a more accurate representation of the knowledge needed to categorize future sequences. Specifically, our KBANN algorithm maps inference rules about a given recognition task into a neural network. Neural network training techniques then use the training examples to refine these inference rules. We call these rules a domain theory, following the convention in the machine teaming community. We have been applying this approach to several problems in DNA sequence analysis. In addition, we have been extending the capabilities of our teaming system along several dimensions. We have also been investigating parallel algorithms that perform sequence alignments in the presence of frameshift errors.

  20. Applying machine learning techniques to DNA sequence analysis

    SciTech Connect

    Shavlik, J.W.

    1992-01-01

    We are developing a machine learning system that modifies existing knowledge about specific types of biological sequences. It does this by considering sample members and nonmembers of the sequence motif being learned. Using this information (which we call a domain theory''), our learning algorithm produces a more accurate representation of the knowledge needed to categorize future sequences. Specifically, the KBANN algorithm maps inference rules, such as consensus sequences, into a neural (connectionist) network. Neural network training techniques then use the training examples of refine these inference rules. We have been applying this approach to several problems in DNA sequence analysis and have also been extending the capabilities of our learning system along several dimensions.

  1. Development of a sensitivity analysis technique for multiloop flight control systems

    NASA Technical Reports Server (NTRS)

    Vaillard, A. H.; Paduano, J.; Downing, D. R.

    1985-01-01

    This report presents the development and application of a sensitivity analysis technique for multiloop flight control systems. This analysis yields very useful information on the sensitivity of the relative-stability criteria of the control system, with variations or uncertainties in the system and controller elements. The sensitivity analysis technique developed is based on the computation of the singular values and singular-value gradients of a feedback-control system. The method is applicable to single-input/single-output as well as multiloop continuous-control systems. Application to sampled-data systems is also explored. The sensitivity analysis technique was applied to a continuous yaw/roll damper stability augmentation system of a typical business jet, and the results show that the analysis is very useful in determining the system elements which have the largest effect on the relative stability of the closed-loop system. As a secondary product of the research reported here, the relative stability criteria based on the concept of singular values were explored.

  2. Reduction of large set data transmission using algorithmically corrected model-based techniques for bandwidth efficiency

    NASA Astrophysics Data System (ADS)

    Khair, Joseph Daniel

    Communication requirements and demands on deployed systems are increasing daily. This increase is due to the desire for more capability, but also, due to the changing landscape of threats on remote vehicles. As such, it is important that we continue to find new and innovative ways to transmit data to and from these remote systems, consistent with this changing landscape. Specifically, this research shows that data can be transmitted to a remote system effectively and efficiently with a model-based approach using real-time updates, called Algorithmically Corrected Model-based Technique (ACMBT), resulting in substantial savings in communications overhead. To demonstrate this model-based data transmission technique, a hardware-based test fixture was designed and built. Execution and analysis software was created to perform a series of characterizations demonstrating the effectiveness of the new transmission method. The new approach was compared to a traditional transmission approach in the same environment, and the results were analyzed and presented. A Figure of Merit (FOM) was devised and presented to allow standardized comparison of traditional and proposed data transmission methodologies alongside bandwidth utilization metrics. The results of this research have successfully shown the model-based technique to be feasible. Additionally, this research has opened the trade space for future discussion and implementation of this technique.

  3. Adolescent baseball pitching technique: lower extremity biomechanical analysis.

    PubMed

    Milewski, Matthew D; Õunpuu, Sylvia; Solomito, Matthew; Westwell, Melany; Nissen, Carl W

    2012-11-01

    Documentation of the lower extremity motion patterns of adolescent pitchers is an important part of understanding the pitching motion and the implication of lower extremity technique on upper extremity loads, injury and performance. The purpose of this study was to take the initial step in this process by documenting the biomechanics of the lower extremities during the pitching cycle in adolescent pitchers and to compare these findings with the published data for older pitchers. Three-dimensional motion analysis using a comprehensive lower extremity model was used to evaluate the fast ball pitch technique in adolescent pitchers. Thirty-two pitchers with a mean age of 12.4 years (range 10.5-14.7 years) and at least 2 years of experience were included in this study. The pitchers showed a mean of 49 ± 12° of knee flexion of the lead leg at foot contact. They tended to maintain this position through ball release, and then extended their knee during the follow through phase (ball release to maximal internal glenohumeral rotation). The lead leg hip rapidly progressed into adduction and flexion during the arm cocking phase with a range of motion of 40 ± 10° adduction and 30 ± 13° flexion. The lead hip mean peak adduction velocity was 434 ± 83°/s and flexion velocity was 456 ± 156°/s. Simultaneously, the trailing leg hip rapidly extended approaching to a mean peak extension of -8 ± 5° at 39% of the pitch cycle, which is close to passive range of motion constraints. Peak hip abduction of the trailing leg at foot contact was -31 ± 12°, which also approached passive range of motion constraints. Differences and similarities were also noted between the adolescent lower extremity kinematics and adult pitchers; however, a more comprehensive analysis using similar methods is needed for a complete comparison. PMID:22660979

  4. Acoustical Characteristics of Mastication Sounds: Application of Speech Analysis Techniques

    NASA Astrophysics Data System (ADS)

    Brochetti, Denise

    Food scientists have used acoustical methods to study characteristics of mastication sounds in relation to food texture. However, a model for analysis of the sounds has not been identified, and reliability of the methods has not been reported. Therefore, speech analysis techniques were applied to mastication sounds, and variation in measures of the sounds was examined. To meet these objectives, two experiments were conducted. In the first experiment, a digital sound spectrograph generated waveforms and wideband spectrograms of sounds by 3 adult subjects (1 male, 2 females) for initial chews of food samples differing in hardness and fracturability. Acoustical characteristics were described and compared. For all sounds, formants appeared in the spectrograms, and energy occurred across a 0 to 8000-Hz range of frequencies. Bursts characterized waveforms for peanut, almond, raw carrot, ginger snap, and hard candy. Duration and amplitude of the sounds varied with the subjects. In the second experiment, the spectrograph was used to measure the duration, amplitude, and formants of sounds for the initial 2 chews of cylindrical food samples (raw carrot, teething toast) differing in diameter (1.27, 1.90, 2.54 cm). Six adult subjects (3 males, 3 females) having normal occlusions and temporomandibular joints chewed the samples between the molar teeth and with the mouth open. Ten repetitions per subject were examined for each food sample. Analysis of estimates of variation indicated an inconsistent intrasubject variation in the acoustical measures. Food type and sample diameter also affected the estimates, indicating the variable nature of mastication. Generally, intrasubject variation was greater than intersubject variation. Analysis of ranks of the data indicated that the effect of sample diameter on the acoustical measures was inconsistent and depended on the subject and type of food. If inferences are to be made concerning food texture from acoustical measures of mastication

  5. Comparison of gas chromatographic hyphenated techniques for mercury speciation analysis.

    PubMed

    Nevado, J J Berzas; Martín-Doimeadios, R C Rodríguez; Krupp, E M; Bernardo, F J Guzmán; Fariñas, N Rodríguez; Moreno, M Jiménez; Wallace, D; Ropero, M J Patiño

    2011-07-15

    In this study, we evaluate advantages and disadvantages of three hyphenated techniques for mercury speciation analysis in different sample matrices using gas chromatography (GC) with mass spectrometry (GC-MS), inductively coupled plasma mass spectrometry (GC-ICP-MS) and pyrolysis atomic fluorescence (GC-pyro-AFS) detection. Aqueous ethylation with NaBEt(4) was required in all cases. All systems were validated with respect to precision, with repeatability and reproducibility <5% RSD, confirmed by the Snedecor F-test. All methods proved to be robust according to a Plackett-Burnham design for 7 factors and 15 experiments, and calculations were carried out using the procedures described by Youden and Steiner. In order to evaluate accuracy, certified reference materials (DORM-2 and DOLT-3) were analyzed after closed-vessel microwave extraction with tetramethylammonium hydroxide (TMAH). No statistically significant differences were found to the certified values (p=0.05). The suitability for water samples analysis with different organic matter and chloride contents was evaluated by recovery experiments in synthetic spiked waters. Absolute detection and quantification limits were in the range of 2-6 pg for GC-pyro-AFS, 1-4 pg for GC-MS, with 0.05-0.21 pg for GC-ICP-MS showing the best limits of detection for the three systems employed. However, all systems are sufficiently sensitive for mercury speciation in environmental samples, with GC-MS and GC-ICP-MS offering isotope analysis capabilities for the use of species-specific isotope dilution analysis, and GC-pyro-AFS being the most cost effective alternative. PMID:21641604

  6. Advanced spatio-temporal filtering techniques for photogrammetric image sequence analysis in civil engineering material testing

    NASA Astrophysics Data System (ADS)

    Liebold, F.; Maas, H.-G.

    2016-01-01

    The paper shows advanced spatial, temporal and spatio-temporal filtering techniques which may be used to reduce noise effects in photogrammetric image sequence analysis tasks and tools. As a practical example, the techniques are validated in a photogrammetric spatio-temporal crack detection and analysis tool applied in load tests in civil engineering material testing. The load test technique is based on monocular image sequences of a test object under varying load conditions. The first image of a sequence is defined as a reference image under zero load, wherein interest points are determined and connected in a triangular irregular network structure. For each epoch, these triangles are compared to the reference image triangles to search for deformations. The result of the feature point tracking and triangle comparison process is a spatio-temporally resolved strain value field, wherein cracks can be detected, located and measured via local discrepancies. The strains can be visualized as a color-coded map. In order to improve the measuring system and to reduce noise, the strain values of each triangle must be treated in a filtering process. The paper shows the results of various filter techniques in the spatial and in the temporal domain as well as spatio-temporal filtering techniques applied to these data. The best results were obtained by a bilateral filter in the spatial domain and by a spatio-temporal EOF (empirical orthogonal function) filtering technique.

  7. Novel failure analysis techniques using photon probing with a scanning optical microscope

    SciTech Connect

    Cole, E.I. Jr.; Soden, J.M.; Rife, J.L.; Barton, D.L.; Henderson, C.L.

    1993-12-31

    Three new failure analysis techniques for integrated circuits (ICs) have been developed using localized photon probing with a scanning optical microscope (SOM). The first two are light-induced voltage alteration (LIVA) imaging techniques that (1) localize open-circuited and damaged junctions and (2) image transistor logic states. The third technique uses the SOM to control logic states optically from the IC backside. LIVA images are produced by monitoring the voltage fluctuations of a constant current power supply as a laser beam is scanned over the IC. High selectivity for localizing defects has been demonstrated using the LIVA approach. Logic state mapping results, similar to previous work using biased optical beam induced current (OBIC) and laser probing approaches have also been produced using LIVA. Application of the two LIVA based techniques to backside failure analysis has been demonstrated using an infrared laser source. Optical logic state control is based upon earlier work examining transistor response to photon injection. The physics of each method and their applications for failure analysis are described.

  8. Analysis techniques for eddy current imaging of carbon fiber materials

    NASA Astrophysics Data System (ADS)

    Schulze, Martin H.; Meyendorf, Norbert; Heuer, Henning

    2010-04-01

    Carbon fiber materials become more and more important for many applications. Unlike metal the technological parameters and certified quality control mechanisms for Raw Carbon Fiber Materials (RCF) have not yet been developed. There is no efficient and reliable testing system for in-line inspections and consecutive manual inspections of RCF and post laminated Carbon Fiber Reinforced Plastics (CFRP). Based upon the multi-frequency Eddy Current system developed at Fraunhofer IZFP, structural and hidden defects such as missing carbon fiber bundles, lanes, suspensions, fringes, missing sewing threads and angle errors can be detected. Using an optimized sensor array and intelligent image pre-processing algorithms, the complex impedance signal can be allocated to different carbon fiber layers. This technique enables the detection of defects in depths of up to 5 layers, including the option of free scale measuring resolution and testing frequency. Appropriate parameter lists for optimal error classifications are available. The dimensions of the smallest detectable flaws are in the range of a few millimeters. Algorithms and basic Eddy Current C-Scan processing techniques for carbon fiber material testing are described in this paper.

  9. Parameter tuning of PVD process based on artificial intelligence technique

    NASA Astrophysics Data System (ADS)

    Norlina, M. S.; Diyana, M. S. Nor; Mazidah, P.; Rusop, M.

    2016-07-01

    In this study, an artificial intelligence technique is proposed to be implemented in the parameter tuning of a PVD process. Due to its previous adaptation in similar optimization problems, genetic algorithm (GA) is selected to optimize the parameter tuning of the RF magnetron sputtering process. The most optimized parameter combination obtained from GA's optimization result is expected to produce the desirable zinc oxide (ZnO) thin film from the sputtering process. The parameters involved in this study were RF power, deposition time and substrate temperature. The algorithm was tested to optimize the 25 datasets of parameter combinations. The results from the computational experiment were then compared with the actual result from the laboratory experiment. Based on the comparison, GA had shown that the algorithm was reliable to optimize the parameter combination before the parameter tuning could be done to the RF magnetron sputtering machine. In order to verify the result of GA, the algorithm was also been compared to other well known optimization algorithms, which were, particle swarm optimization (PSO) and gravitational search algorithm (GSA). The results had shown that GA was reliable in solving this RF magnetron sputtering process parameter tuning problem. GA had shown better accuracy in the optimization based on the fitness evaluation.

  10. Retinoblastoma-comparative analysis of external radiotherapy techniques, including an IMRT technique

    SciTech Connect

    Reisner, Marcio Lemberg . E-mail: mreisner@uol.com.br; Viegas, Celia Maria Pais; Grazziotin, Rachele Zanchet; Santos Batista, Delano Valdivino; Carneiro, Tulio Meneses; Mendonca de Araujo, Carlos Manoel; Marchiori, Edson

    2007-03-01

    Purpose: To compare the numerous external radiotherapy (RT) techniques for the treatment of retinoblastoma, as well as an intensity-modulated RT (IMRT) technique. The latter was elaborated to evaluate the potential dose reduction in the surrounding tissue, as well as the potential avoidance of subdosage in the ora serrata retinae. Methods and Materials: A 2-year-old patient with unilateral retinoblastoma underwent CT. With the aid of an ophthalmologist, the ocular structures were delimited, and 13 techniques described in published reports were reproduced on three-dimensional planning software and identified according to their authors. A technique with four noncoplanar fields using IMRT was also elaborated. These techniques were compared according to the dose to the ora serrata retinae, lens, orbit (volume that received a dose of {>=}20 Gy), vitreous, optic nerve, lacrimal gland (volume that received a dose of {>=}34 Gy), and cornea and according to their ease of reproducibility. Results: The techniques that attained the therapeutic dose to the ora serrata retinae were the IMRT technique and the techniques of Haye, Cassady, Cormack, and al-Beteri. The Cormack technique had the lowest volume that received a dose of {>=}20 Gy in the orbit, followed by the IMRT technique. The IMRT technique also achieved the lowest volume that received a dose of {>=}34 Gy (14%) in the lacrimal gland. The Abramson/McCormick/Blach, Cassady, Reese, and Schipper techniques were the easiest to reproduce and the Chin the most complex. Conclusion: Retinoblastoma treatment with IMRT has an advantage over the other techniques, because it allows for the greatest reduction of dose to the orbit and lacrimal gland, while maintaining the therapeutic dose to the ora serrata retinae and vitreous.

  11. Ares Launch Vehicle Transonic Buffet Testing and Analysis Techniques

    NASA Technical Reports Server (NTRS)

    Piatak, David J.; Sekula, Martin K.; Rausch, Russ D.

    2010-01-01

    It is necessary to define the launch vehicle buffet loads to ensure that structural components and vehicle subsystems possess adequate strength, stress, and fatigue margins when the vehicle structural dynamic response to buffet forcing functions are considered. In order to obtain these forcing functions, the accepted method is to perform wind-tunnel testing of a rigid model instrumented with hundreds of unsteady pressure transducers designed to measure the buffet environment across the desired frequency range. The buffet wind-tunnel test program for the Ares Crew Launch Vehicle employed 3.5 percent scale rigid models of the Ares I and Ares I-X launch vehicles instrumented with 256 unsteady pressure transducers each. These models were tested at transonic conditions at the Transonic Dynamics Tunnel at NASA Langley Research Center. The ultimate deliverable of the Ares buffet test program are buffet forcing functions (BFFs) derived from integrating the measured fluctuating pressures on the rigid wind-tunnel models. These BFFs are then used as input to a multi-mode structural analysis to determine the vehicle response to buffet and the resulting buffet loads and accelerations. This paper discusses the development of the Ares I and I-X rigid buffet model test programs from the standpoint of model design, instrumentation system design, test implementation, data analysis techniques to yield final products, and presents normalized sectional buffet forcing function root-mean-squared levels.

  12. Automated target recognition technique for image segmentation and scene analysis

    NASA Astrophysics Data System (ADS)

    Baumgart, Chris W.; Ciarcia, Christopher A.

    1994-03-01

    Automated target recognition (ATR) software has been designed to perform image segmentation and scene analysis. Specifically, this software was developed as a package for the Army's Minefield and Reconnaissance and Detector (MIRADOR) program. MIRADOR is an on/off road, remote control, multisensor system designed to detect buried and surface- emplaced metallic and nonmetallic antitank mines. The basic requirements for this ATR software were the following: (1) an ability to separate target objects from the background in low signal-noise conditions; (2) an ability to handle a relatively high dynamic range in imaging light levels; (3) the ability to compensate for or remove light source effects such as shadows; and (4) the ability to identify target objects as mines. The image segmentation and target evaluation was performed using an integrated and parallel processing approach. Three basic techniques (texture analysis, edge enhancement, and contrast enhancement) were used collectively to extract all potential mine target shapes from the basic image. Target evaluation was then performed using a combination of size, geometrical, and fractal characteristics, which resulted in a calculated probability for each target shape. Overall results with this algorithm were quite good, though there is a tradeoff between detection confidence and the number of false alarms. This technology also has applications in the areas of hazardous waste site remediation, archaeology, and law enforcement.

  13. Spatiotemporal analysis of olive flowering using geostatistical techniques.

    PubMed

    Rojo, Jesús; Pérez-Badia, Rosa

    2015-02-01

    Analysis of flowering patterns in the olive (Olea europaea L.) are of considerable agricultural and ecological interest, and also provide valuable information for allergy-sufferers, enabling identification of the major sources of airborne pollen at any given moment by interpreting the aerobiological data recorded in pollen traps. The present spatiotemporal analysis of olive flowering in central Spain combined geostatistical techniques with the application of a Geographic Information Systems, and compared results for flowering intensity with airborne pollen records. The results were used to obtain continuous phenological maps which determined the pattern of the succession of the olive flowering. The results show also that, although the highest airborne olive-pollen counts were recorded during the greatest flowering intensity of the groves closest to the pollen trap, the counts recorded at the start of the pollen season were not linked to local olive groves, which had not yet begin to flower. To detect the remote sources of olive pollen several episodes of pollen recorded before the local flowering season were analysed using a HYSPLIT trajectory model and the findings showed that western, southern and southwestern winds transported pollen grains into the study area from earlier-flowering groves located outside the territory. PMID:25461089

  14. Sensitivity-analysis techniques: self-teaching curriculum

    SciTech Connect

    Iman, R.L.; Conover, W.J.

    1982-06-01

    This self teaching curriculum on sensitivity analysis techniques consists of three parts: (1) Use of the Latin Hypercube Sampling Program (Iman, Davenport and Ziegler, Latin Hypercube Sampling (Program User's Guide), SAND79-1473, January 1980); (2) Use of the Stepwise Regression Program (Iman, et al., Stepwise Regression with PRESS and Rank Regression (Program User's Guide) SAND79-1472, January 1980); and (3) Application of the procedures to sensitivity and uncertainty analyses of the groundwater transport model MWFT/DVM (Campbell, Iman and Reeves, Risk Methodology for Geologic Disposal of Radioactive Waste - Transport Model Sensitivity Analysis; SAND80-0644, NUREG/CR-1377, June 1980: Campbell, Longsine, and Reeves, The Distributed Velocity Method of Solving the Convective-Dispersion Equation, SAND80-0717, NUREG/CR-1376, July 1980). This curriculum is one in a series developed by Sandia National Laboratories for transfer of the capability to use the technology developed under the NRC funded High Level Waste Methodology Development Program.

  15. Optical asymmetric cryptography based on elliptical polarized light linear truncation and a numerical reconstruction technique.

    PubMed

    Lin, Chao; Shen, Xueju; Wang, Zhisong; Zhao, Cheng

    2014-06-20

    We demonstrate a novel optical asymmetric cryptosystem based on the principle of elliptical polarized light linear truncation and a numerical reconstruction technique. The device of an array of linear polarizers is introduced to achieve linear truncation on the spatially resolved elliptical polarization distribution during image encryption. This encoding process can be characterized as confusion-based optical cryptography that involves no Fourier lens and diffusion operation. Based on the Jones matrix formalism, the intensity transmittance for this truncation is deduced to perform elliptical polarized light reconstruction based on two intensity measurements. Use of a quick response code makes the proposed cryptosystem practical, with versatile key sensitivity and fault tolerance. Both simulation and preliminary experimental results that support theoretical analysis are presented. An analysis of the resistance of the proposed method on a known public key attack is also provided. PMID:24979424

  16. Plasma-based ambient mass spectrometry techniques: The current status and future prospective.

    PubMed

    Ding, Xuelu; Duan, Yixiang

    2015-01-01

    Plasma-based ambient mass spectrometry is emerging as a frontier technology for direct analysis of sample that employs low-energy plasma as the ionization reagent. The versatile sources of ambient mass spectrometry (MS) can be classified according to the plasma formation approaches; namely, corona discharge, glow discharge, dielectric barrier discharge, and microwave-induced discharge. These techniques allow pretreatment-free detection of samples, ranging from biological materials (e.g., flies, bacteria, plants, tissues, peptides, metabolites, and lipids) to pharmaceuticals, food-stuffs, polymers, chemical warfare reagents, and daily-use chemicals. In most cases, plasma-based ambient MS performs well as a qualitative tool and as an analyzer for semi-quantitation. Herein, we provide an overview of the key concepts, mechanisms, and applications of plasma-based ambient MS techniques, and discuss the challenges and outlook. PMID:24338668

  17. Validation techniques of agent based modelling for geospatial simulations

    NASA Astrophysics Data System (ADS)

    Darvishi, M.; Ahmadi, G.

    2014-10-01

    One of the most interesting aspects of modelling and simulation study is to describe the real world phenomena that have specific properties; especially those that are in large scales and have dynamic and complex behaviours. Studying these phenomena in the laboratory is costly and in most cases it is impossible. Therefore, Miniaturization of world phenomena in the framework of a model in order to simulate the real phenomena is a reasonable and scientific approach to understand the world. Agent-based modelling and simulation (ABMS) is a new modelling method comprising of multiple interacting agent. They have been used in the different areas; for instance, geographic information system (GIS), biology, economics, social science and computer science. The emergence of ABM toolkits in GIS software libraries (e.g. ESRI's ArcGIS, OpenMap, GeoTools, etc) for geospatial modelling is an indication of the growing interest of users to use of special capabilities of ABMS. Since ABMS is inherently similar to human cognition, therefore it could be built easily and applicable to wide range applications than a traditional simulation. But a key challenge about ABMS is difficulty in their validation and verification. Because of frequent emergence patterns, strong dynamics in the system and the complex nature of ABMS, it is hard to validate and verify ABMS by conventional validation methods. Therefore, attempt to find appropriate validation techniques for ABM seems to be necessary. In this paper, after reviewing on Principles and Concepts of ABM for and its applications, the validation techniques and challenges of ABM validation are discussed.

  18. Dynamic digital watermark technique based on neural network

    NASA Astrophysics Data System (ADS)

    Gu, Tao; Li, Xu

    2008-04-01

    An algorithm of dynamic watermark based on neural network is presented which is more robust against attack of false authentication and watermark-tampered operations contrasting with one watermark embedded method. (1) Five binary images used as watermarks are coded into a binary array. The total number of 0s and 1s is 5*N, every 0 or 1 is enlarged fivefold by information-enlarged technique. N is the original total number of the watermarks' binary bits. (2) Choose the seed image pixel p x,y and its 3×3 vicinities pixel p x-1,y-1,p x-1,y,p x-1,y+1,p x,y-1,p x,y+1,p x+1,y-1,p x+1,y,p x+1,y+1 as one sample space. The p x,y is used as the neural network target and the other eight pixel values are used as neural network inputs. (3) To make the neural network learn the sample space, 5*N pixel values and their closely relevant pixel values are randomly chosen with a password from a color BMP format image and used to train the neural network.(4) A four-layer neural network is constructed to describe the nonlinear mapped relationship between inputs and outputs. (5) One bit from the array is embedded by adjusting the polarity between a chosen pixel value and the output value of the model. (6) One randomizer generates a number to ascertain the counts of watermarks for retrieving. The randomly ascertained watermarks can be retrieved by using the restored neural network outputs value, the corresponding image pixels value, and the restore function without knowing the original image and watermarks (The restored coded-watermark bit=1, if ox,y(restored)>p x,y(reconstructed, else coded-watermark bit =0). The retrieved watermarks are different when extracting each time. The proposed technique can offer more watermarking proofs than one watermark embedded algorithm. Experimental results show that the proposed technique is very robust against some image processing operations and JPEG lossy compression. Therefore, the algorithm can be used to protect the copyright of one important image.

  19. Integration of geological remote-sensing techniques in subsurface analysis

    USGS Publications Warehouse

    Taranik, James V.; Trautwein, Charles M.

    1976-01-01

    Geological remote sensing is defined as the study of the Earth utilizing electromagnetic radiation which is either reflected or emitted from its surface in wavelengths ranging from 0.3 micrometre to 3 metres. The natural surface of the Earth is composed of a diversified combination of surface cover types, and geologists must understand the characteristics of surface cover types to successfully evaluate remotely-sensed data. In some areas landscape surface cover changes throughout the year, and analysis of imagery acquired at different times of year can yield additional geological information. Integration of different scales of analysis allows landscape features to be effectively interpreted. Interpretation of the static elements displayed on imagery is referred to as an image interpretation. Image interpretation is dependent upon: (1) the geologist's understanding of the fundamental aspects of image formation, and (2.) his ability to detect, delineate, and classify image radiometric data; recognize radiometric patterns; and identify landscape surface characteristics as expressed on imagery. A geologic interpretation integrates surface characteristics of the landscape with subsurface geologic relationships. Development of a geologic interpretation from imagery is dependent upon: (1) the geologist's ability to interpret geomorphic processes from their static surface expression as landscape characteristics on imagery, (2) his ability to conceptualize the dynamic processes responsible for the evolution 6f interpreted geologic relationships (his ability to develop geologic models). The integration of geologic remote-sensing techniques in subsurface analysis is illustrated by development of an exploration model for ground water in the Tucson area of Arizona, and by the development of an exploration model for mineralization in southwest Idaho.

  20. Topology-based Feature Definition and Analysis

    SciTech Connect

    Weber, Gunther H.; Bremer, Peer-Timo; Gyulassy, Attila; Pascucci, Valerio

    2010-12-10

    Defining high-level features, detecting them, tracking them and deriving quantities based on them is an integral aspect of modern data analysis and visualization. In combustion simulations, for example, burning regions, which are characterized by high fuel-consumption, are a possible feature of interest. Detecting these regions makes it possible to derive statistics about their size and track them over time. However, features of interest in scientific simulations are extremely varied, making it challenging to develop cross-domain feature definitions. Topology-based techniques offer an extremely flexible means for general feature definitions and have proven useful in a variety of scientific domains. This paper will provide a brief introduction into topological structures like the contour tree and Morse-Smale complex and show how to apply them to define features in different science domains such as combustion. The overall goal is to provide an overview of these powerful techniques and start a discussion how these techniques can aid in the analysis of astrophysical simulations.

  1. An interdisciplinary analysis of ERTS data for Colorado mountain environments using ADP Techniques

    NASA Technical Reports Server (NTRS)

    Hoffer, R. M. (Principal Investigator)

    1972-01-01

    Author identified significant preliminary results from the Ouachita portion of the Texoma frame of data indicate many potentials in the analysis and interpretation of ERTS data. It is believed that one of the more significant aspects of this analysis sequence has been the investigation of a technique to relate ERTS analysis and surface observation analysis. At present a sequence involving (1) preliminary analysis based solely upon the spectral characteristics of the data, followed by (2) a surface observation mission to obtain visual information and oblique photography to particular points of interest in the test site area, appears to provide an extremely efficient technique for obtaining particularly meaningful surface observation data. Following such a procedure permits concentration on particular points of interest in the entire ERTS frame and thereby makes the surface observation data obtained to be particularly significant and meaningful. The analysis of the Texoma frame has also been significant from the standpoint of demonstrating a fast turn around analysis capability. Additionally, the analysis has shown the potential accuracy and degree of complexity of features that can be identified and mapped using ERTS data.

  2. Subcellular chemical and morphological analysis by stimulated Raman scattering microscopy and image analysis techniques

    PubMed Central

    D’Arco, Annalisa; Brancati, Nadia; Ferrara, Maria Antonietta; Indolfi, Maurizio; Frucci, Maria; Sirleto, Luigi

    2016-01-01

    The visualization of heterogeneous morphology, segmentation and quantification of image features is a crucial point for nonlinear optics microscopy applications, spanning from imaging of living cells or tissues to biomedical diagnostic. In this paper, a methodology combining stimulated Raman scattering microscopy and image analysis technique is presented. The basic idea is to join the potential of vibrational contrast of stimulated Raman scattering and the strength of imaging analysis technique in order to delineate subcellular morphology with chemical specificity. Validation tests on label free imaging of polystyrene-beads and of adipocyte cells are reported and discussed. PMID:27231626

  3. Subcellular chemical and morphological analysis by stimulated Raman scattering microscopy and image analysis techniques.

    PubMed

    D'Arco, Annalisa; Brancati, Nadia; Ferrara, Maria Antonietta; Indolfi, Maurizio; Frucci, Maria; Sirleto, Luigi

    2016-05-01

    The visualization of heterogeneous morphology, segmentation and quantification of image features is a crucial point for nonlinear optics microscopy applications, spanning from imaging of living cells or tissues to biomedical diagnostic. In this paper, a methodology combining stimulated Raman scattering microscopy and image analysis technique is presented. The basic idea is to join the potential of vibrational contrast of stimulated Raman scattering and the strength of imaging analysis technique in order to delineate subcellular morphology with chemical specificity. Validation tests on label free imaging of polystyrene-beads and of adipocyte cells are reported and discussed. PMID:27231626

  4. Two-dimensional Imaging Velocity Interferometry: Technique and Data Analysis

    SciTech Connect

    Erskine, D J; Smith, R F; Bolme, C; Celliers, P; Collins, G

    2011-03-23

    We describe the data analysis procedures for an emerging interferometric technique for measuring motion across a two-dimensional image at a moment in time, i.e. a snapshot 2d-VISAR. Velocity interferometers (VISAR) measuring target motion to high precision have been an important diagnostic in shockwave physics for many years Until recently, this diagnostic has been limited to measuring motion at points or lines across a target. We introduce an emerging interferometric technique for measuring motion across a two-dimensional image, which could be called a snapshot 2d-VISAR. If a sufficiently fast movie camera technology existed, it could be placed behind a traditional VISAR optical system and record a 2d image vs time. But since that technology is not yet available, we use a CCD detector to record a single 2d image, with the pulsed nature of the illumination providing the time resolution. Consequently, since we are using pulsed illumination having a coherence length shorter than the VISAR interferometer delay ({approx}0.1 ns), we must use the white light velocimetry configuration to produce fringes with significant visibility. In this scheme, two interferometers (illuminating, detecting) having nearly identical delays are used in series, with one before the target and one after. This produces fringes with at most 50% visibility, but otherwise has the same fringe shift per target motion of a traditional VISAR. The 2d-VISAR observes a new world of information about shock behavior not readily accessible by traditional point or 1d-VISARS, simultaneously providing both a velocity map and an 'ordinary' snapshot photograph of the target. The 2d-VISAR has been used to observe nonuniformities in NIF related targets (polycrystalline diamond, Be), and in Si and Al.

  5. A dynamic mechanical analysis technique for porous media

    PubMed Central

    Pattison, Adam J; McGarry, Matthew; Weaver, John B; Paulsen, Keith D

    2015-01-01

    Dynamic mechanical analysis (DMA) is a common way to measure the mechanical properties of materials as functions of frequency. Traditionally, a viscoelastic mechanical model is applied and current DMA techniques fit an analytical approximation to measured dynamic motion data by neglecting inertial forces and adding empirical correction factors to account for transverse boundary displacements. Here, a finite element (FE) approach to processing DMA data was developed to estimate poroelastic material properties. Frequency-dependent inertial forces, which are significant in soft media and often neglected in DMA, were included in the FE model. The technique applies a constitutive relation to the DMA measurements and exploits a non-linear inversion to estimate the material properties in the model that best fit the model response to the DMA data. A viscoelastic version of this approach was developed to validate the approach by comparing complex modulus estimates to the direct DMA results. Both analytical and FE poroelastic models were also developed to explore their behavior in the DMA testing environment. All of the models were applied to tofu as a representative soft poroelastic material that is a common phantom in elastography imaging studies. Five samples of three different stiffnesses were tested from 1 – 14 Hz with rough platens placed on the top and bottom surfaces of the material specimen under test to restrict transverse displacements and promote fluid-solid interaction. The viscoelastic models were identical in the static case, and nearly the same at frequency with inertial forces accounting for some of the discrepancy. The poroelastic analytical method was not sufficient when the relevant physical boundary constraints were applied, whereas the poroelastic FE approach produced high quality estimates of shear modulus and hydraulic conductivity. These results illustrated appropriate shear modulus contrast between tofu samples and yielded a consistent contrast in

  6. Data analysis techniques: a tool for cumulative exposure assessment.

    PubMed

    Lalloué, Benoît; Monnez, Jean-Marie; Padilla, Cindy; Kihal, Wahida; Zmirou-Navier, Denis; Deguen, Séverine

    2015-01-01

    Everyone is subject to environmental exposures from various sources, with negative health impacts (air, water and soil contamination, noise, etc.or with positive effects (e.g. green space). Studies considering such complex environmental settings in a global manner are rare. We propose to use statistical factor and cluster analyses to create a composite exposure index with a data-driven approach, in view to assess the environmental burden experienced by populations. We illustrate this approach in a large French metropolitan area. The study was carried out in the Great Lyon area (France, 1.2 M inhabitants) at the census Block Group (BG) scale. We used as environmental indicators ambient air NO2 annual concentrations, noise levels and proximity to green spaces, to industrial plants, to polluted sites and to road traffic. They were synthesized using Multiple Factor Analysis (MFA), a data-driven technique without a priori modeling, followed by a Hierarchical Clustering to create BG classes. The first components of the MFA explained, respectively, 30, 14, 11 and 9% of the total variance. Clustering in five classes group: (1) a particular type of large BGs without population; (2) BGs of green residential areas, with less negative exposures than average; (3) BGs of residential areas near midtown; (4) BGs close to industries; and (5) midtown urban BGs, with higher negative exposures than average and less green spaces. Other numbers of classes were tested in order to assess a variety of clustering. We present an approach using statistical factor and cluster analyses techniques, which seem overlooked to assess cumulative exposure in complex environmental settings. Although it cannot be applied directly for risk or health effect assessment, the resulting index can help to identify hot spots of cumulative exposure, to prioritize urban policies or to compare the environmental burden across study areas in an epidemiological framework. PMID:25248936

  7. A dynamic mechanical analysis technique for porous media.

    PubMed

    Pattison, Adam Jeffry; McGarry, Matthew; Weaver, John B; Paulsen, Keith D

    2015-02-01

    Dynamic mechanical analysis (DMA) is a common way to measure the mechanical properties of materials as functions of frequency. Traditionally, a viscoelastic mechanical model is applied and current DMA techniques fit an analytical approximation to measured dynamic motion data by neglecting inertial forces and adding empirical correction factors to account for transverse boundary displacements. Here, a finite-element (FE) approach to processing DMA data was developed to estimate poroelastic material properties. Frequency-dependent inertial forces, which are significant in soft media and often neglected in DMA, were included in the FE model. The technique applies a constitutive relation to the DMA measurements and exploits a nonlinear inversion to estimate the material properties in the model that best fit the model response to the DMA data. A viscoelastic version of this approach was developed to validate the approach by comparing complex modulus estimates to the direct DMA results. Both analytical and FE poroelastic models were also developed to explore their behavior in the DMA testing environment. All of the models were applied to tofu as a representative soft poroelastic material that is a common phantom in elastography imaging studies. Five samples of three different stiffnesses were tested from 1-14 Hz with rough platens placed on the top and bottom surfaces of the material specimen under test to restrict transverse displacements and promote fluid-solid interaction. The viscoelastic models were identical in the static case, and nearly the same at frequency with inertial forces accounting for some of the discrepancy. The poroelastic analytical method was not sufficient when the relevant physical boundary constraints were applied, whereas the poroelastic FE approach produced high quality estimates of shear modulus and hydraulic conductivity. These results illustrated appropriate shear modulus contrast between tofu samples and yielded a consistent contrast in

  8. Exploring Techniques for Vision Based Human Activity Recognition: Methods, Systems, and Evaluation

    PubMed Central

    Xu, Xin; Tang, Jinshan; Zhang, Xiaolong; Liu, Xiaoming; Zhang, Hong; Qiu, Yimin

    2013-01-01

    With the wide applications of vision based intelligent systems, image and video analysis technologies have attracted the attention of researchers in the computer vision field. In image and video analysis, human activity recognition is an important research direction. By interpreting and understanding human activities, we can recognize and predict the occurrence of crimes and help the police or other agencies react immediately. In the past, a large number of papers have been published on human activity recognition in video and image sequences. In this paper, we provide a comprehensive survey of the recent development of the techniques, including methods, systems, and quantitative evaluation of the performance of human activity recognition. PMID:23353144

  9. Rapid Automated Dissolution and Analysis Techniques for Radionuclides in Recycle Process Streams

    SciTech Connect

    Sudowe, Ralf; Roman, Audrey; Dailey, Ashlee; Go, Elaine

    2013-07-18

    The analysis of process samples for radionuclide content is an important part of current procedures for material balance and accountancy in the different process streams of a recycling plant. The destructive sample analysis techniques currently available necessitate a significant amount of time. It is therefore desirable to develop new sample analysis procedures that allow for a quick turnaround time and increased sample throughput with a minimum of deviation between samples. In particular, new capabilities for rapid sample dissolution and radiochemical separation are required. Most of the radioanalytical techniques currently employed for sample analysis are based on manual laboratory procedures. Such procedures are time- and labor-intensive, and not well suited for situations in which a rapid sample analysis is required and/or large number of samples need to be analyzed. To address this issue we are currently investigating radiochemical separation methods based on extraction chromatography that have been specifically optimized for the analysis of process stream samples. The influence of potential interferences present in the process samples as well as mass loading, flow rate and resin performance is being studied. In addition, the potential to automate these procedures utilizing a robotic platform is evaluated. Initial studies have been carried out using the commercially available DGA resin. This resin shows an affinity for Am, Pu, U, and Th and is also exhibiting signs of a possible synergistic effects in the presence of iron.

  10. Agent-based forward analysis

    SciTech Connect

    Kerekes, Ryan A.; Jiao, Yu; Shankar, Mallikarjun; Potok, Thomas E.; Lusk, Rick M.

    2008-01-01

    We propose software agent-based "forward analysis" for efficient information retrieval in a network of sensing devices. In our approach, processing is pushed to the data at the edge of the network via intelligent software agents rather than pulling data to a central facility for processing. The agents are deployed with a specific query and perform varying levels of analysis of the data, communicating with each other and sending only relevant information back across the network. We demonstrate our concept in the context of face recognition using a wireless test bed comprised of PDA cell phones and laptops. We show that agent-based forward analysis can provide a significant increase in retrieval speed while decreasing bandwidth usage and information overload at the central facility. n

  11. The use of DEM analysis for structural characterization of landslide-prone areas in crystalline rock slopes using GIS-based techniques. The case of the Matter Valley, Switzerland

    NASA Astrophysics Data System (ADS)

    Yugsi Molina, F. X.; Loew, S.; Button, E.

    2009-04-01

    Mountainous regions influenced by glacial processes are often prone to slope instabilities. One reason for this relationship is their characteristic morphology (high relief and steep slopes) and the surface processes associated with glacial advance and retreat. In the Matter Valley, Switzerland these factors interact with brittle-ductile faults and joint sets and induce rock slope failures at multiple scales, including the 3x107 m3 Randa and the 1x105 m3 Medji events. The general lithological and tectonic disposition in the study area is quite homogeneous, while the local fracture systems and their characteristics vary spatially. These features provide the opportunity to evaluate potential relationships between the local fracture systems and the potential failure modes they develop with the observed slope morphology and its state of stability. In order to investigate this hypothesis the fracture pattern of the area was analyzed using a new combination of data collected from the field and data extracted from an aerial-based LIDAR high resolution DEM (SWISSTOPO, 2m pixel resolution). This is possible for the area because the fracture pattern has been observed to have a strong influence in the morphology of the slopes. To identify slope faces controlled by structures a 3D shaded relief map of the area was produced. A 3D shaded relief map is a color-coded image based on HSV color composition showing changes in color according with the changes on slope orientation (dip and dip direction). A careful selection of the planes used for the analysis was carried out taking in consideration that not all values in the 3D shaded relief image represent fracture orientations; this is due to multiple factors such as cell size of the DEM, presence of land cover (soil), and presence of overhanging blocks. Selection of cells was done using 3D visualizations (an orthophoto mosaic created with aerial photographs acquired in 2005 was used as the top-most layer) and photographs of the

  12. A Rapid, Fluorescence-Based Field Screening Technique for Organic Species in Soil and Water Matrices.

    PubMed

    Russell, Amber L; Martin, David P; Cuddy, Michael F; Bednar, Anthony J

    2016-06-01

    Real-time detection of hydrocarbon contaminants in the environment presents analytical challenges because traditional laboratory-based techniques are cumbersome and not readily field portable. In the current work, a method for rapid and semi-quantitative detection of organic contaminants, primarily crude oil, in natural water and soil matrices has been developed. Detection limits in the parts per million and parts per billion were accomplished when using visual and digital detection methods, respectively. The extraction technique was modified from standard methodologies used for hydrocarbon analysis and provides a straight-forward separation technique that can remove interference from complex natural constituents. For water samples this method is semi-quantitative, with recoveries ranging from 70 % to 130 %, while measurements of soil samples are more qualitative due to lower extraction efficiencies related to the limitations of field-deployable procedures. PMID:26988223

  13. Biogeosystem technique as a base of Sustainable Irrigated Agriculture

    NASA Astrophysics Data System (ADS)

    Batukaev, Abdulmalik

    2016-04-01

    The world water strategy is to be changed because the current imitational gravitational frontal isotropic-continual paradigm of irrigation is not sustainable. This paradigm causes excessive consumption of fresh water - global deficit - up to 4-15 times, adverse effects on soils and landscapes. Current methods of irrigation does not control the water spread throughout the soil continuum. The preferable downward fluxes of irrigation water are forming, up to 70% and more of water supply loses into vadose zone. The moisture of irrigated soil is high, soil loses structure in the process of granulometric fractions flotation decomposition, the stomatal apparatus of plant leaf is fully open, transpiration rate is maximal. We propose the Biogeosystem technique - the transcendental, uncommon and non-imitating methods for Sustainable Natural Resources Management. New paradigm of irrigation is based on the intra-soil pulse discrete method of water supply into the soil continuum by injection in small discrete portions. Individual volume of water is supplied as a vertical cylinder of soil preliminary watering. The cylinder position in soil is at depth form 10 to 30 cm. Diameter of cylinder is 1-2 cm. Within 5-10 min after injection the water spreads from the cylinder of preliminary watering into surrounding soil by capillary, film and vapor transfer. Small amount of water is transferred gravitationally to the depth of 35-40 cm. The soil watering cylinder position in soil profile is at depth of 5-50 cm, diameter of the cylinder is 2-4 cm. Lateral distance between next cylinders along the plant raw is 10-15 cm. The soil carcass which is surrounding the cylinder of non-watered soil remains relatively dry and mechanically stable. After water injection the structure of soil in cylinder restores quickly because of no compression from the stable adjoining volume of soil and soil structure memory. The mean soil thermodynamic water potential of watered zone is -0.2 MPa. At this potential

  14. Dimensional Changes of Acrylic Resin Denture Bases: Conventional Versus Injection-Molding Technique

    PubMed Central

    Gharechahi, Jafar; Asadzadeh, Nafiseh; Shahabian, Foad; Gharechahi, Maryam

    2014-01-01

    Objective: Acrylic resin denture bases undergo dimensional changes during polymerization. Injection molding techniques are reported to reduce these changes and thereby improve physical properties of denture bases. The aim of this study was to compare dimensional changes of specimens processed by conventional and injection-molding techniques. Materials and Methods: SR-Ivocap Triplex Hot resin was used for conventional pressure-packed and SR-Ivocap High Impact was used for injection-molding techniques. After processing, all the specimens were stored in distilled water at room temperature until measured. For dimensional accuracy evaluation, measurements were recorded at 24-hour, 48-hour and 12-day intervals using a digital caliper with an accuracy of 0.01 mm. Statistical analysis was carried out by SPSS (SPSS Inc., Chicago, IL, USA) using t-test and repeated-measures ANOVA. Statistical significance was defined at P<0.05. Results: After each water storage period, the acrylic specimens produced by injection exhibited less dimensional changes compared to those produced by the conventional technique. Curing shrinkage was compensated by water sorption with an increase in water storage time decreasing dimensional changes. Conclusion: Within the limitations of this study, dimensional changes of acrylic resin specimens were influenced by the molding technique used and SR-Ivocap injection procedure exhibited higher dimensional accuracy compared to conventional molding. PMID:25584050

  15. Novel technique: a pupillometer-based objective chromatic perimetry

    NASA Astrophysics Data System (ADS)

    Rotenstreich, Ygal; Skaat, Alon; Sher, Ifat; Kolker, Andru; Rosenfeld, Elkana; Melamed, Shlomo; Belkin, Michael

    2014-02-01

    Evaluation of visual field (VF) is important for clinical diagnosis and patient monitoring. The current VF methods are subjective and require patient cooperation. Here we developed a novel objective perimetry technique based on the pupil response (PR) to multifocal chromatic stimuli in normal subjects and in patients with glaucoma and retinitis pigmentosa (RP). A computerized infrared video pupillometer was used to record PR to short- and long-wavelength stimuli (peak 485 nm and 620 nm, respectively) at light intensities of 15-100 cd-s/m2 at thirteen different points of the VF. The RP study included 30 eyes of 16 patients and 20 eyes of 12 healthy participants. The glaucoma study included 22 eyes of 11 patients and 38 eyes of 19 healthy participants. Significantly reduced PR was observed in RP patients in response to short-wavelength stimuli at 40 cd-s/m2 in nearly all perimetric locations (P <0.05). By contrast, RP patients demonstrated nearly normal PR to long-wavelength in majority of perimetric locations. The glaucoma group showed significantly reduced PR to long- and short-wavelength stimuli at high intensity in all perimetric locations (P <0.05). The PR of glaucoma patients was significantly lower than normal in response to short-wavelength stimuli at low intensity mostly in central and 20° locations (p<0.05). This study demonstrates the feasibility of using pupillometer-based chromatic perimetry for objectively assessing VF defects and retinal function and optic nerve damage in patients with retinal dystrophies and glaucoma. Furthermore, this method may be used to distinguish between the damaged cells underlying the VF defect.

  16. Age estimation based on Kvaal's technique using digital panoramic radiographs

    PubMed Central

    Mittal, Samta; Nagendrareddy, Suma Gundareddy; Sharma, Manisha Lakhanpal; Agnihotri, Poornapragna; Chaudhary, Sunil; Dhillon, Manu

    2016-01-01

    Introduction: Age estimation is important for administrative and ethical reasons and also because of legal consequences. Dental pulp undergoes regression in size with increasing age due to secondary dentin deposition and can be used as a parameter of age estimation even beyond 25 years of age. Kvaal et al. developed a method for chronological age estimation based on the pulp size using periapical dental radiographs. There is a need for testing this method of age estimation in the Indian population using simple tools like digital imaging on living individuals not requiring extraction of teeth. Aims and Objectives: Estimation of the chronological age of subjects by Kvaal's method using digital panoramic radiographs and also testing the validity of regression equations as given by Kvaal et al. Materials and Methods: The study sample included a total of 152 subjects in the age group of 14-60 years. Measurements were performed on the standardized digital panoramic radiographs based on Kvaal's method. Different regression formulae were derived and the age was assessed. The assessed age was then correlated to the actual age of the patient using Student's t-test. Results: No significant difference between the mean of the chronological age and the estimated age was observed. However, the values of the mean age estimated by using regression equations as given previously in the study of Kvaal et al. significantly underestimated the chronological age in the present study sample. Conclusion: The results of the study give an inference for the feasibility of this technique by calculation of regression equations on digital panoramic radiographs. However, it negates the applicability of same regression equations as given by Kvaal et al. on the study population.

  17. A new membrane-based crystallization technique: tests on lysozyme

    NASA Astrophysics Data System (ADS)

    Curcio, Efrem; Profio, Gianluca Di; Drioli, Enrico

    2003-01-01

    The great importance of protein science both in industrial and scientific fields, in conjunction with the intrinsic difficulty to grow macromolecular crystals, stimulates the development of new observations and ideas that can be useful in initiating more systematic studies using novel approaches. In this regard, an innovative technique, based on the employment of microporous hydrophobic membranes in order to promote the formation of lysozyme crystals from supersaturated solutions, is introduced in this work. Operational principles and possible advantages, both in terms of controlled extraction of solvent by acting on the concentration of the stripping solution and reduced induction times, are outlined. Theoretical developments and experimental results concerning the mass transfer, in vapour phase, through the membrane are presented, as well as the results from X-ray diffraction to 1.7 Å resolution of obtained lysozyme crystals using NaCl as the crystallizing agent and sodium acetate as the buffer. Crystals were found to be tetragonal with unit cell dimensions of a= b=79.1 Å and c=37.9 Å; the overall Rmerge on intensities in the resolution range from 25 to 1.7 Å was, in the best case, 4.4%.

  18. Research on technique of wavefront retrieval based on Foucault test

    NASA Astrophysics Data System (ADS)

    Yuan, Lvjun; Wu, Zhonghua

    2010-05-01

    During finely grinding the best fit sphere and initial stage of polishing, surface error of large aperture aspheric mirrors is too big to test using common interferometer. Foucault test is widely used in fabricating large aperture mirrors. However, the optical path is disturbed seriously by air turbulence, and changes of light and dark zones can not be identified, which often lowers people's judging ability and results in making mistake to diagnose surface error of the whole mirror. To solve the problem, the research presents wavefront retrieval based on Foucault test through digital image processing and quantitative calculation. Firstly, real Foucault image can be gained through collecting a variety of images by CCD, and then average these image to eliminate air turbulence. Secondly, gray values are converted into surface error values through principle derivation, mathematical modeling, and software programming. Thirdly, linear deviation brought by defocus should be removed by least-square method to get real surface error. At last, according to real surface error, plot wavefront map, gray contour map and corresponding pseudo color contour map. The experimental results indicates that the three-dimensional wavefront map and two-dimensional contour map are able to accurately and intuitively show surface error on the whole mirrors under test, and they are beneficial to grasp surface error as a whole. The technique can be used to guide the fabrication of large aperture and long focal mirrors during grinding and initial stage of polishing the aspheric surface, which improves fabricating efficiency and precision greatly.

  19. Multivariate analysis of remote LIBS spectra using partial least squares, principal component analysis, and related techniques

    SciTech Connect

    Clegg, Samuel M; Barefield, James E; Wiens, Roger C; Sklute, Elizabeth; Dyare, Melinda D

    2008-01-01

    Quantitative analysis with LIBS traditionally employs calibration curves that are complicated by the chemical matrix effects. These chemical matrix effects influence the LIBS plasma and the ratio of elemental composition to elemental emission line intensity. Consequently, LIBS calibration typically requires a priori knowledge of the unknown, in order for a series of calibration standards similar to the unknown to be employed. In this paper, three new Multivariate Analysis (MV A) techniques are employed to analyze the LIBS spectra of 18 disparate igneous and highly-metamorphosed rock samples. Partial Least Squares (PLS) analysis is used to generate a calibration model from which unknown samples can be analyzed. Principal Components Analysis (PCA) and Soft Independent Modeling of Class Analogy (SIMCA) are employed to generate a model and predict the rock type of the samples. These MV A techniques appear to exploit the matrix effects associated with the chemistries of these 18 samples.

  20. Analysis of compressive fracture in rock using statistical techniques

    SciTech Connect

    Blair, S.C.

    1994-12-01

    Fracture of rock in compression is analyzed using a field-theory model, and the processes of crack coalescence and fracture formation and the effect of grain-scale heterogeneities on macroscopic behavior of rock are studied. The model is based on observations of fracture in laboratory compression tests, and incorporates assumptions developed using fracture mechanics analysis of rock fracture. The model represents grains as discrete sites, and uses superposition of continuum and crack-interaction stresses to create cracks at these sites. The sites are also used to introduce local heterogeneity. Clusters of cracked sites can be analyzed using percolation theory. Stress-strain curves for simulated uniaxial tests were analyzed by studying the location of cracked sites, and partitioning of strain energy for selected intervals. Results show that the model implicitly predicts both development of shear-type fracture surfaces and a strength-vs-size relation that are similar to those observed for real rocks. Results of a parameter-sensitivity analysis indicate that heterogeneity in the local stresses, attributed to the shape and loading of individual grains, has a first-order effect on strength, and that increasing local stress heterogeneity lowers compressive strength following an inverse power law. Peak strength decreased with increasing lattice size and decreasing mean site strength, and was independent of site-strength distribution. A model for rock fracture based on a nearest-neighbor algorithm for stress redistribution is also presented and used to simulate laboratory compression tests, with promising results.