Science.gov

Sample records for analysis technique based

  1. Key Point Based Data Analysis Technique

    NASA Astrophysics Data System (ADS)

    Yang, Su; Zhang, Yong

    In this paper, a new framework for data analysis based on the "key points" in data distribution is proposed. Here, the key points contain three types of data points: bridge points, border points, and skeleton points, where our main contribution is the bridge points. For each type of key points, we have developed the corresponding detection algorithm and tested its effectiveness with several synthetic data sets. Meanwhile, we further developed a new hierarchical clustering algorithm SPHC (Skeleton Point based Hierarchical Clustering) to demonstrate the possible applications of the key points acquired. Based on some real-world data sets, we experimentally show that SPHC performs better compared with several classical clustering algorithms including Complete-Link Hierarchical Clustering, Single-Link Hierarchical Clustering, KMeans, Ncut, and DBSCAN.

  2. FDTD technique based crosstalk analysis of bundled SWCNT interconnects

    NASA Astrophysics Data System (ADS)

    Singh Duksh, Yograj; Kaushik, Brajesh Kumar; Agarwal, Rajendra P.

    2015-05-01

    The equivalent electrical circuit model of a bundled single-walled carbon nanotube based distributed RLC interconnects is employed for the crosstalk analysis. The accurate time domain analysis and crosstalk effect in the VLSI interconnect has emerged as an essential design criteria. This paper presents a brief description of the numerical method based finite difference time domain (FDTD) technique that is intended for estimation of voltages and currents on coupled transmission lines. For the FDTD implementation, the stability of the proposed model is strictly restricted by the Courant condition. This method is used for the estimation of crosstalk induced propagation delay and peak voltage in lossy RLC interconnects. Both functional and dynamic crosstalk effects are analyzed in the coupled transmission line. The effect of line resistance on crosstalk induced delay, and peak voltage under dynamic and functional crosstalk is also evaluated. The FDTD analysis and the SPICE simulations are carried out at 32 nm technology node for the global interconnects. It is observed that the analytical results obtained using the FDTD technique are in good agreement with the SPICE simulation results. The crosstalk induced delay, propagation delay, and peak voltage obtained using the FDTD technique shows average errors of 4.9%, 3.4% and 0.46%, respectively, in comparison to SPICE.

  3. Improved mesh based photon sampling techniques for neutron activation analysis

    SciTech Connect

    Relson, E.; Wilson, P. P. H.; Biondo, E. D.

    2013-07-01

    The design of fusion power systems requires analysis of neutron activation of large, complex volumes, and the resulting particles emitted from these volumes. Structured mesh-based discretization of these problems allows for improved modeling in these activation analysis problems. Finer discretization of these problems results in large computational costs, which drives the investigation of more efficient methods. Within an ad hoc subroutine of the Monte Carlo transport code MCNP, we implement sampling of voxels and photon energies for volumetric sources using the alias method. The alias method enables efficient sampling of a discrete probability distribution, and operates in 0(1) time, whereas the simpler direct discrete method requires 0(log(n)) time. By using the alias method, voxel sampling becomes a viable alternative to sampling space with the 0(1) approach of uniformly sampling the problem volume. Additionally, with voxel sampling it is straightforward to introduce biasing of volumetric sources, and we implement this biasing of voxels as an additional variance reduction technique that can be applied. We verify our implementation and compare the alias method, with and without biasing, to direct discrete sampling of voxels, and to uniform sampling. We study the behavior of source biasing in a second set of tests and find trends between improvements and source shape, material, and material density. Overall, however, the magnitude of improvements from source biasing appears to be limited. Future work will benefit from the implementation of efficient voxel sampling - particularly with conformal unstructured meshes where the uniform sampling approach cannot be applied. (authors)

  4. GC-Based Techniques for Breath Analysis: Current Status, Challenges, and Prospects.

    PubMed

    Xu, Mingjun; Tang, Zhentao; Duan, Yixiang; Liu, Yong

    2016-07-01

    Breath analysis is a noninvasive diagnostic method that profiles a person's physical state by volatile organic compounds in the breath. It has huge potential in the field of disease diagnosis. In order to offer opportunities for practical applications, various GC-based techniques have been investigated for on-line breath analysis since GC is the most preferred technique for mixed gas separation. This article reviews the development of breath analysis and GC-based techniques in basic breath research, involving sampling methods, preconcentration methods, conventional GC-based techniques, and newly developed GC techniques for breath analysis. The combination of GC and newly developed detection techniques takes advantages of the virtues of each. In addition, portable GC or micro GC are poised to become field GC-based techniques in breath analysis. Challenges faced in GC-based techniques for breath analysis are discussed candidly. Effective cooperation of experts from different fields is urgent to promote the development of breath analysis.

  5. BCC skin cancer diagnosis based on texture analysis techniques

    NASA Astrophysics Data System (ADS)

    Chuang, Shao-Hui; Sun, Xiaoyan; Chang, Wen-Yu; Chen, Gwo-Shing; Huang, Adam; Li, Jiang; McKenzie, Frederic D.

    2011-03-01

    In this paper, we present a texture analysis based method for diagnosing the Basal Cell Carcinoma (BCC) skin cancer using optical images taken from the suspicious skin regions. We first extracted the Run Length Matrix and Haralick texture features from the images and used a feature selection algorithm to identify the most effective feature set for the diagnosis. We then utilized a Multi-Layer Perceptron (MLP) classifier to classify the images to BCC or normal cases. Experiments showed that detecting BCC cancer based on optical images is feasible. The best sensitivity and specificity we achieved on our data set were 94% and 95%, respectively.

  6. Ionospheric Plasma Drift Analysis Technique Based On Ray Tracing

    NASA Astrophysics Data System (ADS)

    Ari, Gizem; Toker, Cenk

    2016-07-01

    Ionospheric drift measurements provide important information about the variability in the ionosphere, which can be used to quantify ionospheric disturbances caused by natural phenomena such as solar, geomagnetic, gravitational and seismic activities. One of the prominent ways for drift measurement depends on instrumentation based measurements, e.g. using an ionosonde. The drift estimation of an ionosonde depends on measuring the Doppler shift on the received signal, where the main cause of Doppler shift is the change in the length of the propagation path of the signal between the transmitter and the receiver. Unfortunately, ionosondes are expensive devices and their installation and maintenance require special care. Furthermore, the ionosonde network over the world or even Europe is not dense enough to obtain a global or continental drift map. In order to overcome the difficulties related to an ionosonde, we propose a technique to perform ionospheric drift estimation based on ray tracing. First, a two dimensional TEC map is constructed by using the IONOLAB-MAP tool which spatially interpolates the VTEC estimates obtained from the EUREF CORS network. Next, a three dimensional electron density profile is generated by inputting the TEC estimates to the IRI-2015 model. Eventually, a close-to-real situation electron density profile is obtained in which ray tracing can be performed. These profiles can be constructed periodically with a period of as low as 30 seconds. By processing two consequent snapshots together and calculating the propagation paths, we estimate the drift measurements over any coordinate of concern. We test our technique by comparing the results to the drift measurements taken at the DPS ionosonde at Pruhonice, Czech Republic. This study is supported by TUBITAK 115E915 and Joint TUBITAK 114E092 and AS CR14/001 projects.

  7. Simultaneous and integrated neutron-based techniques for material analysis of a metallic ancient flute

    NASA Astrophysics Data System (ADS)

    Festa, G.; Pietropaolo, A.; Grazzi, F.; Sutton, L. F.; Scherillo, A.; Bognetti, L.; Bini, A.; Barzagli, E.; Schooneveld, E.; Andreani, C.

    2013-09-01

    A metallic 19th century flute was studied by means of integrated and simultaneous neutron-based techniques: neutron diffraction, neutron radiative capture analysis and neutron radiography. This experiment follows benchmark measurements devoted to assessing the effectiveness of a multitask beamline concept for neutron-based investigation on materials. The aim of this study is to show the potential application of the approach using multiple and integrated neutron-based techniques for musical instruments. Such samples, in the broad scenario of cultural heritage, represent an exciting research field. They may represent an interesting link between different disciplines such as nuclear physics, metallurgy and acoustics.

  8. Sex-based differences in lifting technique under increasing load conditions: A principal component analysis.

    PubMed

    Sheppard, P S; Stevenson, J M; Graham, R B

    2016-05-01

    The objective of the present study was to determine if there is a sex-based difference in lifting technique across increasing-load conditions. Eleven male and 14 female participants (n = 25) with no previous history of low back disorder participated in the study. Participants completed freestyle, symmetric lifts of a box with handles from the floor to a table positioned at 50% of their height for five trials under three load conditions (10%, 20%, and 30% of their individual maximum isometric back strength). Joint kinematic data for the ankle, knee, hip, and lumbar and thoracic spine were collected using a two-camera Optotrak motion capture system. Joint angles were calculated using a three-dimensional Euler rotation sequence. Principal component analysis (PCA) and single component reconstruction were applied to assess differences in lifting technique across the entire waveforms. Thirty-two PCs were retained from the five joints and three axes in accordance with the 90% trace criterion. Repeated-measures ANOVA with a mixed design revealed no significant effect of sex for any of the PCs. This is contrary to previous research that used discrete points on the lifting curve to analyze sex-based differences, but agrees with more recent research using more complex analysis techniques. There was a significant effect of load on lifting technique for five PCs of the lower limb (PC1 of ankle flexion, knee flexion, and knee adduction, as well as PC2 and PC3 of hip flexion) (p < 0.005). However, there was no significant effect of load on the thoracic and lumbar spine. It was concluded that when load is standardized to individual back strength characteristics, males and females adopted a similar lifting technique. In addition, as load increased male and female participants changed their lifting technique in a similar manner. PMID:26851478

  9. Robust and discriminating method for face recognition based on correlation technique and independent component analysis model.

    PubMed

    Alfalou, A; Brosseau, C

    2011-03-01

    We demonstrate a novel technique for face recognition. Our approach relies on the performances of a strongly discriminating optical correlation method along with the robustness of the independent component analysis (ICA) model. Simulations were performed to illustrate how this algorithm can identify a face with images from the Pointing Head Pose Image Database. While maintaining algorithmic simplicity, this approach based on ICA representation significantly increases the true recognition rate compared to that obtained using our previously developed all-numerical ICA identity recognition method and another method based on optical correlation and a standard composite filter. PMID:21368935

  10. Application of Petri net based analysis techniques to signal transduction pathways

    PubMed Central

    Sackmann, Andrea; Heiner, Monika; Koch, Ina

    2006-01-01

    Background Signal transduction pathways are usually modelled using classical quantitative methods, which are based on ordinary differential equations (ODEs). However, some difficulties are inherent in this approach. On the one hand, the kinetic parameters involved are often unknown and have to be estimated. With increasing size and complexity of signal transduction pathways, the estimation of missing kinetic data is not possible. On the other hand, ODEs based models do not support any explicit insights into possible (signal-) flows within the network. Moreover, a huge amount of qualitative data is available due to high-throughput techniques. In order to get information on the systems behaviour, qualitative analysis techniques have been developed. Applications of the known qualitative analysis methods concern mainly metabolic networks. Petri net theory provides a variety of established analysis techniques, which are also applicable to signal transduction models. In this context special properties have to be considered and new dedicated techniques have to be designed. Methods We apply Petri net theory to model and analyse signal transduction pathways first qualitatively before continuing with quantitative analyses. This paper demonstrates how to build systematically a discrete model, which reflects provably the qualitative biological behaviour without any knowledge of kinetic parameters. The mating pheromone response pathway in Saccharomyces cerevisiae serves as case study. Results We propose an approach for model validation of signal transduction pathways based on the network structure only. For this purpose, we introduce the new notion of feasible t-invariants, which represent minimal self-contained subnets being active under a given input situation. Each of these subnets stands for a signal flow in the system. We define maximal common transition sets (MCT-sets), which can be used for t-invariant examination and net decomposition into smallest biologically

  11. Validity of content-based techniques to distinguish true and fabricated statements: A meta-analysis.

    PubMed

    Oberlader, Verena A; Naefgen, Christoph; Koppehele-Gossel, Judith; Quinten, Laura; Banse, Rainer; Schmidt, Alexander F

    2016-08-01

    Within the scope of judicial decisions, approaches to distinguish between true and fabricated statements have been of particular importance since ancient times. Although methods focusing on "prototypical" deceptive behavior (e.g., psychophysiological phenomena, nonverbal cues) have largely been rejected with regard to validity, content-based techniques constitute a promising approach and are well established within the applied forensic context. The basic idea of this approach is that experience-based and nonexperience-based statements differ in their content-related quality. In order to test the validity of the most prominent content-based techniques, criteria-based content analysis (CBCA) and reality monitoring (RM), we conducted a comprehensive meta-analysis on English- and German-language studies. Based on a variety of decision criteria, 56 studies were included revealing an overall effect size of g = 1.03 (95% confidence interval [0.78, 1.27], Q = 420.06, p < .001, I2 = 92.48%, N = 3,429). There was no significant difference in the effectiveness of CBCA and RM. Additionally, we investigated a number of moderator variables, such as characteristics of participants, statements, and judgment procedures, as well as general study characteristics. Results showed that the application of all CBCA criteria outperformed any incomplete CBCA criteria set. Furthermore, statement classification based on discriminant functions revealed higher discrimination rates than decisions based on sum scores. Finally, unpublished studies showed higher effect sizes than studies published in peer-reviewed journals. All results are discussed in terms of their significance for future research (e.g., developing standardized decision rules) and practical application (e.g., user training, applying complete criteria set). (PsycINFO Database Record PMID:27149290

  12. Validity of content-based techniques to distinguish true and fabricated statements: A meta-analysis.

    PubMed

    Oberlader, Verena A; Naefgen, Christoph; Koppehele-Gossel, Judith; Quinten, Laura; Banse, Rainer; Schmidt, Alexander F

    2016-08-01

    Within the scope of judicial decisions, approaches to distinguish between true and fabricated statements have been of particular importance since ancient times. Although methods focusing on "prototypical" deceptive behavior (e.g., psychophysiological phenomena, nonverbal cues) have largely been rejected with regard to validity, content-based techniques constitute a promising approach and are well established within the applied forensic context. The basic idea of this approach is that experience-based and nonexperience-based statements differ in their content-related quality. In order to test the validity of the most prominent content-based techniques, criteria-based content analysis (CBCA) and reality monitoring (RM), we conducted a comprehensive meta-analysis on English- and German-language studies. Based on a variety of decision criteria, 56 studies were included revealing an overall effect size of g = 1.03 (95% confidence interval [0.78, 1.27], Q = 420.06, p < .001, I2 = 92.48%, N = 3,429). There was no significant difference in the effectiveness of CBCA and RM. Additionally, we investigated a number of moderator variables, such as characteristics of participants, statements, and judgment procedures, as well as general study characteristics. Results showed that the application of all CBCA criteria outperformed any incomplete CBCA criteria set. Furthermore, statement classification based on discriminant functions revealed higher discrimination rates than decisions based on sum scores. Finally, unpublished studies showed higher effect sizes than studies published in peer-reviewed journals. All results are discussed in terms of their significance for future research (e.g., developing standardized decision rules) and practical application (e.g., user training, applying complete criteria set). (PsycINFO Database Record

  13. Advanced SuperDARN meteor wind observations based on raw time series analysis technique

    NASA Astrophysics Data System (ADS)

    Tsutsumi, M.; Yukimatu, A. S.; Holdsworth, D. A.; Lester, M.

    2009-04-01

    The meteor observation technique based on SuperDARN raw time series analysis has been upgraded. This technique extracts meteor information as biproducts and does not degrade the quality of normal SuperDARN operations. In the upgrade the radar operating system (RADOPS) has been modified so that it can oversample every 15 km during the normal operations, which have a range resolution of 45 km. As an alternative method for better range determination a frequency domain interferometry (FDI) capability was also coded in RADOPS, where the operating radio frequency can be changed every pulse sequence. Test observations were conducted using the CUTLASS Iceland East and Finland radars, where oversampling and FDI operation (two frequencies separated by 3 kHz) were simultaneously carried out. Meteor ranges obtained in both ranging techniques agreed very well. The ranges were then combined with the interferometer data to estimate meteor echo reflection heights. Although there were still some ambiguities in the arrival angles of echoes because of the rather long antenna spacing of the interferometers, the heights and arrival angles of most of meteor echoes were more accurately determined than previously. Wind velocities were successfully estimated over the height range of 84 to 110 km. The FDI technique developed here can be further applied to the common SuperDARN operation, and study of fine horizontal structures of F region plasma irregularities is expected in the future.

  14. A novel murmur-based heart sound feature extraction technique using envelope-morphological analysis

    NASA Astrophysics Data System (ADS)

    Yao, Hao-Dong; Ma, Jia-Li; Fu, Bin-Bin; Wang, Hai-Yang; Dong, Ming-Chui

    2015-07-01

    Auscultation of heart sound (HS) signals serves as an important primary approach to diagnose cardiovascular diseases (CVDs) for centuries. Confronting the intrinsic drawbacks of traditional HS auscultation, computer-aided automatic HS auscultation based on feature extraction technique has witnessed explosive development. Yet, most existing HS feature extraction methods adopt acoustic or time-frequency features which exhibit poor relationship with diagnostic information, thus restricting the performance of further interpretation and analysis. Tackling such a bottleneck problem, this paper innovatively proposes a novel murmur-based HS feature extraction method since murmurs contain massive pathological information and are regarded as the first indications of pathological occurrences of heart valves. Adapting discrete wavelet transform (DWT) and Shannon envelope, the envelope-morphological characteristics of murmurs are obtained and three features are extracted accordingly. Validated by discriminating normal HS and 5 various abnormal HS signals with extracted features, the proposed method provides an attractive candidate in automatic HS auscultation.

  15. Applications of synchrotron-based micro-imaging techniques for the analysis of Cultural Heritage materials

    SciTech Connect

    Cotte, Marine; Chilida, Javier; Walter, Philippe; Taniguchi, Yoko; Susini, Jean

    2009-01-29

    The analysis of cultural Heritage objects is often technically challenging. When analyzing micro-fragments, the amount of matter is usually very tiny, hence requiring sensitive techniques. These samples, in particular painting fragments, may present multi-layered structures, with layer thickness of {approx}10 {mu}m. It leads to favor micro-imaging techniques, with a good lateral resolution (about one micrometer), that manage the discriminative study of each layer. Besides, samples are usually very complex in term of chemistry, as they are made of mineral and organic matters, amorphous and crystallized phases, major and minor elements. Accordingly, a multi-modal approach is generally essential to solve the chemical complexity of such hybrid materials. Different examples will be given, to illustrate the various possibilities of synchrotron-based micro-imaging techniques, such as micro X-ray diffraction, micro X-ray fluorescence, micro X-ray absorption spectroscopy and micro FTIR spectroscopy. Focus will be made on paintings, but the whole range of museum objects (going from soft matter like paper or wood to hard matter like metal and glass) will be also considered.

  16. Analysis of food proteins and peptides by mass spectrometry-based techniques.

    PubMed

    Mamone, Gianfranco; Picariello, Gianluca; Caira, Simonetta; Addeo, Francesco; Ferranti, Pasquale

    2009-10-23

    Mass spectrometry has arguably become the core technology for the characterization of food proteins and peptides. The application of mass spectrometry-based techniques for the qualitative and quantitative analysis of the complex protein mixtures contained in most food preparations is playing a decisive role in the understanding of their nature, structure, functional properties and impact on human health. The application of mass spectrometry to protein analysis has been revolutionized in the recent years by the development of soft ionization techniques such as electrospray ionization and matrix assisted laser desorption/ionization, and by the introduction of multi-stage and 'hybrid' analyzers able to generate de novo amino acid sequence information. The interfacing of mass spectrometry with protein databases has resulted in entirely new possibilities of protein characterization, including the high sensitivity mapping (femtomole to attomole levels) of post-translational and other chemical modifications, protein conformations and protein-protein and protein-ligand interactions, and in general for proteomic studies, building up the core platform of modern proteomic science. MS-based strategies to food and nutrition proteomics are now capable to address a wide range of analytical questions which include issues related to food quality and safety, certification and traceability of (typical) products, and to the definition of the structure/function relationship of food proteins and peptides. These different aspects are necessarily interconnected and can be effectively understood and elucidated only by use of integrated, up-to-date analytical approaches. In this review, the main aspects of current and perspective applications of mass spectrometry and proteomic technologies to the structural characterization of food proteins are presented, with focus on issues related to their detection, identification, and quantification, relevant for their biochemical, technological and

  17. Subdivision based isogeometric analysis technique for electric field integral equations for simply connected structures

    NASA Astrophysics Data System (ADS)

    Li, Jie; Dault, Daniel; Liu, Beibei; Tong, Yiying; Shanker, Balasubramaniam

    2016-08-01

    The analysis of electromagnetic scattering has long been performed on a discrete representation of the geometry. This representation is typically continuous but not differentiable. The need to define physical quantities on this geometric representation has led to development of sets of basis functions that need to satisfy constraints at the boundaries of the elements/tessellations (viz., continuity of normal or tangential components across element boundaries). For electromagnetics, these result in either curl/div-conforming basis sets. The geometric representation used for analysis is in stark contrast with that used for design, wherein the surface representation is higher order differentiable. Using this representation for both geometry and physics on geometry has several advantages, and is elucidated in Hughes et al. (2005) [7]. Until now, a bulk of the literature on isogeometric methods have been limited to solid mechanics, with some effort to create NURBS based basis functions for electromagnetic analysis. In this paper, we present the first complete isogeometry solution methodology for the electric field integral equation as applied to simply connected structures. This paper systematically proceeds through surface representation using subdivision, definition of vector basis functions on this surface, to fidelity in the solution of integral equations. We also present techniques to stabilize the solution at low frequencies, and impose a Calderón preconditioner. Several results presented serve to validate the proposed approach as well as demonstrate some of its capabilities.

  18. An Overview of Micromechanics-Based Techniques for the Analysis of Microstructural Randomness in Functionally Graded Materials

    SciTech Connect

    Ferrante, Fernando J.; Brady, Lori L. Graham; Acton, Katherine; Arwade, Sanjay R.

    2008-02-15

    A review of current research efforts to develop micromechanics-based techniques for the study of microstructural randomness of functionally graded materials is presented, along with a framework developed by the authors of this paper that includes stochastic simulation of statistically inhomogeneous samples and a windowing technique coupled with a micromechanical homogenization technique. The methodology is illustrated through the analysis of one sample coupled with finite element modeling.

  19. Operational modal analysis via image based technique of very flexible space structures

    NASA Astrophysics Data System (ADS)

    Sabatini, Marco; Gasbarri, Paolo; Palmerini, Giovanni B.; Monti, Riccardo

    2013-08-01

    Vibrations represent one of the most important topics of the engineering design relevant to flexible structures. The importance of this problem increases when a very flexible system is considered, and this is often the case of space structures. In order to identify the modal characteristics, in terms of natural frequencies and relevant modal parameters, ground tests are performed. However, these parameters could vary due to the operative conditions of the system. In order to continuously monitor the modal characteristics during the satellite lifetime, an operational modal analysis is mandatory. This kind of analysis is usually performed by using classical accelerometers or strain gauges and by properly analyzing the acquired output. In this paper a different approach for the vibrations data acquisition will be performed via image-based technique. In order to simulate a flexible satellite, a free flying platform is used; the problem is furthermore complicated by the fact that the overall system, constituted by a highly rigid bus and very flexible panels, must necessarily be modeled as a multibody system. In the experimental campaign, the camera, placed on the bus, will be used to identify the eigenfrequencies of the vibrating structure; in this case aluminum thin plates simulate very flexible solar panels. The structure is excited by a hammer or studied during a fast attitude maneuver. The results of the experimental activity will be investigated and compared with respect to the numerical simulation obtained via a FEM-multibody software and the relevant results will be proposed and discussed.

  20. Instanton-based techniques for analysis and reduction of error floor of LDPC codes

    SciTech Connect

    Chertkov, Michael; Chilappagari, Shashi K; Stepanov, Mikhail G; Vasic, Bane

    2008-01-01

    We describe a family of instanton-based optimization methods developed recently for the analysis of the error floors of low-density parity-check (LDPC) codes. Instantons are the most probable configurations of the channel noise which result in decoding failures. We show that the general idea and the respective optimization technique are applicable broadly to a variety of channels, discrete or continuous, and variety of sub-optimal decoders. Specifically, we consider: iterative belief propagation (BP) decoders, Gallager type decoders, and linear programming (LP) decoders performing over the additive white Gaussian noise channel (AWGNC) and the binary symmetric channel (BSC). The instanton analysis suggests that the underlying topological structures of the most probable instanton of the same code but different channels and decoders are related to each other. Armed with this understanding of the graphical structure of the instanton and its relation to the decoding failures, we suggest a method to construct codes whose Tanner graphs are free of these structures, and thus have less significant error floors.

  1. Polyspectral signal analysis techniques for condition based maintenance of helicopter drive-train system

    NASA Astrophysics Data System (ADS)

    Hassan Mohammed, Mohammed Ahmed

    For an efficient maintenance of a diverse fleet of air- and rotorcraft, effective condition based maintenance (CBM) must be established based on rotating components monitored vibration signals. In this dissertation, we present theory and applications of polyspectral signal processing techniques for condition monitoring of critical components in the AH-64D helicopter tail rotor drive train system. Currently available vibration-monitoring tools are mostly built around auto- and cross-power spectral analysis which have limited performance in detecting frequency correlations higher than second order. Studying higher order correlations and their Fourier transforms, higher order spectra, provides more information about the vibration signals which helps in building more accurate diagnostic models of the mechanical system. Based on higher order spectral analysis, different signal processing techniques are developed to assess health conditions of different critical rotating-components in the AH-64D helicopter drive-train. Based on cross-bispectrum, quadratic nonlinear transfer function is presented to model second order nonlinearity in a drive-shaft running between the two hanger bearings. Then, quadratic-nonlinearity coupling coefficient between frequency harmonics of the rotating shaft is used as condition metric to study different seeded shaft faults compared to baseline case, namely: shaft misalignment, shaft imbalance, and combination of shaft misalignment and imbalance. The proposed quadratic-nonlinearity metric shows better capabilities in distinguishing the four studied shaft settings than the conventional linear coupling based on cross-power spectrum. We also develop a new concept of Quadratic-Nonlinearity Power-Index spectrum, QNLPI(f), that can be used in signal detection and classification, based on bicoherence spectrum. The proposed QNLPI(f) is derived as a projection of the three-dimensional bicoherence spectrum into two-dimensional spectrum that

  2. An efficient technique for nuclei segmentation based on ellipse descriptor analysis and improved seed detection algorithm.

    PubMed

    Xu, Hongming; Lu, Cheng; Mandal, Mrinal

    2014-09-01

    In this paper, we propose an efficient method for segmenting cell nuclei in the skin histopathological images. The proposed technique consists of four modules. First, it separates the nuclei regions from the background with an adaptive threshold technique. Next, an elliptical descriptor is used to detect the isolated nuclei with elliptical shapes. This descriptor classifies the nuclei regions based on two ellipticity parameters. Nuclei clumps and nuclei with irregular shapes are then localized by an improved seed detection technique based on voting in the eroded nuclei regions. Finally, undivided nuclei regions are segmented by a marked watershed algorithm. Experimental results on 114 different image patches indicate that the proposed technique provides a superior performance in nuclei detection and segmentation.

  3. Performance Analysis of SAC Optical PPM-CDMA System-Based Interference Rejection Technique

    NASA Astrophysics Data System (ADS)

    Alsowaidi, N.; Eltaif, Tawfig; Mokhtar, M. R.

    2016-03-01

    In this paper, we aim to theoretically analyse optical code division multiple access (OCDMA) system that based on successive interference cancellation (SIC) using pulse position modulation (PPM), considering the interference between the users, imperfection cancellation occurred during the cancellation process and receiver noises. Spectral amplitude coding (SAC) scheme is used to suppress the overlapping between the users and reduce the receiver noises effect. The theoretical analysis of the multiple access interference (MAI)-limited performance of this approach indicates the influence of the size of M-ary PPM on OCDMA system. The OCDMA system performance improves with increasing M-ary PPM. Therefore, it was found that the SIC/SAC-OCDMA system using PPM technique along with modified prime (MPR) codes used as signature sequence code offers significant improvement over the one without cancellation and it can support up to 103 users at the benchmarking value of bit error rate (BER) = 10-9 with prime number p = 11 while the system without cancellation scheme can support only up to 52 users.

  4. A neighbourhood analysis based technique for real-time error concealment in H.264 intra pictures

    NASA Astrophysics Data System (ADS)

    Beesley, Steven T. C.; Grecos, Christos; Edirisinghe, Eran

    2007-02-01

    H.264s extensive use of context-based adaptive binary arithmetic or variable length coding makes streams highly susceptible to channel errors, a common occurrence over networks such as those used by mobile devices. Even a single bit error will cause a decoder to discard all stream data up to the next fixed length resynchronisation point, the worst scenario is that an entire slice is lost. In cases where retransmission and forward error concealment are not possible, a decoder should conceal any erroneous data in order to minimise the impact on the viewer. Stream errors can often be spotted early in the decode cycle of a macroblock which if aborted can provide unused processor cycles, these can instead be used to conceal errors at minimal cost, even as part of a real time system. This paper demonstrates a technique that utilises Sobel convolution kernels to quickly analyse the neighbourhood surrounding erroneous macroblocks before performing a weighted multi-directional interpolation. This generates significantly improved statistical (PSNR) and visual (IEEE structural similarity) results when compared to the commonly used weighted pixel value averaging. Furthermore it is also computationally scalable, both during analysis and concealment, achieving maximum performance from the spare processing power available.

  5. Spatio-temporal analysis of discharge regimes based on hydrograph classification techniques in an agricultural catchment

    NASA Astrophysics Data System (ADS)

    Chen, Xiaofei; Bloeschl, Guenter; Blaschke, Alfred Paul; Silasari, Rasmiaditya; Exner-Kittridge, Mike

    2016-04-01

    The stream, discharges and groundwater hydro-graphs is an integration in spatial and temporal variations for small-scale hydrological response. Characterizing discharges response regime in a drainage farmland is essential to irrigation strategies and hydrologic modeling. Especially for agricultural basins, diurnal hydro-graphs from drainage discharges have been investigated to achieve drainage process inferences in varying magnitudes. To explore the variability of discharge responses, we developed an impersonal method to characterize and classify discharge hydrograph based on features of magnitude and time-series. A cluster analysis (hierarchical k-means) and principal components analysis techniques are used for discharge time-series and groundwater level hydro-graphs to analyze their event characteristics, using 8 different discharge and 18 groundwater level hydro-graphs to test. As the variability of rainfall activity, system location, discharge regime and soil moisture pre-event condition in the catchment, three main clusters of discharge hydro-graph are identified from the test. The results show that : (1) the hydro-graphs from these drainage discharges had similar shapes but different magnitudes for individual rainstorm; the similarity is also showed in overland flow discharge and spring system; (2) for each cluster, the similarity of shape insisted, but the rising slope are different due to different antecedent wetness condition and the rain accumulation meanwhile the difference of regression slope can be explained by system location and discharge area; and (3) surface water always has a close proportional relation with soil moisture throughout the year, while only after the soil moisture exceeds a certain threshold does the outflow of tile drainage systems have a direct ratio relationship with soil moisture and a inverse relationship with the groundwater levels. Finally, we discussed the potential application of hydrograph classification in a wider range of

  6. A Study on Brain Mapping Technique Based on Hierarchical Decomposition Analysis

    NASA Astrophysics Data System (ADS)

    Oura, Kunihiko

    In this paper, brain functional mapping method by hierarchical decomposition analysis (HDA) is proposed. HDA is one of the multi-dimensional AR modeling methods and well-known for its validity to detect temporal lobe seizures. The author transforms the estimated AR model in the form of transfer function from the inner blood flow signal to the cerebral cortex. The signal for HDA is oxidized hemoglobin density HbO, which is measured by near infrared spectroscopy (NIRS). Comparing the 2 tasks which use arithmetic sense, the difference of brain activity becomes clear by proposed technique.

  7. Wilcoxon signed-rank-based technique for the pulse-shape analysis of HPGe detectors

    NASA Astrophysics Data System (ADS)

    Martín, S.; Quintana, B.; Barrientos, D.

    2016-07-01

    The characterization of the electric response of segmented-contact high-purity germanium detectors requires scanning systems capable of accurately associating each pulse with the position of the interaction that generated it. This process requires an algorithm sensitive to changes above the electronic noise in the pulse shapes produced at different positions, depending on the resolution of the Ge crystal. In this work, a pulse-shape comparison technique based on the Wilcoxon signed-rank test has been developed. It provides a method to distinguish pulses coming from different interaction points in the germanium crystal. Therefore, this technique is a necessary step for building a reliable pulse-shape database that can be used later for the determination of the position of interaction for γ-ray tracking spectrometry devices such as AGATA, GRETA or GERDA. The method was validated by comparison with a χ2 test using simulated and experimental pulses corresponding to a Broad Energy germanium detector (BEGe).

  8. Complexity analysis of sleep and alterations with insomnia based on non-invasive techniques.

    PubMed

    Holloway, Philip M; Angelova, Maia; Lombardo, Sara; St Clair Gibson, Alan; Lee, David; Ellis, Jason

    2014-04-01

    For the first time, fractal analysis techniques are implemented to study the correlations present in sleep actigraphy for individuals suffering from acute insomnia with comparisons made against healthy subjects. Analysis was carried out for 21 healthy individuals with no diagnosed sleep disorders and 26 subjects diagnosed with acute insomnia during night-time hours. Detrended fluctuation analysis was applied in order to look for 1/f-fluctuations indicative of high complexity. The aim is to investigate whether complexity analysis can differentiate between people who sleep normally and people who suffer from acute insomnia. We hypothesize that the complexity will be higher in subjects who suffer from acute insomnia owing to increased night-time arousals. This hypothesis, although contrary to much of the literature surrounding complexity in physiology, was found to be correct-for our study. The complexity results for nearly all of the subjects fell within a 1/f-range, indicating the presence of underlying control mechanisms. The subjects with acute insomnia displayed significantly higher correlations, confirmed by significance testing-possibly a result of too much activity in the underlying regulatory systems. Moreover, we found a linear relationship between complexity and variability, both of which increased with the onset of insomnia. Complexity analysis is very promising and could prove to be a useful non-invasive identifier for people who suffer from sleep disorders such as insomnia.

  9. Fluorous affinity-based separation techniques for the analysis of biogenic and related molecules.

    PubMed

    Hayama, Tadashi; Yoshida, Hideyuki; Yamaguchi, Masatoshi; Nohta, Hitoshi

    2014-12-01

    Perfluoroalkyl-containing compounds have a unique 'fluorous' property that refers to the remarkably specific affinity they share. Fluorous compounds can be easily isolated from non-fluorous species on the perfluoroalkyl-functionalized stationary phases used in fluorous solid-phase extraction and fluorous liquid chromatography by means of fluorous-fluorous interactions (fluorophilicity). Recently, this unique specificity has been applied to the highly selective enrichment and analysis of different classes of biogenic and related compounds in complex samples. Because the biogenic compounds are generally not 'fluorous', they must be derivatized with appropriate perfluoroalkyl group-containing reagent in order to utilize fluorous interaction. In this review, we introduce the application of fluorous affinity techniques including derivatization methods to biogenic sample analysis. PMID:24865313

  10. Mass Spectrometry Based Imaging Techniques for Spatially Resolved Analysis of Molecules

    PubMed Central

    Matros, Andrea; Mock, Hans-Peter

    2013-01-01

    Higher plants are composed of a multitude of tissues with specific functions, reflected by distinct profiles for transcripts, proteins, and metabolites. Comprehensive analysis of metabolites and proteins has advanced tremendously within recent years, and this progress has been driven by the rapid development of sophisticated mass spectrometric techniques. In most of the current “omics”-studies, analysis is performed on whole organ or whole plant extracts, rendering to the loss of spatial information. Mass spectrometry imaging (MSI) techniques have opened a new avenue to obtain information on the spatial distribution of metabolites and of proteins. Pioneered in the field of medicine, the approaches are now applied to study the spatial profiles of molecules in plant systems. A range of different plant organs and tissues have been successfully analyzed by MSI, and patterns of various classes of metabolites from primary and secondary metabolism could be obtained. It can be envisaged that MSI approaches will substantially contribute to build spatially resolved biochemical networks. PMID:23626593

  11. Advanced NMR-based techniques for pore structure analysis of coal

    SciTech Connect

    Smith, D.M.

    1992-01-01

    One of the main problems in coal utilization is the inability to properly characterize its complex pore structure. Coals typically have micro/ultra-micro pores but they also exhibit meso and macroporosity. Conventional pore size techniques (adsorption/condensation, mercury porosimetry) are limited because of this broad pore size range, microporosity, reactive nature of coal, samples must be completely dried, and network/percolation effects. Small angle scattering is limited because it probes both open and closed pores. Although one would not expect any single technique to provide a satisfactory description of a coal's structure, it is apparent that better techniques are necessary. We believe that measurement of the NMR parameters of various gas phase and adsorbed phase NMR active probes can provide the resolution to this problem. We will investigate the dependence of the common NMR parameters such as chemical shifts and relaxation times of several different nuclei and compounds on the pore structure of model microporous solids, carbons, and coals. In particular, we will study the interaction between several small molecules and the pore surfaces in coals. These molecules have been selected for their chemical and physical properties. A special NMR probe will be constructed which will allow the concurrent measurement of NMR properties and adsorption uptake at a variety of temperatures. All samples will be subjected to a suite of conventional'' pore structure analyses. These include nitrogen adsorption at 77 K with BET analysis, CO[sub 2] and CH[sub 4] adsorption at 273 K with D-R (Dubinin-Radushkevich) analysis, helium pycnometry, and small angle X-ray scattering as well as gas diffusion measurements.

  12. Analysis to feature-based video stabilization/registration techniques within application of traffic data collection

    NASA Astrophysics Data System (ADS)

    Sadat, Mojtaba T.; Viti, Francesco

    2015-02-01

    Machine vision is rapidly gaining popularity in the field of Intelligent Transportation Systems. In particular, advantages are foreseen by the exploitation of Aerial Vehicles (AV) in delivering a superior view on traffic phenomena. However, vibration on AVs makes it difficult to extract moving objects on the ground. To partly overcome this issue, image stabilization/registration procedures are adopted to correct and stitch multiple frames taken of the same scene but from different positions, angles, or sensors. In this study, we examine the impact of multiple feature-based techniques for stabilization, and we show that SURF detector outperforms the others in terms of time efficiency and output similarity.

  13. Analysis of meteorological variables in the Australasian region using ground- and space-based GPS techniques

    NASA Astrophysics Data System (ADS)

    Kuleshov, Yuriy; Choy, Suelynn; Fu, Erjiang Frank; Chane-Ming, Fabrice; Liou, Yuei-An; Pavelyev, Alexander G.

    2016-07-01

    Results of analysis of meteorological variables (temperature and moisture) in the Australasian region using the global positioning system (GPS) radio occultation (RO) and GPS ground-based observations verified with in situ radiosonde (RS) data are presented. The potential of using ground-based GPS observations for retrieving column integrated precipitable water vapour (PWV) over the Australian continent has been demonstrated using the Australian ground-based GPS reference stations network. Using data from the 15 ground-based GPS stations, the state of the atmosphere over Victoria during a significant weather event, the March 2010 Melbourne storm, has been investigated, and it has been shown that the GPS observations has potential for monitoring the movement of a weather front that has sharp moisture contrast. Temperature and moisture variability in the atmosphere over various climatic regions (the Indian and the Pacific Oceans, the Antarctic and Australia) has been examined using satellite-based GPS RO and in situ RS observations. Investigating recent atmospheric temperature trends over Antarctica, the time series of the collocated GPS RO and RS data were examined, and strong cooling in the lower stratosphere and warming through the troposphere over Antarctica has been identified, in agreement with outputs of climate models. With further expansion of the Global Navigation Satellite Systems (GNSS) system, it is expected that GNSS satellite- and ground-based measurements would be able to provide an order of magnitude larger amount of data which in turn could significantly advance weather forecasting services, climate monitoring and analysis in the Australasian region.

  14. Novel Recognition Method of Blast Furnace Dust Composition by Multifeature Analysis Based on Comprehensive Image-Processing Techniques

    NASA Astrophysics Data System (ADS)

    Guo, Hongwei; Su, Buxin; Bai, Zhenlong; Zhang, Jianliang; Li, Xinyu

    2014-11-01

    The traditional artificial recognition methods for the blast furnace dust composition have several disadvantages, including a great deal of information to dispose, complex operation, and low working efficiency. In this article, a multifeature analysis method based on comprehensive image-processing techniques was proposed to automatically recognize the blast furnace dust composition. First, the artificial recognition and feature analysis, which included image preprocessing, Harris corner feature, Canny edge feature, and Ruffle feature analysis, was designed to build the template image, so that any unknown dust digital image could be tested. Second, the composition of coke, microvariation pulverized coal, vitric, ash, and iron from dust would be distinguished according to their different range of values based on the multifeature analysis. The method is valid for recognizing the blast furnace dust composition automatically, and it is fast and has a high recognition accuracy.

  15. Novel Recognition Method of Blast Furnace Dust Composition by Multifeature Analysis Based on Comprehensive Image-Processing Techniques

    NASA Astrophysics Data System (ADS)

    Guo, Hongwei; Su, Buxin; Bai, Zhenlong; Zhang, Jianliang; Li, Xinyu

    2014-09-01

    The traditional artificial recognition methods for the blast furnace dust composition have several disadvantages, including a great deal of information to dispose, complex operation, and low working efficiency. In this article, a multifeature analysis method based on comprehensive image-processing techniques was proposed to automatically recognize the blast furnace dust composition. First, the artificial recognition and feature analysis, which included image preprocessing, Harris corner feature, Canny edge feature, and Ruffle feature analysis, was designed to build the template image, so that any unknown dust digital image could be tested. Second, the composition of coke, microvariation pulverized coal, vitric, ash, and iron from dust would be distinguished according to their different range of values based on the multifeature analysis. The method is valid for recognizing the blast furnace dust composition automatically, and it is fast and has a high recognition accuracy.

  16. Robust object tracking techniques for vision-based 3D motion analysis applications

    NASA Astrophysics Data System (ADS)

    Knyaz, Vladimir A.; Zheltov, Sergey Y.; Vishnyakov, Boris V.

    2016-04-01

    Automated and accurate spatial motion capturing of an object is necessary for a wide variety of applications including industry and science, virtual reality and movie, medicine and sports. For the most part of applications a reliability and an accuracy of the data obtained as well as convenience for a user are the main characteristics defining the quality of the motion capture system. Among the existing systems for 3D data acquisition, based on different physical principles (accelerometry, magnetometry, time-of-flight, vision-based), optical motion capture systems have a set of advantages such as high speed of acquisition, potential for high accuracy and automation based on advanced image processing algorithms. For vision-based motion capture accurate and robust object features detecting and tracking through the video sequence are the key elements along with a level of automation of capturing process. So for providing high accuracy of obtained spatial data the developed vision-based motion capture system "Mosca" is based on photogrammetric principles of 3D measurements and supports high speed image acquisition in synchronized mode. It includes from 2 to 4 technical vision cameras for capturing video sequences of object motion. The original camera calibration and external orientation procedures provide the basis for high accuracy of 3D measurements. A set of algorithms as for detecting, identifying and tracking of similar targets, so for marker-less object motion capture is developed and tested. The results of algorithms' evaluation show high robustness and high reliability for various motion analysis tasks in technical and biomechanics applications.

  17. Analysis and coding technique based on computational intelligence methods and image-understanding architecture

    NASA Astrophysics Data System (ADS)

    Kuvychko, Igor

    2000-05-01

    Human vision involves higher-level knowledge and top-bottom processes for resolving ambiguity and uncertainty in the real images. Even very advanced low-level image processing can not give any advantages without a highly effective knowledge-representation and reasoning system that is the solution of image understanding problem. Methods of image analysis and coding are directly based on the methods of knowledge representation and processing. Article suggests such models and mechanisms in form of Spatial Turing Machine that in place of symbols and tapes works with hierarchical networks represented dually as discrete and continuous structures. Such networks are able to perform both graph and diagrammatic operations being the basis of intelligence. Computational intelligence methods provide transformation of continuous image information into the discrete structures, making it available for analysis. Article shows that symbols naturally emerge in such networks, giving opportunity to use symbolic operations. Such framework naturally combines methods of machine learning, classification and analogy with induction, deduction and other methods of higher level reasoning. Based on these principles image understanding system provides more flexible ways of handling with ambiguity and uncertainty in the real images and does not require supercomputers. That opens way to new technologies in the computer vision and image databases.

  18. Numerical analysis of radiation propagation in innovative volumetric receivers based on selective laser melting techniques

    NASA Astrophysics Data System (ADS)

    Alberti, Fabrizio; Santiago, Sergio; Roccabruna, Mattia; Luque, Salvador; Gonzalez-Aguilar, Jose; Crema, Luigi; Romero, Manuel

    2016-05-01

    Volumetric absorbers constitute one of the key elements in order to achieve high thermal conversion efficiencies in concentrating solar power plants. Regardless of the working fluid or thermodynamic cycle employed, design trends towards higher absorber output temperatures are widespread, which lead to the general need of components of high solar absorptance, high conduction within the receiver material, high internal convection, low radiative and convective heat losses and high mechanical durability. In this context, the use of advanced manufacturing techniques, such as selective laser melting, has allowed for the fabrication of intricate geometries that are capable of fulfilling the previous requirements. This paper presents a parametric design and analysis of the optical performance of volumetric absorbers of variable porosity conducted by means of detailed numerical ray tracing simulations. Sections of variable macroscopic porosity along the absorber depth were constructed by the fractal growth of single-cell structures. Measures of performance analyzed include optical reflection losses from the absorber front and rear faces, penetration of radiation inside the absorber volume, and radiation absorption as a function of absorber depth. The effects of engineering design parameters such as absorber length and wall thickness, material reflectance and porosity distribution on the optical performance of absorbers are discussed, and general design guidelines are given.

  19. OSSE spectral analysis techniques

    NASA Technical Reports Server (NTRS)

    Purcell, W. R.; Brown, K. M.; Grabelsky, D. A.; Johnson, W. N.; Jung, G. V.; Kinzer, R. L.; Kroeger, R. A.; Kurfess, J. D.; Matz, S. M.; Strickman, M. S.

    1992-01-01

    Analysis of the spectra from the Oriented Scintillation Spectrometer Experiment (OSSE) is complicated because of the typically low signal to noise (approx. 0.1 percent) and the large background variability. The OSSE instrument was designed to address these difficulties by periodically offset-pointing the detectors from the source to perform background measurements. These background measurements are used to estimate the background during each of the source observations. The resulting background-subtracted spectra can then be accumulated and fitted for spectral lines and/or continua. Data selection based on various environmental parameters can be performed at various stages during the analysis procedure. In order to achieve the instrument's statistical sensitivity, however, it will be necessary for investigators to develop a detailed understanding of the instrument operation, data collection, and the background spectrum and its variability. A brief description of the major steps in the OSSE spectral analysis process is described, including a discussion of the OSSE background spectrum and examples of several observational strategies.

  20. Assessing morphological and DNA-based diet analysis techniques in a generalist predator, the arrow squid Nototodarus gouldi.

    PubMed

    Braley, Michelle; Goldsworthy, Simon D; Page, Brad; Steer, Mike; Austin, Jeremy J

    2010-05-01

    Establishing the diets of marine generalist consumers is difficult, with most studies limited to the use of morphological methods for prey identification. Such analyses rely on the preservation of diagnostic hard parts, which can limit taxonomic resolution and introduce biases. DNA-based analyses provide a method to assess the diets of marine species, potentially overcoming many of the limitations introduced by other techniques. This study compared the effectiveness of morphological and DNA-based analysis for determining the diet of a free-ranging generalist predator, the arrow squid (Nototodarus gouldi). A combined approach was more effective than using either of the methods in isolation. Nineteen unique prey taxa were identified, of which six were found by both methods, 10 were only detected using DNA and three were only identified using morphological methods. Morphological techniques only found 50% of the total number of identifiable prey taxa, whereas DNA-based techniques found 84%. This study highlights the benefits of using a combination of techniques to detect and identify prey of generalist marine consumers.

  1. An accurate, convective energy equation based automated meshing technique for analysis of blood vessels and tissues.

    PubMed

    White, J A; Dutton, A W; Schmidt, J A; Roemer, R B

    2000-01-01

    An automated three-element meshing method for generating finite element based models for the accurate thermal analysis of blood vessels imbedded in tissue has been developed and evaluated. The meshing method places eight noded hexahedral elements inside the vessels where advective flows exist, and four noded tetrahedral elements in the surrounding tissue. The higher order hexahedrals are used where advective flow fields occur, since high accuracy is required and effective upwinding algorithms exist. Tetrahedral elements are placed in the remaining tissue region, since they are computationally more efficient and existing automatic tetrahedral mesh generators can be used. Five noded pyramid elements connect the hexahedrals and tetrahedrals. A convective energy equation (CEE) based finite element algorithm solves for the temperature distributions in the flowing blood, while a finite element formulation of a generalized conduction equation is used in the surrounding tissue. Use of the CEE allows accurate solutions to be obtained without the necessity of assuming ad hoc values for heat transfer coefficients. Comparisons of the predictions of the three-element model to analytical solutions show that the three-element model accurately simulates temperature fields. Energy balance checks show that the three-element model has small, acceptable errors. In summary, this method provides an accurate, automatic finite element gridding procedure for thermal analysis of irregularly shaped tissue regions that contain important blood vessels. At present, the models so generated are relatively large (in order to obtain accurate results) and are, thus, best used for providing accurate reference values for checking other approximate formulations to complicated, conjugated blood heat transfer problems.

  2. An Analysis Technique for Active Neutron Multiplicity Measurements Based on First Principles

    SciTech Connect

    Evans, Louise G; Goddard, Braden; Charlton, William S; Peerani, Paolo

    2012-08-13

    Passive neutron multiplicity counting is commonly used to quantify the total mass of plutonium in a sample, without prior knowledge of the sample geometry. However, passive neutron counting is less applicable to uranium measurements due to the low spontaneous fission rates of uranium. Active neutron multiplicity measurements are therefore used to determine the {sup 235}U mass in a sample. Unfortunately, there are still additional challenges to overcome for uranium measurements, such as the coupling of the active source and the uranium sample. Techniques, such as the coupling method, have been developed to help reduce the dependence of calibration curves for active measurements on uranium samples; although, they still require similar geometry known standards. An advanced active neutron multiplicity measurement method is being developed by Texas A&M University, in collaboration with Los Alamos National Laboratory (LANL) in an attempt to overcome the calibration curve requirements. This method can be used to quantify the {sup 235}U mass in a sample containing uranium without using calibration curves. Furthermore, this method is based on existing detectors and nondestructive assay (NDA) systems, such as the LANL Epithermal Neutron Multiplicity Counter (ENMC). This method uses an inexpensive boron carbide liner to shield the uranium sample from thermal and epithermal neutrons while allowing fast neutrons to reach the sample. Due to the relatively low and constant fission and absorption energy dependent cross-sections at high neutron energies for uranium isotopes, fast neutrons can penetrate the sample without significant attenuation. Fast neutron interrogation therefore creates a homogeneous fission rate in the sample, allowing for first principle methods to be used to determine the {sup 235}U mass in the sample. This paper discusses the measurement method concept and development, including measurements and simulations performed to date, as well as the potential

  3. Cost Analysis of MRI Services in Iran: An Application of Activity Based Costing Technique

    PubMed Central

    Bayati, Mohsen; Mahboub Ahari, Alireza; Badakhshan, Abbas; Gholipour, Mahin; Joulaei, Hassan

    2015-01-01

    Background: Considerable development of MRI technology in diagnostic imaging, high cost of MRI technology and controversial issues concerning official charges (tariffs) have been the main motivations to define and implement this study. Objectives: The present study aimed to calculate the unit-cost of MRI services using activity-based costing (ABC) as a modern cost accounting system and to fairly compare calculated unit-costs with official charges (tariffs). Materials and Methods: We included both direct and indirect costs of MRI services delivered in fiscal year 2011 in Shiraz Shahid Faghihi hospital. Direct allocation method was used for distribution of overhead costs. We used micro-costing approach to calculate unit-cost of all different MRI services. Clinical cost data were retrieved from the hospital registering system. Straight-line method was used for depreciation cost estimation. To cope with uncertainty and to increase the robustness of study results, unit costs of 33 MRI services was calculated in terms of two scenarios. Results: Total annual cost of MRI activity center (AC) was calculated at USD 400,746 and USD 532,104 based on first and second scenarios, respectively. Ten percent of the total cost was allocated from supportive departments. The annual variable costs of MRI center were calculated at USD 295,904. Capital costs measured at USD 104,842 and USD 236, 200 resulted from the first and second scenario, respectively. Existing tariffs for more than half of MRI services were above the calculated costs. Conclusion: As a public hospital, there are considerable limitations in both financial and administrative databases of Shahid Faghihi hospital. Labor cost has the greatest share of total annual cost of Shahid Faghihi hospital. The gap between unit costs and tariffs implies that the claim for extra budget from health providers may not be relevant for all services delivered by the studied MRI center. With some adjustments, ABC could be implemented in MRI

  4. Experimental investigation of evanescence-based infrared biodetection technique for micro-total-analysis systems

    NASA Astrophysics Data System (ADS)

    Chandrasekaran, Arvind; Packirisamy, Muthukumaran

    2009-09-01

    The advent of microoptoelectromechanical systems (MOEMS) and its integration with other technologies such as microfluidics, microthermal, immunoproteomics, etc. has led to the concept of an integrated micro-total-analysis systems (μTAS) or Lab-on-a-Chip for chemical and biological applications. Recently, research and development of μTAS have attained a significant growth rate over several biodetection sciences, in situ medical diagnoses, and point-of-care testing applications. However, it is essential to develop suitable biophysical label-free detection methods for the success, reliability, and ease of use of the μTAS. We proposed an infrared (IR)-based evanescence wave detection system on the silicon-on-insulator platform for biodetection with μTAS. The system operates on the principle of bio-optical interaction that occurs due to the evanescence of light from the waveguide device. The feasibility of biodetection has been experimentally investigated by the detection of horse radish peroxidase upon its reaction with hydrogen peroxide.

  5. A new approach to the analysis of alpha spectra based on neural network techniques

    NASA Astrophysics Data System (ADS)

    Baeza, A.; Miranda, J.; Guillén, J.; Corbacho, J. A.; Pérez, R.

    2011-10-01

    The analysis of alpha spectra requires good radiochemical procedures in order to obtain well differentiated alpha peaks in the spectrum, and the easiest way to analyze them is by directly summing the counts obtained in the Regions of Interest (ROIs). However, the low-energy tails of the alpha peaks frequently make this simple approach unworkable because some peaks partially overlap. Many fitting procedures have been proposed to solve this problem, most of them based on semi-empirical mathematical functions that emulate the shape of a theoretical alpha peak. The main drawback of these methods is that the great number of fitting parameters used means that their physical meaning is obscure or completely lacking. We propose another approach—the application of an artificial neural network. Instead of fitting the experimental data to a mathematical function, the fit is carried out by an artificial neural network (ANN) that has previously been trained to model the shape of an alpha peak using as training patterns several polonium spectra obtained from actual samples analyzed in our laboratory. In this sense, the ANN is able to learn the shape of an actual alpha peak. We have designed such an ANN as a feed-forward multi-layer perceptron with supervised training based on a back-propagation algorithm. The fitting procedure is based on the experimental observables that are characteristic of alpha peaks—the number of counts of the maximum and several peak widths at different heights. Polonium isotope spectra were selected because the alpha peaks corresponding to 208Po, 209Po, and 210Po are monoenergetic and well separated. The uncertainties introduced by this fitting procedure were less than the counting uncertainties. This new approach was applied to the problem of resolving overlapping peaks. Firstly, a theoretical study was carried out by artificially overlapping alpha peaks from actual samples in order to test the ability of the ANN to resolve each peak. Then, the ANN

  6. Advanced NMR-based techniques for pore structure analysis of coal. Final project report

    SciTech Connect

    Smith, D.M.; Hua, D.W.

    1996-02-01

    During the 3 year term of the project, new methods have been developed for characterizing the pore structure of porous materials such as coals, carbons, and amorphous silica gels. In general, these techniques revolve around; (1) combining multiple techniques such as small-angle x-ray scattering (SAXS) and adsorption of contrast-matched adsorbates or {sup 129}Xe NMR and thermoporometry (the change in freezing point with pore size), (2) combining adsorption isotherms over several pressure ranges to obtain a more complete description of pore filling, or (3) applying NMR ({sup 129}Xe, {sup 14}N{sub 2}, {sup 15}N{sub 2}) techniques with well-defined porous solids with pores in the large micropore size range (>1 nm).

  7. Novel Laser-Based Technique is Ideal for Real-Time Environmental Analysis

    ERIC Educational Resources Information Center

    Journal of Chemical Education, 2005

    2005-01-01

    Ocean Optics offers laser-induced breakdown spectrometer systems (LIBS) that can be used to identify light to heavy metals in a variety of sample types and geometries in environmental analysis applications. LIBS are versatile, real-time, high-resolution analyzers for qualitative analysis, in less than one second, of every element in solids,…

  8. Advanced NMR-based techniques for pore structure analysis of coal

    SciTech Connect

    Smith, D.M.

    1992-01-01

    One of the main problems in coal utilization is the inability to properly characterize its complex pore structure. Coals typically have micro/ultra-micro pores but they also exhibit meso and macroporosity. Conventional pore size techniques (adsorption/condensation, mercury porosimetry) are limited because of this broad pore size range, microporosity, reactive nature of coal, samples must be completely dried, and network/percolation effects. Small angle scattering is limited because it probes both open and closed pores. Although one would not expect any single technique to provide a satisfactory description of a coal's structure, it is apparent that better techniques are necessary. We believe that measurement of the NMR parameters of various gas phase and adsorbed phase NMR active probes can provide the resolution to this problem. We now have two suites of well-characterized microporous materials including oxides (zeolites and silica gel) and activated carbons from our industrial partner, Air Products in Allentown, PA. Our current work may be divided into three areas: small-angle X-ray scattering (SAXS), adsorption, and NMR.

  9. Characterization of electrically-active defects in ultraviolet light-emitting diodes with laser-based failure analysis techniques

    NASA Astrophysics Data System (ADS)

    Miller, Mary A.; Tangyunyong, Paiboon; Cole, Edward I.

    2016-01-01

    Laser-based failure analysis techniques demonstrate the ability to quickly and non-intrusively screen deep ultraviolet light-emitting diodes (LEDs) for electrically-active defects. In particular, two laser-based techniques, light-induced voltage alteration and thermally-induced voltage alteration, generate applied voltage maps (AVMs) that provide information on electrically-active defect behavior including turn-on bias, density, and spatial location. Here, multiple commercial LEDs were examined and found to have dark defect signals in the AVM indicating a site of reduced resistance or leakage through the diode. The existence of the dark defect signals in the AVM correlates strongly with an increased forward-bias leakage current. This increased leakage is not present in devices without AVM signals. Transmission electron microscopy analysis of a dark defect signal site revealed a dislocation cluster through the pn junction. The cluster included an open core dislocation. Even though LEDs with few dark AVM defect signals did not correlate strongly with power loss, direct association between increased open core dislocation densities and reduced LED device performance has been presented elsewhere [M. W. Moseley et al., J. Appl. Phys. 117, 095301 (2015)].

  10. Error-reduction techniques and error analysis for fully phase- and amplitude-based encryption.

    PubMed

    Javidi, B; Towghi, N; Maghzi, N; Verrall, S C

    2000-08-10

    The performance of fully phase- and amplitude-based encryption processors is analyzed. The effects of noise perturbations on the encrypted information are considered. A thresholding method of decryption that further reduces the mean-squared error (MSE) for the fully phase- and amplitude-based encryption processes is provided. The proposed thresholding scheme significantly improves the performance of fully phase- and amplitude-based encryption, as measured by the MSE metric. We obtain analytical MSE bounds when thresholding is used for both decryption methods, and we also present computer-simulation results. These results show that the fully phase-based method is more robust. We also give a formal proof of a conjecture about the decrypted distribution of distorted encrypted information. This allows the analytical bounds of the MSE to be extended to more general non-Gaussian, nonadditive, nonstationary distortions. Computer simulations support this extension.

  11. Comparative Study of Various Normal Mode Analysis Techniques Based on Partial Hessians

    PubMed Central

    GHYSELS, AN; VAN SPEYBROECK, VERONIQUE; PAUWELS, EWALD; CATAK, SARON; BROOKS, BERNARD R.; VAN NECK, DIMITRI; WAROQUIER, MICHEL

    2014-01-01

    Standard normal mode analysis becomes problematic for complex molecular systems, as a result of both the high computational cost and the excessive amount of information when the full Hessian matrix is used. Several partial Hessian methods have been proposed in the literature, yielding approximate normal modes. These methods aim at reducing the computational load and/or calculating only the relevant normal modes of interest in a specific application. Each method has its own (dis)advantages and application field but guidelines for the most suitable choice are lacking. We have investigated several partial Hessian methods, including the Partial Hessian Vibrational Analysis (PHVA), the Mobile Block Hessian (MBH), and the Vibrational Subsystem Analysis (VSA). In this article, we focus on the benefits and drawbacks of these methods, in terms of the reproduction of localized modes, collective modes, and the performance in partially optimized structures. We find that the PHVA is suitable for describing localized modes, that the MBH not only reproduces localized and global modes but also serves as an analysis tool of the spectrum, and that the VSA is mostly useful for the reproduction of the low frequency spectrum. These guidelines are illustrated with the reproduction of the localized amine-stretch, the spectrum of quinine and a bis-cinchona derivative, and the low frequency modes of the LAO binding protein. PMID:19813181

  12. FBGs cascade interrogation technique based on wavelength-to-delay mapping and KLT analysis

    NASA Astrophysics Data System (ADS)

    Hervás, J.; Barrera, D.; Fernández-Pousa, Carlos R.; Sales, S.

    2016-05-01

    The Karhunen-Loeve transform is applied to the coarsely sampled impulse response generated by an FBG cascade in order to calculate the temperature change suffered by the FBGs. Thanks to a dispersive media, the wavelength change performed by the temperature change produces a delay shift in the sample generated by an FBG, delay shift which is recorded in the eigenvalues calculated by the KLT routine, letting to measure the temperature variation. Although the FBGs samples are represented only by four points, a continuous temperature measurement can be performed thanks to the KLT algorithm. This means a three order reduction in the number of points giving this method a low computational complexity. Simulations are performed to validate the interrogation technique and estimate performance and an experimental example is provided to demonstrate real operation.

  13. Model building techniques for analysis.

    SciTech Connect

    Walther, Howard P.; McDaniel, Karen Lynn; Keener, Donald; Cordova, Theresa Elena; Henry, Ronald C.; Brooks, Sean; Martin, Wilbur D.

    2009-09-01

    The practice of mechanical engineering for product development has evolved into a complex activity that requires a team of specialists for success. Sandia National Laboratories (SNL) has product engineers, mechanical designers, design engineers, manufacturing engineers, mechanical analysts and experimentalists, qualification engineers, and others that contribute through product realization teams to develop new mechanical hardware. The goal of SNL's Design Group is to change product development by enabling design teams to collaborate within a virtual model-based environment whereby analysis is used to guide design decisions. Computer-aided design (CAD) models using PTC's Pro/ENGINEER software tools are heavily relied upon in the product definition stage of parts and assemblies at SNL. The three-dimensional CAD solid model acts as the design solid model that is filled with all of the detailed design definition needed to manufacture the parts. Analysis is an important part of the product development process. The CAD design solid model (DSM) is the foundation for the creation of the analysis solid model (ASM). Creating an ASM from the DSM currently is a time-consuming effort; the turnaround time for results of a design needs to be decreased to have an impact on the overall product development. This effort can be decreased immensely through simple Pro/ENGINEER modeling techniques that summarize to the method features are created in a part model. This document contains recommended modeling techniques that increase the efficiency of the creation of the ASM from the DSM.

  14. [Analyzer Design of Atmospheric Particulate Matter's Concentration and Elemental Composition Based on β and X-Ray's Analysis Techniques].

    PubMed

    Ge, Liang-quan; Liu, He-fan; Zeng, Guo-qiang; Zhang, Qing-xian; Ren, Mao-qiang; Li, Dan; Gu, Yi; Luo, Yao-yao; Zhao, Jian-kun

    2016-03-01

    Monitoring atmospheric particulate matter requires real-time analysis, such as particulate matter's concentrations, their element types and contents. An analyzer which is based on β and X rays analysis techniques is designed to meet those demands. Applying β-ray attenuation law and energy dispersive X-ray fluorescence analysis principle, the paper introduces the analyzer's overall design scheme, structure, FPGA circuit hardware and software for the analyzer. And the analyzer can measure atmospheric particulate matters' concentration, elements and their contents by on-line analysis. Pure elemental particle standard samples were prepared by deposition, and those standard samples were used to set the calibration for the analyzer in this paper. The analyzer can monitor atmospheric particulate matters concentration, 30 kinds of elements and content, such as TSP, PM10 and PM2.5. Comparing the measurement results from the analyzer to Chengdu Environmental Protection Agency's monitoring results for monitoring particulate matters, a high consistency is obtained by the application in eastern suburbs of Chengdu. Meanwhile, the analyzer are highly sensitive in monitoring particulate matters which contained heavy metal elements (such as As, Hg, Cd, Cr, Pb and so on). The analyzer has lots of characteristics through technical performance testing, such as continuous measurement, low detection limit, quick analysis, easy to use and so on. In conclusion, the analyzer can meet the demands for analyzing atmospheric particulate matter's concentration, elements and their contents in urban environmental monitoring. PMID:27400540

  15. [Measurement Error Analysis and Calibration Technique of NTC - Based Body Temperature Sensor].

    PubMed

    Deng, Chi; Hu, Wei; Diao, Shengxi; Lin, Fujiang; Qian, Dahong

    2015-11-01

    A NTC thermistor-based wearable body temperature sensor was designed. This paper described the design principles and realization method of the NTC-based body temperature sensor. In this paper the temperature measurement error sources of the body temperature sensor were analyzed in detail. The automatic measurement and calibration method of ADC error was given. The results showed that the measurement accuracy of calibrated body temperature sensor is better than ± 0.04 degrees C. The temperature sensor has high accuracy, small size and low power consumption advantages.

  16. Application of an ensemble technique based on singular spectrum analysis to daily rainfall forecasting.

    PubMed

    Baratta, Daniela; Cicioni, Giovambattista; Masulli, Francesco; Studer, Léonard

    2003-01-01

    In previous work, we have proposed a constructive methodology for temporal data learning supported by results and prescriptions related to the embedding theorem, and using the singular spectrum analysis both in order to reduce the effects of the possible discontinuity of the signal and to implement an efficient ensemble method. In this paper we present new results concerning the application of this approach to the forecasting of the individual rain-fall intensities series collected by 135 stations distributed in the Tiber basin. The average RMS error of the obtained forecasting is less than 3mm of rain. PMID:12672433

  17. Frontier-based techniques in measuring hospital efficiency in Iran: a systematic review and meta-regression analysis

    PubMed Central

    2013-01-01

    Background In recent years, there has been growing interest in measuring the efficiency of hospitals in Iran and several studies have been conducted on the topic. The main objective of this paper was to review studies in the field of hospital efficiency and examine the estimated technical efficiency (TE) of Iranian hospitals. Methods Persian and English databases were searched for studies related to measuring hospital efficiency in Iran. Ordinary least squares (OLS) regression models were applied for statistical analysis. The PRISMA guidelines were followed in the search process. Results A total of 43 efficiency scores from 29 studies were retrieved and used to approach the research question. Data envelopment analysis was the principal frontier efficiency method in the estimation of efficiency scores. The pooled estimate of mean TE was 0.846 (±0.134). There was a considerable variation in the efficiency scores between the different studies performed in Iran. There were no differences in efficiency scores between data envelopment analysis (DEA) and stochastic frontier analysis (SFA) techniques. The reviewed studies are generally similar and suffer from similar methodological deficiencies, such as no adjustment for case mix and quality of care differences. The results of OLS regression revealed that studies that included more variables and more heterogeneous hospitals generally reported higher TE. Larger sample size was associated with reporting lower TE. Conclusions The features of frontier-based techniques had a profound impact on the efficiency scores among Iranian hospital studies. These studies suffer from major methodological deficiencies and were of sub-optimal quality, limiting their validity and reliability. It is suggested that improving data collection and processing in Iranian hospital databases may have a substantial impact on promoting the quality of research in this field. PMID:23945011

  18. Characterization of electrically-active defects in ultraviolet light-emitting diodes with laser-based failure analysis techniques

    DOE PAGES

    Miller, Mary A.; Tangyunyong, Paiboon; Edward I. Cole, Jr.

    2016-01-12

    In this study, laser-based failure analysis techniques demonstrate the ability to quickly and non-intrusively screen deep ultraviolet light-emitting diodes(LEDs) for electrically-active defects. In particular, two laser-based techniques, light-induced voltage alteration and thermally-induced voltage alteration, generate applied voltage maps (AVMs) that provide information on electrically-active defect behavior including turn-on bias, density, and spatial location. Here, multiple commercial LEDs were examined and found to have dark defect signals in the AVM indicating a site of reduced resistance or leakage through the diode. The existence of the dark defect signals in the AVM correlates strongly with an increased forward-bias leakage current. This increasedmore » leakage is not present in devices without AVM signals. Transmission electron microscopyanalysis of a dark defect signal site revealed a dislocation cluster through the pn junction. The cluster included an open core dislocation. Even though LEDs with few dark AVM defect signals did not correlate strongly with power loss, direct association between increased open core dislocation densities and reduced LED device performance has been presented elsewhere [M. W. Moseley et al., J. Appl. Phys. 117, 095301 (2015)].« less

  19. Semi-Automated Classification of Gray Scale Aerial Photographs using Geographic Object Based Image Analysis (GEOBIA) Technique

    NASA Astrophysics Data System (ADS)

    Harb Rabia, Ahmed; Terribile, Fabio

    2013-04-01

    Aerial photography is an important source of high resolution remotely sensed data. Before 1970, aerial photographs were the only remote sensing data source for land use and land cover classification. Using these old aerial photographs improve the final output of land use and land cover change detection. However, classic techniques of aerial photographs classification like manual interpretation or screen digitization require great experience, long processing time and vast effort. A new technique needs to be developed in order to reduce processing time and effort and to give better results. Geographic object based image analysis (GEOBIA) is a newly developed area of Geographic Information Science and remote sensing in which automatic segmentation of images into objects of similar spectral, temporal and spatial characteristics is undertaken. Unlike pixel-based technique, GEOBIA deals with the object properties such as texture, square fit, roundness and many other properties that can improve classification results. GEOBIA technique can be divided into two main steps; segmentation and classification. Segmentation process is grouping adjacent pixels into objects of similar spectral and spatial characteristics. Classification process is assigning classes to the generated objects based on the characteristics of the individual objects. This study aimed to use GEOBIA technique to develop a novel approach for land use and land cover classification of aerial photographs that saves time and effort and gives improved results. Aerial photographs from 1954 of Valle Telesina in Italy were used in this study. Images were rectified and georeferenced in Arcmap using topographic maps. Images were then processed in eCognition software to generate land use and land cover map of 1954. A decision tree rule set was developed in eCognition to classify images and finally nine classes of general land use and land cover in the study area were recognized (forest, trees stripes, agricultural

  20. Get to Understand More from Single-Cells: Current Studies of Microfluidic-Based Techniques for Single-Cell Analysis

    PubMed Central

    Lo, Shih-Jie; Yao, Da-Jeng

    2015-01-01

    This review describes the microfluidic techniques developed for the analysis of a single cell. The characteristics of microfluidic (e.g., little sample amount required, high-throughput performance) make this tool suitable to answer and to solve biological questions of interest about a single cell. This review aims to introduce microfluidic related techniques for the isolation, trapping and manipulation of a single cell. The major approaches for detection in single-cell analysis are introduced; the applications of single-cell analysis are then summarized. The review concludes with discussions of the future directions and opportunities of microfluidic systems applied in analysis of a single cell. PMID:26213918

  1. NASA standard: Trend analysis techniques

    NASA Technical Reports Server (NTRS)

    1990-01-01

    Descriptive and analytical techniques for NASA trend analysis applications are presented in this standard. Trend analysis is applicable in all organizational elements of NASA connected with, or supporting, developmental/operational programs. This document should be consulted for any data analysis activity requiring the identification or interpretation of trends. Trend analysis is neither a precise term nor a circumscribed methodology: it generally connotes quantitative analysis of time-series data. For NASA activities, the appropriate and applicable techniques include descriptive and graphical statistics, and the fitting or modeling of data by linear, quadratic, and exponential models. Usually, but not always, the data is time-series in nature. Concepts such as autocorrelation and techniques such as Box-Jenkins time-series analysis would only rarely apply and are not included in this document. The basic ideas needed for qualitative and quantitative assessment of trends along with relevant examples are presented.

  2. Spectroscopic Techniques for Atmospheric Analysis

    SciTech Connect

    Bililign, Solomon

    2009-07-06

    Several analytical and optical techniques for atmospheric analysis are discussed. Environmental constraints for real world applications are mentioned. Special emphasis is given to the cavity ring Down Spectroscopy as a very sensitive method for atmospheric trace gas detection is described.

  3. Comparative analysis of DNA polymorphisms and phylogenetic relationships among Syzygium cumini Skeels based on phenotypic characters and RAPD technique

    PubMed Central

    Singh, Jitendra P; Singh, AK; Bajpai, Anju; Ahmad, Iffat Zareen

    2014-01-01

    The Indian black berry (Syzygium cumini Skeels) has a great nutraceutical and medicinal properties. As in other fruit crops, the fruit characteristics are important attributes for differentiation were also determined for different accessions of S. cumini. The fruit weight, length, breadth, length: breadth ratio, pulp weight, pulp content, seed weight and pulp: seed ratio significantly varied in different accessions. Molecular characterization was carried out using PCR based RAPD technique. Out of 80 RAPD primers, only 18 primers produced stable polymorphisms that were used to examine the phylogenetic relationship. A sum of 207 loci were generated out of which 201 loci found polymorphic. The average genetic dissimilarity was 97 per cent among jamun accessions. The phylogenetic relationship was also determined by principal coordinates analysis (PCoA) that explained 46.95 per cent cumulative variance. The two-dimensional PCoA analysis showed grouping of the different accessions that were plotted into four sub-plots, representing clustering of accessions. The UPGMA (r = 0.967) and NJ (r = 0.987) dendrogram constructed based on the dissimilarity matrix revealed a good degree of fit with the cophenetic correlation value. The dendrogram grouped the accessions into three main clusters according to their eco-geographical regions which given useful insight into their phylogenetic relationships. PMID:24966521

  4. Evaluating the adequacy of maximum contaminant levels as health-protective cleanup goals: an analysis based on Monte Carlo techniques.

    PubMed

    Finley, B L; Scott, P; Paustenbach, D J

    1993-12-01

    At many sites in the United States, health-based remediation goals for contaminated groundwater have been set at levels far below USEPA's drinking water standards (i.e., maximum contaminant levels or MCLs). This is due to the fact that, while the USEPA must often consider technical and economic factors (e.g., cost of compliance, risk/benefit analysis) when setting MCLs for public water systems, cleanup goals for contaminated groundwater are often based solely on conservative "point" estimates of exposure. One of the more recent refinements in the risk assessment process is the use of ranges of exposure estimates or "probability density functions" (PDFs), rather than fixed point estimates, to estimate exposure and chemical uptake. This approach provides a more thorough description of the range of potential risks, rather than a single "worst-case" value, and allows one to understand the conservatism inherent in assessments based on regulatory default parameters. This paper uses a number of PDFs and the Monte Carlo technique to assess whether the USEPA's MCLs for drinking water are sufficiently low to protect persons exposed to these levels. A case study involving daily exposure to tapwater containing MCL concentrations of tetrachloroethylene, chloroform, bromoform, and vinyl chloride is presented. Several direct and indirect exposure pathways are evaluated, including inhalation and dermal contact while showering, direct ingestion, and inhalation of emissions from household fixtures and appliances. PDFs for each exposure factor are based on the most recent and applicable data available. Our analysis indicates that the estimated increased cancer risks at the 50th and 95th percentile of exposure are within the range of increased cancer risks typically considered acceptable at Superfund sites (10(-4)-10(-6)). These results suggest that, at least for some chemicals, groundwater need not be cleaned-up to concentrations less than drinking water standards (i.e., MCLs) to

  5. Visualization and Analysis of Wireless Sensor Network Data for Smart Civil Structure Applications Based On Spatial Correlation Technique

    NASA Astrophysics Data System (ADS)

    Chowdhry, Bhawani Shankar; White, Neil M.; Jeswani, Jai Kumar; Dayo, Khalil; Rathi, Manorma

    2009-07-01

    Disasters affecting infrastructure, such as the 2001 earthquakes in India, 2005 in Pakistan, 2008 in China and the 2004 tsunami in Asia, provide a common need for intelligent buildings and smart civil structures. Now, imagine massive reductions in time to get the infrastructure working again, realtime information on damage to buildings, massive reductions in cost and time to certify that structures are undamaged and can still be operated, reductions in the number of structures to be rebuilt (if they are known not to be damaged). Achieving these ideas would lead to huge, quantifiable, long-term savings to government and industry. Wireless sensor networks (WSNs) can be deployed in buildings to make any civil structure both smart and intelligent. WSNs have recently gained much attention in both public and research communities because they are expected to bring a new paradigm to the interaction between humans, environment, and machines. This paper presents the deployment of WSN nodes in the Top Quality Centralized Instrumentation Centre (TQCIC). We created an ad hoc networking application to collect real-time data sensed from the nodes that were randomly distributed throughout the building. If the sensors are relocated, then the application automatically reconfigures itself in the light of the new routing topology. WSNs are event-based systems that rely on the collective effort of several micro-sensor nodes, which are continuously observing a physical phenomenon. WSN applications require spatially dense sensor deployment in order to achieve satisfactory coverage. The degree of spatial correlation increases with the decreasing inter-node separation. Energy consumption is reduced dramatically by having only those sensor nodes with unique readings transmit their data. We report on an algorithm based on a spatial correlation technique that assures high QoS (in terms of SNR) of the network as well as proper utilization of energy, by suppressing redundant data transmission

  6. Comparison of two headspace sampling techniques for the analysis of off-flavour volatiles from oat based products.

    PubMed

    Cognat, Claudine; Shepherd, Tom; Verrall, Susan R; Stewart, Derek

    2012-10-01

    Two different headspace sampling techniques were compared for analysis of aroma volatiles from freshly produced and aged plain oatcakes. Solid phase microextraction (SPME) using a Carboxen-Polydimethylsiloxane (PDMS) fibre and entrainment on Tenax TA within an adsorbent tube were used for collection of volatiles. The effects of variation in the sampling method were also considered using SPME. The data obtained using both techniques were processed by multivariate statistical analysis (PCA). Both techniques showed similar capacities to discriminate between the samples at different ages. Discrimination between fresh and rancid samples could be made on the basis of changes in the relative abundances of 14-15 of the constituents in the volatile profiles. A significant effect on the detection level of volatile compounds was observed when samples were crushed and analysed by SPME-GC-MS, in comparison to undisturbed product. The applicability and cost effectiveness of both methods were considered. PMID:25005987

  7. A new technique for calculating reentry base heating. [analysis of laminar base flow field of two dimensional reentry body

    NASA Technical Reports Server (NTRS)

    Meng, J. C. S.

    1973-01-01

    The laminar base flow field of a two-dimensional reentry body has been studied by Telenin's method. The flow domain was divided into strips along the x-axis, and the flow variations were represented by Lagrange interpolation polynomials in the transformed vertical coordinate. The complete Navier-Stokes equations were used in the near wake region, and the boundary layer equations were applied elsewhere. The boundary conditions consisted of the flat plate thermal boundary layer in the forebody region and the near wake profile in the downstream region. The resulting two-point boundary value problem of 33 ordinary differential equations was then solved by the multiple shooting method. The detailed flow field and thermal environment in the base region are presented in the form of temperature contours, Mach number contours, velocity vectors, pressure distributions, and heat transfer coefficients on the base surface. The maximum heating rate was found on the centerline, and the two-dimensional stagnation point flow solution was adquate to estimate the maximum heating rate so long as the local Reynolds number could be obtained.

  8. Analysis of Land Covers over Northern Peninsular Malaysia by Using ALOS-PALSAR Data Based on Frequency-Based Contextual and Neural Network Classification Technique

    NASA Astrophysics Data System (ADS)

    Lim, H. S.; MatJafri, M. Z.; Abdullah, K.; Saleh, N. Mohd.

    2008-11-01

    Optical and microwave remote sensing data have been widely used in land cover and land use classification. Optical satellite remote sensing methods are more appropriate but require cloud-free conditions for data to be useful especially at Equatorial region. In Equatorial region cloud free acquisitions can be rare reducing these sensors' applicability to such studies. ALOS-PALSAR data can be acquired day and night irrespective of weather conditions. This paper presents a comparison between frequency-based contextual and neural network classification technique by using ALOS-PALSAR data for land cover assessment in Northern Peninsular Malaysia. The ALOS-PALSAR data acquired on 10 November 2006 were converted to vegetation, urban, water and other land features. The PALSAR data of training areas were choose and selected based on the optical satellite imagery and were classified using supervised classification methods. Supervised classification techniques were used in the classification analysis. The best supervised classifier was chosen based on the highest overall accuracy and Kappa statistic. Based on the result produced by this study, it can be pointed out the utility of ALOS-PALSAR data as an alternative data source for land cover classification in the Peninsular Malaysia.

  9. NASA standard: Trend analysis techniques

    NASA Technical Reports Server (NTRS)

    1988-01-01

    This Standard presents descriptive and analytical techniques for NASA trend analysis applications. Trend analysis is applicable in all organizational elements of NASA connected with, or supporting, developmental/operational programs. Use of this Standard is not mandatory; however, it should be consulted for any data analysis activity requiring the identification or interpretation of trends. Trend Analysis is neither a precise term nor a circumscribed methodology, but rather connotes, generally, quantitative analysis of time-series data. For NASA activities, the appropriate and applicable techniques include descriptive and graphical statistics, and the fitting or modeling of data by linear, quadratic, and exponential models. Usually, but not always, the data is time-series in nature. Concepts such as autocorrelation and techniques such as Box-Jenkins time-series analysis would only rarely apply and are not included in this Standard. The document presents the basic ideas needed for qualitative and quantitative assessment of trends, together with relevant examples. A list of references provides additional sources of information.

  10. A preliminary structural analysis of space-base living quarters modules to verify a weight-estimating technique

    NASA Technical Reports Server (NTRS)

    Grissom, D. S.; Schneider, W. C.

    1971-01-01

    The determination of a base line (minimum weight) design for the primary structure of the living quarters modules in an earth-orbiting space base was investigated. Although the design is preliminary in nature, the supporting analysis is sufficiently thorough to provide a reasonably accurate weight estimate of the major components that are considered to comprise the structural weight of the space base.

  11. Applications of electrochemical techniques in mineral analysis.

    PubMed

    Niu, Yusheng; Sun, Fengyue; Xu, Yuanhong; Cong, Zhichao; Wang, Erkang

    2014-09-01

    This review, covering reports published in recent decade from 2004 to 2013, shows how electrochemical (EC) techniques such as voltammetry, electrochemical impedance spectroscopy, potentiometry, coulometry, etc., have made significant contributions in the analysis of minerals such as clay, sulfide, oxide, and oxysalt. It was discussed based on the classifications of both the types of the used EC techniques and kinds of the analyzed minerals. Furthermore, minerals as electrode modification materials for EC analysis have also been summarized. Accordingly, research vacancies and future development trends in these areas are discussed.

  12. Proteomic data analysis of glioma cancer stem-cell lines based on novel nonlinear dimensional data reduction techniques

    NASA Astrophysics Data System (ADS)

    Lespinats, Sylvain; Pinker-Domenig, Katja; Wengert, Georg; Houben, Ivo; Lobbes, Marc; Stadlbauer, Andreas; Meyer-Bäse, Anke

    2016-05-01

    Glioma-derived cancer stem cells (GSCs) are tumor-initiating cells and may be refractory to radiation and chemotherapy and thus have important implications for tumor biology and therapeutics. The analysis and interpretation of large proteomic data sets requires the development of new data mining and visualization approaches. Traditional techniques are insufficient to interpret and visualize these resulting experimental data. The emphasis of this paper lies in the application of novel approaches for the visualization, clustering and projection representation to unveil hidden data structures relevant for the accurate interpretation of biological experiments. These qualitative and quantitative methods are applied to the proteomic analysis of data sets derived from the GSCs. The achieved clustering and visualization results provide a more detailed insight into the protein-level fold changes and putative upstream regulators for the GSCs. However the extracted molecular information is insufficient in classifying GSCs and paving the pathway to an improved therapeutics of the heterogeneous glioma.

  13. Techniques for Automated Performance Analysis

    SciTech Connect

    Marcus, Ryan C.

    2014-09-02

    The performance of a particular HPC code depends on a multitude of variables, including compiler selection, optimization flags, OpenMP pool size, file system load, memory usage, MPI configuration, etc. As a result of this complexity, current predictive models have limited applicability, especially at scale. We present a formulation of scientific codes, nodes, and clusters that reduces complex performance analysis to well-known mathematical techniques. Building accurate predictive models and enhancing our understanding of scientific codes at scale is an important step towards exascale computing.

  14. Balloon-based interferometric techniques

    NASA Technical Reports Server (NTRS)

    Rees, David

    1985-01-01

    A balloon-borne triple-etalon Fabry-Perot Interferometer, observing the Doppler shifts of absorption lines caused by molecular oxygen and water vapor in the far red/near infrared spectrum of backscattered sunlight, has been used to evaluate a passive spaceborne remote sensing technique for measuring winds in the troposphere and stratosphere. There have been two successful high altitude balloon flights of the prototype UCL instrument from the National Scientific Balloon Facility at Palestine, TE (May 80, Oct. 83). The results from these flights have demonstrated that an interferometer with adequate resolution, stability and sensitivity can be built. The wind data are of comparable quality to those obtained from operational techniques (balloon and rocket sonde, cloud-top drift analysis, and from the gradient wind analysis of satellite radiance measurements). However, the interferometric data can provide a regular global grid, over a height range from 5 to 50 km in regions of clear air. Between the middle troposphere (5 km) and the upper stratosphere (40 to 50 km), an optimized instrument can make wind measurements over the daylit hemisphere with an accuracy of about 3 to 5 m/sec (2 sigma). It is possible to obtain full height profiles between altitudes of 5 and 50 km, with 4 km height resolution, and a spatial resolution of about 200 km, along the orbit track. Below an altitude of about 10 km, Fraunhofer lines of solar origin are possible targets of the Doppler wind analysis. Above an altitude of 50 km, the weakness of the backscattered solar spectrum (decreasing air density) is coupled with the low absorption crosssection of all atmospheric species in the spectral region up to 800 nm (where imaging photon detectors can be used), causing the along-the-track resolution (or error) to increase beyond values useful for operational purposes. Within the region of optimum performance (5 to 50 km), however, the technique is a valuable potential complement to existing wind

  15. High Throughput Sample Preparation and Analysis for DNA Sequencing, PCR and Combinatorial Screening of Catalysis Based on Capillary Array Technique

    SciTech Connect

    Yonghua Zhang

    2002-05-27

    Sample preparation has been one of the major bottlenecks for many high throughput analyses. The purpose of this research was to develop new sample preparation and integration approach for DNA sequencing, PCR based DNA analysis and combinatorial screening of homogeneous catalysis based on multiplexed capillary electrophoresis with laser induced fluorescence or imaging UV absorption detection. The author first introduced a method to integrate the front-end tasks to DNA capillary-array sequencers. protocols for directly sequencing the plasmids from a single bacterial colony in fused-silica capillaries were developed. After the colony was picked, lysis was accomplished in situ in the plastic sample tube using either a thermocycler or heating block. Upon heating, the plasmids were released while chromsomal DNA and membrane proteins were denatured and precipitated to the bottom of the tube. After adding enzyme and Sanger reagents, the resulting solution was aspirated into the reaction capillaries by a syringe pump, and cycle sequencing was initiated. No deleterious effect upon the reaction efficiency, the on-line purification system, or the capillary electrophoresis separation was observed, even though the crude lysate was used as the template. Multiplexed on-line DNA sequencing data from 8 parallel channels allowed base calling up to 620 bp with an accuracy of 98%. The entire system can be automatically regenerated for repeated operation. For PCR based DNA analysis, they demonstrated that capillary electrophoresis with UV detection can be used for DNA analysis starting from clinical sample without purification. After PCR reaction using cheek cell, blood or HIV-1 gag DNA, the reaction mixtures was injected into the capillary either on-line or off-line by base stacking. The protocol was also applied to capillary array electrophoresis. The use of cheaper detection, and the elimination of purification of DNA sample before or after PCR reaction, will make this approach an

  16. A novel fast and flexible technique of radical kinetic behaviour investigation based on pallet for plasma evaluation structure and numerical analysis

    NASA Astrophysics Data System (ADS)

    Malinowski, Arkadiusz; Takeuchi, Takuya; Chen, Shang; Suzuki, Toshiya; Ishikawa, Kenji; Sekine, Makoto; Hori, Masaru; Lukasiak, Lidia; Jakubowski, Andrzej

    2013-07-01

    This paper describes a new, fast, and case-independent technique for sticking coefficient (SC) estimation based on pallet for plasma evaluation (PAPE) structure and numerical analysis. Our approach does not require complicated structure, apparatus, or time-consuming measurements but offers high reliability of data and high flexibility. Thermal analysis is also possible. This technique has been successfully applied to estimation of very low value of SC of hydrogen radicals on chemically amplified ArF 193 nm photoresist (the main goal of this study). Upper bound of our technique has been determined by investigation of SC of fluorine radical on polysilicon (in elevated temperature). Sources of estimation error and ways of its reduction have been also discussed. Results of this study give an insight into the process kinetics, and not only they are helpful in better process understanding but additionally they may serve as parameters in a phenomenological model development for predictive modelling of etching for ultimate CMOS topography simulation.

  17. Judging complex movement performances for excellence: a principal components analysis-based technique applied to competitive diving.

    PubMed

    Young, Cole; Reinkensmeyer, David J

    2014-08-01

    Athletes rely on subjective assessment of complex movements from coaches and judges to improve their motor skills. In some sports, such as diving, snowboard half pipe, gymnastics, and figure skating, subjective scoring forms the basis for competition. It is currently unclear whether this scoring process can be mathematically modeled; doing so could provide insight into what motor skill is. Principal components analysis has been proposed as a motion analysis method for identifying fundamental units of coordination. We used PCA to analyze movement quality of dives taken from USA Diving's 2009 World Team Selection Camp, first identifying eigenpostures associated with dives, and then using the eigenpostures and their temporal weighting coefficients, as well as elements commonly assumed to affect scoring - gross body path, splash area, and board tip motion - to identify eigendives. Within this eigendive space we predicted actual judges' scores using linear regression. This technique rated dives with accuracy comparable to the human judges. The temporal weighting of the eigenpostures, body center path, splash area, and board tip motion affected the score, but not the eigenpostures themselves. These results illustrate that (1) subjective scoring in a competitive diving event can be mathematically modeled; (2) the elements commonly assumed to affect dive scoring actually do affect scoring (3) skill in elite diving is more associated with the gross body path and the effect of the movement on the board and water than the units of coordination that PCA extracts, which might reflect the high level of technique these divers had achieved. We also illustrate how eigendives can be used to produce dive animations that an observer can distort continuously from poor to excellent, which is a novel approach to performance visualization.

  18. Applicability of neuro-fuzzy techniques in predicting ground-water vulnerability: a GIS-based sensitivity analysis

    NASA Astrophysics Data System (ADS)

    Dixon, B.

    2005-07-01

    Modeling groundwater vulnerability reliably and cost effectively for non-point source (NPS) pollution at a regional scale remains a major challenge. In recent years, Geographic Information Systems (GIS), neural networks and fuzzy logic techniques have been used in several hydrological studies. However, few of these research studies have undertaken an extensive sensitivity analysis. The overall objective of this research is to examine the sensitivity of neuro-fuzzy models used to predict groundwater vulnerability in a spatial context by integrating GIS and neuro-fuzzy techniques. The specific objectives are to assess the sensitivity of neuro-fuzzy models in a spatial domain using GIS by varying (i) shape of the fuzzy sets, (ii) number of fuzzy sets, and (iii) learning and validation parameters (including rule weights). The neuro-fuzzy models were developed using NEFCLASS-J software on a JAVA platform and were loosely integrated with a GIS. Four plausible parameters which are critical in transporting contaminants through the soil profile to the groundwater, included soil hydrologic group, depth of the soil profile, soil structure (pedality points) of the A horizon, and landuse. In order to validate the model predictions, coincidence reports were generated among model inputs, model predictions, and well/spring contamination data for NO 3-N. A total of 16 neuro-fuzzy models were developed for selected sub-basins of Illinois River Watershed, AR. The sensitivity analysis showed that neuro-fuzzy models were sensitive to the shape of the fuzzy sets, number of fuzzy sets, nature of the rule weights, and validation techniques used during the learning processes. Compared to bell-shaped and triangular-shaped membership functions, the neuro-fuzzy models with a trapezoidal membership function were the least sensitive to the various permutations and combinations of the learning and validation parameters. Over all, Models 11 and 8 showed relatively higher coincidence with well

  19. Video based lifting technique coding system.

    PubMed

    Hsiang, S M; Brogmus, G E; Martin, S E; Bezverkhny, I B

    1998-03-01

    Despite automation and improved working conditions, many materials in industry are still handled manually. Among the basic activities involved in manual materials handling, lifting is the one most frequently associated with low-back pain (LBP). Biomechanical analysis techniques have been used to better understand the risk factors associated with manual handling, but because these techniques require specialized equipment, highly trained personnel, and interfere with normal business operations, they are limited in their usefulness. A video based lifting technique analysis system (the VidLiTeCTM System) is presented that provides for quantifiable non-invasive biomechanical analysis of the dynamic features of lifting with high inter-coder reliability and low sensitivity to absolute errors. Analysis of results from a laboratory experiment and from field-collected videotape are described that support the reliability, sensitivity, and accuracy claims of the VidLiTeCTM System. The VidLiTeCTM System allows technicians with minimal training and low-tech equipment (a camcorder) to collect large sets of lifting data without interfering with normal business operations. A reasonably accurate estimate of the peak compressive force on the L5/S1 joint can be made from the data collected. Such a system can be used to collect quantified data on lifting techniques that can be related to LBP reporting.

  20. Extension of an Itô-based general approximation technique for random vibration of a BBW general hysteris model part II: Non-Gaussian analysis

    NASA Astrophysics Data System (ADS)

    Davoodi, H.; Noori, M.

    1990-07-01

    The work presented in this paper constitutes the second phase of on-going research aimed at developing mathematical models for representing general hysteretic behavior of structures and approximation techniques for the computation and analysis of the response of hysteretic systems to random excitations. In this second part, the technique previously developed by the authors for the Gaussian response analysis of non-linear systems with general hysteretic behavior is extended for the non-Gaussian analysis of these systems. This approximation technique is based on the approach proposed independently by Ibrahim and Wu-Lin. In this work up to fourth order moments of the response co-ordinates are obtained for the Bouc-Baber-Wen smooth hysteresis model. These higher order statistics previously have not been made available for general hysteresis models by using existing approximation methods. Second order moments obtained for the model by this non-Gaussian closure scheme are compared with equivalent linearization and Gaussian closure results via Monte Carlo simulation (MCS). Higher order moments are compared with the simulation results. The study performed for a wide range of degradation parameters and input power spectral density ( PSD) levels shows that the non-Gaussian responses obtained by this approach are in better agreement with the MCS results than the linearized and Gaussian ones. This approximation technique can provide information on higher order moments for general hysteretic systems. This information is valuable in random vibration and the reliability analysis of hysteretically yielding structures.

  1. Photogrammetric Techniques for Road Surface Analysis

    NASA Astrophysics Data System (ADS)

    Knyaz, V. A.; Chibunichev, A. G.

    2016-06-01

    The quality and condition of a road surface is of great importance for convenience and safety of driving. So the investigations of the behaviour of road materials in laboratory conditions and monitoring of existing roads are widely fulfilled for controlling a geometric parameters and detecting defects in the road surface. Photogrammetry as accurate non-contact measuring method provides powerful means for solving different tasks in road surface reconstruction and analysis. The range of dimensions concerned in road surface analysis can have great variation from tenths of millimetre to hundreds meters and more. So a set of techniques is needed to meet all requirements of road parameters estimation. Two photogrammetric techniques for road surface analysis are presented: for accurate measuring of road pavement and for road surface reconstruction based on imagery obtained from unmanned aerial vehicle. The first technique uses photogrammetric system based on structured light for fast and accurate surface 3D reconstruction and it allows analysing the characteristics of road texture and monitoring the pavement behaviour. The second technique provides dense 3D model road suitable for road macro parameters estimation.

  2. Task-based strategy for optimized contrast enhanced breast imaging: Analysis of six imaging techniques for mammography and tomosynthesis

    SciTech Connect

    Ikejimba, Lynda C.; Kiarashi, Nooshin; Ghate, Sujata V.; Samei, Ehsan; Lo, Joseph Y.

    2014-06-15

    Purpose: The use of contrast agents in breast imaging has the capability of enhancing nodule detectability and providing physiological information. Accordingly, there has been a growing trend toward using iodine as a contrast medium in digital mammography (DM) and digital breast tomosynthesis (DBT). Widespread use raises concerns about the best way to use iodine in DM and DBT, and thus a comparison is necessary to evaluate typical iodine-enhanced imaging methods. This study used a task-based observer model to determine the optimal imaging approach by analyzing six imaging paradigms in terms of their ability to resolve iodine at a given dose: unsubtracted mammography and tomosynthesis, temporal subtraction mammography and tomosynthesis, and dual energy subtraction mammography and tomosynthesis. Methods: Imaging performance was characterized using a detectability index d{sup ′}, derived from the system task transfer function (TTF), an imaging task, iodine signal difference, and the noise power spectrum (NPS). The task modeled a 10 mm diameter lesion containing iodine concentrations between 2.1 mg/cc and 8.6 mg/cc. TTF was obtained using an edge phantom, and the NPS was measured over several exposure levels, energies, and target-filter combinations. Using a structured CIRS phantom, d{sup ′} was generated as a function of dose and iodine concentration. Results: For all iodine concentrations and dose, temporal subtraction techniques for mammography and tomosynthesis yielded the highest d{sup ′}, while dual energy techniques for both modalities demonstrated the next best performance. Unsubtracted imaging resulted in the lowest d{sup ′} values for both modalities, with unsubtracted mammography performing the worst out of all six paradigms. Conclusions: At any dose, temporal subtraction imaging provides the greatest detectability, with temporally subtracted DBT performing the highest. The authors attribute the successful performance to excellent cancellation of

  3. Task-based strategy for optimized contrast enhanced breast imaging: analysis of six imaging techniques for mammography and tomosynthesis

    NASA Astrophysics Data System (ADS)

    Ikejimba, Lynda; Kiarashi, Nooshin; Lin, Yuan; Chen, Baiyu; Ghate, Sujata V.; Zerhouni, Moustafa; Samei, Ehsan; Lo, Joseph Y.

    2012-03-01

    Digital breast tomosynthesis (DBT) is a novel x-ray imaging technique that provides 3D structural information of the breast. In contrast to 2D mammography, DBT minimizes tissue overlap potentially improving cancer detection and reducing number of unnecessary recalls. The addition of a contrast agent to DBT and mammography for lesion enhancement has the benefit of providing functional information of a lesion, as lesion contrast uptake and washout patterns may help differentiate between benign and malignant tumors. This study used a task-based method to determine the optimal imaging approach by analyzing six imaging paradigms in terms of their ability to resolve iodine at a given dose: contrast enhanced mammography and tomosynthesis, temporal subtraction mammography and tomosynthesis, and dual energy subtraction mammography and tomosynthesis. Imaging performance was characterized using a detectability index d', derived from the system task transfer function (TTF), an imaging task, iodine contrast, and the noise power spectrum (NPS). The task modeled a 5 mm lesion containing iodine concentrations between 2.1 mg/cc and 8.6 mg/cc. TTF was obtained using an edge phantom, and the NPS was measured over several exposure levels, energies, and target-filter combinations. Using a structured CIRS phantom, d' was generated as a function of dose and iodine concentration. In general, higher dose gave higher d', but for the lowest iodine concentration and lowest dose, dual energy subtraction tomosynthesis and temporal subtraction tomosynthesis demonstrated the highest performance.

  4. A RT-based Technique for the Analysis and the Removal of Titan's Atmosphere by Cassini/VIMS-IR data

    NASA Astrophysics Data System (ADS)

    Sindoni, G.; Tosi, F.; Adriani, A.; Moriconi, M. L.; D'Aversa, E.; Grassi, D.; Oliva, F.; Dinelli, B. M.; Castelli, E.

    2015-12-01

    Since 2004, the Visual and Infrared Mapping Spectrometer (VIMS), together with the CIRS and UVIS spectrometers, aboard the Cassini spacecraft has provided insight on Saturn and Titan atmospheres through remote sensing observations. The presence of clouds and aerosols in Titan's dense atmosphere makes the analysis of the surface radiation a difficult task. For this purpose, an atmospheric radiative transfer (RT) model is required. The implementation of a RT code, which includes multiple scattering, in an inversion algorithm based on the Bayesian approach, can provide strong constraints about both the surface albedo and the atmospheric composition. The application of this retrieval procedure we have developed to VIMS-IR spectra acquired in nadir or slant geometries allows us to retrieve the equivalent opacity of Titan's atmosphere in terms of variable aerosols and gaseous content. Thus, the separation of the atmospheric and surface contributions in the observed spectrum is possible. The atmospheric removal procedure was tested on the spectral range 1-2.2μm of publicly available VIMS data covering the Ontario Lacus and Ligeia Mare regions. The retrieval of the accurate composition of Titan's atmosphere is a much more complex task. So far, the information about the vertical structure of the atmosphere by limb spectra was mostly derived under conditions where the scattering could be neglected [1,2]. Indeed, since the very high aerosol load in the middle-low atmosphere produces strong scattering effects on the measured spectra, the analysis requires a RT modeling taking into account multiple scattering in a spherical-shell geometry. Therefore the use of an innovative method we are developing based on the Monte-Carlo approach, can provide important information about the vertical distribution of the aerosols and the gases composing Titan's atmosphere.[1]Bellucci et al., (2009). Icarus, 201, Issue 1, p. 198-216.[2]de Kok et al., (2007). Icarus, 191, Issue 1, p. 223-235.

  5. Integration of conventional GIS-based techniques and remote sensing analysis to landslide risk assessment at basin scale

    NASA Astrophysics Data System (ADS)

    Agili, F.; Bartolomei, A.; Casagli, N.; Catani, F.; Ermini, L.; Farina, P.; Kukavicic, M.; Mirannalti, M.; Moretti, S.; Righini, G.

    2003-04-01

    This note concerns the preliminary results gathered in a research project aimed at landslide risk assessment in the Arno River basin (9000 km^2). The project, sponsored by the Basin Authority of the Arno River, started in the year 2002 and it will finish in the 2004. The objective of such a project consists of the updating of the landslide risk cartography related to the PAI document (Piano Assetto Idrogeologico) with reference to the Italian Law 267/1998. Different types of products will be generated: the updating of the existing inventory maps and the definition and application of a methodology for landslide hazard and risk mapping. Conventional methods, such as aerial-photo interpretation and field surveys are coupled with the use of different remote sensing methods, and all the data are integrated within a GIS environment. The analysis of remote sensing data regards both optical and radar images. In particular for the analysis of optical data, panchromatic and multispectral Landsat images are used in order to update the Corine standard land cover maps. In addition high resolution images (Ikonos and Quickbird), acquired in stereoscopic configuration, are analysed for integrating the aerial-photo intepretation. Differential SAR interferometry, implemented by using ERS and JERS data, is used in order to detect new mass movements, not yet observed and to evaluate the state of activity of known phenomena. Such data represent the base needed to produce the final landslide risk cartography.

  6. Automatic system for brain MRI analysis using a novel combination of fuzzy rule-based and automatic clustering techniques

    NASA Astrophysics Data System (ADS)

    Hillman, Gilbert R.; Chang, Chih-Wei; Ying, Hao; Kent, T. A.; Yen, John

    1995-05-01

    Analysis of magnetic resonance images (MRI) of the brain permits the identification and measurement of brain compartments. These compartments include normal subdivisions of brain tissue, such as gray matter, white matter and specific structures, and also include pathologic lesions associated with stroke or viral infection. A fuzzy system has been developed to analyze images of animal and human brain, segmenting the images into physiologically meaningful regions for display and measurement. This image segmentation system consists of two stages which include a fuzzy rule-based system and fuzzy c-means algorithm (FCM). The first stage of this system is a fuzzy rule-based system which classifies most pixels in MR images into several known classes and one `unclassified' group, which fails to fit the predetermined rules. In the second stage, this system uses the result of the first stage as initial estimates for the properties of the compartments and applies FCM to classify all the previously unclassified pixels. The initial prototypes are estimated by using the averages of the previously classified pixels. The combined processes constitute a fast, accurate and robust image segmentation system. This method can be applied to many clinical image segmentation problems. While the rule-based portion of the system allows specialized knowledge about the images to be incorporated, the FCM allows the resolution of ambiguities that result from noise and artifacts in the image data. The volumes and locations of the compartments can easily be measured and reported quantitatively once they are identified. It is easy to adapt this approach to new imaging problems, by introducing a new set of fuzzy rules and adjusting the number of expected compartments. However, for the purpose of building a practical fully automatic system, a rule learning mechanism may be necessary to improve the efficiency of modification of the fuzzy rules.

  7. Spectral compressor vibration analysis techniques

    SciTech Connect

    Hanson, M.L.

    1982-02-01

    Studies at GAT have verified that the spectral distribution of energy in gaseous diffusion compressor vibrations contains information pertinent to the state of the compressor's ''health.'' Based on that conclusion, vibration analysis capabilities were included in the CUP computer data acquisition system. In order for that information to be used for diagnosis of incipient failure mechanisms, however, spectral features must be empirically associated with actual malfunctions and validated statistically as diagnostic symptoms. When the system was acquired, indicators were generally unknown except for those associated with unbalance, misalignment, 00 secondary surge and severe resonant blade vibrations. Others must be developed as in-service malfunctions occur. The power spectral density function (PSDF) has historically been used to compute vibration spectra. Accurate, high-resolution power density spectra require long data-acquisition periods which is inconsistent with frequent examinations of all up-rated compressors. Detection of gross spectral changes indicative of a need for detailed analyses has been accomplished at a rate of less than 1 minute per compressor. An optimum analytical sequence will be based on trade offs. Work is in progress to identify additional malfunction indicators and investigate tools other than the PSDF to provide faster diagnoses. 6 figs.

  8. Terahertz Wide-Angle Imaging and Analysis on Plane-wave Criteria Based on Inverse Synthetic Aperture Techniques

    NASA Astrophysics Data System (ADS)

    Gao, Jing Kun; Qin, Yu Liang; Deng, Bin; Wang, Hong Qiang; Li, Jin; Li, Xiang

    2016-04-01

    This paper presents two parts of work around terahertz imaging applications. The first part aims at solving the problems occurred with the increasing of the rotation angle. To compensate for the nonlinearity of terahertz radar systems, a calibration signal acquired from a bright target is always used. Generally, this compensation inserts an extra linear phase term in the intermediate frequency (IF) echo signal which is not expected in large-rotation angle imaging applications. We carried out a detailed theoretical analysis on this problem, and a minimum entropy criterion was employed to estimate and compensate for the linear-phase errors. In the second part, the effects of spherical wave on terahertz inverse synthetic aperture imaging are analyzed. Analytic criteria of plane-wave approximation were derived in the cases of different rotation angles. Experimental results of corner reflectors and an aircraft model based on a 330-GHz linear frequency-modulated continuous wave (LFMCW) radar system validated the necessity and effectiveness of the proposed compensation. By comparing the experimental images obtained under plane-wave assumption and spherical-wave correction, it also showed to be highly consistent with the analytic criteria we derived.

  9. Implementation of a routine genetic evaluation for longevity based on survival analysis techniques in dairy cattle populations in Switzerland.

    PubMed

    Vukasinovic, N; Moll, J; Casanova, L

    2001-09-01

    Genetic evaluation of sires for functional longevity of their daughters based on survival analysis has been implemented in the populations of Braunvieh, Simmental, and Holstein cattle in Switzerland. A Weibull mixed sire-maternal grandsire survival model was used to estimate breeding values of sires with data on cows that calved since April 1, 1980. Data on Braunvieh and Simmental cows included about 1.1 million records, data on Holstein cows comprised about 220,000 records. Data contained approximately 20 to 24% right-censored records and 6 to 9% left-truncated records. Besides the random sire and maternal grandsire effects, the model included effects of herd-year-season, age at first calving, parity, stage of lactation, alpine pasturing (Braunvieh and Simmental), and relative milk yield and relative fat and protein percentage within herd to account for culling for production. Heritability of functional longevity, estimated on a subset of data including approximately 150,000 animals, were 0.181, 0.198, and 0.184 for Braunvieh, Simmental, and Holstein, respectively. Breeding values were estimated for all sires with at least six daughters or three granddaughters in the data. Breeding values of sires are expressed in months of functional productive life and published in sire catalogs along with breeding values for production traits.

  10. School Principals' Personal Constructs Regarding Technology: An Analysis Based on Decision-Making Grid Technique

    ERIC Educational Resources Information Center

    Bektas, Fatih

    2014-01-01

    This study aims to determine the similarities and differences between existing school principals' personal constructs of "ideal principal qualities" in terms of technology by means of the decision-making grid technique. The study has a phenomenological design, and the study group consists of 17 principals who have been serving at…

  11. Visual exploratory analysis of DCE-MRI data in breast cancer based on novel nonlinear dimensional data reduction techniques

    NASA Astrophysics Data System (ADS)

    Meyer-Bäse, Anke; Lespinats, Sylvain; Steinbrücker, Frank; Saalbach, Axel; Schlossbauer, Thomas; Barbu, Adrian

    2009-04-01

    Visualization of multi-dimensional data sets becomes a critical and significant area in modern medical image processing. To analyze such high dimensional data, novel nonlinear embedding approaches become increasingly important to show dependencies among these data in a two- or three-dimensional space. This paper investigates the potential of novel nonlinear dimensional data reduction techniques and compares their results with proven nonlinear techniques when applied to the differentiation of malignant and benign lesions described by high-dimensional data sets arising from dynamic contrast-enhanced magnetic resonance imaging (DCE-MRI). Two important visualization modalities in medical imaging are presented: the mapping on a lower-dimensional data manifold and the image fusion.

  12. A Three Corner Hat-based analysis of station position time series for the assessment of inter-technique precision at ITRF co-located sites

    NASA Astrophysics Data System (ADS)

    Abbondanza, C.; Chin, T. M.; Gross, R. S.; Heflin, M. B.; Hurst, K. J.; Parker, J. W.; Wu, X.; Altamimi, Z.

    2012-12-01

    Assessing the uncertainty in geodetic positioning is a crucial factor when combining independent space-geodetic solutions for the computation of the International Terrestrial Reference Frame (ITRF). ITRF is a combined product based on the stacking of VLBI, GPS, SLR and DORIS solutions and merging the single technique reference frames with terrestrial local tie measurements at co-located sites. In current ITRF realizations, the uncertainty evaluation of the four techniques relies on the analysis of the post-fit residuals, which are a by-product of the combination process. An alternative approach to the assessment of the inter-technique precision can be offered by a Three Corner Hat (TCH) analysis of the non-linear residual time series obtained at ITRF co-location sites as a by-product of the stacking procedure. Non-linear residuals of station position time series stemming from global networks of the four techniques can be modeled as a composition of periodic signals (commonly annual and semi-annual) and stochastic noise, typically characterized as a combination of flicker and white noise. Pair-wise differences of station position time series of at least three co-located instruments can be formed with the aim of removing the common geophysical signal and characterizing the inter-technique precision. The application of TCH relies on the hypothesis of absence of correlation between the error processes of the four techniques and assumes the stochastic noise to be Gaussian. If the hypothesis of statistical independence between the space-geodetic technique errors is amply verified, the assumption of pure white noise of the stochastic error processes appears to be more questionable. In fact, previous studies focused on geodetic positioning consistently showed that flicker noise generally prevails over white noise in the analysis of global network GPS time series, whereas in VLBI, SLR and DORIS time series Gaussian noise is predominant. In this investigation, TCH is applied

  13. [Approach to the jugular foramen and related structures - an analysis of the surgical technique based on cadaver simulation].

    PubMed

    Ladziński, Piotr; Maliszewski, Mariusz; Kaspera, Wojciech; Szczygieł, Majchrzak; Tymowski, Michał

    2011-01-01

    This study presents consecutive stages of the approach to the jugular foramen and related structures. Eleven simulations of the approach were performed on non-fixed human cadavers without any known pathologies in the head and neck. The consecutive stages of the procedure were documented with photographs and schematic diagrams. The starting point for the discussed approach is removal of the mastoid and petrosal parts of the temporal bone, as well as the jugular process and the jugular tuberculum. It allows penetration of the jugular foramen from the back. Widening of the approach enables penetration of the jugular foramen from above and the front. Approach to the jugular foramen is a reproducible technique, which provides surgical penetration of this foramen and related structures. This approach is particularly useful in the surgical treatment of tumours expanding in the petrous pyramid, surroundings of the petrosal part of the internal carotid artery, cerebellopontine angle, subtemporal fossa and nervous-vascular bundle of the neck. PMID:21866483

  14. Application of Electromigration Techniques in Environmental Analysis

    NASA Astrophysics Data System (ADS)

    Bald, Edward; Kubalczyk, Paweł; Studzińska, Sylwia; Dziubakiewicz, Ewelina; Buszewski, Bogusław

    Inherently trace-level concentration of pollutants in the environment, together with the complexity of sample matrices, place a strong demand on the detection capabilities of electromigration methods. Significant progress is continually being made, widening the applicability of these techniques, mostly capillary zone electrophoresis, micellar electrokinetic chromatography, and capillary electrochromatography, to the analysis of real-world environmental samples, including the concentration sensitivity and robustness of the developed analytical procedures. This chapter covers the recent major developments in the domain of capillary electrophoresis analysis of environmental samples for pesticides, polycyclic aromatic hydrocarbons, phenols, amines, carboxylic acids, explosives, pharmaceuticals, and ionic liquids. Emphasis is made on pre-capillary and on-capillary chromatography and electrophoresis-based concentration of analytes and detection improvement.

  15. [Acupoints selecting and medication rules analysis based on data mining technique for bronchial asthma treated with acupoint application].

    PubMed

    Wang, Zhaohui; Han, Dongyue; Qie, Lili; Liu, Chang; Wang, Fuchun

    2015-06-01

    Clinical literature of bronchial asthma treated with acupoints application from January 2000 to March 2014 in modern periodicals databases was retrieved through computer. With cluster analysis and frequency analysis methods of data mining, acupoints selecting and medication rules of bronchial asthma treated with acupoints application were analyzed. Total 38 articles were included eventually, including 25 acupoints and 42 medicines. The results indicate that on acupoints selecting, Feishu (BL 13) is used as the main acupoint and 3 groups of bladder meridian and conception vessel acupoints are applied alternately and on medicines, Baijiezi (Brassica alba Boiss), Xixin (Radix et Rhizoma Asari), Gansui (Radix Kansui), Yanhusuo (Corydalis) and Mahuang (Radix et Rhizonma Ephedrae) are primarily adopted, epispastic medicines being the main medicines; medicines mostly belong to lung meridian, main medicines being unchanged mostly with Shengjiang as guiding drug.

  16. [Development of Selective LC Analysis Method for Biogenic and Related Compounds Based on a Fluorous Affinity Technique].

    PubMed

    Hayama, Tadashi

    2015-01-01

    A separation-oriented derivatization method combined with LC has been developed for the selective analysis of biogenic and related compounds. In this method, we utilized a specific affinity between perfluoroalkyl-containing compounds, i.e., 'fluorous' compounds (fluorophilicity). Our strategy involves the derivatization of target analytes with perfluoroalkyl reagents, followed by selective retention of the derivatives with a perfluoroalkyl-modified stationary phase LC column. The perfluoroalkylated derivatives are strongly retained on the column owing to their fluorophilicity, whereas non-derivatized species, such as sample matrices, are hardly retained. Therefore, utilizing this derivatization method, target analytes can be determined selectively without interference from matrices. This method has been successfully applied to the LC analysis of some biogenic and related compounds in complex biological samples. PMID:26329550

  17. Permethylation Linkage Analysis Techniques for Residual Carbohydrates

    NASA Astrophysics Data System (ADS)

    Price, Neil P. J.

    Permethylation analysis is the classic approach to establishing the position of glycosidic linkages between sugar residues. Typically, the carbohydrate is derivatized to form acid-stable methyl ethers, hydrolyzed, peracetylated, and analyzed by gas chromatography-mass spectrometry. The position of glycosidic linkages in the starting carbohydrate are apparent from the mass spectra as determined by the location of acetyl residues. The completeness of permethylation is dependent upon the choice of base catalyst and is readily confirmed by matrix-assisted laser desorption/ionization time-of-flight mass spectrometry mass spectrometry. For the permethylation of β-cyclodextrin, Hakomori dimsyl base is shown to be superior to the NaOH-dimethyl sulfoxide system, and the use of the latter resulted in selective under-methylation of the 3-hydroxy groups. These techniques are highly applicable to residual carbohydrates from biofuel processes.

  18. Digital techniques for ULF wave polarization analysis

    NASA Technical Reports Server (NTRS)

    Arthur, C. W.

    1979-01-01

    Digital power spectral and wave polarization analysis are powerful techniques for studying ULF waves in the earth's magnetosphere. Four different techniques for using the spectral matrix to perform such an analysis have been presented in the literature. Three of these techniques are similar in that they require transformation of the spectral matrix to the principal axis system prior to performing the polarization analysis. The differences in the three techniques lie in the manner in which determine this transformation. A comparative study of these three techniques using both simulated and real data has shown them to be approximately equal in quality of performance. The fourth technique does not require transformation of the spectral matrix. Rather, it uses the measured spectral matrix and state vectors for a desired wave type to design a polarization detector function in the frequency domain. The design of various detector functions and their application to both simulated and real data will be presented.

  19. Bone quality around bioactive silica-based coated stainless steel implants: analysis by micro-Raman, XRF and XAS techniques.

    PubMed

    Ballarre, Josefina; Desimone, Paula M; Chorro, Matthieu; Baca, Matías; Orellano, Juan Carlos; Ceré, Silvia M

    2013-11-01

    Surface modification of surgical stainless steel implants by sol gel coatings has been proposed as a tool to generate a surface that besides being protective could also create a "bioactive" interface to generate a natural bonding between the metal surface and the existing bone. The aim of this work is to analyze the quality and bone formation around hybrid bioactive coatings containing glass-ceramic particles, made by sol-gel process on 316L stainless steel used as permanent implant in terms of mineralization, calcium content and bone maturity with micro Raman, X-ray microfluorescence and X-ray absorption techniques. Uncoated implants seem to generate a thin bone layer at the beginning of osseointegration process and then this layer being separated from the surface with time. The hybrid coatings without glass-ceramic particles generate new bone around implants, with high concentration of Ca and P at the implant/tissue interface. This fact seems to be related with the presence of silica nanoparticles in the layer. The addition of bioactive particles promotes and enhances the bone quality with a homogeneous Ca and P content and a low rate of beta carbonate substitution and crystallinity, similar to young and mechanical resistant bone. PMID:24076155

  20. Bone quality around bioactive silica-based coated stainless steel implants: analysis by micro-Raman, XRF and XAS techniques.

    PubMed

    Ballarre, Josefina; Desimone, Paula M; Chorro, Matthieu; Baca, Matías; Orellano, Juan Carlos; Ceré, Silvia M

    2013-11-01

    Surface modification of surgical stainless steel implants by sol gel coatings has been proposed as a tool to generate a surface that besides being protective could also create a "bioactive" interface to generate a natural bonding between the metal surface and the existing bone. The aim of this work is to analyze the quality and bone formation around hybrid bioactive coatings containing glass-ceramic particles, made by sol-gel process on 316L stainless steel used as permanent implant in terms of mineralization, calcium content and bone maturity with micro Raman, X-ray microfluorescence and X-ray absorption techniques. Uncoated implants seem to generate a thin bone layer at the beginning of osseointegration process and then this layer being separated from the surface with time. The hybrid coatings without glass-ceramic particles generate new bone around implants, with high concentration of Ca and P at the implant/tissue interface. This fact seems to be related with the presence of silica nanoparticles in the layer. The addition of bioactive particles promotes and enhances the bone quality with a homogeneous Ca and P content and a low rate of beta carbonate substitution and crystallinity, similar to young and mechanical resistant bone.

  1. Analysis of Different Classification Techniques for Two-Class Functional Near-Infrared Spectroscopy-Based Brain-Computer Interface

    PubMed Central

    Qureshi, Nauman Khalid; Noori, Farzan Majeed; Hong, Keum-Shik

    2016-01-01

    We analyse and compare the classification accuracies of six different classifiers for a two-class mental task (mental arithmetic and rest) using functional near-infrared spectroscopy (fNIRS) signals. The signals of the mental arithmetic and rest tasks from the prefrontal cortex region of the brain for seven healthy subjects were acquired using a multichannel continuous-wave imaging system. After removal of the physiological noises, six features were extracted from the oxygenated hemoglobin (HbO) signals. Two- and three-dimensional combinations of those features were used for classification of mental tasks. In the classification, six different modalities, linear discriminant analysis (LDA), quadratic discriminant analysis (QDA), k-nearest neighbour (kNN), the Naïve Bayes approach, support vector machine (SVM), and artificial neural networks (ANN), were utilized. With these classifiers, the average classification accuracies among the seven subjects for the 2- and 3-dimensional combinations of features were 71.6, 90.0, 69.7, 89.8, 89.5, and 91.4% and 79.6, 95.2, 64.5, 94.8, 95.2, and 96.3%, respectively. ANN showed the maximum classification accuracies: 91.4 and 96.3%. In order to validate the results, a statistical significance test was performed, which confirmed that the p values were statistically significant relative to all of the other classifiers (p < 0.005) using HbO signals. PMID:27725827

  2. Hybrid analysis techniques for software fault detection

    SciTech Connect

    Young, M.T.

    1989-01-01

    Since the question Does program P obey specification S' can not be decided in general, every practical software validation technique must compromise accuracy in some way. Testing techniques admit the possibility that a fault will be undetected, as the price for quitting after a finite number of test cases. Formal verification admits the possibility that a proof will not be found for a valid assertion, as the price for quitting after a finite amount of proof effort. No technique so dominates others that a wise validation strategy consists of applying that technique alone; rather, effective validation requires applying several techniques. This dissertation contributes to the understanding of synergistic combinations of fault detection techniques. A framework for comparing techniques and considering their combinations is developed. Techniques that fold a state space depend critically on leaving out the right details to make the space smaller or more regular. One often wishes to argue that simplifications will not hide any of the errors a technique is designed to detect. This claim is formalized as a relation between models of execution, and sufficient conditions for establishing the relation with respect to specification formulas expressed in temporal logic are proved. The framework and theory are applied to two problems in reachability analysis of concurrent software. A method for limiting combinatorial explosion by parceling the analysis of large systems is described. It is justified by showing that analysis of each parcel is an error-preserving abstraction of the global analysis. To ameliorate the problem of spurious error reports, a technique combining reachability analysis with symbolic execution is devised. Soundness of the hybrid technique is established by showing that reachability analysis is an error-preserving abstraction of symbolic execution. A prototype implementation of tools to support analysis of concurrent software is described.

  3. Analysis of algebraic reconstruction technique for accurate imaging of gas temperature and concentration based on tunable diode laser absorption spectroscopy

    NASA Astrophysics Data System (ADS)

    Hui-Hui, Xia; Rui-Feng, Kan; Jian-Guo, Liu; Zhen-Yu, Xu; Ya-Bai, He

    2016-06-01

    An improved algebraic reconstruction technique (ART) combined with tunable diode laser absorption spectroscopy(TDLAS) is presented in this paper for determining two-dimensional (2D) distribution of H2O concentration and temperature in a simulated combustion flame. This work aims to simulate the reconstruction of spectroscopic measurements by a multi-view parallel-beam scanning geometry and analyze the effects of projection rays on reconstruction accuracy. It finally proves that reconstruction quality dramatically increases with the number of projection rays increasing until more than 180 for 20 × 20 grid, and after that point, the number of projection rays has little influence on reconstruction accuracy. It is clear that the temperature reconstruction results are more accurate than the water vapor concentration obtained by the traditional concentration calculation method. In the present study an innovative way to reduce the error of concentration reconstruction and improve the reconstruction quality greatly is also proposed, and the capability of this new method is evaluated by using appropriate assessment parameters. By using this new approach, not only the concentration reconstruction accuracy is greatly improved, but also a suitable parallel-beam arrangement is put forward for high reconstruction accuracy and simplicity of experimental validation. Finally, a bimodal structure of the combustion region is assumed to demonstrate the robustness and universality of the proposed method. Numerical investigation indicates that the proposed TDLAS tomographic algorithm is capable of detecting accurate temperature and concentration profiles. This feasible formula for reconstruction research is expected to resolve several key issues in practical combustion devices. Project supported by the Young Scientists Fund of the National Natural Science Foundation of China (Grant No. 61205151), the National Key Scientific Instrument and Equipment Development Project of China (Grant

  4. Analysis of algebraic reconstruction technique for accurate imaging of gas temperature and concentration based on tunable diode laser absorption spectroscopy

    NASA Astrophysics Data System (ADS)

    Hui-Hui, Xia; Rui-Feng, Kan; Jian-Guo, Liu; Zhen-Yu, Xu; Ya-Bai, He

    2016-06-01

    An improved algebraic reconstruction technique (ART) combined with tunable diode laser absorption spectroscopy(TDLAS) is presented in this paper for determining two-dimensional (2D) distribution of H2O concentration and temperature in a simulated combustion flame. This work aims to simulate the reconstruction of spectroscopic measurements by a multi-view parallel-beam scanning geometry and analyze the effects of projection rays on reconstruction accuracy. It finally proves that reconstruction quality dramatically increases with the number of projection rays increasing until more than 180 for 20 × 20 grid, and after that point, the number of projection rays has little influence on reconstruction accuracy. It is clear that the temperature reconstruction results are more accurate than the water vapor concentration obtained by the traditional concentration calculation method. In the present study an innovative way to reduce the error of concentration reconstruction and improve the reconstruction quality greatly is also proposed, and the capability of this new method is evaluated by using appropriate assessment parameters. By using this new approach, not only the concentration reconstruction accuracy is greatly improved, but also a suitable parallel-beam arrangement is put forward for high reconstruction accuracy and simplicity of experimental validation. Finally, a bimodal structure of the combustion region is assumed to demonstrate the robustness and universality of the proposed method. Numerical investigation indicates that the proposed TDLAS tomographic algorithm is capable of detecting accurate temperature and concentration profiles. This feasible formula for reconstruction research is expected to resolve several key issues in practical combustion devices. Project supported by the Young Scientists Fund of the National Natural Science Foundation of China (Grant No. 61205151), the National Key Scientific Instrument and Equipment Development Project of China (Grant

  5. Analysis of the nonlinear behavior of shear-Alfvén modes in tokamaks based on Hamiltonian mapping techniques

    NASA Astrophysics Data System (ADS)

    Briguglio, S.; Wang, X.; Zonca, F.; Vlad, G.; Fogaccia, G.; Di Troia, C.; Fusco, V.

    2014-11-01

    We present a series of numerical simulation experiments set up to illustrate the fundamental physics processes underlying the nonlinear dynamics of Alfvénic modes resonantly excited by energetic particles in tokamak plasmas and of the ensuing energetic particle transports. These phenomena are investigated by following the evolution of a test particle population in the electromagnetic fields computed in self-consistent MHD-particle simulation performed by the HMGC code. Hamiltonian mapping techniques are used to extract and illustrate several features of wave-particle dynamics. The universal structure of resonant particle phase space near an isolated resonance is recovered and analyzed, showing that bounded orbits and untrapped trajectories, divided by the instantaneous separatrix, form phase space zonal structures, whose characteristic non-adiabatic evolution time is the same as the nonlinear time of the underlying fluctuations. Bounded orbits correspond to a net outward resonant particle flux, which produces a flattening and/or gradient inversion of the fast ion density profile around the peak of the linear wave-particle resonance. The connection of this phenomenon to the mode saturation is analyzed with reference to two different cases: a Toroidal Alfvén eigenmode in a low shear magnetic equilibrium and a weakly unstable energetic particle mode for stronger magnetic shear. It is shown that, in the former case, saturation is reached because of radial decoupling (resonant particle redistribution matching the mode radial width) and is characterized by a weak dependence of the mode amplitude on the growth rate. In the latter case, saturation is due to resonance detuning (resonant particle redistribution matching the resonance width) with a stronger dependence of the mode amplitude on the growth rate.

  6. Analysis of the nonlinear behavior of shear-Alfvén modes in tokamaks based on Hamiltonian mapping techniques

    SciTech Connect

    Briguglio, S. Vlad, G.; Fogaccia, G.; Di Troia, C.; Fusco, V.; Wang, X.; Zonca, F.

    2014-11-15

    We present a series of numerical simulation experiments set up to illustrate the fundamental physics processes underlying the nonlinear dynamics of Alfvénic modes resonantly excited by energetic particles in tokamak plasmas and of the ensuing energetic particle transports. These phenomena are investigated by following the evolution of a test particle population in the electromagnetic fields computed in self-consistent MHD-particle simulation performed by the HMGC code. Hamiltonian mapping techniques are used to extract and illustrate several features of wave-particle dynamics. The universal structure of resonant particle phase space near an isolated resonance is recovered and analyzed, showing that bounded orbits and untrapped trajectories, divided by the instantaneous separatrix, form phase space zonal structures, whose characteristic non-adiabatic evolution time is the same as the nonlinear time of the underlying fluctuations. Bounded orbits correspond to a net outward resonant particle flux, which produces a flattening and/or gradient inversion of the fast ion density profile around the peak of the linear wave-particle resonance. The connection of this phenomenon to the mode saturation is analyzed with reference to two different cases: a Toroidal Alfvén eigenmode in a low shear magnetic equilibrium and a weakly unstable energetic particle mode for stronger magnetic shear. It is shown that, in the former case, saturation is reached because of radial decoupling (resonant particle redistribution matching the mode radial width) and is characterized by a weak dependence of the mode amplitude on the growth rate. In the latter case, saturation is due to resonance detuning (resonant particle redistribution matching the resonance width) with a stronger dependence of the mode amplitude on the growth rate.

  7. Physicochemical bases of differences between the sedimentometric and laser-diffraction techniques of soil particle-size analysis

    NASA Astrophysics Data System (ADS)

    Fedotov, G. N.; Shein, E. V.; Putlyaev, V. I.; Arkhangel'Skaya, T. A.; Eliseev, A. V.; Milanovskii, E. Yu.

    2007-03-01

    Comparison of the particle-size distributions in different soils showed that the sedimentation method (Kachinskii pipette method) gives higher (by 1.5-5 times) values of the clay content than the laser diffraction method. This is related to the significant variation in density of soil solids, which is taken to be constant in the sedimentation method. Therefore, particles of significantly larger size and lower density fall into this fraction. Using optical, electron, and confocal microscopy, it was shown that the low density of soil particles of silt size falling into the sedimentometric clay fraction is related to the organomineral shell (film) around the soil microparticles. This shell contributes to the linking of microparticles into aggregates at the lower average density. As a result, these aggregates have significantly larger size and lower density and settle with the same velocity as the small particles with the average density of the solid phase during the sedimentation particle-size analysis.

  8. Typology of delivery quality: latent profile analysis of teacher engagement and delivery techniques in a school-based prevention intervention, keepin' it REAL curriculum.

    PubMed

    Shin, YoungJu; Miller-Day, Michelle; Pettigrew, Jonathan; Hecht, Michael L; Krieger, Janice L

    2014-12-01

    Enhancing the delivery quality of school-based, evidence-based prevention programs is one key to ensuring uniform program effects on student outcomes. Program evaluations often focus on content dosage when implementing prevention curricula, however, less is known about implementation quality of prevention content, especially among teachers who may or may not have a prevention background. The goal of the current study is to add to the scholarly literature on implementation quality for a school-based substance use prevention intervention. Twenty-five schools in Ohio and Pennsylvania implemented the original keepin' REAL (kiR) substance use prevention curriculum. Each of the 10, 40-45 min lessons of the kiR curriculum was video recorded. Coders observed and rated a random sample of 276 videos reflecting 78 classes taught by 31 teachers. Codes included teachers' delivery techniques (e.g., lecture, discussion, demonstration and role play) and engagement with students (e.g. attentiveness, enthusiasm and positivity). Based on the video ratings, a latent profile analysis was run to identify typology of delivery quality. Five profiles were identified: holistic approach, attentive teacher-orientated approach, enthusiastic lecture approach, engaged interactive learning approach and skill practice-only approach. This study provides a descriptive typology of delivery quality while implementing a school-based substance use prevention intervention.

  9. Typology of delivery quality: latent profile analysis of teacher engagement and delivery techniques in a school-based prevention intervention, keepin’ it REAL curriculum

    PubMed Central

    Shin, YoungJu; Miller-Day, Michelle; Pettigrew, Jonathan; Hecht, Michael L.; Krieger, Janice L.

    2014-01-01

    Enhancing the delivery quality of school-based, evidence-based prevention programs is one key to ensuring uniform program effects on student outcomes. Program evaluations often focus on content dosage when implementing prevention curricula, however, less is known about implementation quality of prevention content, especially among teachers who may or may not have a prevention background. The goal of the current study is to add to the scholarly literature on implementation quality for a school-based substance use prevention intervention. Twenty-five schools in Ohio and Pennsylvania implemented the original keepin’ REAL (kiR) substance use prevention curriculum. Each of the 10, 40–45 min lessons of the kiR curriculum was video recorded. Coders observed and rated a random sample of 276 videos reflecting 78 classes taught by 31 teachers. Codes included teachers’ delivery techniques (e.g. lecture, discussion, demonstration and role play) and engagement with students (e.g. attentiveness, enthusiasm and positivity). Based on the video ratings, a latent profile analysis was run to identify typology of delivery quality. Five profiles were identified: holistic approach, attentive teacher-orientated approach, enthusiastic lecture approach, engaged interactive learning approach and skill practice-only approach. This study provides a descriptive typology of delivery quality while implementing a school-based substance use prevention intervention. PMID:25274721

  10. Typology of delivery quality: latent profile analysis of teacher engagement and delivery techniques in a school-based prevention intervention, keepin' it REAL curriculum.

    PubMed

    Shin, YoungJu; Miller-Day, Michelle; Pettigrew, Jonathan; Hecht, Michael L; Krieger, Janice L

    2014-12-01

    Enhancing the delivery quality of school-based, evidence-based prevention programs is one key to ensuring uniform program effects on student outcomes. Program evaluations often focus on content dosage when implementing prevention curricula, however, less is known about implementation quality of prevention content, especially among teachers who may or may not have a prevention background. The goal of the current study is to add to the scholarly literature on implementation quality for a school-based substance use prevention intervention. Twenty-five schools in Ohio and Pennsylvania implemented the original keepin' REAL (kiR) substance use prevention curriculum. Each of the 10, 40-45 min lessons of the kiR curriculum was video recorded. Coders observed and rated a random sample of 276 videos reflecting 78 classes taught by 31 teachers. Codes included teachers' delivery techniques (e.g., lecture, discussion, demonstration and role play) and engagement with students (e.g. attentiveness, enthusiasm and positivity). Based on the video ratings, a latent profile analysis was run to identify typology of delivery quality. Five profiles were identified: holistic approach, attentive teacher-orientated approach, enthusiastic lecture approach, engaged interactive learning approach and skill practice-only approach. This study provides a descriptive typology of delivery quality while implementing a school-based substance use prevention intervention. PMID:25274721

  11. An inertial sensor-based system for spatio-temporal analysis in classic cross-country skiing diagonal technique.

    PubMed

    Fasel, Benedikt; Favre, Julien; Chardonnens, Julien; Gremion, Gérald; Aminian, Kamiar

    2015-09-18

    The present study proposes a method based on ski fixed inertial sensors to automatically compute spatio-temporal parameters (phase durations, cycle speed and cycle length) for the diagonal stride in classical cross-country skiing. The proposed system was validated against a marker-based motion capture system during indoor treadmill skiing. Skiing movement of 10 junior to world-cup athletes was measured for four different conditions. The accuracy (i.e. median error) and precision (i.e. interquartile range of error) of the system was below 6 ms for cycle duration and ski thrust duration and below 35 ms for pole push duration. Cycle speed precision (accuracy) was below 0.1m/s (0.00 5m/s) and cycle length precision (accuracy) was below 0.15m (0.005 m). The system was sensitive to changes of conditions and was accurate enough to detect significant differences reported in previous studies. Since capture volume is not limited and setup is simple, the system would be well suited for outdoor measurements on snow.

  12. An inertial sensor-based system for spatio-temporal analysis in classic cross-country skiing diagonal technique.

    PubMed

    Fasel, Benedikt; Favre, Julien; Chardonnens, Julien; Gremion, Gérald; Aminian, Kamiar

    2015-09-18

    The present study proposes a method based on ski fixed inertial sensors to automatically compute spatio-temporal parameters (phase durations, cycle speed and cycle length) for the diagonal stride in classical cross-country skiing. The proposed system was validated against a marker-based motion capture system during indoor treadmill skiing. Skiing movement of 10 junior to world-cup athletes was measured for four different conditions. The accuracy (i.e. median error) and precision (i.e. interquartile range of error) of the system was below 6 ms for cycle duration and ski thrust duration and below 35 ms for pole push duration. Cycle speed precision (accuracy) was below 0.1m/s (0.00 5m/s) and cycle length precision (accuracy) was below 0.15m (0.005 m). The system was sensitive to changes of conditions and was accurate enough to detect significant differences reported in previous studies. Since capture volume is not limited and setup is simple, the system would be well suited for outdoor measurements on snow. PMID:26209087

  13. A dual resolution measurement based Monte Carlo simulation technique for detailed dose analysis of small volume organs in the skull base region

    NASA Astrophysics Data System (ADS)

    Yeh, Chi-Yuan; Tung, Chuan-Jung; Chao, Tsi-Chain; Lin, Mu-Han; Lee, Chung-Chi

    2014-11-01

    The purpose of this study was to examine dose distribution of a skull base tumor and surrounding critical structures in response to high dose intensity-modulated radiosurgery (IMRS) with Monte Carlo (MC) simulation using a dual resolution sandwich phantom. The measurement-based Monte Carlo (MBMC) method (Lin et al., 2009) was adopted for the study. The major components of the MBMC technique involve (1) the BEAMnrc code for beam transport through the treatment head of a Varian 21EX linear accelerator, (2) the DOSXYZnrc code for patient dose simulation and (3) an EPID-measured efficiency map which describes non-uniform fluence distribution of the IMRS treatment beam. For the simulated case, five isocentric 6 MV photon beams were designed to deliver a total dose of 1200 cGy in two fractions to the skull base tumor. A sandwich phantom for the MBMC simulation was created based on the patient's CT scan of a skull base tumor [gross tumor volume (GTV)=8.4 cm3] near the right 8th cranial nerve. The phantom, consisted of a 1.2-cm thick skull base region, had a voxel resolution of 0.05×0.05×0.1 cm3 and was sandwiched in between 0.05×0.05×0.3 cm3 slices of a head phantom. A coarser 0.2×0.2×0.3 cm3 single resolution (SR) phantom was also created for comparison with the sandwich phantom. A particle history of 3×108 for each beam was used for simulations of both the SR and the sandwich phantoms to achieve a statistical uncertainty of <2%. Our study showed that the planning target volume (PTV) receiving at least 95% of the prescribed dose (VPTV95) was 96.9%, 96.7% and 99.9% for the TPS, SR, and sandwich phantom, respectively. The maximum and mean doses to large organs such as the PTV, brain stem, and parotid gland for the TPS, SR and sandwich MC simulations did not show any significant difference; however, significant dose differences were observed for very small structures like the right 8th cranial nerve, right cochlea, right malleus and right semicircular canal. Dose

  14. Innovative Techniques Simplify Vibration Analysis

    NASA Technical Reports Server (NTRS)

    2010-01-01

    In the early years of development, Marshall Space Flight Center engineers encountered challenges related to components in the space shuttle main engine. To assess the problems, they evaluated the effects of vibration and oscillation. To enhance the method of vibration signal analysis, Marshall awarded Small Business Innovation Research (SBIR) contracts to AI Signal Research, Inc. (ASRI), in Huntsville, Alabama. ASRI developed a software package called PC-SIGNAL that NASA now employs on a daily basis, and in 2009, the PKP-Module won Marshall s Software of the Year award. The technology is also used in many industries: aircraft and helicopter, rocket engine manufacturing, transportation, and nuclear power."

  15. A numerical comparison of sensitivity analysis techniques

    SciTech Connect

    Hamby, D.M.

    1993-12-31

    Engineering and scientific phenomena are often studied with the aid of mathematical models designed to simulate complex physical processes. In the nuclear industry, modeling the movement and consequence of radioactive pollutants is extremely important for environmental protection and facility control. One of the steps in model development is the determination of the parameters most influential on model results. A {open_quotes}sensitivity analysis{close_quotes} of these parameters is not only critical to model validation but also serves to guide future research. A previous manuscript (Hamby) detailed many of the available methods for conducting sensitivity analyses. The current paper is a comparative assessment of several methods for estimating relative parameter sensitivity. Method practicality is based on calculational ease and usefulness of the results. It is the intent of this report to demonstrate calculational rigor and to compare parameter sensitivity rankings resulting from various sensitivity analysis techniques. An atmospheric tritium dosimetry model (Hamby) is used here as an example, but the techniques described can be applied to many different modeling problems. Other investigators (Rose; Dalrymple and Broyd) present comparisons of sensitivity analyses methodologies, but none as comprehensive as the current work.

  16. Advanced techniques in current signature analysis

    SciTech Connect

    Smith, S.F.; Castleberry, K.N.

    1992-03-01

    In general, both ac and dc motors can be characterized as weakly nonlinear systems, in which both linear and nonlinear effects occur simultaneously. Fortunately, the nonlinearities are generally well behaved and understood and an be handled via several standard mathematical techniques already well developed in the systems modeling area; examples are piecewise linear approximations and Volterra series representations. Field measurements of numerous motors and motor-driven systems confirm the rather complex nature of motor current spectra and illustrate both linear and nonlinear effects (including line harmonics and modulation components). Although previous current signature analysis (CSA) work at Oak Ridge and other sites has principally focused on the modulation mechanisms and detection methods (AM, PM, and FM), more recent studies have been conducted on linear spectral components (those appearing in the electric current at their actual frequencies and not as modulation sidebands). For example, large axial-flow compressors ({approximately}3300 hp) in the US gaseous diffusion uranium enrichment plants exhibit running-speed ({approximately}20 Hz) and high-frequency vibrational information (>1 kHz) in their motor current spectra. Several signal-processing techniques developed to facilitate analysis of these components, including specialized filtering schemes, are presented. Finally, concepts for the designs of advanced digitally based CSA units are offered, which should serve to foster the development of much more computationally capable ``smart`` CSA instrumentation in the next several years. 3 refs.

  17. Crude oil price forecasting based on hybridizing wavelet multiple linear regression model, particle swarm optimization techniques, and principal component analysis.

    PubMed

    Shabri, Ani; Samsudin, Ruhaidah

    2014-01-01

    Crude oil prices do play significant role in the global economy and are a key input into option pricing formulas, portfolio allocation, and risk measurement. In this paper, a hybrid model integrating wavelet and multiple linear regressions (MLR) is proposed for crude oil price forecasting. In this model, Mallat wavelet transform is first selected to decompose an original time series into several subseries with different scale. Then, the principal component analysis (PCA) is used in processing subseries data in MLR for crude oil price forecasting. The particle swarm optimization (PSO) is used to adopt the optimal parameters of the MLR model. To assess the effectiveness of this model, daily crude oil market, West Texas Intermediate (WTI), has been used as the case study. Time series prediction capability performance of the WMLR model is compared with the MLR, ARIMA, and GARCH models using various statistics measures. The experimental results show that the proposed model outperforms the individual models in forecasting of the crude oil prices series.

  18. Crude oil price forecasting based on hybridizing wavelet multiple linear regression model, particle swarm optimization techniques, and principal component analysis.

    PubMed

    Shabri, Ani; Samsudin, Ruhaidah

    2014-01-01

    Crude oil prices do play significant role in the global economy and are a key input into option pricing formulas, portfolio allocation, and risk measurement. In this paper, a hybrid model integrating wavelet and multiple linear regressions (MLR) is proposed for crude oil price forecasting. In this model, Mallat wavelet transform is first selected to decompose an original time series into several subseries with different scale. Then, the principal component analysis (PCA) is used in processing subseries data in MLR for crude oil price forecasting. The particle swarm optimization (PSO) is used to adopt the optimal parameters of the MLR model. To assess the effectiveness of this model, daily crude oil market, West Texas Intermediate (WTI), has been used as the case study. Time series prediction capability performance of the WMLR model is compared with the MLR, ARIMA, and GARCH models using various statistics measures. The experimental results show that the proposed model outperforms the individual models in forecasting of the crude oil prices series. PMID:24895666

  19. Comparing Techniques for Certified Static Analysis

    NASA Technical Reports Server (NTRS)

    Cachera, David; Pichardie, David

    2009-01-01

    A certified static analysis is an analysis whose semantic validity has been formally proved correct with a proof assistant. The recent increasing interest in using proof assistants for mechanizing programming language metatheory has given rise to several approaches for certification of static analysis. We propose a panorama of these techniques and compare their respective strengths and weaknesses.

  20. Ion Beam Analysis Techniques in Interdisciplinary Applications

    SciTech Connect

    Respaldiza, Miguel A.; Ager, Francisco J.

    1999-12-31

    The ion beam analysis techniques emerge in the last years as one of the main applications of electrostatic accelerators. A short summary of the most used IBA techniques will be given as well as some examples of applications in interdisciplinary sciences.

  1. Ion beam analysis techniques in interdisciplinary applications

    SciTech Connect

    Respaldiza, Miguel A.; Ager, Francisco J.

    1999-11-16

    The ion beam analysis techniques emerge in the last years as one of the main applications of electrostatic accelerators. A short summary of the most used IBA techniques will be given as well as some examples of applications in interdisciplinary sciences.

  2. Relative error covariance analysis techniques and application

    NASA Technical Reports Server (NTRS)

    Wolff, Peter, J.; Williams, Bobby G.

    1988-01-01

    A technique for computing the error covariance of the difference between two estimators derived from different (possibly overlapping) data arcs is presented. The relative error covariance is useful for predicting the achievable consistency between Kalman-Bucy filtered estimates generated from two (not necessarily disjoint) data sets. The relative error covariance analysis technique is then applied to a Venus Orbiter simulation.

  3. A New Microcell Technique for NMR Analysis.

    ERIC Educational Resources Information Center

    Yu, Sophia J.

    1987-01-01

    Describes a new laboratory technique for working with small samples of compounds used in nuclear magnetic resonance (NMR) analysis. Demonstrates how microcells can be constructed for each experiment and samples can be recycled. (TW)

  4. A technique to reduce motion artifact for externally triggered cine-MRI(EC-MRI) based on detecting the onset of the articulated word with spectral analysis.

    PubMed

    Shimada, Yasuhiro; Nishimoto, Hironori; Kochiyama, Takanori; Fujimoto, Ichiro; Mano, Hiroaki; Masaki, Shinobu; Murase, Kenya

    2012-01-01

    One issue in externally triggered cine-magnetic resonance imaging (EC-MRI) for the dynamic observation of speech organs is motion artifact in the phase-encoding direction caused by unstable repetitions of speech during data acquisition. We propose a technique to reduce such artifact by rearranging the k-space data used to reconstruct MR images based on the analysis of recorded speech sounds. We recorded the subject's speech sounds during EC-MRI and used post hoc acoustical processing to reduce scanning noise and detect the onset of each utterance based on analysis of the recorded sounds. We selected each line of k-space from several data acquisition sessions and rearranged them to reconstruct a new series of dynamic MR images according to the analyzed time of utterance onset. Comparative evaluation showed significant reduction in motion artifact signal in the dynamic MR images reconstructed by the proposed method. The quality of the reconstructed images was sufficient to observe the dynamic aspects of speech production mechanisms.

  5. Techniques for Enhancing Web-Based Education.

    ERIC Educational Resources Information Center

    Barbieri, Kathy; Mehringer, Susan

    The Virtual Workshop is a World Wide Web-based set of modules on high performance computing developed at the Cornell Theory Center (CTC) (New York). This approach reaches a large audience, leverages staff effort, and poses challenges for developing interesting presentation techniques. This paper describes the following techniques with their…

  6. Bone feature analysis using image processing techniques.

    PubMed

    Liu, Z Q; Austin, T; Thomas, C D; Clement, J G

    1996-01-01

    In order to establish the correlation between bone structure and age, and information about age-related bone changes, it is necessary to study microstructural features of human bone. Traditionally, in bone biology and forensic science, the analysis if bone cross-sections has been carried out manually. Such a process is known to be slow, inefficient and prone to human error. Consequently, the results obtained so far have been unreliable. In this paper we present a new approach to quantitative analysis of cross-sections of human bones using digital image processing techniques. We demonstrate that such a system is able to extract various bone features consistently and is capable of providing more reliable data and statistics for bones. Consequently, we will be able to correlate features of bone microstructure with age and possibly also with age related bone diseases such as osteoporosis. The development of knowledge-based computer vision-systems for automated bone image analysis can now be considered feasible.

  7. Latent practice profiles of substance abuse treatment counselors: do evidence-based techniques displace traditional techniques?

    PubMed

    Smith, Brenda D; Liu, Junqing

    2014-04-01

    As more substance abuse treatment counselors begin to use evidence-based treatment techniques, questions arise regarding the continued use of traditional techniques. This study aims to (1) assess whether there are meaningful practice profiles among practitioners reflecting distinct combinations of cognitive-behavioral and traditional treatment techniques; and (2) if so, identify practitioner characteristics associated with the distinct practice profiles. Survey data from 278 frontline counselors working in community substance abuse treatment organizations were used to conduct latent profile analysis. The emergent practice profiles illustrate that practitioners vary most in the use of traditional techniques. Multinomial regression models suggest that practitioners with less experience, more education, and less traditional beliefs about treatment and substance abuse are least likely to mix traditional techniques with cognitive-behavioral techniques. Findings add to the understanding of how evidence-based practices are implemented in routine settings and have implications for training and support of substance abuse treatment counselors.

  8. Window technique for climate trend analysis

    NASA Astrophysics Data System (ADS)

    Szentimrey, Tamás; Faragó, Tibor; Szalai, Sándor

    1992-01-01

    Climatic characteristics are affected by various systematic and occasional impacts: besides the changes in the observing system (locations of the stations of the meteorological network, instruments, observing procedures), the possible local-scale and global natural and antropogenic impacts on climatic conditions should be taken into account. Apart from the predictability problems, the phenomenological analysis of the climatic variability and the determination of past persistent climatic anomalies are significant problems, among other aspects, as evidence of the possible anomalous behavior of climate or for climate impact studies. In this paper, a special technique for the identification of such “shifts” in the observational series is presented. The existence of these significant shorter or longer term changes in the mean characteristics for the properly selected adjoining periods of time is the necessary condition for the formation of any more or less unidirectional climatic trends. Actually, the window technique is based on a complete set of orthogonal functions. The sensitivity of the proposed model on its main parameters is also investigated. This method is applied for hemispheric and Hungarian data series of the mean annual surface temperature.

  9. Advanced analysis techniques for uranium assay

    SciTech Connect

    Geist, W. H.; Ensslin, Norbert; Carrillo, L. A.; Beard, C. A.

    2001-01-01

    Uranium has a negligible passive neutron emission rate making its assay practicable only with an active interrogation method. The active interrogation uses external neutron sources to induce fission events in the uranium in order to determine the mass. This technique requires careful calibration with standards that are representative of the items to be assayed. The samples to be measured are not always well represented by the available standards which often leads to large biases. A technique of active multiplicity counting is being developed to reduce some of these assay difficulties. Active multiplicity counting uses the measured doubles and triples count rates to determine the neutron multiplication (f4) and the product of the source-sample coupling ( C ) and the 235U mass (m). Since the 35U mass always appears in the multiplicity equations as the product of Cm, the coupling needs to be determined before the mass can be known. A relationship has been developed that relates the coupling to the neutron multiplication. The relationship is based on both an analytical derivation and also on empirical observations. To determine a scaling constant present in this relationship, known standards must be used. Evaluation of experimental data revealed an improvement over the traditional calibration curve analysis method of fitting the doubles count rate to the 235Um ass. Active multiplicity assay appears to relax the requirement that the calibration standards and unknown items have the same chemical form and geometry.

  10. Exact geometry solid-shell element based on a sampling surfaces technique for 3D stress analysis of doubly-curved composite shells

    NASA Astrophysics Data System (ADS)

    Kulikov, G. M.; Mamontov, A. A.; Plotnikova, S. V.; Mamontov, S. A.

    2015-11-01

    A hybrid-mixed ANS four-node shell element by using the sampling surfaces (SaS) technique is developed. The SaS formulation is based on choosing inside the nth layer In not equally spaced SaS parallel to the middle surface of the shell in order to introduce the displacements of these surfaces as basic shell variables. Such choice of unknowns with the consequent use of Lagrange polynomials of degree In - 1 in the thickness direction for each layer permits the presentation of the layered shell formulation in a very compact form. The SaS are located inside each layer at Chebyshev polynomial nodes that allows one to minimize uniformly the error due to the Lagrange interpolation. To implement the efficient analytical integration throughout the element, the enhanced ANS method is employed. The proposed hybrid-mixed four-node shell element is based on the Hu-Washizu variational equation and exhibits a superior performance in the case of coarse meshes. It could be useful for the 3D stress analysis of thick and thin doubly-curved shells since the SaS formulation gives the possibility to obtain numerical solutions with a prescribed accuracy, which asymptotically approach the exact solutions of elasticity as the number of SaS tends to infinity.

  11. Photomorphic analysis techniques: An interim spatial analysis using satellite remote sensor imagery and historical data

    NASA Technical Reports Server (NTRS)

    Keuper, H. R.; Peplies, R. W.; Gillooly, R. P.

    1977-01-01

    The use of machine scanning and/or computer-based techniques to provide greater objectivity in the photomorphic approach was investigated. Photomorphic analysis and its application in regional planning are discussed. Topics included: delineation of photomorphic regions; inadequacies of existing classification systems; tonal and textural characteristics and signature analysis techniques; pattern recognition and Fourier transform analysis; and optical experiments. A bibliography is included.

  12. Magnetic Analysis Techniques Applied to Desert Varnish

    NASA Technical Reports Server (NTRS)

    Schmidgall, E. R.; Moskowitz, B. M.; Dahlberg, E. D.; Kuhlman, K. R.

    2003-01-01

    Desert varnish is a black or reddish coating commonly found on rock samples from arid regions. Typically, the coating is very thin, less than half a millimeter thick. Previous research has shown that the primary components of desert varnish are silicon oxide clay minerals (60%), manganese and iron oxides (20-30%), and trace amounts of other compounds [1]. Desert varnish is thought to originate when windborne particles containing iron and manganese oxides are deposited onto rock surfaces where manganese oxidizing bacteria concentrate the manganese and form the varnish [4,5]. If desert varnish is indeed biogenic, then the presence of desert varnish on rock surfaces could serve as a biomarker, indicating the presence of microorganisms. This idea has considerable appeal, especially for Martian exploration [6]. Magnetic analysis techniques have not been extensively applied to desert varnish. The only previous magnetic study reported that based on room temperature demagnetization experiments, there were noticeable differences in magnetic properties between a sample of desert varnish and the substrate sandstone [7]. Based upon the results of the demagnetization experiments, the authors concluded that the primary magnetic component of desert varnish was either magnetite (Fe3O4) or maghemite ( Fe2O3).

  13. Emerging techniques for ultrasensitive protein analysis.

    PubMed

    Yang, Xiaolong; Tang, Yanan; Alt, Ryan R; Xie, Xiaoyu; Li, Feng

    2016-06-21

    Many important biomarkers for devastating diseases and biochemical processes are proteins present at ultralow levels. Traditional techniques, such as enzyme-linked immunosorbent assays (ELISA), mass spectrometry, and protein microarrays, are often not sensitive enough to detect proteins with concentrations below the picomolar level, thus requiring the development of analytical techniques with ultrahigh sensitivities. In this review, we highlight the recent advances in developing novel techniques, sensors, and assays for ultrasensitive protein analysis. Particular attention will be focused on three classes of signal generation and/or amplification mechanisms, including the uses of nanomaterials, nucleic acids, and digital platforms. PMID:26898911

  14. A review of sensitivity analysis techniques

    SciTech Connect

    Hamby, D.M.

    1993-12-31

    Mathematical models are utilized to approximate various highly complex engineering, physical, environmental, social, and economic phenomena. Model parameters exerting the most influence on model results are identified through a {open_quotes}sensitivity analysis.{close_quotes} A comprehensive review is presented of more than a dozen sensitivity analysis methods. The most fundamental of sensitivity techniques utilizes partial differentiation whereas the simplest approach requires varying parameter values one-at-a-time. Correlation analysis is used to determine relationships between independent and dependent variables. Regression analysis provides the most comprehensive sensitivity measure and is commonly utilized to build response surfaces that approximate complex models.

  15. Quantifying sources of black carbon in western North America using observationally based analysis and an emission tagging technique in the Community Atmosphere Model

    NASA Astrophysics Data System (ADS)

    Zhang, R.; Wang, H.; Hegg, D. A.; Qian, Y.; Doherty, S. J.; Dang, C.; Ma, P.-L.; Rasch, P. J.; Fu, Q.

    2015-11-01

    The Community Atmosphere Model (CAM5), equipped with a technique to tag black carbon (BC) emissions by source regions and types, has been employed to establish source-receptor relationships for atmospheric BC and its deposition to snow over western North America. The CAM5 simulation was conducted with meteorological fields constrained by reanalysis for year 2013 when measurements of BC in both near-surface air and snow are available for model evaluation. We find that CAM5 has a significant low bias in predicted mixing ratios of BC in snow but only a small low bias in predicted atmospheric concentrations over northwestern USA and western Canada. Even with a strong low bias in snow mixing ratios, radiative transfer calculations show that the BC-in-snow darkening effect is substantially larger than the BC dimming effect at the surface by atmospheric BC. Local sources contribute more to near-surface atmospheric BC and to deposition than distant sources, while the latter are more important in the middle and upper troposphere where wet removal is relatively weak. Fossil fuel (FF) is the dominant source type for total column BC burden over the two regions. FF is also the dominant local source type for BC column burden, deposition, and near-surface BC, while for all distant source regions combined the contribution of biomass/biofuel (BB) is larger than FF. An observationally based positive matrix factorization (PMF) analysis of the snow-impurity chemistry is conducted to quantitatively evaluate the CAM5 BC source-type attribution. While CAM5 is qualitatively consistent with the PMF analysis with respect to partitioning of BC originating from BB and FF emissions, it significantly underestimates the relative contribution of BB. In addition to a possible low bias in BB emissions used in the simulation, the model is likely missing a significant source of snow darkening from local soil found in the observations.

  16. Quantifying sources of black carbon in western North America using observationally based analysis and an emission tagging technique in the Community Atmosphere Model

    DOE PAGES

    Zhang, R.; Wang, H.; Hegg, D. A.; Qian, Y.; Doherty, S. J.; Dang, C.; Ma, P.-L.; Rasch, P. J.; Fu, Q.

    2015-11-18

    The Community Atmosphere Model (CAM5), equipped with a technique to tag black carbon (BC) emissions by source regions and types, has been employed to establish source–receptor relationships for atmospheric BC and its deposition to snow over western North America. The CAM5 simulation was conducted with meteorological fields constrained by reanalysis for year 2013 when measurements of BC in both near-surface air and snow are available for model evaluation. We find that CAM5 has a significant low bias in predicted mixing ratios of BC in snow but only a small low bias in predicted atmospheric concentrations over northwestern USA and westernmore » Canada. Even with a strong low bias in snow mixing ratios, radiative transfer calculations show that the BC-in-snow darkening effect is substantially larger than the BC dimming effect at the surface by atmospheric BC. Local sources contribute more to near-surface atmospheric BC and to deposition than distant sources, while the latter are more important in the middle and upper troposphere where wet removal is relatively weak. Fossil fuel (FF) is the dominant source type for total column BC burden over the two regions. FF is also the dominant local source type for BC column burden, deposition, and near-surface BC, while for all distant source regions combined the contribution of biomass/biofuel (BB) is larger than FF. An observationally based positive matrix factorization (PMF) analysis of the snow-impurity chemistry is conducted to quantitatively evaluate the CAM5 BC source-type attribution. While CAM5 is qualitatively consistent with the PMF analysis with respect to partitioning of BC originating from BB and FF emissions, it significantly underestimates the relative contribution of BB. In addition to a possible low bias in BB emissions used in the simulation, the model is likely missing a significant source of snow darkening from local soil found in the observations.« less

  17. Quantifying sources of black carbon in Western North America using observationally based analysis and an emission tagging technique in the Community Atmosphere Model

    DOE PAGES

    Zhang, R.; Wang, H.; Hegg, D. A.; Qian, Y.; Doherty, S. J.; Dang, C.; Ma, P.-L.; Rasch, P. J.; Fu, Q.

    2015-05-04

    The Community Atmosphere Model (CAM5), equipped with a technique to tag black carbon (BC) emissions by source regions and types, has been employed to establish source-receptor relationships for atmospheric BC and its deposition to snow over Western North America. The CAM5 simulation was conducted with meteorological fields constrained by reanalysis for year 2013 when measurements of BC in both near-surface air and snow are available for model evaluation. We find that CAM5 has a significant low bias in predicted mixing ratios of BC in snow but only a small low bias in predicted atmospheric concentrations over the Northwest USA andmore » West Canada. Even with a strong low bias in snow mixing ratios, radiative transfer calculations show that the BC-in-snow darkening effect is substantially larger than the BC dimming effect at the surface by atmospheric BC. Local sources contribute more to near-surface atmospheric BC and to deposition than distant sources, while the latter are more important in the middle and upper troposphere where wet removal is relatively weak. Fossil fuel (FF) is the dominant source type for total column BC burden over the two regions. FF is also the dominant local source type for BC column burden, deposition, and near-surface BC, while for all distant source regions combined the contribution of biomass/biofuel (BB) is larger than FF. An observationally based Positive Matrix Factorization (PMF) analysis of the snow-impurity chemistry is conducted to quantitatively evaluate the CAM5 BC source-type attribution. While CAM5 is qualitatively consistent with the PMF analysis with respect to partitioning of BC originating from BB and FF emissions, it significantly underestimates the relative contribution of BB. In addition to a possible low bias in BB emissions used in the simulation, the model is likely missing a significant source of snow darkening from local soil found in the observations.« less

  18. Quantifying sources of black carbon in Western North America using observationally based analysis and an emission tagging technique in the Community Atmosphere Model

    SciTech Connect

    Zhang, Rudong; Wang, Hailong; Hegg, D. A.; Qian, Yun; Doherty, Sarah J.; Dang, Cheng; Ma, Po-Lun; Rasch, Philip J.; Fu, Qiang

    2015-11-18

    The Community Atmosphere Model (CAM5), equipped with a technique to tag black carbon (BC) emissions by source regions and types, has been employed to establish source-receptor relationships for atmospheric BC and its deposition to snow over Western North America. The CAM5 simulation was conducted with meteorological fields constrained by reanalysis for year 2013 when measurements of BC in both near-surface air and snow are available for model evaluation. We find that CAM5 has a significant low bias in predicted mixing ratios of BC in snow but only a small low bias in predicted atmospheric concentrations over the Northwest USA and West Canada. Even with a strong low bias in snow mixing ratios, radiative transfer calculations show that the BC-in-snow darkening effect is substantially larger than the BC dimming effect at the surface by atmospheric BC. Local sources contribute more to near-surface atmospheric BC and to deposition than distant sources, while the latter are more important in the middle and upper troposphere where wet removal is relatively weak. Fossil fuel (FF) is the dominant source type for total column BC burden over the two regions. FF is also the dominant local source type for BC column burden, deposition, and near-surface BC, while for all distant source regions combined the contribution of biomass/biofuel (BB) is larger than FF. An observationally based Positive Matrix Factorization (PMF) analysis of the snow-impurity chemistry is conducted to quantitatively evaluate the CAM5 BC source-type attribution. While CAM5 is qualitatively consistent with the PMF analysis with respect to partitioning of BC originating from BB and FF emissions, it significantly underestimates the relative contribution of BB. In addition to a possible low bias in BB emissions used in the simulation, the model is likely missing a significant source of snow darkening from local soil found in the observations.

  19. Techniques for the Analysis of Human Movement.

    ERIC Educational Resources Information Center

    Grieve, D. W.; And Others

    This book presents the major analytical techniques that may be used in the appraisal of human movement. Chapter 1 is devoted to the photopgraphic analysis of movement with particular emphasis on cine filming. Cine film may be taken with little or no restriction on the performer's range of movement; information on the film is permanent and…

  20. Training Parents in Behavior Analysis Techniques.

    ERIC Educational Resources Information Center

    Perelman, Phyllis F.; Hanley, Edward M.

    This document discusses state-designed workshops which have provided training in behavior analysis techniques to parents. Through information gained from bimonthly meetings and frequent monitoring by workshop leaders and graduate students enrolled in the Special Education Area of the University of Vermont, parents have developed and implemented…

  1. Aerosol particle analysis by Raman scattering technique

    SciTech Connect

    Fung, K.H.; Tang, I.N.

    1992-10-01

    Laser Raman spectroscopy is a very versatile tool for chemical characterization of micron-sized particles. Such particles are abundant in nature, and in numerous energy-related processes. In order to elucidate the formation mechanisms and understand the subsequent chemical transformation under a variety of reaction conditions, it is imperative to develop analytical measurement techniques for in situ monitoring of these suspended particles. In this report, we outline our recent work on spontaneous Raman, resonance Raman and non-linear Raman scattering as a novel technique for chemical analysis of aerosol particles as well as supersaturated solution droplets.

  2. Applications Of Binary Image Analysis Techniques

    NASA Astrophysics Data System (ADS)

    Tropf, H.; Enderle, E.; Kammerer, H. P.

    1983-10-01

    After discussing the conditions where binary image analysis techniques can be used, three new applications of the fast binary image analysis system S.A.M. (Sensorsystem for Automation and Measurement) are reported: (1) The human view direction is measured at TV frame rate while the subject's head is free movable. (2) Industrial parts hanging on a moving conveyor are classified prior to spray painting by robot. (3) In automotive wheel assembly, the eccentricity of the wheel is minimized by turning the tyre relative to the rim in order to balance the eccentricity of the components.

  3. Infusing Reliability Techniques into Software Safety Analysis

    NASA Technical Reports Server (NTRS)

    Shi, Ying

    2015-01-01

    Software safety analysis for a large software intensive system is always a challenge. Software safety practitioners need to ensure that software related hazards are completely identified, controlled, and tracked. This paper discusses in detail how to incorporate the traditional reliability techniques into the entire software safety analysis process. In addition, this paper addresses how information can be effectively shared between the various practitioners involved in the software safety analyses. The author has successfully applied the approach to several aerospace applications. Examples are provided to illustrate the key steps of the proposed approach.

  4. Fibre based integral field unit constructional techniques

    NASA Astrophysics Data System (ADS)

    Murray, Graham J.

    2006-06-01

    Presented here is a selected overview of constructional techniques and principles that have been developed and implemented at the University of Durham in the manufacture of successful fibre-based integral field units. The information contained herein is specifically intended to highlight the constructional methods that have been devised to assemble an efficient fibre bundle. Potential pitfalls that need to be considered when embarking upon such a deceptively simple instrument are also discussed.

  5. Analysis techniques for residual acceleration data

    NASA Technical Reports Server (NTRS)

    Rogers, Melissa J. B.; Alexander, J. Iwan D.; Snyder, Robert S.

    1990-01-01

    Various aspects of residual acceleration data are of interest to low-gravity experimenters. Maximum and mean values and various other statistics can be obtained from data as collected in the time domain. Additional information may be obtained through manipulation of the data. Fourier analysis is discussed as a means of obtaining information about dominant frequency components of a given data window. Transformation of data into different coordinate axes is useful in the analysis of experiments with different orientations and can be achieved by the use of a transformation matrix. Application of such analysis techniques to residual acceleration data provides additional information than what is provided in a time history and increases the effectiveness of post-flight analysis of low-gravity experiments.

  6. Accelerometer Data Analysis and Presentation Techniques

    NASA Technical Reports Server (NTRS)

    Rogers, Melissa J. B.; Hrovat, Kenneth; McPherson, Kevin; Moskowitz, Milton E.; Reckart, Timothy

    1997-01-01

    The NASA Lewis Research Center's Principal Investigator Microgravity Services project analyzes Orbital Acceleration Research Experiment and Space Acceleration Measurement System data for principal investigators of microgravity experiments. Principal investigators need a thorough understanding of data analysis techniques so that they can request appropriate analyses to best interpret accelerometer data. Accelerometer data sampling and filtering is introduced along with the related topics of resolution and aliasing. Specific information about the Orbital Acceleration Research Experiment and Space Acceleration Measurement System data sampling and filtering is given. Time domain data analysis techniques are discussed and example environment interpretations are made using plots of acceleration versus time, interval average acceleration versus time, interval root-mean-square acceleration versus time, trimmean acceleration versus time, quasi-steady three dimensional histograms, and prediction of quasi-steady levels at different locations. An introduction to Fourier transform theory and windowing is provided along with specific analysis techniques and data interpretations. The frequency domain analyses discussed are power spectral density versus frequency, cumulative root-mean-square acceleration versus frequency, root-mean-square acceleration versus frequency, one-third octave band root-mean-square acceleration versus frequency, and power spectral density versus frequency versus time (spectrogram). Instructions for accessing NASA Lewis Research Center accelerometer data and related information using the internet are provided.

  7. Laser Remote Sensing: Velocimetry Based Techniques

    NASA Astrophysics Data System (ADS)

    Molebny, Vasyl; Steinvall, Ove

    Laser-based velocity measurement is an area of the field of remote sensing where the coherent properties of laser radiation are the most exposed. Much of the published literature deals with the theory and techniques of remote sensing. We restrict our discussion to current trends in this area, gathered from recent conferences and professional journals. Remote wind sensing and vibrometry are promising in their new scientific, industrial, military, and biomedical applications, including improving flight safety, precise weapon correction, non-contact mine detection, optimization of wind farm operation, object identification based on its vibration signature, fluid flow studies, and vibrometry-associated diagnosis.

  8. Respiratory monitoring system based on the nasal pressure technique for the analysis of sleep breathing disorders: Reduction of static and dynamic errors, and comparisons with thermistors and pneumotachographs

    NASA Astrophysics Data System (ADS)

    Alves de Mesquita, Jayme; Lopes de Melo, Pedro

    2004-03-01

    Thermally sensitive devices—thermistors—have usually been used to monitor sleep-breathing disorders. However, because of their long time constant, these devices are not able to provide a good characterization of fast events, like hypopneas. Nasal pressure recording technique (NPR) has recently been suggested to quantify airflow during sleep. It is claimed that the short time constants of the devices used to implement this technique would allow an accurate analysis of fast abnormal respiratory events. However, these devices present errors associated with nonlinearities and acoustic resonance that could reduce the diagnostic value of the NPR. Moreover, in spite of the high scientific and clinical potential, there is no detailed description of a complete instrumentation system to implement this promising technique in sleep studies. In this context, the purpose of this work was twofold: (1) describe the development of a flexible NPR device and (2) evaluate the performance of this device when compared to pneumotachographs (PNTs) and thermistors. After the design details are described, the system static accuracy is evaluated by a comparative analysis with a PNT. This analysis revealed a significant reduction (p<0.001) of the static error when system nonlinearities were reduced. The dynamic performance of the NPR system was investigated by frequency response analysis and time constant evaluations and the results showed that the developed device response was as good as PNT and around 100 times faster (τ=5,3 ms) than thermistors (τ=512 ms). Experimental results obtained in simulated clinical conditions and in a patient are presented as examples, and confirmed the good features achieved in engineering tests. These results are in close agreement with physiological fundamentals, supplying substantial evidence that the improved dynamic and static characteristics of this device can contribute to a more accurate implementation of medical research projects and to improve the

  9. Automated quantitative analysis of in-situ NaI measured spectra in the marine environment using a wavelet-based smoothing technique.

    PubMed

    Tsabaris, Christos; Prospathopoulos, Aristides

    2011-10-01

    An algorithm for automated analysis of in-situ NaI γ-ray spectra in the marine environment is presented. A standard wavelet denoising technique is implemented for obtaining a smoothed spectrum, while the stability of the energy spectrum is achieved by taking advantage of the permanent presence of two energy lines in the marine environment. The automated analysis provides peak detection, net area calculation, energy autocalibration, radionuclide identification and activity calculation. The results of the algorithm performance, presented for two different cases, show that analysis of short-term spectra with poor statistical information is considerably improved and that incorporation of further advancements could allow the use of the algorithm in early-warning marine radioactivity systems. PMID:21742510

  10. Hybrid perturbation/Bubnov-Galerkin technique for nonlinear thermal analysis

    NASA Technical Reports Server (NTRS)

    Noor, A. K.; Balch, C. D.

    1983-01-01

    A two step hybrid analysis technique to predict the nonlinear steady state temperature distribution in structures and solids is presented. The technique is based on the regular perturbation expansion and the classical Bubnov-Galerkin approximation. The functions are obtained by using the regular perturbation method. These functions are selected as coordinate functions and the classical Bubnov-Galerkin technique is used to compute their amplitudes. The potential of the proposed hybrid technique for the solution of nonlinear thermal problems is discussed. The effectiveness of this technique is demonstrated by the effects of conduction, convection, and radiation modes of heat transfer. It is indicated that the hybrid technique overcomes the two major drawbacks of the classical techniques: (1) the requirement of using a small parameter in the regular perturbation method; and (2) the arbitrariness in the choice of the coordinate functions in the Bubnov-Galerkin technique. The proposed technique extends the range of applicability of the regular perturbation method and enhances the effectiveness of the Bubnov-Galerkin technique.

  11. Application of spectral analysis techniques to the intercomparison of aerosol data - Part 4: Combined maximum covariance analysis to bridge the gap between multi-sensor satellite retrievals and ground-based measurements

    NASA Astrophysics Data System (ADS)

    Li, J.; Carlson, B. E.; Lacis, A. A.

    2014-04-01

    The development of remote sensing techniques has greatly advanced our knowledge of atmospheric aerosols. Various satellite sensors and the associated retrieval algorithms all add to the information of global aerosol variability, while well-designed surface networks provide time series of highly accurate measurements at specific locations. In studying the variability of aerosol properties, aerosol climate effects, and constraining aerosol fields in climate models, it is essential to make the best use of all of the available information. In the previous three parts of this series, we demonstrated the usefulness of several spectral decomposition techniques in the analysis and comparison of temporal and spatial variability of aerosol optical depth using satellite and ground-based measurements. Specifically, Principal Component Analysis (PCA) successfully captures and isolates seasonal and interannual variability from different aerosol source regions, Maximum Covariance Analysis (MCA) provides a means to verify the variability in one satellite dataset against Aerosol Robotic Network (AERONET) data, and Combined Principal Component Analysis (CPCA) realized parallel comparison among multi-satellite, multi-sensor datasets. As the final part of the study, this paper introduces a novel technique that integrates both multi-sensor datasets and ground observations, and thus effectively bridges the gap between these two types of measurements. The Combined Maximum Covariance Analysis (CMCA) decomposes the cross covariance matrix between the combined multi-sensor satellite data field and AERONET station data. We show that this new method not only confirms the seasonal and interannual variability of aerosol optical depth, aerosol source regions and events represented by different satellite datasets, but also identifies the strengths and weaknesses of each dataset in capturing the variability associated with sources, events or aerosol types. Furthermore, by examining the spread of

  12. DETECTION OF DNA DAMAGE USING MELTING ANALYSIS TECHNIQUES

    EPA Science Inventory

    A rapid and simple fluorescence screening assay for UV radiation-, chemical-, and enzyme-induced DNA damage is reported. This assay is based on a melting/annealing analysis technique and has been used with both calf thymus DNA and plasmid DNA (puc 19 plasmid from E. coli). DN...

  13. UPLC: a preeminent technique in pharmaceutical analysis.

    PubMed

    Kumar, Ashok; Saini, Gautam; Nair, Anroop; Sharma, Rishbha

    2012-01-01

    The pharmaceutical companies today are driven to create novel and more efficient tools to discover, develop, deliver and monitor the drugs. In this contest the development of rapid chromatographic method is crucial for the analytical laboratories. In precedent decade, substantial technological advances have been done in enhancing particle chemistry performance, improving detector design and in optimizing the system, data processors and various controls of chromatographic techniques. When all was blended together, it resulted in the outstanding performance via ultra-high performance liquid chromatography (UPLC), which holds back the principle of HPLC technique. UPLC shows a dramatic enhancement in speed, resolution as well as the sensitivity of analysis by using particle size less than 2 pm and the system is operational at higher pressure, while the mobile phase could be able to run at greater linear velocities as compared to HPLC. This technique is considered as a new focal point in field of liquid chromatographic studies. This review focuses on the basic principle, instrumentation of UPLC and its advantages over HPLC, furthermore, this article emphasizes various pharmaceutical applications of this technique.

  14. Flash Infrared Thermography Contrast Data Analysis Technique

    NASA Technical Reports Server (NTRS)

    Koshti, Ajay

    2014-01-01

    This paper provides information on an IR Contrast technique that involves extracting normalized contrast versus time evolutions from the flash thermography inspection infrared video data. The analysis calculates thermal measurement features from the contrast evolution. In addition, simulation of the contrast evolution is achieved through calibration on measured contrast evolutions from many flat-bottom holes in the subject material. The measurement features and the contrast simulation are used to evaluate flash thermography data in order to characterize delamination-like anomalies. The thermal measurement features relate to the anomaly characteristics. The contrast evolution simulation is matched to the measured contrast evolution over an anomaly to provide an assessment of the anomaly depth and width which correspond to the depth and diameter of the equivalent flat-bottom hole (EFBH) similar to that used as input to the simulation. A similar analysis, in terms of diameter and depth of an equivalent uniform gap (EUG) providing a best match with the measured contrast evolution, is also provided. An edge detection technique called the half-max is used to measure width and length of the anomaly. Results of the half-max width and the EFBH/EUG diameter are compared to evaluate the anomaly. The information provided here is geared towards explaining the IR Contrast technique. Results from a limited amount of validation data on reinforced carbon-carbon (RCC) hardware are included in this paper.

  15. An analysis of I/O efficient order-statistic-based techniques for noise power estimation in the HRMS sky survey's operational system

    NASA Technical Reports Server (NTRS)

    Zimmerman, G. A.; Olsen, E. T.

    1992-01-01

    Noise power estimation in the High-Resolution Microwave Survey (HRMS) sky survey element is considered as an example of a constant false alarm rate (CFAR) signal detection problem. Order-statistic-based noise power estimators for CFAR detection are considered in terms of required estimator accuracy and estimator dynamic range. By limiting the dynamic range of the value to be estimated, the performance of an order-statistic estimator can be achieved by simpler techniques requiring only a single pass of the data. Simple threshold-and-count techniques are examined, and it is shown how several parallel threshold-and-count estimation devices can be used to expand the dynamic range to meet HRMS system requirements with minimal hardware complexity. An input/output (I/O) efficient limited-precision order-statistic estimator with wide but limited dynamic range is also examined.

  16. Advanced automated char image analysis techniques

    SciTech Connect

    Tao Wu; Edward Lester; Michael Cloke

    2006-05-15

    Char morphology is an important characteristic when attempting to understand coal behavior and coal burnout. In this study, an augmented algorithm has been proposed to identify char types using image analysis. On the basis of a series of image processing steps, a char image is singled out from the whole image, which then allows the important major features of the char particle to be measured, including size, porosity, and wall thickness. The techniques for automated char image analysis have been tested against char images taken from ICCP Char Atlas as well as actual char particles derived from pyrolyzed char samples. Thirty different chars were prepared in a drop tube furnace operating at 1300{sup o}C, 1% oxygen, and 100 ms from 15 different world coals sieved into two size fractions (53-75 and 106-125 {mu}m). The results from this automated technique are comparable with those from manual analysis, and the additional detail from the automated sytem has potential use in applications such as combustion modeling systems. Obtaining highly detailed char information with automated methods has traditionally been hampered by the difficulty of automatic recognition of individual char particles. 20 refs., 10 figs., 3 tabs.

  17. Further development of ultrasonic techniques for non-destructive evaluation based on Fourier analysis of signals from irregular and inhomogeneous structures

    NASA Technical Reports Server (NTRS)

    Miller, J. G.

    1979-01-01

    To investigate the use of Fourier analysis techniques model systems had to be designed to test some of the general properties of the interaction of sound with an inhomogeneity. The first models investigated were suspensions of solid spheres in water. These systems allowed comparison between theoretical computation of the frequency dependence of the attenuation coefficient and measurement of the attenuation coefficient over a range of frequencies. Ultrasonic scattering processes in both suspensions of hard spheres in water, and suspensions of hard spheres in polyester resin were investigated. The second model system was constructed to test the applicability of partial wave analysis to the description of an inhomogeneity in a solid, and to test the range of material properties over which the measurement systems were valid.

  18. Visual exploratory analysis of integrated chromosome 19 proteomic data derived from glioma cancer stem-cell lines based on novel nonlinear dimensional data reduction techniques

    NASA Astrophysics Data System (ADS)

    Lespinats, Sylvain; Pinker-Domenig, Katja; Meyer-Bäse, Uwe; Meyer-Bäse, Anke

    2015-05-01

    Chromosome 19 is known to be linked to neurodegeneration and many cancers. Glioma-derived cancer stem cells (GSCs) are tumor-initiating cells and may be refractory to radiation and chemotherapy and thus have important implications for tumor biology and therapeutics. The analysis and interpretation of large proteomic data sets requires the development of new data mining and visualization approaches. Traditional techniques are insufficient to interpret and visualize these resulting experimental data. The emphasis of this paper lies in the presentation of novel approaches for the visualization, clustering and projection representation to unveil hidden data structures relevant for the accurate interpretation of biological experiments. These qualitative and quantitative methods are applied to the proteomic analysis of data sets derived from the GSCs. The achieved clustering and visualization results provide a more detailed insight into the expression patterns for chromosome 19 proteins.

  19. A comparison of geostatistically based inverse techniques for use in performance assessment analysis at the Waste Isolation Pilot Plant Site: Results from Test Case No. 1

    SciTech Connect

    Zimmerman, D.A.; Gallegos, D.P.

    1993-10-01

    The groundwater flow pathway in the Culebra Dolomite aquifer at the Waste Isolation Pilot Plant (WIPP) has been identified as a potentially important pathway for radionuclide migration to the accessible environment. Consequently, uncertainties in the models used to describe flow and transport in the Culebra need to be addressed. A ``Geostatistics Test Problem`` is being developed to evaluate a number of inverse techniques that may be used for flow calculations in the WIPP performance assessment (PA). The Test Problem is actually a series of test cases, each being developed as a highly complex synthetic data set; the intent is for the ensemble of these data sets to span the range of possible conceptual models of groundwater flow at the WIPP site. The Test Problem analysis approach is to use a comparison of the probabilistic groundwater travel time (GWTT) estimates produced by each technique as the basis for the evaluation. Participants are given observations of head and transmissivity (possibly including measurement error) or other information such as drawdowns from pumping wells, and are asked to develop stochastic models of groundwater flow for the synthetic system. Cumulative distribution functions (CDFs) of groundwater flow (computed via particle tracking) are constructed using the head and transmissivity data generated through the application of each technique; one semi-analytical method generates the CDFs of groundwater flow directly. This paper describes the results from Test Case No. 1.

  20. Advanced NMR-based techniques for pore structure analysis of coal. Quarter report No. 4, 1 October 1992--30 December 1992

    SciTech Connect

    Smith, D.M.

    1992-12-31

    One of the main problems in coal utilization is the inability to properly characterize its complex pore structure. Coals typically have micro/ultra-micro pores but they also exhibit meso and macroporosity. Conventional pore size techniques (adsorption/condensation, mercury porosimetry) are limited because of this broad pore size range, microporosity, reactive nature of coal, samples must be completely dried, and network/percolation effects. Small angle scattering is limited because it probes both open and closed pores. Although one would not expect any single technique to provide a satisfactory description of a coal`s structure, it is apparent that better techniques are necessary. We believe that measurement of the NMR parameters of various gas phase and adsorbed phase NMR active probes can provide the resolution to this problem. We will investigate the dependence of the common NMR parameters such as chemical shifts and relaxation times of several different nuclei and compounds on the pore structure of model microporous solids, carbons, and coals. In particular, we will study the interaction between several small molecules and the pore surfaces in coals. These molecules have been selected for their chemical and physical properties. A special NMR probe will be constructed which will allow the concurrent measurement of NMR properties and adsorption uptake at a variety of temperatures. All samples will be subjected to a suite of ``conventional`` pore structure analyses. These include nitrogen adsorption at 77 K with BET analysis, CO{sub 2} and CH{sub 4} adsorption at 273 K with D-R (Dubinin-Radushkevich) analysis, helium pycnometry, and small angle X-ray scattering as well as gas diffusion measurements.

  1. Flood alert system based on bayesian techniques

    NASA Astrophysics Data System (ADS)

    Gulliver, Z.; Herrero, J.; Viesca, C.; Polo, M. J.

    2012-04-01

    The problem of floods in the Mediterranean regions is closely linked to the occurrence of torrential storms in dry regions, where even the water supply relies on adequate water management. Like other Mediterranean basins in Southern Spain, the Guadalhorce River Basin is a medium sized watershed (3856 km2) where recurrent yearly floods occur , mainly in autumn and spring periods, driven by cold front phenomena. The torrential character of the precipitation in such small basins, with a concentration time of less than 12 hours, produces flash flood events with catastrophic effects over the city of Malaga (600000 inhabitants). From this fact arises the need for specific alert tools which can forecast these kinds of phenomena. Bayesian networks (BN) have been emerging in the last decade as a very useful and reliable computational tool for water resources and for the decision making process. The joint use of Artificial Neural Networks (ANN) and BN have served us to recognize and simulate the two different types of hydrological behaviour in the basin: natural and regulated. This led to the establishment of causal relationships between precipitation, discharge from upstream reservoirs, and water levels at a gauging station. It was seen that a recurrent ANN model working at an hourly scale, considering daily precipitation and the two previous hourly values of reservoir discharge and water level, could provide R2 values of 0.86. BN's results slightly improve this fit, but contribute with uncertainty to the prediction. In our current work to Design a Weather Warning Service based on Bayesian techniques the first steps were carried out through an analysis of the correlations between the water level and rainfall at certain representative points in the basin, along with the upstream reservoir discharge. The lower correlation found between precipitation and water level emphasizes the highly regulated condition of the stream. The autocorrelations of the variables were also

  2. Artificial Intelligence based technique for BTS placement

    NASA Astrophysics Data System (ADS)

    Alenoghena, C. O.; Emagbetere, J. O.; Aibinu, A. M.

    2013-12-01

    The increase of the base transceiver station (BTS) in most urban areas can be traced to the drive by network providers to meet demand for coverage and capacity. In traditional network planning, the final decision of BTS placement is taken by a team of radio planners, this decision is not fool proof against regulatory requirements. In this paper, an intelligent based algorithm for optimal BTS site placement has been proposed. The proposed technique takes into consideration neighbour and regulation considerations objectively while determining cell site. The application will lead to a quantitatively unbiased evaluated decision making process in BTS placement. An experimental data of a 2km by 3km territory was simulated for testing the new algorithm, results obtained show a 100% performance of the neighbour constrained algorithm in BTS placement optimization. Results on the application of GA with neighbourhood constraint indicate that the choices of location can be unbiased and optimization of facility placement for network design can be carried out.

  3. Multiclass pesticide analysis in fruit-based baby food: A comparative study of sample preparation techniques previous to gas chromatography-mass spectrometry.

    PubMed

    Petrarca, Mateus H; Fernandes, José O; Godoy, Helena T; Cunha, Sara C

    2016-12-01

    With the aim to develop a new gas chromatography-mass spectrometry method to analyze 24 pesticide residues in baby foods at the level imposed by established regulation two simple, rapid and environmental-friendly sample preparation techniques based on QuEChERS (quick, easy, cheap, effective, robust and safe) were compared - QuEChERS with dispersive liquid-liquid microextraction (DLLME) and QuEChERS with dispersive solid-phase extraction (d-SPE). Both sample preparation techniques achieved suitable performance criteria, including selectivity, linearity, acceptable recovery (70-120%) and precision (⩽20%). A higher enrichment factor was observed for DLLME and consequently better limits of detection and quantification were obtained. Nevertheless, d-SPE provided a more effective removal of matrix co-extractives from extracts than DLLME, which contributed to lower matrix effects. Twenty-two commercial fruit-based baby food samples were analyzed by the developed method, being procymidone detected in one sample at a level above the legal limit established by EU.

  4. Debonding damage analysis in composite-masonry strengthening systems with polymer- and mortar-based matrix by means of the acoustic emission technique

    NASA Astrophysics Data System (ADS)

    Verstrynge, E.; Wevers, M.; Ghiassi, B.; Lourenço, P. B.

    2016-01-01

    Different types of strengthening systems, based on fiber reinforced materials, are under investigation for external strengthening of historic masonry structures. A full characterization of the bond behavior and of the short- and long-term failure mechanisms is crucial to ensure effective design, compatibility with the historic substrate and durability of the strengthening solution. Therein, non-destructive techniques are essential for bond characterization, durability assessment and on-site condition monitoring. In this paper, the acoustic emission (AE) technique is evaluated for debonding characterization and localization on fiber reinforced polymer (FRP) and steel reinforced grout-strengthened clay bricks. Both types of strengthening systems are subjected to accelerated ageing tests under thermal cycles and to single-lap shear bond tests. During the reported experimental campaign, AE data from the accelerated ageing tests demonstrated the thermal incompatibility between brick and epoxy-bonded FRP composites, and debonding damage was successfully detected, characterized and located. In addition, a qualitative comparison is made with digital image correlation and infrared thermography, in view of efficient on-site debonding detection.

  5. Laser Scanning–Based Tissue Autofluorescence/Fluorescence Imaging (LS-TAFI), a New Technique for Analysis of Microanatomy in Whole-Mount Tissues

    PubMed Central

    Mori, Hidetoshi; Borowsky, Alexander D.; Bhat, Ramray; Ghajar, Cyrus M.; Seiki, Motoharu; Bissell, Mina J.

    2012-01-01

    Intact organ structure is essential in maintaining tissue specificity and cellular differentiation. Small physiological or genetic variations lead to changes in microanatomy that, if persistent, could have functional consequences and may easily be masked by the heterogeneity of tissue anatomy. Current imaging techniques rely on histological, two-dimensional sections requiring sample manipulation that are essentially two dimensional. We have developed a method for three-dimensional imaging of whole-mount, unsectioned mammalian tissues to elucidate subtle and detailed micro- and macroanatomies in adult organs and embryos. We analyzed intact or dissected organ whole mounts with laser scanning–based tissue autofluorescence/fluorescence imaging (LS-TAFI). We obtained clear visualization of microstructures within murine mammary glands and mammary tumors and other organs without the use of immunostaining and without probes or fluorescent reporter genes. Combining autofluorescence with reflected light signals from chromophore-stained tissues allowed identification of individual cells within three-dimensional structures of whole-mounted organs. This technique could be useful for rapid diagnosis of human clinical samples and possibly the effect of subtle variations such as low dose radiation. PMID:22542846

  6. Modular Sampling and Analysis Techniques for the Real-Time Analysis of Human Breath

    SciTech Connect

    Frank, M; Farquar, G; Adams, K; Bogan, M; Martin, A; Benner, H; Spadaccini, C; Steele, P; Davis, C; Loyola, B; Morgan, J; Sankaran, S

    2007-07-09

    At LLNL and UC Davis, we are developing several techniques for the real-time sampling and analysis of trace gases, aerosols and exhaled breath that could be useful for a modular, integrated system for breath analysis. Those techniques include single-particle bioaerosol mass spectrometry (BAMS) for the analysis of exhaled aerosol particles or droplets as well as breath samplers integrated with gas chromatography mass spectrometry (GC-MS) or MEMS-based differential mobility spectrometry (DMS). We describe these techniques and present recent data obtained from human breath or breath condensate, in particular, addressing the question of how environmental exposure influences the composition of breath.

  7. Which Combinations of Techniques and Modes of Delivery in Internet-Based Interventions Effectively Change Health Behavior? A Meta-Analysis

    PubMed Central

    van Genugten, Lenneke; Webb, Thomas Llewelyn; van Empelen, Pepijn

    2016-01-01

    Background Many online interventions designed to promote health behaviors combine multiple behavior change techniques (BCTs), adopt different modes of delivery (MoD) (eg, text messages), and range in how usable they are. Research is therefore needed to examine the impact of these features on the effectiveness of online interventions. Objective This study applies Classification and Regression Trees (CART) analysis to meta-analytic data, in order to identify synergistic effects of BCTs, MoDs, and usability factors. Methods We analyzed data from Webb et al. This review included effect sizes from 52 online interventions targeting a variety of health behaviors and coded the use of 40 BCTs and 11 MoDs. Our research also developed a taxonomy for coding the usability of interventions. Meta-CART analyses were performed using the BCTs and MoDs as predictors and using treatment success (ie, effect size) as the outcome. Results Factors related to usability of the interventions influenced their efficacy. Specifically, subgroup analyses indicated that more efficient interventions (interventions that take little time to understand and use) are more likely to be effective than less efficient interventions. Meta-CART identified one synergistic effect: Interventions that included barrier identification/ problem solving and provided rewards for behavior change reported an average effect size that was smaller (ḡ=0.23, 95% CI 0.08-0.44) than interventions that used other combinations of techniques (ḡ=0.43, 95% CI 0.27-0.59). No synergistic effects were found for MoDs or for MoDs combined with BCTs. Conclusions Interventions that take little time to understand and use were more effective than those that require more time. Few specific combinations of BCTs that contribute to the effectiveness of online interventions were found. Furthermore, no synergistic effects between BCTs and MoDs were found, even though MoDs had strong effects when analyzed univariately in the original study

  8. HELCATS - Heliospheric Cataloguing, Analysis and Techniques Service

    NASA Astrophysics Data System (ADS)

    Harrison, Richard; Davies, Jackie; Perry, Chris; Moestl, Christian; Rouillard, Alexis; Bothmer, Volker; Rodriguez, Luciano; Eastwood, Jonathan; Kilpua, Emilia; Gallagher, Peter

    2016-04-01

    Understanding the evolution of the solar wind is fundamental to advancing our knowledge of energy and mass transport in the solar system, rendering it crucial to space weather and its prediction. The advent of truly wide-angle heliospheric imaging has revolutionised the study of both transient (CMEs) and background (SIRs/CIRs) solar wind plasma structures, by enabling their direct and continuous observation out to 1 AU and beyond. The EU-funded FP7 HELCATS project combines European expertise in heliospheric imaging, built up in particular through lead involvement in NASA's STEREO mission, with expertise in solar and coronal imaging as well as in-situ and radio measurements of solar wind phenomena, in a programme of work that will enable a much wider exploitation and understanding of heliospheric imaging observations. With HELCATS, we are (1.) cataloguing transient and background solar wind structures imaged in the heliosphere by STEREO/HI, since launch in late October 2006 to date, including estimates of their kinematic properties based on a variety of established techniques and more speculative, approaches; (2.) evaluating these kinematic properties, and thereby the validity of these techniques, through comparison with solar source observations and in-situ measurements made at multiple points throughout the heliosphere; (3.) appraising the potential for initialising advanced numerical models based on these kinematic properties; (4.) assessing the complementarity of radio observations (in particular of Type II radio bursts and interplanetary scintillation) in combination with heliospheric imagery. We will, in this presentation, provide an overview of progress from the first 18 months of the HELCATS project.

  9. Statistical-techniques-based computer-aided diagnosis (CAD) using texture feature analysis: application in computed tomography (CT) imaging to fatty liver disease

    NASA Astrophysics Data System (ADS)

    Chung, Woon-Kwan; Park, Hyong-Hu; Im, In-Chul; Lee, Jae-Seung; Goo, Eun-Hoe; Dong, Kyung-Rae

    2012-09-01

    This paper proposes a computer-aided diagnosis (CAD) system based on texture feature analysis and statistical wavelet transformation technology to diagnose fatty liver disease with computed tomography (CT) imaging. In the target image, a wavelet transformation was performed for each lesion area to set the region of analysis (ROA, window size: 50 × 50 pixels) and define the texture feature of a pixel. Based on the extracted texture feature values, six parameters (average gray level, average contrast, relative smoothness, skewness, uniformity, and entropy) were determined to calculate the recognition rate for a fatty liver. In addition, a multivariate analysis of the variance (MANOVA) method was used to perform a discriminant analysis to verify the significance of the extracted texture feature values and the recognition rate for a fatty liver. According to the results, each texture feature value was significant for a comparison of the recognition rate for a fatty liver ( p < 0.05). Furthermore, the F-value, which was used as a scale for the difference in recognition rates, was highest in the average gray level, relatively high in the skewness and the entropy, and relatively low in the uniformity, the relative smoothness and the average contrast. The recognition rate for a fatty liver had the same scale as that for the F-value, showing 100% (average gray level) at the maximum and 80% (average contrast) at the minimum. Therefore, the recognition rate is believed to be a useful clinical value for the automatic detection and computer-aided diagnosis (CAD) using the texture feature value. Nevertheless, further study on various diseases and singular diseases will be needed in the future.

  10. 77 FR 40552 - Federal Acquisition Regulation; Price Analysis Techniques

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-07-10

    ... Federal Acquisition Regulation; Price Analysis Techniques AGENCY: Department of Defense (DoD), General... clarify the use of a price analysis technique in order to establish a fair and reasonable price. DATES....404-1(b)(2) addresses various price analysis techniques and procedures the Government may use...

  11. Comparison of performance of object-based image analysis techniques available in open source software (Spring and Orfeo Toolbox/Monteverdi) considering very high spatial resolution data

    NASA Astrophysics Data System (ADS)

    Teodoro, Ana C.; Araujo, Ricardo

    2016-01-01

    The use of unmanned aerial vehicles (UAVs) for remote sensing applications is becoming more frequent. However, this type of information can result in several software problems related to the huge amount of data available. Object-based image analysis (OBIA) has proven to be superior to pixel-based analysis for very high-resolution images. The main objective of this work was to explore the potentialities of the OBIA methods available in two different open source software applications, Spring and OTB/Monteverdi, in order to generate an urban land cover map. An orthomosaic derived from UAVs was considered, 10 different regions of interest were selected, and two different approaches were followed. The first one (Spring) uses the region growing segmentation algorithm followed by the Bhattacharya classifier. The second approach (OTB/Monteverdi) uses the mean shift segmentation algorithm followed by the support vector machine (SVM) classifier. Two strategies were followed: four classes were considered using Spring and thereafter seven classes were considered for OTB/Monteverdi. The SVM classifier produces slightly better results and presents a shorter processing time. However, the poor spectral resolution of the data (only RGB bands) is an important factor that limits the performance of the classifiers applied.

  12. Statistics and Machine Learning based Outlier Detection Techniques for Exoplanets

    NASA Astrophysics Data System (ADS)

    Goel, Amit; Montgomery, Michele

    2015-08-01

    Architectures of planetary systems are observable snapshots in time that can indicate formation and dynamic evolution of planets. The observable key parameters that we consider are planetary mass and orbital period. If planet masses are significantly less than their host star masses, then Keplerian Motion is defined as P^2 = a^3 where P is the orbital period in units of years and a is the orbital period in units of Astronomical Units (AU). Keplerian motion works on small scales such as the size of the Solar System but not on large scales such as the size of the Milky Way Galaxy. In this work, for confirmed exoplanets of known stellar mass, planetary mass, orbital period, and stellar age, we analyze Keplerian motion of systems based on stellar age to seek if Keplerian motion has an age dependency and to identify outliers. For detecting outliers, we apply several techniques based on statistical and machine learning methods such as probabilistic, linear, and proximity based models. In probabilistic and statistical models of outliers, the parameters of a closed form probability distributions are learned in order to detect the outliers. Linear models use regression analysis based techniques for detecting outliers. Proximity based models use distance based algorithms such as k-nearest neighbour, clustering algorithms such as k-means, or density based algorithms such as kernel density estimation. In this work, we will use unsupervised learning algorithms with only the proximity based models. In addition, we explore the relative strengths and weaknesses of the various techniques by validating the outliers. The validation criteria for the outliers is if the ratio of planetary mass to stellar mass is less than 0.001. In this work, we present our statistical analysis of the outliers thus detected.

  13. Developing techniques for cause-responsibility analysis of occupational accidents.

    PubMed

    Jabbari, Mousa; Ghorbani, Roghayeh

    2016-11-01

    The aim of this study was to specify the causes of occupational accidents, determine social responsibility and the role of groups involved in work-related accidents. This study develops occupational accidents causes tree, occupational accidents responsibility tree, and occupational accidents component-responsibility analysis worksheet; based on these methods, it develops cause-responsibility analysis (CRA) techniques, and for testing them, analyzes 100 fatal/disabling occupational accidents in the construction setting that were randomly selected from all the work-related accidents in Tehran, Iran, over a 5-year period (2010-2014). The main result of this study involves two techniques for CRA: occupational accidents tree analysis (OATA) and occupational accidents components analysis (OACA), used in parallel for determination of responsible groups and responsibilities rate. From the results, we find that the management group of construction projects has 74.65% responsibility of work-related accidents. The developed techniques are purposeful for occupational accidents investigation/analysis, especially for the determination of detailed list of tasks, responsibilities, and their rates. Therefore, it is useful for preventing work-related accidents by focusing on the responsible group's duties.

  14. Analysis of diagnostic calorimeter data by the transfer function technique.

    PubMed

    Delogu, R S; Poggi, C; Pimazzoni, A; Rossi, G; Serianni, G

    2016-02-01

    This paper describes the analysis procedure applied to the thermal measurements on the rear side of a carbon fibre composite calorimeter with the purpose of reconstructing the energy flux due to an ion beam colliding on the front side. The method is based on the transfer function technique and allows a fast analysis by means of the fast Fourier transform algorithm. Its efficacy has been tested both on simulated and measured temperature profiles: in all cases, the energy flux features are well reproduced and beamlets are well resolved. Limits and restrictions of the method are also discussed, providing strategies to handle issues related to signal noise and digital processing. PMID:26932104

  15. CT-based morphometric analysis of C1 laminar dimensions: C1 translaminar screw fixation is a feasible technique for salvage of atlantoaxial fusions

    PubMed Central

    Yew, Andrew; Lu, Derek; Lu, Daniel C.

    2015-01-01

    Background: Translaminar screw fixation has become an alternative in the fixation of the axial and subaxial cervical spine. We report utilization of this approach in the atlas as a salvage technique for atlantoaxial stabilization when C1 lateral mass screws are precluded. To assess the feasibility of translaminar fixation at the atlas, we have characterized the dimensions of the C1 lamina in the general adult population using computed tomography (CT)-based morphometry. Methods: A 46-year-old male with symptomatic atlantoaxial instability secondary to os odontoideum underwent bilateral C1 and C2 translaminar screw/rod fixation as C1 lateral mass fixation was precluded by an anomalous vertebral artery. The follow-up evaluation 2½ years postoperatively revealed an asymptomatic patient without recurrent neck/shoulder pain or clinical signs of instability. To better assess the feasibility of utilizing this approach in the general population, we retrospectively analyzed 502 consecutive cervical CT scans performed over a 3-month period in patients aged over 18 years at a single institution. Measurements of C1 bicortical diameter, bilateral laminar length, height, and angulation were performed. Laminar and screw dimensions were compared to assess instrumentation feasibility. Results: Review of CT imaging found that 75.9% of C1 lamina had a sufficient bicortical diameter, and 63.7% of C1 lamina had sufficient height to accept bilateral translaminar screw placement. Conclusions: CT-based measurement of atlas morphology in the general population revealed that a majority of C1 lamina had sufficient dimensions to accept translaminar screw placement. Although these screws appear to be a feasible alternative when lateral mass screws are precluded, further research is required to determine if they provide comparable fixation strength versus traditional instrumentation methods. PMID:26005585

  16. Advanced Techniques for Root Cause Analysis

    2000-09-19

    Five items make up this package, or can be used individually. The Chronological Safety Management Template utilizes a linear adaptation of the Integrated Safety Management System laid out in the form of a template that greatly enhances the ability of the analyst to perform the first step of any investigation which is to gather all pertinent facts and identify causal factors. The Problem Analysis Tree is a simple three (3) level problem analysis tree whichmore » is easier for organizations outside of WSRC to use. Another part is the Systemic Root Cause Tree. One of the most basic and unique features of Expanded Root Cause Analysis is the Systemic Root Cause portion of the Expanded Root Cause Pyramid. The Systemic Root Causes are even more basic than the Programmatic Root Causes and represent Root Causes that cut across multiple (if not all) programs in an organization. the Systemic Root Cause portion contains 51 causes embedded at the bottom level of a three level Systemic Root Cause Tree that is divided into logical, organizationally based categorie to assist the analyst. The Computer Aided Root Cause Analysis that allows the analyst at each level of the Pyramid to a) obtain a brief description of the cause that is being considered, b) record a decision that the item is applicable, c) proceed to the next level of the Pyramid to see only those items at the next level of the tree that are relevant to the particular cause that has been chosen, and d) at the end of the process automatically print out a summary report of the incident, the causal factors as they relate to the safety management system, the probable causes, apparent causes, Programmatic Root Causes and Systemic Root Causes for each causal factor and the associated corrective action.« less

  17. Effective learning techniques for military applications using the Personalized Assistant that Learns (PAL) enhanced Web-Based Temporal Analysis System (WebTAS)

    NASA Astrophysics Data System (ADS)

    LaMonica, Peter; Dziegiel, Roger; Liuzzi, Raymond; Hepler, James

    2009-05-01

    The Personalized Assistant that Learns (PAL) Program is a Defense Advanced Research Projects Agency (DARPA) research effort that is advancing technologies in the area of cognitive learning by developing cognitive assistants to support military users, such as commanders and decision makers. The Air Force Research Laboratory's (AFRL) Information Directorate leveraged several core PAL components and applied them to the Web-Based Temporal Analysis System (WebTAS) so that users of this system can have automated features, such as task learning, intelligent clustering, and entity extraction. WebTAS is a modular software toolset that supports fusion of large amounts of disparate data sets, visualization, project organization and management, pattern analysis and activity prediction, and includes various presentation aids. WebTAS is predominantly used by analysts within the intelligence community and with the addition of these automated features, many transition opportunities exist for this integrated technology. Further, AFRL completed an extensive test and evaluation of this integrated software to determine its effectiveness for military applications in terms of timeliness and situation awareness, and these findings and conclusions, as well as future work, will be presented in this report.

  18. Investigation of electroforming techniques, literature analysis report

    NASA Technical Reports Server (NTRS)

    Malone, G. A.

    1975-01-01

    A literature analysis is presented of reports, specifications, and documented experiences with the use of electroforming to produce copper and nickel structures for aerospace and other engineering applications. The literature period covered is from 1948 to 1974. Specific effort was made to correlate mechanical property data for the electrodeposited material with known electroforming solution compositions and operating conditions. From this survey, electrolytes are suggested for selection to electroform copper and nickel outer shells on regeneratively cooled thrust chamber liners, and other devices subject to thermal and pressure exposure, based on mechanical properties obtainable, performance under various thermal environments, and ease of process control for product reproducibility. Processes of potential value in obtaining sound bonds between electrodeposited copper and nickel and copper alloy substrates are also discussed.

  19. DCT-based cyber defense techniques

    NASA Astrophysics Data System (ADS)

    Amsalem, Yaron; Puzanov, Anton; Bedinerman, Anton; Kutcher, Maxim; Hadar, Ofer

    2015-09-01

    With the increasing popularity of video streaming services and multimedia sharing via social networks, there is a need to protect the multimedia from malicious use. An attacker may use steganography and watermarking techniques to embed malicious content, in order to attack the end user. Most of the attack algorithms are robust to basic image processing techniques such as filtering, compression, noise addition, etc. Hence, in this article two novel, real-time, defense techniques are proposed: Smart threshold and anomaly correction. Both techniques operate at the DCT domain, and are applicable for JPEG images and H.264 I-Frames. The defense performance was evaluated against a highly robust attack, and the perceptual quality degradation was measured by the well-known PSNR and SSIM quality assessment metrics. A set of defense techniques is suggested for improving the defense efficiency. For the most aggressive attack configuration, the combination of all the defense techniques results in 80% protection against cyber-attacks with PSNR of 25.74 db.

  20. Modular techniques for dynamic fault-tree analysis

    NASA Technical Reports Server (NTRS)

    Patterson-Hine, F. A.; Dugan, Joanne B.

    1992-01-01

    It is noted that current approaches used to assess the dependability of complex systems such as Space Station Freedom and the Air Traffic Control System are incapable of handling the size and complexity of these highly integrated designs. A novel technique for modeling such systems which is built upon current techniques in Markov theory and combinatorial analysis is described. It enables the development of a hierarchical representation of system behavior which is more flexible than either technique alone. A solution strategy which is based on an object-oriented approach to model representation and evaluation is discussed. The technique is virtually transparent to the user since the fault tree models can be built graphically and the objects defined automatically. The tree modularization procedure allows the two model types, Markov and combinatoric, to coexist and does not require that the entire fault tree be translated to a Markov chain for evaluation. This effectively reduces the size of the Markov chain required and enables solutions with less truncation, making analysis of longer mission times possible. Using the fault-tolerant parallel processor as an example, a model is built and solved for a specific mission scenario and the solution approach is illustrated in detail.

  1. Applying knowledge compilation techniques to model-based reasoning

    NASA Technical Reports Server (NTRS)

    Keller, Richard M.

    1991-01-01

    Researchers in the area of knowledge compilation are developing general purpose techniques for improving the efficiency of knowledge-based systems. In this article, an attempt is made to define knowledge compilation, to characterize several classes of knowledge compilation techniques, and to illustrate how some of these techniques can be applied to improve the performance of model-based reasoning systems.

  2. Neutron Activation Analysis: Techniques and Applications

    NASA Astrophysics Data System (ADS)

    MacLellan, Ryan

    2011-04-01

    The role of neutron activation analysis in low-energy low-background experimentsis discussed in terms of comparible methods. Radiochemical neutron activation analysis is introduce. The procedure of instrumental neutron activation analysis is detailed especially with respect to the measurement of trace amounts of natural radioactivity. The determination of reactor neutron spectrum parameters required for neutron activation analysis is also presented.

  3. Neutron Activation Analysis: Techniques and Applications

    SciTech Connect

    MacLellan, Ryan

    2011-04-27

    The role of neutron activation analysis in low-energy low-background experimentsis discussed in terms of comparible methods. Radiochemical neutron activation analysis is introduce. The procedure of instrumental neutron activation analysis is detailed especially with respect to the measurement of trace amounts of natural radioactivity. The determination of reactor neutron spectrum parameters required for neutron activation analysis is also presented.

  4. A Quantitative Study of Gully Erosion Based on Object-Oriented Analysis Techniques: A Case Study in Beiyanzikou Catchment of Qixia, Shandong, China

    PubMed Central

    Wang, Tao; He, Fuhong; Zhang, Anding; Gu, Lijuan; Wen, Yangmao; Jiang, Weiguo; Shao, Hongbo

    2014-01-01

    This paper took a subregion in a small watershed gully system at Beiyanzikou catchment of Qixia, China, as a study and, using object-orientated image analysis (OBIA), extracted shoulder line of gullies from high spatial resolution digital orthophoto map (DOM) aerial photographs. Next, it proposed an accuracy assessment method based on the adjacent distance between the boundary classified by remote sensing and points measured by RTK-GPS along the shoulder line of gullies. Finally, the original surface was fitted using linear regression in accordance with the elevation of two extracted edges of experimental gullies, named Gully 1 and Gully 2, and the erosion volume was calculated. The results indicate that OBIA can effectively extract information of gullies; average range difference between points field measured along the edge of gullies and classified boundary is 0.3166 m, with variance of 0.2116 m. The erosion area and volume of two gullies are 2141.6250 m2, 5074.1790 m3 and 1316.1250 m2, 1591.5784 m3, respectively. The results of the study provide a new method for the quantitative study of small gully erosion. PMID:24616626

  5. Using remote sensing techniques and field-based structural analysis to explore new gold and associated mineral sites around Al-Hajar mine, Asir terrane, Arabian Shield

    NASA Astrophysics Data System (ADS)

    Sonbul, Abdullah R.; El-Shafei, Mohamed K.; Bishta, Adel Z.

    2016-05-01

    Modern earth resource satellites provide huge amounts of digital imagery at different resolutions. These satellite imageries are considered one of the most significant sources of data for mineral exploration. Image processing techniques were applied to the exposed rocks around the Al-Aqiq area of the Asir terrane in the southern part of the Arabian Shield. The area under study has two sub-parallel N-S trending metamorphic belts of green-schist facies. The first belt is located southeast of Al-Aqiq, where the Al-Hajar Gold Mine is situated. It is essentially composed of metavolcanics and metasedimentary rocks, and it is intruded by different plutonic rocks of primarily diorite, syenite and porphyritic granite. The second belt is located northwest of Al-Aqiq, and it is composed of metavolcanics and metasedimentary rocks and is intruded by granite bodies. The current study aimed to distinguish the lithological units, detect and map the alteration zones, and extract the major fault lineaments around the Al-Hajar gold prospect. Digital satellite imageries, including Landsat 7 ETM + multispectral and panchromatic and SPOT-5 were used in addition to field verification. Areas with similar spectral signatures to the prospect were identified in the nearby metamorphic belt; it was considered as a target area and was inspected in the field. The relationships between the alteration zones, the mineral deposits and the structural elements were used to locate the ore-bearing zones in the subsurface. The metasedimentary units of the target area showed a dextral-ductile shearing top-to-the-north and the presence of dominant mineralized quartz vein-system. The area to the north of the Al-Hajar prospect showed also sub-parallel shear zones along which different types of alterations were detected. Field-based criteria such as hydrothermal breccia, jasper, iron gossans and porphyritic granite strongly indicate the presence of porphyry-type ore deposits in Al-Hajar metamorphic belt that

  6. Node Augmentation Technique in Bayesian Network Evidence Analysis and Marshaling

    SciTech Connect

    Keselman, Dmitry; Tompkins, George H; Leishman, Deborah A

    2010-01-01

    Given a Bayesian network, sensitivity analysis is an important activity. This paper begins by describing a network augmentation technique which can simplifY the analysis. Next, we present two techniques which allow the user to determination the probability distribution of a hypothesis node under conditions of uncertain evidence; i.e. the state of an evidence node or nodes is described by a user specified probability distribution. Finally, we conclude with a discussion of three criteria for ranking evidence nodes based on their influence on a hypothesis node. All of these techniques have been used in conjunction with a commercial software package. A Bayesian network based on a directed acyclic graph (DAG) G is a graphical representation of a system of random variables that satisfies the following Markov property: any node (random variable) is independent of its non-descendants given the state of all its parents (Neapolitan, 2004). For simplicities sake, we consider only discrete variables with a finite number of states, though most of the conclusions may be generalized.

  7. Cochlear implant simulator for surgical technique analysis

    NASA Astrophysics Data System (ADS)

    Turok, Rebecca L.; Labadie, Robert F.; Wanna, George B.; Dawant, Benoit M.; Noble, Jack H.

    2014-03-01

    Cochlear Implant (CI) surgery is a procedure in which an electrode array is inserted into the cochlea. The electrode array is used to stimulate auditory nerve fibers and restore hearing for people with severe to profound hearing loss. The primary goals when placing the electrode array are to fully insert the array into the cochlea while minimizing trauma to the cochlea. Studying the relationship between surgical outcome and various surgical techniques has been difficult since trauma and electrode placement are generally unknown without histology. Our group has created a CI placement simulator that combines an interactive 3D visualization environment with a haptic-feedback-enabled controller. Surgical techniques and patient anatomy can be varied between simulations so that outcomes can be studied under varied conditions. With this system, we envision that through numerous trials we will be able to statistically analyze how outcomes relate to surgical techniques. As a first test of this system, in this work, we have designed an experiment in which we compare the spatial distribution of forces imparted to the cochlea in the array insertion procedure when using two different but commonly used surgical techniques for cochlear access, called round window and cochleostomy access. Our results suggest that CIs implanted using round window access may cause less trauma to deeper intracochlear structures than cochleostomy techniques. This result is of interest because it challenges traditional thinking in the otological community but might offer an explanation for recent anecdotal evidence that suggests that round window access techniques lead to better outcomes.

  8. Retention of denture bases fabricated by three different processing techniques – An in vivo study

    PubMed Central

    Chalapathi Kumar, V. H.; Surapaneni, Hemchand; Ravikiran, V.; Chandra, B. Sarat; Balusu, Srilatha; Reddy, V. Naveen

    2016-01-01

    Aim: Distortion due to Polymerization shrinkage compromises the retention. To evaluate the amount of retention of denture bases fabricated by conventional, anchorized, and injection molding polymerization techniques. Materials and Methods: Ten completely edentulous patients were selected, impressions were made, and master cast obtained was duplicated to fabricate denture bases by three polymerization techniques. Loop was attached to the finished denture bases to estimate the force required to dislodge them by retention apparatus. Readings were subjected to nonparametric Friedman two-way analysis of variance followed by Bonferroni correction methods and Wilcoxon matched-pairs signed-ranks test. Results: Denture bases fabricated by injection molding (3740 g), anchorized techniques (2913 g) recorded greater retention values than conventional technique (2468 g). Significant difference was seen between these techniques. Conclusions: Denture bases obtained by injection molding polymerization technique exhibited maximum retention, followed by anchorized technique, and least retention was seen in conventional molding technique. PMID:27382542

  9. A procedural analysis of correspondence training techniques

    PubMed Central

    Paniagua, Freddy A.

    1990-01-01

    A variety of names have been given to procedures used in correspondence training, some more descriptive than others. In this article I argue that a terminology more accurately describing actual procedures, rather than the conceptual function that those procedures are assumed to serve, would benefit the area of correspondence training. I identify two documented procedures during the reinforcement of verbalization phase and five procedures during the reinforcement of correspondence phase and suggest that those procedures can be classified, or grouped into nonoverlapping categories, by specifying the critical dimensions of those procedures belonging to a single category. I suggest that the names of such nonoverlapping categories should clearly specify the dimensions on which the classification is based in order to facilitate experimental comparison of procedures, and to be able to recognize when a new procedure (as opposed to a variant of one already in existence) is developed. Future research involving comparative analysis across and within procedures is discussed within the framework of the proposed classification. PMID:22478059

  10. Ray tracing analysis of inclined illumination techniques.

    PubMed

    Sinkó, József; Szabó, Gábor; Erdélyi, Miklós

    2014-08-11

    The reduction of out of focus signal is a general task in fluorescence microscopy and is especially important in the recently developed super-resolution techniques because of the degradation of the final image. Several illumination methods have been developed to provide decreased out of focus signal level relative to the common epifluorescent illumination. In this paper we examine the highly inclined and the total internal reflection illumination techniques using the ray tracing method. Two merit functions were introduced for the quantitative description of the excitation of the selected region. We studied the feasibility of illumination methods, and the required corrections arising from the imperfections of the optical elements.

  11. Model-Based Safety Analysis

    NASA Technical Reports Server (NTRS)

    Joshi, Anjali; Heimdahl, Mats P. E.; Miller, Steven P.; Whalen, Mike W.

    2006-01-01

    System safety analysis techniques are well established and are used extensively during the design of safety-critical systems. Despite this, most of the techniques are highly subjective and dependent on the skill of the practitioner. Since these analyses are usually based on an informal system model, it is unlikely that they will be complete, consistent, and error free. In fact, the lack of precise models of the system architecture and its failure modes often forces the safety analysts to devote much of their effort to gathering architectural details about the system behavior from several sources and embedding this information in the safety artifacts such as the fault trees. This report describes Model-Based Safety Analysis, an approach in which the system and safety engineers share a common system model created using a model-based development process. By extending the system model with a fault model as well as relevant portions of the physical system to be controlled, automated support can be provided for much of the safety analysis. We believe that by using a common model for both system and safety engineering and automating parts of the safety analysis, we can both reduce the cost and improve the quality of the safety analysis. Here we present our vision of model-based safety analysis and discuss the advantages and challenges in making this approach practical.

  12. Passive analysis technique for packet link performance

    NASA Astrophysics Data System (ADS)

    Fairhurst, G.; Wan, P. S.

    1993-01-01

    The performance of a bearer link is usually assessed by bit error rate (BER) tests or measurement of the error free seconds (EFS). These require exclusive access to the link. An alternative technique is presented that measures performance by passive observation of the frames passing over a packet link. This may be used to estimate the performance of the link.

  13. Recent trends in particle size analysis techniques

    NASA Technical Reports Server (NTRS)

    Kang, S. H.

    1984-01-01

    Recent advances and developments in the particle-sizing technologies are briefly reviewed in accordance with three operating principles including particle size and shape descriptions. Significant trends of the particle size analysing equipment recently developed show that compact electronic circuitry and rapid data processing systems were mainly adopted in the instrument design. Some newly developed techniques characterizing the particulate system were also introduced.

  14. Noninvasive in vivo glucose sensing using an iris based technique

    NASA Astrophysics Data System (ADS)

    Webb, Anthony J.; Cameron, Brent D.

    2011-03-01

    Physiological glucose monitoring is important aspect in the treatment of individuals afflicted with diabetes mellitus. Although invasive techniques for glucose monitoring are widely available, it would be very beneficial to make such measurements in a noninvasive manner. In this study, a New Zealand White (NZW) rabbit animal model was utilized to evaluate a developed iris-based imaging technique for the in vivo measurement of physiological glucose concentration. The animals were anesthetized with isoflurane and an insulin/dextrose protocol was used to control blood glucose concentration. To further help restrict eye movement, a developed ocular fixation device was used. During the experimental time frame, near infrared illuminated iris images were acquired along with corresponding discrete blood glucose measurements taken with a handheld glucometer. Calibration was performed using an image based Partial Least Squares (PLS) technique. Independent validation was also performed to assess model performance along with Clarke Error Grid Analysis (CEGA). Initial validation results were promising and show that a high percentage of the predicted glucose concentrations are within 20% of the reference values.

  15. Advanced NMR-based techniques for pore structure analysis of coal. Quarterly report No. 7, April 1, 1993--June 31, 1993

    SciTech Connect

    Smith, D.M.

    1993-09-01

    One of the main problems in coal utilization is the inability to properly characterize its complex pore structure. Coals typically have micro/ultramicro pores but they also exhibit meso and macroporosity. Conventional pore size techniques (adsorption/condensation, mercury porosimetry) are limited because of this broad pore size range, microporosity, reactive nature of coal, samples must be completely dried, and network/percolation effects. Small angle scattering is limited because it probes both open and closed pores. Although one would not expect any single technique to provide a satisfactory description of a coal`s structure, it is apparent that better techniques are necessary. Small angle scattering could be improved by combining scattering and adsorption measurements. Also, the measurement of NMR parameters of various gas phase and adsorbed phase NMR active probes can provide pore structure information. We will investigate the dependence of the common NMR parameters such as chemical shifts and relaxation times of several different nuclei and compounds on the pore structure of model microporous solids, carbons, and coals. In particular, we will study the interaction between several small molecules ({sup 129}Xe, {sup 3}He, {sup 2}H{sub 2},{sup 14}N{sub 2}, {sup 14}NH{sub 3}, {sup 15}N{sub 2},{sup 13}CH{sub 4}, {sup 13}CO{sub 2}) and the pore surfaces in coals.

  16. Advanced NMR-based techniques for pore structure analysis of coal. Quarterly report No. 3, July 1, 1992--September 30, 1992

    SciTech Connect

    Smith, D.M.

    1992-12-31

    One of the main problems in coal utilization is the inability to properly characterize its complex pore structure. Coals typically have micro/ultra-micro pores but they also exhibit meso and macroporosity. Conventional pore size techniques (adsorption/condensation, mercury porosimetry) are limited because of this broad pore size range, microporosity, reactive nature of coal, samples must be completely dried, and network/percolation effects. Small angle scattering is limited because it probes both open and closed pores. Although one would not expect any single technique to provide a satisfactory description of a coal`s structure, it is apparent that better techniques are necessary. We believe that measurement of the NMR parameters of various gas phase and adsorbed phase NMR active probes can provide the resolution to this problem. We now have two suites of well-characterized microporous materials including oxides (zeolites and silica gel) and activated carbons from our industrial partner, Air Products in Allentown, PA. Our current work may be divided into three areas: small-angle X-ray scattering (SAXS), adsorption, and NMR.

  17. Advanced NMR-based techniques for pore structure analysis of coal. Quarterly report No. 9, October 1, 1993--December 30, 1993

    SciTech Connect

    Smith, D.M.

    1993-12-31

    One of the main problems in coal utilization is the inability to properly characterize its complex pore structure. Coals typically have micro/ultra-micro pores but they also exhibit meso and macroporosity. Conventional pore size techniques (adsorption/condensation, mercury porosimetry) are limited because of this broad pore size range, microporosity, reactive nature of coal, samples must be completely dried, and network/percolation effects. Small angle scattering is limited because it probes both open and dosed pores. Although one would not expect any single technique to provide a satisfactory description of a coal`s structure, it is apparent that better techniques are necessary. Small angle scattering could be improved by combining scattering and adsorption measurements. Also, the measurement of NMR parameters of various gas phase and adsorbed phase NMR active probes can provide pore structure information. We will investigate the dependence of the common NMR parameters such as chemical shifts and relaxation times of several different nuclei and compounds on the pore structure of model microporous solids, carbons, and coals. In particular, we will study the interaction between several small molecules ({sup 129}Xe, {sup 3}He, {sup 14}N{sub 2}, {sup 14}NH{sub 3}, {sup 15}N{sub 2}, {sup 13}CH{sub 4}, {sup 13}CO{sub 2}) and pore surface. Our current work may be divided into three areas: small-angle X-ray scattering (SAXS), adsorption, and NMR.

  18. Advanced NMR-based techniques for pore structure analysis of coal. Quarter report {number_sign}8, 7/1/93--9/30/93

    SciTech Connect

    Smith, D.M.

    1993-12-31

    One of the main problems in coal utilization is the inability to properly characterize its complex pore structure. Coals typically have micro/ultramicro pores but they also exhibit meso and macroporosity. Conventional pore size techniques (adsorption/condensation, mercury porosimetry) are limited because of this broad pore size range, microporosity, reactive nature of coal, samples must be completely dried, and network/percolation effects. Small angle scattering is limited because it probes both open and closed pores. Although one would not expect any single technique to provide a satisfactory description of a coal`s structure, it is apparent that better techniques are necessary. Small angle scattering could be improved by combining scattering and adsorption measurements. Also, the measurement of NMR parameters of various gas phase and adsorbed phase NMR active probes can provide pore structure information. The dependence of the common NMR parameters such as chemical shifts and relaxation times of several different nuclei and compounds on the pore structure of model microporous solids, carbons, and coals is investigated. In particular, the interaction between several small molecules ({sup 129}Xe, {sup 3}He, {sup 14}N{sub 2},{sup 14}NH{sub 3},{sup 15}N{sub 2},{sup 13} CH{sub 4}, {sup 13}CO{sub 2}) and pore surface is studied.

  19. Advanced NMR-based techniques for pore structure analysis of coal. Quarterly report No. 6, January 1, 1993--March 31, 1993

    SciTech Connect

    Smith, D.M.

    1993-08-01

    One of the main problems in coal utilization is the inability to properly characterize its complex pore structure. Coals typically have micro/ultra-micro pores but they also exhibit meso and macroporosity. Conventional pore size techniques (adsorption/condensation, mercury porosimetry) are limited because of this broad pore size range, microporosity, reactive nature of coal, samples must be completely dried, and network/percolation effects. Small angle scattering is limited because it probes both open and closed pores. Although one would not expect any single technique to provide a satisfactory description of a coal`s structure, it is apparent that better techniques are necessary. Small angle scattering could be improved by combining scattering and adsorption measurements. Also, the measurement of NMR parameters of various gas phase and adsorbed phase NMR active probes can provide pore structure information. We will investigate the dependence of the common NMR parameters such as chemical shifts and relaxation times of several different nuclei and compounds on the pore structure of model microporous solids, carbons, and coals. In particular, we will study the interaction between several small molecules ({sup 129}Xe, {sup 3}He, {sup 2}H{sub 2}, {sup 14}N{sub 2},{sup 14}NH{sub 3}, {sup 15}N{sup 2}, {sup 13}CH{sub 4}, {sup 13}CO{sub 2}) and the pore surfaces in coals.

  20. Fractal Spectrum Technique for Quantitative Analysis of Volcanic Particle Shapes

    NASA Astrophysics Data System (ADS)

    Maria, A. H.; Carey, S. N.

    2001-12-01

    The shapes of volcanic particles reflect numerous eruptive parameters (e.g. magma viscosity, volatile content, degree of interaction with water) and are useful for understanding fragmentation and transport processes associated with volcanic eruptions. However, quantitative analysis of volcanic particle shapes has proven difficult due to their morphological complexity and variability. Shape analysis based on fractal geometry has been successfully applied to a wide variety of particles and appears to be well suited for describing complex features. The technique developed and applied to volcanic particles in this study uses fractal data produced by dilation of the 2-D particle boundary to produce a full spectrum of fractal dimensions over a range of scales for each particle. Multiple fractal dimensions, which can be described as a fractal spectrum curve, are calculated by taking the first derivative of data points on a standard Richardson plot. Quantitative comparisons are carried out using multivariate statistical techniques such as cluster and principal components analysis. Compared with previous fractal methods that express shape in terms of only one or two fractal dimensions, use of multiple fractal dimensions results in more effective discrimination between samples. In addition, the technique eliminates the subjectivity associated with selecting linear segments on Richardson plots for fractal dimension calculation, and allows direct comparison of particles as long as instantaneous dimensions used as input to multivariate analyses are selected at the same scales for each particle. Applications to samples from well documented eruptions (e.g. Mt. St. Helens, Tambora, Surtsey) indicate that the fractal spectrum technique provides a useful means of characterizing volcanic particles and can be helpful for identifying the products of specific fragmentation processes (volatile exsolution, phreatomagmatic, quench granulation) and modes of volcanic deposition (tephra fall

  1. Liquid Tunable Microlenses based on MEMS techniques

    PubMed Central

    Zeng, Xuefeng; Jiang, Hongrui

    2013-01-01

    The recent rapid development in microlens technology has provided many opportunities for miniaturized optical systems, and has found a wide range of applications. Of these microlenses, tunable-focus microlenses are of special interest as their focal lengths can be tuned using micro-scale actuators integrated with the lens structure. Realization of such tunable microlens generally relies on the microelectromechanical system (MEMS) technologies. Here, we review the recent progress in tunable liquid microlenses. The underlying physics relevant to these microlenses are first discussed, followed by description of three main categories of tunable microlenses involving MEMS techniques, mechanically driven, electrically driven, and those integrated within microfluidic systems. PMID:24163480

  2. A uniform technique for flood frequency analysis.

    USGS Publications Warehouse

    Thomas, W.O., Jr.

    1985-01-01

    This uniform technique consisted of fitting the logarithms of annual peak discharges to a Pearson Type III distribution using the method of moments. The objective was to adopt a consistent approach for the estimation of floodflow frequencies that could be used in computing average annual flood losses for project evaluation. In addition, a consistent approach was needed for defining equitable flood-hazard zones as part of the National Flood Insurance Program. -from ASCE Publications Information

  3. Analysis of Cultural Heritage by Accelerator Techniques and Analytical Imaging

    NASA Astrophysics Data System (ADS)

    Ide-Ektessabi, Ari; Toque, Jay Arre; Murayama, Yusuke

    2011-12-01

    In this paper we present the result of experimental investigation using two very important accelerator techniques: (1) synchrotron radiation XRF and XAFS; and (2) accelerator mass spectrometry and multispectral analytical imaging for the investigation of cultural heritage. We also want to introduce a complementary approach to the investigation of artworks which is noninvasive and nondestructive that can be applied in situ. Four major projects will be discussed to illustrate the potential applications of these accelerator and analytical imaging techniques: (1) investigation of Mongolian Textile (Genghis Khan and Kublai Khan Period) using XRF, AMS and electron microscopy; (2) XRF studies of pigments collected from Korean Buddhist paintings; (3) creating a database of elemental composition and spectral reflectance of more than 1000 Japanese pigments which have been used for traditional Japanese paintings; and (4) visible light-near infrared spectroscopy and multispectral imaging of degraded malachite and azurite. The XRF measurements of the Japanese and Korean pigments could be used to complement the results of pigment identification by analytical imaging through spectral reflectance reconstruction. On the other hand, analysis of the Mongolian textiles revealed that they were produced between 12th and 13th century. Elemental analysis of the samples showed that they contained traces of gold, copper, iron and titanium. Based on the age and trace elements in the samples, it was concluded that the textiles were produced during the height of power of the Mongol empire, which makes them a valuable cultural heritage. Finally, the analysis of the degraded and discolored malachite and azurite demonstrates how multispectral analytical imaging could be used to complement the results of high energy-based techniques.

  4. Bayesian Analysis of the Pattern Informatics Technique

    NASA Astrophysics Data System (ADS)

    Cho, N.; Tiampo, K.; Klein, W.; Rundle, J.

    2007-12-01

    The pattern informatics (PI) [Rundle et al., 2000; Tiampo et al., 2002; Holliday et al., 2005] is a technique that uses phase dynamics in order to quantify temporal variations in seismicity patterns. This technique has shown interesting results for forecasting earthquakes with magnitude greater than or equal to 5 in southern California from 2000 to 2010 [Rundle et al., 2002]. In this work, a Bayesian approach is used to obtain a modified updated version of the PI called Bayesian pattern informatics (BPI). This alternative method uses the PI result as a prior probability and models such as ETAS [Ogata, 1988, 2004; Helmstetter and Sornette, 2002] or BASS [Turcotte et al., 2007] in order to obtain the likelihood. Its result is similar to the one obtained by the PI: the determination of regions, known as hotspots, that are most susceptible to the occurrence of events with M=5 and larger during the forecast period. As an initial test, retrospective forecasts for the southern California region from 1990 to 2000 were made with both the BPI and the PI techniques, and the results are discussed in this work.

  5. Typology of Delivery Quality: Latent Profile Analysis of Teacher Engagement and Delivery Techniques in a School-Based Prevention Intervention, "Keepin' It REAL" Curriculum

    ERIC Educational Resources Information Center

    Shin, YoungJu; Miller-Day, Michelle; Pettigrew, Jonathan; Hecht, Michael L.; Krieger, Janice L.

    2014-01-01

    Enhancing the delivery quality of school-based, evidence-based prevention programs is one key to ensuring uniform program effects on student outcomes. Program evaluations often focus on content dosage when implementing prevention curricula, however, less is known about implementation quality of prevention content, especially among teachers who may…

  6. Statistical Analysis Techniques for Small Sample Sizes

    NASA Technical Reports Server (NTRS)

    Navard, S. E.

    1984-01-01

    The small sample sizes problem which is encountered when dealing with analysis of space-flight data is examined. Because of such a amount of data available, careful analyses are essential to extract the maximum amount of information with acceptable accuracy. Statistical analysis of small samples is described. The background material necessary for understanding statistical hypothesis testing is outlined and the various tests which can be done on small samples are explained. Emphasis is on the underlying assumptions of each test and on considerations needed to choose the most appropriate test for a given type of analysis.

  7. Impact during equine locomotion: techniques for measurement and analysis.

    PubMed

    Burn, J F; Wilson, A; Nason, G P

    1997-05-01

    Impact is implicated in the development of several types of musculoskeletal injury in the horse. Characterisation of impact experienced during strenuous exercise is an important first step towards understanding the mechanism for injury. Measurement and analysis of large, short duration impacts is difficult. The measurement system must be able to record transient peaks and high frequencies accurately. The analysis technique must be able to characterise the impact signal in time and frequency. This paper presents a measurement system and analysis technique for the characterisation of large impacts. A piezo-electric accelerometer was securely mounted on the dorsal surface of the horses hoof. Saddle mounted charge amplifiers and a 20 m coaxial cable transferred these data to a PC based logging system. Data were down-loaded onto a UNIX workstation and analysed using a proprietary statistics package. The values of parameters calculated from the time series data were comparable to those of other authors. A wavelet decomposition showed that the frequency profile of the signal changed with time. While most spectral energy was seen at impact, a significant amount of energy was contained in the signal immediately following impact. Over 99% of this energy was contained in frequencies less than 1250 Hz. The sampling rate and the frequency response of a measurement system for recording impact should be chosen carefully to prevent loss or corruption of data. Time scale analysis using a wavelet decomposition is a powerful technique which can be used to characterise impact data. The use of contour plots provides a highly visual representation of the time and frequency localisation of power during impact.

  8. 78 FR 37690 - Federal Acquisition Regulation; Price Analysis Techniques

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-06-21

    ... published a proposed rule in the Federal Register at 77 FR 40552 on July 10, 2012, to clarify and pinpoint a... Federal Acquisition Regulation; Price Analysis Techniques AGENCY: Department of Defense (DoD), General... clarify and give a precise reference in the use of a price analysis technique in order to establish a...

  9. Development of analysis techniques for remote sensing of vegetation resources

    NASA Technical Reports Server (NTRS)

    Draeger, W. C.

    1972-01-01

    Various data handling and analysis techniques are summarized for evaluation of ERTS-A and supporting high flight imagery. These evaluations are concerned with remote sensors applied to wildland and agricultural vegetation resource inventory problems. Monitoring California's annual grassland, automatic texture analysis, agricultural ground data collection techniques, and spectral measurements are included.

  10. Soil Analysis using the semi-parametric NAA technique

    SciTech Connect

    Zamboni, C. B.; Silveira, M. A. G.; Medina, N. H.

    2007-10-26

    The semi-parametric Neutron Activation Analysis technique, using Au as a flux monitor, was applied to measure element concentrations of Br, Ca, Cl, K, Mn and Na for soil characterization. The results were compared with those using the Instrumental Neutron Activation Analysis technique and they found to be compatible. The viability, advantages, and limitations of using these two analytic methodologies are discussed.

  11. Trends and Techniques for Space Base Electronics

    NASA Technical Reports Server (NTRS)

    Trotter, J. D.; Wade, T. E.; Gassaway, J. D.

    1979-01-01

    Simulations of various phosphorus and boron diffusions in SOS were completed and a sputtering system, furnaces, and photolithography related equipment were set up. Double layer metal experiments initially utilized wet chemistry techniques. By incorporating ultrasonic etching of the vias, premetal cleaning a modified buffered HF, phosphorus doped vapox, and extended sintering, yields of 98% were obtained using the standard test pattern. A two dimensional modeling program was written for simulating short channel MOSFETs with nonuniform substrate doping. A key simplifying assumption used is that the majority carriers can be represented by a sheet charge at the silicon dioxide silicon interface. Although the program is incomplete, the two dimensional Poisson equation for the potential distribution was achieved. The status of other Z-D MOSFET simulation programs is summarized.

  12. Morphometric techniques for orientation analysis of karst in northern Florida

    SciTech Connect

    Jenkins, D.T.; Beck, B.F.

    1985-01-01

    Morphometric techniques for the analysis of karst landscape orientation data based on swallet catchment areas can be highly inadequate. The long axes of catchment areas may not coincide with structural control, especially in regions having very low relief. Better structural correlation was observed using multiply linear trend measurements of closed depressions rather than drainage basins. Trend analysis was performed on four areas, approximately 25 km/sup 2/ each, forming a sequence from the Suwannee River to the Cody Escarpment in northern Florida. This area is a karst plain, mantled by 12 to 25 meters of unconsolidated sands and clays. Structural control was examined by tabulating the azimuths of distinct linear trends as determined from depression shape based on 1:24,000 topographic maps. The topography was characterized by 1872 individual swallet catchment areas or 1457 closed depressions. The common geomorphic technique of analyzing orientation data in 10/sup 0/ increments beginning with O/sup 0/ may yield incorrect peak width and placement. To correctly detect all significant orientation peaks all possible combinations of peak width and placement must be tested. Fifty-five different plots were reviewed and tested for each area.

  13. Visualization techniques for malware behavior analysis

    NASA Astrophysics Data System (ADS)

    Grégio, André R. A.; Santos, Rafael D. C.

    2011-06-01

    Malware spread via Internet is a great security threat, so studying their behavior is important to identify and classify them. Using SSDT hooking we can obtain malware behavior by running it in a controlled environment and capturing interactions with the target operating system regarding file, process, registry, network and mutex activities. This generates a chain of events that can be used to compare them with other known malware. In this paper we present a simple approach to convert malware behavior into activity graphs and show some visualization techniques that can be used to analyze malware behavior, individually or grouped.

  14. Recovering prehistoric woodworking skills using spatial analysis techniques

    NASA Astrophysics Data System (ADS)

    Kovács, K.; Hanke, K.

    2015-08-01

    Recovering of ancient woodworking skills can be achieved by the simultaneous documentation and analysis of the tangible evidences such as the geometry parameters of prehistoric hand tools or the fine morphological characteristics of well preserved wooden archaeological finds. During this study, altogether 10 different hand tool forms and over 60 hand tool impressions were investigated for the better understanding of the Bronze Age woodworking efficiency. Two archaeological experiments were also designed in this methodology and unknown prehistoric adzes could be reconstructed by the results of these studies and by the spatial analysis of the Bronze Age tool marks. Finally, the trimming efficiency of these objects were also implied and these woodworking skills could be quantified in the case of a Bronze Age wooden construction from Austria. The proposed GIS-based tool mark segmentation and comparison can offer an objective, user-independent technique for the related intangible heritage interpretations in the future.

  15. Structural analysis of box beams using symbolic manipulation technique

    NASA Astrophysics Data System (ADS)

    Sathyamoorthy, M.; Sirigiri, Ravindra

    1993-04-01

    The aeroelastic analysis of aircraft wings requires an accurate determination of the influence coefficients. In the past, energy methods have been commonly used to analyze box-type structures and the results have been found to agree well with the experiments. However, when analysis of large wing-type structures is desired, it becomes necessary to automate the energy method. In this article, a method has been developed based on symbolic manipulation as an automated technique to find solutions to box-type structures. Various manipulations required for the energy method have been automatically implemented in a computer program with solutions available at each stage in a symbolic form. The numerical results for several example problems have been compared with alternate theoretical as well as experimental results. Good agreement has been noted in all the cases considered in this article.

  16. 48 CFR 215.404-1 - Proposal analysis techniques.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 48 Federal Acquisition Regulations System 3 2013-10-01 2013-10-01 false Proposal analysis... Contract Pricing 215.404-1 Proposal analysis techniques. (1) Follow the procedures at PGI 215.404-1 for proposal analysis. (2) For spare parts or support equipment, perform an analysis of— (i) Those line...

  17. 48 CFR 215.404-1 - Proposal analysis techniques.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 48 Federal Acquisition Regulations System 3 2011-10-01 2011-10-01 false Proposal analysis... Contract Pricing 215.404-1 Proposal analysis techniques. (1) Follow the procedures at PGI 215.404-1 for proposal analysis. (2) For spare parts or support equipment, perform an analysis of— (i) Those line...

  18. 48 CFR 215.404-1 - Proposal analysis techniques.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 48 Federal Acquisition Regulations System 3 2014-10-01 2014-10-01 false Proposal analysis... Contract Pricing 215.404-1 Proposal analysis techniques. (1) Follow the procedures at PGI 215.404-1 for proposal analysis. (2) For spare parts or support equipment, perform an analysis of— (i) Those line...

  19. Uncertainty analysis technique for OMEGA Dante measurementsa)

    NASA Astrophysics Data System (ADS)

    May, M. J.; Widmann, K.; Sorce, C.; Park, H.-S.; Schneider, M.

    2010-10-01

    The Dante is an 18 channel x-ray filtered diode array which records the spectrally and temporally resolved radiation flux from various targets (e.g., hohlraums, etc.) at x-ray energies between 50 eV and 10 keV. It is a main diagnostic installed on the OMEGA laser facility at the Laboratory for Laser Energetics, University of Rochester. The absolute flux is determined from the photometric calibration of the x-ray diodes, filters and mirrors, and an unfold algorithm. Understanding the errors on this absolute measurement is critical for understanding hohlraum energetic physics. We present a new method for quantifying the uncertainties on the determined flux using a Monte Carlo parameter variation technique. This technique combines the uncertainties in both the unfold algorithm and the error from the absolute calibration of each channel into a one sigma Gaussian error function. One thousand test voltage sets are created using these error functions and processed by the unfold algorithm to produce individual spectra and fluxes. Statistical methods are applied to the resultant set of fluxes to estimate error bars on the measurements.

  20. Uncertainty Analysis Technique for OMEGA Dante Measurements

    SciTech Connect

    May, M J; Widmann, K; Sorce, C; Park, H; Schneider, M

    2010-05-07

    The Dante is an 18 channel X-ray filtered diode array which records the spectrally and temporally resolved radiation flux from various targets (e.g. hohlraums, etc.) at X-ray energies between 50 eV to 10 keV. It is a main diagnostics installed on the OMEGA laser facility at the Laboratory for Laser Energetics, University of Rochester. The absolute flux is determined from the photometric calibration of the X-ray diodes, filters and mirrors and an unfold algorithm. Understanding the errors on this absolute measurement is critical for understanding hohlraum energetic physics. We present a new method for quantifying the uncertainties on the determined flux using a Monte-Carlo parameter variation technique. This technique combines the uncertainties in both the unfold algorithm and the error from the absolute calibration of each channel into a one sigma Gaussian error function. One thousand test voltage sets are created using these error functions and processed by the unfold algorithm to produce individual spectra and fluxes. Statistical methods are applied to the resultant set of fluxes to estimate error bars on the measurements.

  1. Uncertainty analysis technique for OMEGA Dante measurements

    SciTech Connect

    May, M. J.; Widmann, K.; Sorce, C.; Park, H.-S.; Schneider, M.

    2010-10-15

    The Dante is an 18 channel x-ray filtered diode array which records the spectrally and temporally resolved radiation flux from various targets (e.g., hohlraums, etc.) at x-ray energies between 50 eV and 10 keV. It is a main diagnostic installed on the OMEGA laser facility at the Laboratory for Laser Energetics, University of Rochester. The absolute flux is determined from the photometric calibration of the x-ray diodes, filters and mirrors, and an unfold algorithm. Understanding the errors on this absolute measurement is critical for understanding hohlraum energetic physics. We present a new method for quantifying the uncertainties on the determined flux using a Monte Carlo parameter variation technique. This technique combines the uncertainties in both the unfold algorithm and the error from the absolute calibration of each channel into a one sigma Gaussian error function. One thousand test voltage sets are created using these error functions and processed by the unfold algorithm to produce individual spectra and fluxes. Statistical methods are applied to the resultant set of fluxes to estimate error bars on the measurements.

  2. Automated fluid analysis apparatus and techniques

    DOEpatents

    Szecsody, James E.

    2004-03-16

    An automated device that couples a pair of differently sized sample loops with a syringe pump and a source of degassed water. A fluid sample is mounted at an inlet port and delivered to the sample loops. A selected sample from the sample loops is diluted in the syringe pump with the degassed water and fed to a flow through detector for analysis. The sample inlet is also directly connected to the syringe pump to selectively perform analysis without dilution. The device is airtight and used to detect oxygen-sensitive species, such as dithionite in groundwater following a remedial injection to treat soil contamination.

  3. Accelerator based techniques for contraband detection

    NASA Astrophysics Data System (ADS)

    Vourvopoulos, George

    1994-05-01

    It has been shown that narcotics, explosives, and other contraband materials, contain various chemical elements such as H, C, N, O, P, S, and Cl in quantities and ratios that differentiate them from each other and from other innocuous substances. Neutrons and γ-rays have the ability to penetrate through various materials at large depths. They are thus able, in a non-intrusive way, to interrogate volumes ranging from suitcases to Sea-Land containers, and have the ability to image the object with an appreciable degree of reliability. Neutron induced reactions such as (n, γ), (n, n') (n, p) or proton induced γ-resonance absorption are some of the reactions currently investigated for the identification of the chemical elements mentioned above. Various DC and pulsed techniques are discussed and their advantages, characteristics, and current progress are shown. Areas where use of these methods is currently under evaluation are detection of hidden explosives, illicit drug interdiction, chemical war agents identification, nuclear waste assay, nuclear weapons destruction and others.

  4. Analysis techniques for background rejection at the Majorana Demonstrator

    SciTech Connect

    Cuestra, Clara; Rielage, Keith Robert; Elliott, Steven Ray; Xu, Wenqin; Goett, John Jerome III

    2015-06-11

    The MAJORANA Collaboration is constructing the MAJORANA DEMONSTRATOR, an ultra-low background, 40-kg modular HPGe detector array to search for neutrinoless double beta decay in 76Ge. In view of the next generation of tonne-scale Ge-based 0νββ-decay searches that will probe the neutrino mass scale in the inverted-hierarchy region, a major goal of the MAJORANA DEMONSTRATOR is to demonstrate a path forward to achieving a background rate at or below 1 count/tonne/year in the 4 keV region of interest around the Q-value at 2039 keV. The background rejection techniques to be applied to the data include cuts based on data reduction, pulse shape analysis, event coincidences, and time correlations. The Point Contact design of the DEMONSTRATOR's germanium detectors allows for significant reduction of gamma background.

  5. Analysis techniques for background rejection at the MAJORANA DEMONSTRATOR

    SciTech Connect

    Cuesta, C.; Buuck, M.; Detwiler, J. A.; Gruszko, J.; Guinn, I. S.; Leon, J.; Robertson, R. G. H.; Abgrall, N.; Bradley, A. W.; Chan, Y-D.; Mertens, S.; Poon, A. W. P.; Arnquist, I. J.; Hoppe, E. W.; Kouzes, R. T.; LaFerriere, B. D.; Orrell, J. L.; Avignone, F. T.; Baldenegro-Barrera, C. X.; Bertrand, F. E.; and others

    2015-08-17

    The MAJORANA Collaboration is constructing the MAJORANA DEMONSTRATOR, an ultra-low background, 40- kg modular HPGe detector array to search for neutrinoless double beta decay in {sup 76}Ge. In view of the next generation of tonne-scale Ge-based 0νβ β-decay searches that will probe the neutrino mass scale in the inverted-hierarchy region, a major goal of the MAJORANA DEMONSTRATOR is to demonstrate a path forward to achieving a background rate at or below 1 count/tonne/year in the 4 keV region of interest around the Q-value at 2039 keV. The background rejection techniques to be applied to the data include cuts based on data reduction, pulse shape analysis, event coincidences, and time correlations. The Point Contact design of the DEMONSTRATOR’s germanium detectors allows for significant reduction of gamma background.

  6. Comparison of Hydrogen Sulfide Analysis Techniques

    ERIC Educational Resources Information Center

    Bethea, Robert M.

    1973-01-01

    A summary and critique of common methods of hydrogen sulfide analysis is presented. Procedures described are: reflectance from silver plates and lead acetate-coated tiles, lead acetate and mercuric chloride paper tapes, sodium nitroprusside and methylene blue wet chemical methods, infrared spectrophotometry, and gas chromatography. (BL)

  7. Analysis techniques for offshore platform tieback systems

    SciTech Connect

    Griffith, D.L.

    1983-03-01

    The development of offshore fields using subsea drilling templates and tie-back equipment has become an accepted method of achieving early production. This technical paper gives an introduction to tie-back systems, presents a method of analyzing them, discusses the loading conditions that affect tie-back systems, and describes a method of accounting for loading conditions during analysis.

  8. Maintenance Audit through Value Analysis Technique: A Case Study

    NASA Astrophysics Data System (ADS)

    Carnero, M. C.; Delgado, S.

    2008-11-01

    The increase in competitiveness, technological changes and the increase in the requirements of quality and service have forced a change in the design and application of maintenance, as well as the way in which it is considered within the managerial strategy. There are numerous maintenance activities that must be developed in a service company. As a result the maintenance functions as a whole have to be outsourced. Nevertheless, delegating this subject to specialized personnel does not exempt the company from responsibilities, but rather leads to the need for control of each maintenance activity. In order to achieve this control and to evaluate the efficiency and effectiveness of the company it is essential to carry out an audit that diagnoses the problems that could develop. In this paper a maintenance audit applied to a service company is developed. The methodology applied is based on the expert systems. The expert system by means of rules uses the weighting technique SMART and value analysis to obtain the weighting between the decision functions and between the alternatives. The expert system applies numerous rules and relations between different variables associated with the specific maintenance functions, to obtain the maintenance state by sections and the general maintenance state of the enterprise. The contributions of this paper are related to the development of a maintenance audit in a service enterprise, in which maintenance is not generally considered a strategic subject and to the integration of decision-making tools such as the weighting technique SMART with value analysis techniques, typical in the design of new products, in the area of the rule-based expert systems.

  9. An analysis technique for microstrip antennas

    NASA Technical Reports Server (NTRS)

    Agrawal, P. K.; Bailey, M. C.

    1977-01-01

    The paper presents a combined numerical and empirical approach to the analysis of microstrip antennas over a wide range of frequencies. The method involves representing the antenna by a fine wire grid immersed in a dielectric medium and then using Richmond's reaction formulation (1974) to evaluate the piecewise sinusoidal currents on the grid segments. The calculated results are then modified to account for the finite dielectric discontinuity. The method is applied to round and square microstrip antennas.

  10. Identification of Tea Storage Times by Linear Discrimination Analysis and Back-Propagation Neural Network Techniques Based on the Eigenvalues of Principal Components Analysis of E-Nose Sensor Signals

    PubMed Central

    Yu, Huichun; Wang, Yongwei; Wang, Jun

    2009-01-01

    An electronic nose (E-nose) was employed to detect the aroma of green tea after different storage times. Longjing green tea dry leaves, beverages and residues were detected with an E-nose, respectively. In order to decrease the data dimensionality and optimize the feature vector, the E-nose sensor response data were analyzed by principal components analysis (PCA) and the five main principal components values were extracted as the input for the discrimination analysis. The storage time (0, 60, 120, 180 and 240 days) was better discriminated by linear discrimination analysis (LDA) and was predicted by the back-propagation neural network (BPNN) method. The results showed that the discrimination and testing results based on the tea leaves were better than those based on tea beverages and tea residues. The mean errors of the tea leaf data were 9, 2.73, 3.93, 6.33 and 6.8 days, respectively. PMID:22408494

  11. Envelopment technique and topographic overlays in bite mark analysis

    PubMed Central

    Djeapragassam, Parimala; Daniel, Mariappan Jonathan; Srinivasan, Subramanian Vasudevan; Ramadoss, Koliyan; Jimsha, Vannathan Kumaran

    2015-01-01

    Aims and Objectives: The aims and objectives of our study were to compare four sequential overlays generated using the envelopment technique and to evaluate inter- and intraoperator reliability of the overlays obtained by the envelopment technique. Materials and Methods: Dental stone models were prepared from impressions made from healthy individuals; photographs were taken and computer-assisted overlays were generated. The models were then enveloped in a different-color dental stone. After this, four sequential cuts were made at a thickness of 1mm each. Each sectional cut was photographed and overlays were generated. Thus, 125 overlays were generated and compared. Results: The scoring was done based on matching accuracy and the data were analyzed. The Kruskal-Wallis one-way analysis of variance (ANOVA) test was used to compare four sequential overlays and Spearman's rank correlation tests were used to evaluate the inter- and intraoperator reliability of the overlays obtained by the envelopment technique. Conclusion: Through our study, we conclude that the third and fourth cuts were the best among the four cuts and inter- and intraoperator reliability were found to be statistically significant at 5% level that is 95% confidence interval (P < 0.05). PMID:26816458

  12. Point counting on the Macintosh. A semiautomated image analysis technique.

    PubMed

    Gatlin, C L; Schaberg, E S; Jordan, W H; Kuyatt, B L; Smith, W C

    1993-10-01

    In image analysis, point counting is used to estimate three-dimensional quantitative parameters from sets of measurements made on two-dimensional images. Point counting is normally conducted either by hand only or manually through a planimeter. We developed a semiautomated, Macintosh-based method of point counting. This technique could be useful for any point counting application in which the image can be digitized. We utilized this technique to demonstrate increased vacuolation in white matter tracts of rat brains, but it could be used on many other types of tissue. Volume fractions of vacuoles within the corpus callosum of rat brains were determined by analyzing images of histologic sections. A stereologic grid was constructed using the Claris MacDraw II software. The grid was modified for optimum line density and size in Adobe Photoshop, electronically superimposed onto the images and sampled using version 1.37 of NIH Image public domain software. This technique was further automated by the creation of a macro (small program) to create the grid, overlay the grid on a predetermined image, threshold the objects of interest and count thresholded objects at intersections of the grid lines. This method is expected to significantly reduce the amount of time required to conduct point counting and to improve the consistency of counts.

  13. FDI and Accommodation Using NN Based Techniques

    NASA Astrophysics Data System (ADS)

    Garcia, Ramon Ferreiro; de Miguel Catoira, Alberto; Sanz, Beatriz Ferreiro

    Massive application of dynamic backpropagation neural networks is used on closed loop control FDI (fault detection and isolation) tasks. The process dynamics is mapped by means of a trained backpropagation NN to be applied on residual generation. Process supervision is then applied to discriminate faults on process sensors, and process plant parameters. A rule based expert system is used to implement the decision making task and the corresponding solution in terms of faults accommodation and/or reconfiguration. Results show an efficient and robust FDI system which could be used as the core of an SCADA or alternatively as a complement supervision tool operating in parallel with the SCADA when applied on a heat exchanger.

  14. Techniques for analysing ground-based UV-visible long-term BrO and NO2 observations for satellite validation and trend analysis

    NASA Astrophysics Data System (ADS)

    Kreher, Karin; Johnston, Paul; Hay, Timothy; Liley, Ben; Thomas, Alan; Martinez-Aviles, Monica; Friess, Udo; Bodeker, Greg; Schofield, Robyn; van Roozendael, Michel

    NIWA operates a network of zenith-sky viewing DOAS (Differential Optical Absorption Spec-troscopy) instruments to measure NO2 and BrO. The longest existing time series (1981 -present) of NO2 has been measured at Lauder (45oS), New Zealand and the trend of this long-term data set has been studied extensively. Here we present a summary of stratospheric NO2 trends observed at several Northern and Southern Hemisphere stations (including Lauder) and an update of our understanding of the observed hemispheric asymmetry. These trends provide an important anchor for the interpretation of NO2 trends measured by satellites. BrO observations are currently made by NIWA at two Southern Hemisphere sites, Lauder and Arrival Heights (78oS) with each data set spanning more than 15 years. The zenith sky BrO observations are complemented with direct sun observations at Lauder since 2001 and with MAX-DOAS (Multi-axis Differential Optical Absorption Spectroscopy) observations at Arrival Heights (78oS) since 1998. A retrieval technique to separate the tropospheric and stratospheric partial columns of BrO was developed for the combination of zenith sky and direct sun measurements -with the zenith sky observations providing predominantly the information on the stratospheric partial column and the direct sun observations providing the tropospheric contribution. This retrieval has now been applied to Lauder BrO UV-visible measurements for the whole time period (2001 -present) and the updated results including an upper limit of BrO in the troposphere and the stratospheric bromine loading will be presented. The retrieval method has now also been extended so that it can be applied to zenith sky data only. Furthermore, an independent retrieval algorithm has been developed including a forward model capable of dealing with multiple scattering (Monte Carlo radiative transfer model) to enable us to retrieve altitude information in the boundary layer and lower troposphere. This retrieval method has

  15. Techniques for geothermal liquid sampling and analysis

    SciTech Connect

    Kindle, C.H.; Woodruff, E.M.

    1981-07-01

    A methodology has been developed that is particularly suited to liquid-dominated resources and adaptable to a variety of situations. It is intended to be a base methodology upon which variations can be made to meet specific needs or situations. The approach consists of recording flow conditions at the time of sampling, a specific insertable probe sampling system, a sample stabilization procedure, commercially available laboratory instruments, and data quality check procedures.

  16. Estimation of the diagnostic accuracy of the invA-gene-based PCR technique and a bacteriological culture for the detection of Salmonella spp. in caecal content from slaughtered pigs using Bayesian analysis.

    PubMed

    Mainar-Jaime, R C; Atashparvar, N; Chirino-Trejo, M

    2008-01-01

    The goal of this study was to estimate the accuracy of the invA-gene-based polymerase chain reaction (PCR) and a culture technique based on pre-enrichment with buffered peptone water, three selective enrichment media (selenite, tetrathionate and Rappaport-Vassiliadis broths) and four selective, solid media (Xylose-Lysine-Tergitol-4, Salmonella/Shigella, Hekton-Enteric and MacConkey), for the detection of Salmonella organisms from caecal samples from slaughter pigs. For this purpose a latent-class (Bayesian) approach was used. Two hundred and three slaughtered pigs were used after grouping them into two groups of 96 and 107 animals. Sensitivity (Se) was estimated to be 56% (95% probability interval 40, 76) for culture and 91% (81, 97) for PCR. The specificity (Sp) of the PCR was 88% (80, 95) while the Sp of the culture had been considered 100% in the statistical analysis as all culture-positive samples were confirmed by serotyping. PCR Se was not affected by the Salmonella serotypes present in the samples analysed. Accordingly, a minimum of 25.5% of the pigs was estimated to harbour Salmonella organisms in their faeces. It was concluded that bacteriology on caecal samples alone was a poor diagnostic method, and that the PCR method could be considered a cost-effective alternative to culture in Salmonella monitoring programmes. However, given the moderate Sp of this molecular technique, PCR-positive samples should be further confirmed through bacteriology.

  17. Vortex metrology using Fourier analysis techniques: vortex networks correlation fringes.

    PubMed

    Angel-Toro, Luciano; Sierra-Sosa, Daniel; Tebaldi, Myrian; Bolognini, Néstor

    2012-10-20

    In this work, we introduce an alternative method of analysis in vortex metrology based on the application of the Fourier optics techniques. The first part of the procedure is conducted as is usual in vortex metrology for uniform in-plane displacement determination. On the basis of two recorded intensity speckled distributions, corresponding to two states of a diffuser coherently illuminated, we numerically generate an analytical signal from each recorded intensity pattern by using a version of the Riesz integral transform. Then, from each analytical signal, a two-dimensional pseudophase map is generated in which the vortices are located and characterized in terms of their topological charges and their core's structural properties. The second part of the procedure allows obtaining Young's interference fringes when Fourier transforming the light passing through a diffracting mask with multiple apertures at the locations of the homologous vortices. In fact, we use the Fourier transform as a mathematical operation to compute the far-field diffraction intensity pattern corresponding to the multiaperture set. Each aperture from the set is associated with a rectangular hole that coincides both in shape and size with a pixel from recorded images. We show that the fringe analysis can be conducted as in speckle photography in an extended range of displacement measurements. Effects related with speckled decorrelation are also considered. Our experimental results agree with those of speckle photography in the range in which both techniques are applicable.

  18. Comparison Of Four FFT-Based Frequency-Acquisition Techniques

    NASA Technical Reports Server (NTRS)

    Shah, Biren N.; Hinedi, Sami M.; Holmes, Jack K.

    1993-01-01

    Report presents comparative theoretical analysis of four conceptual techniques for initial estimation of carrier frequency of suppressed-carrier, binary-phase-shift-keyed radio signal. Each technique effected by open-loop analog/digital signal-processing subsystem part of Costas-loop phase-error detector functioning in closed-loop manner overall.

  19. Improvement of Rocket Engine Plume Analysis Techniques

    NASA Technical Reports Server (NTRS)

    Smith, S. D.

    1982-01-01

    A nozzle plume flow field code was developed. The RAMP code which was chosen as the basic code is of modular construction and has the following capabilities: two phase with two phase transonic solution; a two phase, reacting gas (chemical equilibrium reaction kinetics), supersonic inviscid nozzle/plume solution; and is operational for inviscid solutions at both high and low altitudes. The following capabilities were added to the code: a direct interface with JANNAF SPF code; shock capturing finite difference numerical operator; two phase, equilibrium/frozen, boundary layer analysis; a variable oxidizer to fuel ratio transonic solution; an improved two phase transonic solution; and a two phase real gas semiempirical nozzle boundary layer expansion.

  20. Validation of Design and Analysis Techniques of Tailored Composite Structures

    NASA Technical Reports Server (NTRS)

    Jegley, Dawn C. (Technical Monitor); Wijayratne, Dulnath D.

    2004-01-01

    Aeroelasticity is the relationship between the elasticity of an aircraft structure and its aerodynamics. This relationship can cause instabilities such as flutter in a wing. Engineers have long studied aeroelasticity to ensure such instabilities do not become a problem within normal operating conditions. In recent decades structural tailoring has been used to take advantage of aeroelasticity. It is possible to tailor an aircraft structure to respond favorably to multiple different flight regimes such as takeoff, landing, cruise, 2-g pull up, etc. Structures can be designed so that these responses provide an aerodynamic advantage. This research investigates the ability to design and analyze tailored structures made from filamentary composites. Specifically the accuracy of tailored composite analysis must be verified if this design technique is to become feasible. To pursue this idea, a validation experiment has been performed on a small-scale filamentary composite wing box. The box is tailored such that its cover panels induce a global bend-twist coupling under an applied load. Two types of analysis were chosen for the experiment. The first is a closed form analysis based on a theoretical model of a single cell tailored box beam and the second is a finite element analysis. The predicted results are compared with the measured data to validate the analyses. The comparison of results show that the finite element analysis is capable of predicting displacements and strains to within 10% on the small-scale structure. The closed form code is consistently able to predict the wing box bending to 25% of the measured value. This error is expected due to simplifying assumptions in the closed form analysis. Differences between the closed form code representation and the wing box specimen caused large errors in the twist prediction. The closed form analysis prediction of twist has not been validated from this test.

  1. Characteristics, Procedures, and Results of Two Job Analysis Techniques.

    ERIC Educational Resources Information Center

    Burnett, Michael F.; McCracken, J. David

    1982-01-01

    This article describes and compares two job analysis procedures, task inventory analysis and Position Analysis Questionnaire. It provides comparisons in terms of the characteristics of, the activities involved in, and the results derived from a study utilizing each of the techniques. (Author/CT)

  2. Evaluation of energy system analysis techniques for identifying underground facilities

    SciTech Connect

    VanKuiken, J.C.; Kavicky, J.A.; Portante, E.C.

    1996-03-01

    This report describes the results of a study to determine the feasibility and potential usefulness of applying energy system analysis techniques to help detect and characterize underground facilities that could be used for clandestine activities. Four off-the-shelf energy system modeling tools were considered: (1) ENPEP (Energy and Power Evaluation Program) - a total energy system supply/demand model, (2) ICARUS (Investigation of Costs and Reliability in Utility Systems) - an electric utility system dispatching (or production cost and reliability) model, (3) SMN (Spot Market Network) - an aggregate electric power transmission network model, and (4) PECO/LF (Philadelphia Electric Company/Load Flow) - a detailed electricity load flow model. For the purposes of most of this work, underground facilities were assumed to consume about 500 kW to 3 MW of electricity. For some of the work, facilities as large as 10-20 MW were considered. The analysis of each model was conducted in three stages: data evaluation, base-case analysis, and comparative case analysis. For ENPEP and ICARUS, open source data from Pakistan were used for the evaluations. For SMN and PECO/LF, the country data were not readily available, so data for the state of Arizona were used to test the general concept.

  3. Flexible control techniques for a lunar base

    NASA Technical Reports Server (NTRS)

    Kraus, Thomas W.

    1992-01-01

    applications with little or no customization. This means that lunar process control projects will not be delayed by unforeseen problems or last minute process modifications. The software will include all of the tools needed to adapt to virtually any changes. In contrast to other space programs which required the development of tremendous amounts of custom software, lunar-based processing facilities will benefit from the use of existing software technology which is being proven in commercial applications on Earth.

  4. On the factors governing water vapor turbulence mixing in the convective boundary layer over land: Concept and data analysis technique using ground-based lidar measurements.

    PubMed

    Pal, Sandip

    2016-06-01

    The convective boundary layer (CBL) turbulence is the key process for exchanging heat, momentum, moisture and trace gases between the earth's surface and the lower part of the troposphere. The turbulence parameterization of the CBL is a challenging but important component in numerical models. In particular, correct estimation of CBL turbulence features, parameterization, and the determination of the contribution of eddy diffusivity are important for simulating convection initiation, and the dispersion of health hazardous air pollutants and Greenhouse gases. In general, measurements of higher-order moments of water vapor mixing ratio (q) variability yield unique estimates of turbulence in the CBL. Using the high-resolution lidar-derived profiles of q variance, third-order moment, and skewness and analyzing concurrent profiles of vertical velocity, potential temperature, horizontal wind and time series of near-surface measurements of surface flux and meteorological parameters, a conceptual framework based on bottom up approach is proposed here for the first time for a robust characterization of the turbulent structure of CBL over land so that our understanding on the processes governing CBL q turbulence could be improved. Finally, principal component analyses will be applied on the lidar-derived long-term data sets of q turbulence statistics to identify the meteorological factors and the dominant physical mechanisms governing the CBL turbulence features.

  5. On the factors governing water vapor turbulence mixing in the convective boundary layer over land: Concept and data analysis technique using ground-based lidar measurements.

    PubMed

    Pal, Sandip

    2016-06-01

    The convective boundary layer (CBL) turbulence is the key process for exchanging heat, momentum, moisture and trace gases between the earth's surface and the lower part of the troposphere. The turbulence parameterization of the CBL is a challenging but important component in numerical models. In particular, correct estimation of CBL turbulence features, parameterization, and the determination of the contribution of eddy diffusivity are important for simulating convection initiation, and the dispersion of health hazardous air pollutants and Greenhouse gases. In general, measurements of higher-order moments of water vapor mixing ratio (q) variability yield unique estimates of turbulence in the CBL. Using the high-resolution lidar-derived profiles of q variance, third-order moment, and skewness and analyzing concurrent profiles of vertical velocity, potential temperature, horizontal wind and time series of near-surface measurements of surface flux and meteorological parameters, a conceptual framework based on bottom up approach is proposed here for the first time for a robust characterization of the turbulent structure of CBL over land so that our understanding on the processes governing CBL q turbulence could be improved. Finally, principal component analyses will be applied on the lidar-derived long-term data sets of q turbulence statistics to identify the meteorological factors and the dominant physical mechanisms governing the CBL turbulence features. PMID:26950615

  6. Hand-Based Biometric Analysis

    NASA Technical Reports Server (NTRS)

    Bebis, George

    2013-01-01

    Hand-based biometric analysis systems and techniques provide robust hand-based identification and verification. An image of a hand is obtained, which is then segmented into a palm region and separate finger regions. Acquisition of the image is performed without requiring particular orientation or placement restrictions. Segmentation is performed without the use of reference points on the images. Each segment is analyzed by calculating a set of Zernike moment descriptors for the segment. The feature parameters thus obtained are then fused and compared to stored sets of descriptors in enrollment templates to arrive at an identity decision. By using Zernike moments, and through additional manipulation, the biometric analysis is invariant to rotation, scale, or translation or an input image. Additionally, the analysis uses re-use of commonly seen terms in Zernike calculations to achieve additional efficiencies over traditional Zernike moment calculation.

  7. Hand-Based Biometric Analysis

    NASA Technical Reports Server (NTRS)

    Bebis, George (Inventor); Amayeh, Gholamreza (Inventor)

    2015-01-01

    Hand-based biometric analysis systems and techniques are described which provide robust hand-based identification and verification. An image of a hand is obtained, which is then segmented into a palm region and separate finger regions. Acquisition of the image is performed without requiring particular orientation or placement restrictions. Segmentation is performed without the use of reference points on the images. Each segment is analyzed by calculating a set of Zernike moment descriptors for the segment. The feature parameters thus obtained are then fused and compared to stored sets of descriptors in enrollment templates to arrive at an identity decision. By using Zernike moments, and through additional manipulation, the biometric analysis is invariant to rotation, scale, or translation or an in put image. Additionally, the analysis utilizes re-use of commonly-seen terms in Zernike calculations to achieve additional efficiencies over traditional Zernike moment calculation.

  8. Retrieval techniques and information content analysis to improve remote sensing of atmospheric water vapor, liquid water and temperature from ground-based microwave radiometer measurements

    NASA Astrophysics Data System (ADS)

    Sahoo, Swaroop

    Observation of profiles of temperature, humidity and winds with sufficient accuracy and fine vertical and temporal resolution are needed to improve mesoscale weather prediction, track conditions in the lower to mid-troposphere, predict winds for renewable energy, inform the public of severe weather and improve transportation safety. In comparing these thermodynamic variables, the absolute atmospheric temperature varies only by 15%; in contrast, total water vapor may change by up to 50% over several hours. In addition, numerical weather prediction (NWP) models are initialized using water vapor profile information, so improvements in their accuracy and resolution tend to improve the accuracy of NWP. Current water vapor profile observation systems are expensive and have insufficient spatial coverage to observe humidity in the lower to mid-troposphere. To address this important scientific need, the principal objective of this dissertation is to improve the accuracy, vertical resolution and revisit time of tropospheric water vapor profiles retrieved from microwave and millimeter-wave brightness temperature measurements. This dissertation advances the state of knowledge of retrieval of atmospheric water vapor from microwave brightness temperature measurements. It focuses on optimizing two information sources of interest for water vapor profile retrieval, i.e. independent measurements and background data set size. From a theoretical perspective, it determines sets of frequencies in the ranges of 20-23, 85-90 and 165-200 GHz that are optimal for water vapor retrieval from each of ground-based and airborne radiometers. The maximum number of degrees of freedom for the selected frequencies for ground-based radiometers is 5-6, while the optimum vertical resolution is 0.5 to 1.5 km. On the other hand, the maximum number of degrees of freedom for airborne radiometers is 8-9, while the optimum vertical resolution is 0.2 to 0.5 km. From an experimental perspective, brightness

  9. Energy minimization versus pseudo force technique for nonlinear structural analysis

    NASA Technical Reports Server (NTRS)

    Kamat, M. P.; Hayduk, R. J.

    1980-01-01

    The effectiveness of using minimization techniques for the solution of nonlinear structural analysis problems is discussed and demonstrated by comparison with the conventional pseudo force technique. The comparison involves nonlinear problems with a relatively few degrees of freedom. A survey of the state-of-the-art of algorithms for unconstrained minimization reveals that extension of the technique to large scale nonlinear systems is possible.

  10. PVUSA instrumentation and data analysis techniques for photovoltaic systems

    SciTech Connect

    Newmiller, J.; Hutchinson, P.; Townsend, T.; Whitaker, C.

    1995-10-01

    The Photovoltaics for Utility Scale Applications (PVUSA) project tests two types of PV systems at the main test site in Davis, California: new module technologies fielded as 20-kW Emerging Module Technology (EMT) arrays and more mature technologies fielded as 70- to 500-kW turnkey Utility-Scale (US) systems. PVUSA members have also installed systems in their service areas. Designed appropriately, data acquisition systems (DASs) can be a convenient and reliable means of assessing system performance, value, and health. Improperly designed, they can be complicated, difficult to use and maintain, and provide data of questionable validity. This report documents PVUSA PV system instrumentation and data analysis techniques and lessons learned. The report is intended to assist utility engineers, PV system designers, and project managers in establishing an objective, then, through a logical series of topics, facilitate selection and design of a DAS to meet the objective. Report sections include Performance Reporting Objectives (including operational versus research DAS), Recommended Measurements, Measurement Techniques, Calibration Issues, and Data Processing and Analysis Techniques. Conclusions and recommendations based on the several years of operation and performance monitoring are offered. This report is one in a series of 1994--1995 PVUSA reports documenting PVUSA lessons learned at the demonstration sites in Davis and Kerman, California. Other topical reports address: five-year assessment of EMTs; validation of the Kerman 500-kW grid support PV plant benefits; construction and safety experience in installing and operating PV systems; balance-of-system design and costs; procurement, acceptance, and rating practices for PV power plants; experience with power conditioning units and power quality.

  11. Hyperspectral-imaging-based techniques applied to wheat kernels characterization

    NASA Astrophysics Data System (ADS)

    Serranti, Silvia; Cesare, Daniela; Bonifazi, Giuseppe

    2012-05-01

    Single kernels of durum wheat have been analyzed by hyperspectral imaging (HSI). Such an approach is based on the utilization of an integrated hardware and software architecture able to digitally capture and handle spectra as an image sequence, as they results along a pre-defined alignment on a surface sample properly energized. The study was addressed to investigate the possibility to apply HSI techniques for classification of different types of wheat kernels: vitreous, yellow berry and fusarium-damaged. Reflectance spectra of selected wheat kernels of the three typologies have been acquired by a laboratory device equipped with an HSI system working in near infrared field (1000-1700 nm). The hypercubes were analyzed applying principal component analysis (PCA) to reduce the high dimensionality of data and for selecting some effective wavelengths. Partial least squares discriminant analysis (PLS-DA) was applied for classification of the three wheat typologies. The study demonstrated that good classification results were obtained not only considering the entire investigated wavelength range, but also selecting only four optimal wavelengths (1104, 1384, 1454 and 1650 nm) out of 121. The developed procedures based on HSI can be utilized for quality control purposes or for the definition of innovative sorting logics of wheat.

  12. Introducing Risk Management Techniques Within Project Based Software Engineering Courses

    NASA Astrophysics Data System (ADS)

    Port, Daniel; Boehm, Barry

    2002-03-01

    In 1996, USC switched its core two-semester software engineering course from a hypothetical-project, homework-and-exam course based on the Bloom taxonomy of educational objectives (knowledge, comprehension, application, analysis, synthesis, and evaluation). The revised course is a real-client team-project course based on the CRESST model of learning objectives (content understanding, problem solving, collaboration, communication, and self-regulation). We used the CRESST cognitive demands analysis to determine the necessary student skills required for software risk management and the other major project activities, and have been refining the approach over the last 5 years of experience, including revised versions for one-semester undergraduate and graduate project course at Columbia. This paper summarizes our experiences in evolving the risk management aspects of the project course. These have helped us mature more general techniques such as risk-driven specifications, domain-specific simplifier and complicator lists, and the schedule as an independent variable (SAIV) process model. The largely positive results in terms of review of pass / fail rates, client evaluations, product adoption rates, and hiring manager feedback are summarized as well.

  13. Assesment of dial data collection and analysis techniques

    NASA Technical Reports Server (NTRS)

    Browell, E. V.; Woods, P. T.

    1986-01-01

    The key issues in all areas of Differential Absorption Lidar (DIAL) data collection and analysis techniques were examined. This included consideration of the practical and theoretical limitations of DIAL and the range of possible DIAL measurements.

  14. 48 CFR 215.404-1 - Proposal analysis techniques.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... reliability of its estimating and accounting systems. ... 48 Federal Acquisition Regulations System 3 2010-10-01 2010-10-01 false Proposal analysis techniques. 215.404-1 Section 215.404-1 Federal Acquisition Regulations System DEFENSE...

  15. Bifurcation techniques for nonlinear dynamic analysis of compressor stall phenomena

    NASA Technical Reports Server (NTRS)

    Razavi, H. C.; Mehra, R. K.

    1985-01-01

    Compressor stall phenomena is analyzed from nonlinear control theory viewpoint, based on bifurcation-catastrophe techniques. This new approach appears promising and offers insight into such well known compressor instability problems as surge and rotating stall; furthermore it suggests strategies for recovery from stall. Three interlocking dynamic nonlinear state space models are developed. It is shown that the problem of rotating stall can be viewed as an (induced) bifurcation of solution of the unstalled model. Hysteresis effect is shown to exist in the stall/recovery process. Surge cycles are observed to develop for some critical parameter values. It is shown that the oscillatory behavior is due to development of limit cycles, generated by Hopf bifurcation of solutions. Both stable and unstable limit cycles are observed. To further illustrate the usefulness of the methodology some partial computation of domains of attraction of equilibria is carried out, and parameter sensitivity analysis is performed.

  16. Correlation of images: technique for mandible biomechanics analysis.

    PubMed

    Yachouh, Jacques; Domergue, Sophie; Loosli, Yannick; Goudot, Patrick

    2011-09-01

    Various experimental or physicomathematical methods can be used to calculate the biomechanical behavior of the mandible. In this study, we tested a new tool for the analysis of mandibular surface strain based on the correlation of images. Five fresh explanted human mandibles were placed in a loading device allowing replication of a physiologic biting exercise. Surfaces of the mandibles were prepared with white and black lacquer. Images were recorded by 2 cameras and analyzed with an algorithm to correlate those images. With the Limess Measurement & Software system and VIC 3D software, we obtained data output concerning deformations, strains, and principal strains. This allowed us to confirm strain distribution on the mandibular corpus and to focus on weak points. Image correlation is a new technique to study mandible biomechanics, which provides accurate measurements on a wide bone surface, with high-definition images and without modification of the structure.

  17. A Word-Based Compression Technique for Text Files.

    ERIC Educational Resources Information Center

    Vernor, Russel L., III; Weiss, Stephen F.

    1978-01-01

    Presents a word-based technique for storing natural language text in compact form. The compressed text consists of a dictionary and a text that is a combination of actual running text and pointers to the dictionary. This technique has shown itself to be effective for both text storage and retrieval. (VT)

  18. Investigations on landmine detection by neutron-based techniques.

    PubMed

    Csikai, J; Dóczi, R; Király, B

    2004-07-01

    Principles and techniques of some neutron-based methods used to identify the antipersonnel landmines (APMs) are discussed. New results have been achieved in the field of neutron reflection, transmission, scattering and reaction techniques. Some conclusions are as follows: The neutron hand-held detector is suitable for the observation of anomaly caused by a DLM2-like sample in different soils with a scanning speed of 1m(2)/1.5 min; the reflection cross section of thermal neutrons rendered the determination of equivalent thickness of different soil components possible; a simple method was developed for the determination of the thermal neutron flux perturbation factor needed for multi-elemental analysis of bulky samples; unfolded spectra of elastically backscattered neutrons using broad-spectrum sources render the identification of APMs possible; the knowledge of leakage spectra of different source neutrons is indispensable for the determination of the differential and integrated reaction rates and through it the dimension of the interrogated volume; the precise determination of the C/O atom fraction requires the investigations on the angular distribution of the 6.13MeV gamma-ray emitted in the (16)O(n,n'gamma) reaction. These results, in addition to the identification of landmines, render the improvement of the non-intrusive neutron methods possible.

  19. A study of trends and techniques for space base electronics

    NASA Technical Reports Server (NTRS)

    Trotter, J. D.; Wade, T. E.; Gassaway, J. D.; Mahmood, Q.

    1978-01-01

    A sputtering system was developed to deposit aluminum and aluminum alloys by the dc sputtering technique. This system is designed for a high level of cleanliness and for monitoring the deposition parameters during film preparation. This system is now ready for studying the deposition and annealing parameters upon double-level metal preparation. A technique recently applied for semiconductor analysis, the finite element method, was studied for use in the computer modeling of two dimensional MOS transistor structures. It was concluded that the method has not been sufficiently well developed for confident use at this time. An algorithm was developed for confident use at this time. An algorithm was developed for implementing a computer study which is based upon the finite difference method. The program which was developed was modified and used to calculate redistribution data for boron and phosphorous which had been predeposited by ion implantation with range and straggle conditions. Data were generated for 111 oriented SOS films with redistribution in N2, dry O2 and steam ambients.

  20. A Comparison of Ellipse-Fitting Techniques for Two and Three-Dimensional Strain Analysis, and Their Implementation in an Integrated Computer Program Designed for Field-Based Studies

    NASA Astrophysics Data System (ADS)

    Vollmer, F. W.

    2010-12-01

    A new computer program, EllipseFit 2, was developed to implement computational and graphical techniques for two and three-dimensional geological finite strain analysis. The program includes an integrated set of routines to derive three-dimensional strain from oriented digital photographs, with a graphical interface suitable for field-based structural studies. The intuitive interface and multi-platform deployment make it useful for structural geology teaching laboratories as well (the program is free). Images of oriented sections are digitized using center-point, five-point ellipse, or n-point polygon moment-equivalent ellipse fitting. The latter allows strain calculation from irregular polygons with sub-pixel accuracy (Steger, 1996; Mulchrone and Choudhury, 2004). Graphical strain ellipse techniques include center-to-center methods (Fry, 1979; Erslev, 1988; Erslev and Ge, 1990), with manual and automatic n-point ellipse-fitting. Graphical displays include axial length graphs, Rf/Φ graphs (Dunnet, 1969), logarithmic and hyperbolic polar graphs (Elliott, 1970; Wheeler, 1984) with automatic contouring, and strain maps. Best-fit ellipse calculations include harmonic and circular means, and eigenvalue (Shimamoto and Ikeda, 1976) and mean radial length (Mulchrone et al., 2003) shape-matrix calculations. Shape-matrix error analysis is done analytically (Mulchrone, 2005) and using bootstrap techniques (Efron, 1979). The initial data set can be unstrained to check variation in the calculated pre-strain fabric. Fitting of ellipse-section data to a best-fit ellipsoid (b*) is done using the shape-matrix technique of Shan (2008). Error analysis is done by calculating the section ellipses of b*, and comparing the misfits between calculated and observed section ellipses. Graphical displays of ellipsoid data include axial-ratio (Flinn, 1962) and octahedral strain magnitude (Hossack, 1968) graphs. Calculations were done to test and compare computational techniques. For two

  1. Comparison of multivariate preprocessing techniques as applied to electronic tongue based pattern classification for black tea.

    PubMed

    Palit, Mousumi; Tudu, Bipan; Bhattacharyya, Nabarun; Dutta, Ankur; Dutta, Pallab Kumar; Jana, Arun; Bandyopadhyay, Rajib; Chatterjee, Anutosh

    2010-08-18

    In an electronic tongue, preprocessing on raw data precedes pattern analysis and choice of the appropriate preprocessing technique is crucial for the performance of the pattern classifier. While attempting to classify different grades of black tea using a voltammetric electronic tongue, different preprocessing techniques have been explored and a comparison of their performances is presented in this paper. The preprocessing techniques are compared first by a quantitative measurement of separability followed by principle component analysis; and then two different supervised pattern recognition models based on neural networks are used to evaluate the performance of the preprocessing techniques.

  2. Comparison of laser transit anemometry data analysis techniques

    NASA Technical Reports Server (NTRS)

    Humphreys, William M., Jr.; Gartrell, Luther R.

    1991-01-01

    Two techniques for the extraction of two-dimensional flow information from laser transit anemometry (LTA) data sets are presented and compared via a simulation study and experimental investigation. The methods are a probability density function (PDF) estimation technique and a marginal distribution analysis technique. The simulation study builds on the results of previous work and provides a quantification of the accuracy of both techniques for various LTA data acquisition scenarios. The experimental comparison consists of using an LTA system to survey the flow downstream of a turbulence generator in a small low-speed wind tunnel. The collected data sets are analyzed and compared.

  3. Polynomial optimization techniques for activity scheduling. Optimization based prototype scheduler

    NASA Technical Reports Server (NTRS)

    Reddy, Surender

    1991-01-01

    Polynomial optimization techniques for activity scheduling (optimization based prototype scheduler) are presented in the form of the viewgraphs. The following subject areas are covered: agenda; need and viability of polynomial time techniques for SNC (Space Network Control); an intrinsic characteristic of SN scheduling problem; expected characteristics of the schedule; optimization based scheduling approach; single resource algorithms; decomposition of multiple resource problems; prototype capabilities, characteristics, and test results; computational characteristics; some features of prototyped algorithms; and some related GSFC references.

  4. An Empirical Test of a Trait-Oriented Job Analysis Technique.

    ERIC Educational Resources Information Center

    Lopez, Felix M.; And Others

    1981-01-01

    Examined a trait-oriented job analysis technique based on 33 defined traits encompassing elements of the physical, mental, learned, motivational, and social domains of the work world. Discriminability tests showed support for the technique and indicated that job incumbents requiring a particular trait scored higher on measures of that trait.…

  5. High-level power analysis and optimization techniques

    NASA Astrophysics Data System (ADS)

    Raghunathan, Anand

    1997-12-01

    This thesis combines two ubiquitous trends in the VLSI design world--the move towards designing at higher levels of design abstraction, and the increasing importance of power consumption as a design metric. Power estimation and optimization tools are becoming an increasingly important part of design flows, driven by a variety of requirements such as prolonging battery life in portable computing and communication devices, thermal considerations and system cooling and packaging costs, reliability issues (e.g. electromigration, ground bounce, and I-R drops in the power network), and environmental concerns. This thesis presents a suite of techniques to automatically perform power analysis and optimization for designs at the architecture or register-transfer, and behavior or algorithm levels of the design hierarchy. High-level synthesis refers to the process of synthesizing, from an abstract behavioral description, a register-transfer implementation that satisfies the desired constraints. High-level synthesis tools typically perform one or more of the following tasks: transformations, module selection, clock selection, scheduling, and resource allocation and assignment (also called resource sharing or hardware sharing). High-level synthesis techniques for minimizing the area, maximizing the performance, and enhancing the testability of the synthesized designs have been investigated. This thesis presents high-level synthesis techniques that minimize power consumption in the synthesized data paths. This thesis investigates the effects of resource sharing on the power consumption in the data path, provides techniques to efficiently estimate power consumption during resource sharing, and resource sharing algorithms to minimize power consumption. The RTL circuit that is obtained from the high-level synthesis process can be further optimized for power by applying power-reducing RTL transformations. This thesis presents macro-modeling and estimation techniques for switching

  6. Wood lens design philosophy based on a binary additive manufacturing technique

    NASA Astrophysics Data System (ADS)

    Marasco, Peter L.; Bailey, Christopher

    2016-04-01

    Using additive manufacturing techniques in optical engineering to construct a gradient index (GRIN) optic may overcome a number of limitations of GRIN technology. Such techniques are maturing quickly, yielding additional design degrees of freedom for the engineer. How best to employ these degrees of freedom is not completely clear at this time. This paper describes a preliminary design philosophy, including assumptions, pertaining to a particular printing technique for GRIN optics. It includes an analysis based on simulation and initial component measurement.

  7. The detection of bulk explosives using nuclear-based techniques

    SciTech Connect

    Morgado, R.E.; Gozani, T.; Seher, C.C.

    1988-01-01

    In 1986 we presented a rationale for the detection of bulk explosives based on nuclear techniques that addressed the requirements of civil aviation security in the airport environment. Since then, efforts have intensified to implement a system based on thermal neutron activation (TNA), with new work developing in fast neutron and energetic photon reactions. In this paper we will describe these techniques and present new results from laboratory and airport testing. Based on preliminary results, we contended in our earlier paper that nuclear-based techniques did provide sufficiently penetrating probes and distinguishable detectable reaction products to achieve the FAA operational goals; new data have supported this contention. The status of nuclear-based techniques for the detection of bulk explosives presently under investigation by the US Federal Aviation Administration (FAA) is reviewed. These include thermal neutron activation (TNA), fast neutron activation (FNA), the associated particle technique, nuclear resonance absorption, and photoneutron activation. The results of comprehensive airport testing of the TNA system performed during 1987-88 are summarized. From a technical point of view, nuclear-based techniques now represent the most comprehensive and feasible approach for meeting the operational criteria of detection, false alarms, and throughput. 9 refs., 5 figs., 2 tabs.

  8. Application of glyph-based techniques for multivariate engineering visualization

    NASA Astrophysics Data System (ADS)

    Glazar, Vladimir; Marunic, Gordana; Percic, Marko; Butkovic, Zlatko

    2016-01-01

    This article presents a review of glyph-based techniques for engineering visualization as well as practical application for the multivariate visualization process. Two glyph techniques, Chernoff faces and star glyphs, uncommonly used in engineering practice, are described, applied to the selected data set, run through the chosen optimization methods and user evaluated. As an example of how these techniques function, a set of data for the optimization of a heat exchanger with a microchannel coil is adopted for visualization. The results acquired by the chosen visualization techniques are related to the results of optimization carried out by the response surface method and compared with the results of user evaluation. Based on the data set from engineering research and practice, the advantages and disadvantages of these techniques for engineering visualization are identified and discussed.

  9. Evolution of Electroencephalogram Signal Analysis Techniques during Anesthesia

    PubMed Central

    Al-Kadi, Mahmoud I.; Reaz, Mamun Bin Ibne; Ali, Mohd Alauddin Mohd

    2013-01-01

    Biosignal analysis is one of the most important topics that researchers have tried to develop during the last century to understand numerous human diseases. Electroencephalograms (EEGs) are one of the techniques which provides an electrical representation of biosignals that reflect changes in the activity of the human brain. Monitoring the levels of anesthesia is a very important subject, which has been proposed to avoid both patient awareness caused by inadequate dosage of anesthetic drugs and excessive use of anesthesia during surgery. This article reviews the bases of these techniques and their development within the last decades and provides a synopsis of the relevant methodologies and algorithms that are used to analyze EEG signals. In addition, it aims to present some of the physiological background of the EEG signal, developments in EEG signal processing, and the effective methods used to remove various types of noise. This review will hopefully increase efforts to develop methods that use EEG signals for determining and classifying the depth of anesthesia with a high data rate to produce a flexible and reliable detection device. PMID:23686141

  10. Evolution of electroencephalogram signal analysis techniques during anesthesia.

    PubMed

    Al-Kadi, Mahmoud I; Reaz, Mamun Bin Ibne; Ali, Mohd Alauddin Mohd

    2013-05-17

    Biosignal analysis is one of the most important topics that researchers have tried to develop during the last century to understand numerous human diseases. Electroencephalograms (EEGs) are one of the techniques which provides an electrical representation of biosignals that reflect changes in the activity of the human brain. Monitoring the levels of anesthesia is a very important subject, which has been proposed to avoid both patient awareness caused by inadequate dosage of anesthetic drugs and excessive use of anesthesia during surgery. This article reviews the bases of these techniques and their development within the last decades and provides a synopsis of the relevant methodologies and algorithms that are used to analyze EEG signals. In addition, it aims to present some of the physiological background of the EEG signal, developments in EEG signal processing, and the effective methods used to remove various types of noise. This review will hopefully increase efforts to develop methods that use EEG signals for determining and classifying the depth of anesthesia with a high data rate to produce a flexible and reliable detection device.

  11. Metrology Optical Power Budgeting in SIM Using Statistical Analysis Techniques

    NASA Technical Reports Server (NTRS)

    Kuan, Gary M

    2008-01-01

    The Space Interferometry Mission (SIM) is a space-based stellar interferometry instrument, consisting of up to three interferometers, which will be capable of micro-arc second resolution. Alignment knowledge of the three interferometer baselines requires a three-dimensional, 14-leg truss with each leg being monitored by an external metrology gauge. In addition, each of the three interferometers requires an internal metrology gauge to monitor the optical path length differences between the two sides. Both external and internal metrology gauges are interferometry based, operating at a wavelength of 1319 nanometers. Each gauge has fiber inputs delivering measurement and local oscillator (LO) power, split into probe-LO and reference-LO beam pairs. These beams experience power loss due to a variety of mechanisms including, but not restricted to, design efficiency, material attenuation, element misalignment, diffraction, and coupling efficiency. Since the attenuation due to these sources may degrade over time, an accounting of the range of expected attenuation is needed so an optical power margin can be book kept. A method of statistical optical power analysis and budgeting, based on a technique developed for deep space RF telecommunications, is described in this paper and provides a numerical confidence level for having sufficient optical power relative to mission metrology performance requirements.

  12. Techniques of trend analysis for monthly water quality data.

    USGS Publications Warehouse

    Hirsch, R.M.; Slack, J.R.; Smith, R.A.

    1982-01-01

    Some of the characteristics that complicate the analysis of water quality time series are non-normal distributions, seasonality, flow relatedness, missing values, values below the limit of detection, and serial correlation. Presented here are techniques that are suitable in the face of the complications listed above for the exploratory analysis of monthly water quality data for monotonic trends.-from Authors

  13. Regional environmental analysis and management: New techniques for current problems

    NASA Technical Reports Server (NTRS)

    Honea, R. B.; Paludan, C. T. N.

    1974-01-01

    Advances in data acquisition and processing procedures for regional environmental analysis are discussed. Automated and semi-automated techniques employing Earth Resources Technology Satellite data and conventional data sources are presented. Experiences are summarized. The ERTS computer compatible tapes provide a very complete and flexible record of earth resources data and represent a viable medium to enhance regional environmental analysis research.

  14. Basic Sequence Analysis Techniques for Use with Audit Trail Data

    ERIC Educational Resources Information Center

    Judd, Terry; Kennedy, Gregor

    2008-01-01

    Audit trail analysis can provide valuable insights to researchers and evaluators interested in comparing and contrasting designers' expectations of use and students' actual patterns of use of educational technology environments (ETEs). Sequence analysis techniques are particularly effective but have been neglected to some extent because of real…

  15. Biomechanical analysis of cross-country skiing techniques.

    PubMed

    Smith, G A

    1992-09-01

    The development of new techniques for cross-country skiing based on skating movements has stimulated biomechanical research aimed at understanding the various movement patterns, the forces driving the motions, and the mechanical factors affecting performance. Research methods have evolved from two-dimensional kinematic descriptions of classic ski techniques to three-dimensional analyses involving measurement of the forces and energy relations of skating. While numerous skiing projects have been completed, most have focused on either the diagonal stride or the V1 skating technique on uphill terrain. Current understanding of skiing mechanics is not sufficiently complete to adequately assess and optimize an individual skier's technique.

  16. Aerosol analysis techniques and results from micro pulse lidar

    NASA Technical Reports Server (NTRS)

    Hlavka, Dennis L.; Spinhirne, James D.; Campbell, James R.; Reagan, John A.; Powell, Donna

    1998-01-01

    The effect of clouds and aerosol on the atmospheric energy balance is a key global change problem. Full knowledge of aerosol distributions is difficult to obtain by passive sensing alone. Aerosol and cloud retrievals in several important areas can be significantly improved with active remote sensing by lidar. Micro Pulse Lidar (MPL) is an aerosol and cloud profilometer that provides a detailed picture of the vertical structure of boundary layer and elevated dust or smoke plume aerosols. MPL is a compact, fully eyesafe, ground-based, zenith pointing instrument capable of full-time, long-term unattended operation at 523 nm. In October of 1993, MPL began taking full-time measurements for the Atmospheric Radiation Measurement (ARM) program at its Southern Great Plains (SGP) site and has since expanded to ARM sites in the Tropical West Pacific (TWP) and the North Slope of Alaska (NSA). Other MPL's are moving out to some of the 60 world-wide Aerosol Robotic Network (AERONET) sites which are already equipped with automatic sun-sky scanning spectral radiometers providing total column optical depth measurements. Twelve additional MPL's have been purchased by NASA to add to the aerosol and cloud database of the EOS ground validation network. The original MPL vertical resolution was 300 meters but the newer versions have a vertical resolution of 30 meters. These expanding data sets offer a significant new resource for atmospheric radiation analysis. Under the direction of Jim Spinhirne, the MPL analysis team at NASA/GSFC has developed instrument correction and backscatter analysis techniques for ARM to detect cloud boundaries and analyze vertical aerosol structures. A summary of MPL applications is found in Hlavka (1997). With the aid of independent total column optical depth instruments such as the Multifilter Rotating Shadowband Radiometer (MFRSR) at the ARM sites or sun photometers at the AERONET sites, the MPL data can be calibrated, and time-resolved vertical profiles of

  17. Initial planetary base construction techniques and machine implementation

    NASA Technical Reports Server (NTRS)

    Crockford, William W.

    1987-01-01

    Conceptual designs of (1) initial planetary base structures, and (2) an unmanned machine to perform the construction of these structures using materials local to the planet are presented. Rock melting is suggested as a possible technique to be used by the machine in fabricating roads, platforms, and interlocking bricks. Identification of problem areas in machine design and materials processing is accomplished. The feasibility of the designs is contingent upon favorable results of an analysis of the engineering behavior of the product materials. The analysis requires knowledge of several parameters for solution of the constitutive equations of the theory of elasticity. An initial collection of these parameters is presented which helps to define research needed to perform a realistic feasibility study. A qualitative approach to estimating power and mass lift requirements for the proposed machine is used which employs specifications of currently available equipment. An initial, unmanned mission scenario is discussed with emphasis on identifying uncompleted tasks and suggesting design considerations for vehicles and primitive structures which use the products of the machine processing.

  18. Geotechnical Analysis of Paleoseismic Shaking Using Liquefaction Features: Part I. Major Updating of Analysis Techniques

    USGS Publications Warehouse

    Olson, Scott M.; Green, Russell A.; Obermeier, Stephen F.

    2003-01-01

    A new methodology is proposed for the geotechnical analysis of strength of paleoseismic shaking using liquefaction effects. The proposed method provides recommendations for selection of both individual and regionally located test sites, techniques for validation of field data for use in back-analysis, and use of a recently developed energy-based solution to back-calculate paleoearthquake magnitude and strength of shaking. The proposed method allows investigators to assess the influence of post-earthquake density change and aging. The proposed method also describes how the back-calculations from individual sites should be integrated into a regional assessment of paleoseismic parameters.

  19. A new data analysis technique for Spitzer transit observations

    NASA Astrophysics Data System (ADS)

    Morello, Giuseppe; Waldmann, Ingo Peter; Tinetti, Giovanna; Howarth, Ian D.; Micela, Giuseppina

    2015-08-01

    Observations of exoplanetary transits are a powerful tool to investigate the nature of planets around other stars. Transits are revealed through periodic drops in the apparent stellar brightness, due to the interposition of a planet between the star and the observer. Multi-wavelength observations can be used to characterize the atmospheres of exoplanets, through differences in the transit depths, typically at the level of one part in 10^4 in stellar flux for giant planets. Although this method has been successfully used to detect a list of molecules on several exoplanets, some controversial results are present in the literature. Instrumental systematics are often difficult to disentangle from the signal, and the use of different parameterizations of the systematics can affect the results. We present a blind source separation method, based on an Independent Component Analysis of individual pixel time series, to decorrelate the planetary signal without any prior instrument model or astrophysical information, hence ensuring a higher degree of objectivity. This method has been applied to a few Spitzer/IRAC light-curves of HD189733b and GJ436b, obtaining for the first time coherent and repeatable results over different epochs (Morello et al. 2014, ApJ, 786, 22, Morello et al. 2015, accepted by ApJ). We will present here the technique (Morello 2015, submitted), and the results of its application to different observations, in addition to the already published ones. A uniform re-analysis of other archive data with this technique will provide improved parameters for a list of exoplanets, and in particular some other results which are debated in the literature.

  20. Damage detection technique by measuring laser-based mechanical impedance

    SciTech Connect

    Lee, Hyeonseok; Sohn, Hoon

    2014-02-18

    This study proposes a method for measurement of mechanical impedance using noncontact laser ultrasound. The measurement of mechanical impedance has been of great interest in nondestructive testing (NDT) or structural health monitoring (SHM) since mechanical impedance is sensitive even to small-sized structural defects. Conventional impedance measurements, however, have been based on electromechanical impedance (EMI) using contact-type piezoelectric transducers, which show deteriorated performances induced by the effects of a) Curie temperature limitations, b) electromagnetic interference (EMI), c) bonding layers and etc. This study aims to tackle the limitations of conventional EMI measurement by utilizing laser-based mechanical impedance (LMI) measurement. The LMI response, which is equivalent to a steady-state ultrasound response, is generated by shooting the pulse laser beam to the target structure, and is acquired by measuring the out-of-plane velocity using a laser vibrometer. The formation of the LMI response is observed through the thermo-mechanical finite element analysis. The feasibility of applying the LMI technique for damage detection is experimentally verified using a pipe specimen under high temperature environment.

  1. CANDU in-reactor quantitative visual-based inspection techniques

    NASA Astrophysics Data System (ADS)

    Rochefort, P. A.

    2009-02-01

    This paper describes two separate visual-based inspection procedures used at CANDU nuclear power generating stations. The techniques are quantitative in nature and are delivered and operated in highly radioactive environments with access that is restrictive, and in one case is submerged. Visual-based inspections at stations are typically qualitative in nature. For example a video system will be used to search for a missing component, inspect for a broken fixture, or locate areas of excessive corrosion in a pipe. In contrast, the methods described here are used to measure characteristic component dimensions that in one case ensure ongoing safe operation of the reactor and in the other support reactor refurbishment. CANDU reactors are Pressurized Heavy Water Reactors (PHWR). The reactor vessel is a horizontal cylindrical low-pressure calandria tank approximately 6 m in diameter and length, containing heavy water as a neutron moderator. Inside the calandria, 380 horizontal fuel channels (FC) are supported at each end by integral end-shields. Each FC holds 12 fuel bundles. The heavy water primary heat transport water flows through the FC pressure tube, removing the heat from the fuel bundles and delivering it to the steam generator. The general design of the reactor governs both the type of measurements that are required and the methods to perform the measurements. The first inspection procedure is a method to remotely measure the gap between FC and other in-core horizontal components. The technique involves delivering vertically a module with a high-radiation-resistant camera and lighting into the core of a shutdown but fuelled reactor. The measurement is done using a line-of-sight technique between the components. Compensation for image perspective and viewing elevation to the measurement is required. The second inspection procedure measures flaws within the reactor's end shield FC calandria tube rolled joint area. The FC calandria tube (the outer shell of the FC) is

  2. VLBI Analysis with the Multi-Technique Software GEOSAT

    NASA Technical Reports Server (NTRS)

    Kierulf, Halfdan Pascal; Andersen, Per-Helge; Boeckmann, Sarah; Kristiansen, Oddgeir

    2010-01-01

    GEOSAT is a multi-technique geodetic analysis software developed at Forsvarets Forsknings Institutt (Norwegian defense research establishment). The Norwegian Mapping Authority has now installed the software and has, together with Forsvarets Forsknings Institutt, adapted the software to deliver datum-free normal equation systems in SINEX format. The goal is to be accepted as an IVS Associate Analysis Center and to provide contributions to the IVS EOP combination on a routine basis. GEOSAT is based on an upper diagonal factorized Kalman filter which allows estimation of time variable parameters like the troposphere and clocks as stochastic parameters. The tropospheric delays in various directions are mapped to tropospheric zenith delay using ray-tracing. Meteorological data from ECMWF with a resolution of six hours is used to perform the ray-tracing which depends both on elevation and azimuth. Other models are following the IERS and IVS conventions. The Norwegian Mapping Authority has submitted test SINEX files produced with GEOSAT to IVS. The results have been compared with the existing IVS combined products. In this paper the outcome of these comparisons is presented.

  3. Efficient Plant Supervision Strategy Using NN Based Techniques

    NASA Astrophysics Data System (ADS)

    Garcia, Ramon Ferreiro; Rolle, Jose Luis Calvo; Castelo, Francisco Javier Perez

    Most of non-linear type one and type two control systems suffers from lack of detectability when model based techniques are applied on FDI (fault detection and isolation) tasks. In general, all types of processes suffer from lack of detectability also due to the ambiguity to discriminate the process, sensors and actuators in order to isolate any given fault. This work deals with a strategy to detect and isolate faults which include massive neural networks based functional approximation procedures associated to recursive rule based techniques applied to a parity space approach.

  4. S-192 analysis: Conventional and special data processing techniques. [Michigan

    NASA Technical Reports Server (NTRS)

    Nalepka, R. F. (Principal Investigator); Morganstern, J.; Cicone, R.; Sarno, J.; Lambeck, P.; Malila, W.

    1975-01-01

    The author has identified the following significant results. Multispectral scanner data gathered over test sites in southeast Michigan were analyzed. This analysis showed the data to be somewhat deficient especially in terms of the limited signal range in most SDOs and also in regard to SDO-SDO misregistration. Further analysis showed that the scan line straightening algorithm increased the misregistration of the data. Data were processed using the conic format. The effects of such misregistration on classification accuracy was analyzed via simulation and found to be significant. Results of employing conventional as well as special, unresolved object, processing techniques were disappointing due, at least in part, to the limited signal range and noise content of the data. Application of a second class of special processing techniques, signature extension techniques, yielded better results. Two of the more basic signature extension techniques seemed to be useful in spite of the difficulties.

  5. Analysis techniques used on field degraded photovoltaic modules

    SciTech Connect

    Hund, T.D.; King, D.L.

    1995-09-01

    Sandia National Laboratory`s PV System Components Department performs comprehensive failure analysis of photovoltaic modules after extended field exposure at various sites around the world. A full spectrum of analytical techniques are used to help identify the causes of degradation. The techniques are used to make solder fatigue life predictions for PV concentrator modules, identify cell damage or current mismatch, and measure the adhesive strength of the module encapsulant.

  6. Diode laser based water vapor DIAL using modulated pulse technique

    NASA Astrophysics Data System (ADS)

    Pham, Phong Le Hoai; Abo, Makoto

    2014-11-01

    In this paper, we propose a diode laser based differential absorption lidar (DIAL) for measuring lower-tropospheric water vapor profile using the modulated pulse technique. The transmitter is based on single-mode diode laser and tapered semiconductor optical amplifier with a peak power of 10W around 800nm absorption band, and the receiver telescope diameter is 35cm. The selected wavelengths are compared to referenced wavelengths in terms of random error and systematic errors. The key component of modulated pulse technique, a macropulse, is generated with a repetition rate of 10 kHz, and the modulation within the macropulse is coded according to a pseudorandom sequence with 100ns chip width. As a result, we evaluate both single pulse modulation and pseudorandom coded pulse modulation technique. The water vapor profiles conducted from these modulation techniques are compared to the real observation data in summer in Japan.

  7. Applications of Electromigration Techniques: Applications of Electromigration Techniques in Food Analysis

    NASA Astrophysics Data System (ADS)

    Wieczorek, Piotr; Ligor, Magdalena; Buszewski, Bogusław

    Electromigration techniques, including capillary electrophoresis (CE), are widely used for separation and identification of compounds present in food products. These techniques may also be considered as alternate and complementary with respect to commonly used analytical techniques, such as high-performance liquid chromatography (HPLC), or gas chromatography (GC). Applications of CE concern the determination of high-molecular compounds, like polyphenols, including flavonoids, pigments, vitamins, food additives (preservatives, antioxidants, sweeteners, artificial pigments) are presented. Also, the method developed for the determination of proteins and peptides composed of amino acids, which are basic components of food products, are studied. Other substances such as carbohydrates, nucleic acids, biogenic amines, natural toxins, and other contaminations including pesticides and antibiotics are discussed. The possibility of CE application in food control laboratories, where analysis of the composition of food and food products are conducted, is of great importance. CE technique may be used during the control of technological processes in the food industry and for the identification of numerous compounds present in food. Due to the numerous advantages of the CE technique it is successfully used in routine food analysis.

  8. Bond strength with custom base indirect bonding techniques.

    PubMed

    Klocke, Arndt; Shi, Jianmin; Kahl-Nieke, Bärbel; Bismayer, Ulrich

    2003-04-01

    Different types of adhesives for indirect bonding techniques have been introduced recently. But there is limited information regarding bond strength with these new materials. In this in vitro investigation, stainless steel brackets were bonded to 100 permanent bovine incisors using the Thomas technique, the modified Thomas technique, and light-cured direct bonding for a control group. The following five groups of 20 teeth each were formed: (1) modified Thomas technique with thermally cured base composite (Therma Cure) and chemically cured sealant (Maximum Cure), (2) Thomas technique with thermally cured base composite (Therma Cure) and chemically cured sealant (Custom I Q), (3) Thomas technique with light-cured base composite (Transbond XT) and chemically cured sealant (Sondhi Rapid Set), (4) modified Thomas technique with chemically cured base adhesive (Phase II) and chemically cured sealant (Maximum Cure), and (5) control group directly bonded with light-cured adhesive (Transbond XT). Mean bond strengths in groups 3, 4, and 5 were 14.99 +/- 2.85, 15.41 +/- 3.21, and 13.88 +/- 2.33 MPa, respectively, and these groups were not significantly different from each other. Groups 1 (mean bond strength 7.28 +/- 4.88 MPa) and 2 (mean bond strength 7.07 +/- 4.11 MPa) showed significantly lower bond strengths than groups 3, 4, and 5 and a higher probability of bond failure. Both the original (group 2) and the modified (group 1) Thomas technique were able to achieve bond strengths comparable to the light-cured direct bonded control group.

  9. Review and classification of variability analysis techniques with clinical applications

    PubMed Central

    2011-01-01

    Analysis of patterns of variation of time-series, termed variability analysis, represents a rapidly evolving discipline with increasing applications in different fields of science. In medicine and in particular critical care, efforts have focussed on evaluating the clinical utility of variability. However, the growth and complexity of techniques applicable to this field have made interpretation and understanding of variability more challenging. Our objective is to provide an updated review of variability analysis techniques suitable for clinical applications. We review more than 70 variability techniques, providing for each technique a brief description of the underlying theory and assumptions, together with a summary of clinical applications. We propose a revised classification for the domains of variability techniques, which include statistical, geometric, energetic, informational, and invariant. We discuss the process of calculation, often necessitating a mathematical transform of the time-series. Our aims are to summarize a broad literature, promote a shared vocabulary that would improve the exchange of ideas, and the analyses of the results between different studies. We conclude with challenges for the evolving science of variability analysis. PMID:21985357

  10. Automated thermal mapping techniques using chromatic image analysis

    NASA Technical Reports Server (NTRS)

    Buck, Gregory M.

    1989-01-01

    Thermal imaging techniques are introduced using a chromatic image analysis system and temperature sensitive coatings. These techniques are used for thermal mapping and surface heat transfer measurements on aerothermodynamic test models in hypersonic wind tunnels. Measurements are made on complex vehicle configurations in a timely manner and at minimal expense. The image analysis system uses separate wavelength filtered images to analyze surface spectral intensity data. The system was initially developed for quantitative surface temperature mapping using two-color thermographic phosphors but was found useful in interpreting phase change paint and liquid crystal data as well.

  11. Sample preparation techniques in trace element analysis of water

    NASA Astrophysics Data System (ADS)

    Nagj, Marina; Jakšić, M.; Orlić, I.; Valković, V.

    1985-06-01

    Sample preparation techniques for the analysis of water for trace elements using X-ray emission spectroscopy are described. Fresh water samples for the analysis of transition metals were prepared by complexation with ammonium-pyrrolidine-dithiocarbamate (APDC) and filtering through a membrane filter. Analyses of water samples for halogenes was done on samples prepared by precipitation with AgNO 3 and subsequent filtration. Two techniques for seawater preparation for uranium determination are described, viz. precipitation with APDC in the presence of iron (II) as a carrier and complexation with APDC followed with adsorption on activated carbon. In all cases trace element levels at 10 -3 μg/g were measured.

  12. The Analysis of Dimensionality Reduction Techniques in Cryptographic Object Code Classification

    SciTech Connect

    Jason L. Wright; Milos Manic

    2010-05-01

    This paper compares the application of three different dimension reduction techniques to the problem of locating cryptography in compiled object code. A simple classi?er is used to compare dimension reduction via sorted covariance, principal component analysis, and correlation-based feature subset selection. The analysis concentrates on the classi?cation accuracy as the number of dimensions is increased.

  13. Laser-based direct-write techniques for cell printing

    PubMed Central

    Schiele, Nathan R; Corr, David T; Huang, Yong; Raof, Nurazhani Abdul; Xie, Yubing; Chrisey, Douglas B

    2016-01-01

    Fabrication of cellular constructs with spatial control of cell location (±5 μm) is essential to the advancement of a wide range of applications including tissue engineering, stem cell and cancer research. Precise cell placement, especially of multiple cell types in co- or multi-cultures and in three dimensions, can enable research possibilities otherwise impossible, such as the cell-by-cell assembly of complex cellular constructs. Laser-based direct writing, a printing technique first utilized in electronics applications, has been adapted to transfer living cells and other biological materials (e.g., enzymes, proteins and bioceramics). Many different cell types have been printed using laser-based direct writing, and this technique offers significant improvements when compared to conventional cell patterning techniques. The predominance of work to date has not been in application of the technique, but rather focused on demonstrating the ability of direct writing to pattern living cells, in a spatially precise manner, while maintaining cellular viability. This paper reviews laser-based additive direct-write techniques for cell printing, and the various cell types successfully laser direct-written that have applications in tissue engineering, stem cell and cancer research are highlighted. A particular focus is paid to process dynamics modeling and process-induced cell injury during laser-based cell direct writing. PMID:20814088

  14. Emerging techniques for soil analysis via mid-infrared spectroscopy

    NASA Astrophysics Data System (ADS)

    Linker, R.; Shaviv, A.

    2009-04-01

    Transmittance and diffuse reflectance (DRIFT) spectroscopy in the mid-IR range are well-established methods for soil analysis. Over the last five years, additional mid-IR techniques have been investigated, and in particular: 1. Attenuated total reflectance (ATR) Attenuated total reflectance is commonly used for analysis of liquids and powders for which simple transmittance measurements are not possible. The method relies on a crystal with a high refractive index, which is in contact with the sample and serves as a waveguide for the IR radiation. The radiation beam is directed in such a way that it hits the crystal/sample interface several times, each time penetrating a few microns into the sample. Since the penetration depth is limited to a few microns, very good contact between the sample and the crystal must be ensured, which can be achieved by working with samples close to water saturation. However, the strong absorbance of water in the mid-infrared range as well as the absorbance of some soil constituents (e.g., calcium carbonate) interfere with some of the absorbance bands of interest. This has led to the development of several post-processing methods for analysis of the spectra. The FTIR-ATR technique has been successfully applied to soil classification as well as to determination of nitrate concentration [1, 6-8, 10]. Furthermore, Shaviv et al. [12] demonstrated the possibility of using fiber optics as an ATR devise for direct determination of nitrate concentration in soil extracts. Recently, Du et al. [5] showed that it is possible to differentiate between 14N and 15N in such spectra, which opens very promising opportunities for developing FTIR-ATR based methods for investigating nitrogen transformation in soils by tracing changes in N-isotopic species. 2. Photo-acoustic spectroscopy Photoacoustic spectroscopy (PAS) is based on absorption-induced heating of the sample, which produces pressure fluctuations in a surrounding gas. These fluctuations are

  15. Towards Effective Clustering Techniques for the Analysis of Electric Power Grids

    SciTech Connect

    Hogan, Emilie A.; Cotilla Sanchez, Jose E.; Halappanavar, Mahantesh; Wang, Shaobu; Mackey, Patrick S.; Hines, Paul; Huang, Zhenyu

    2013-11-30

    Clustering is an important data analysis technique with numerous applications in the analysis of electric power grids. Standard clustering techniques are oblivious to the rich structural and dynamic information available for power grids. Therefore, by exploiting the inherent topological and electrical structure in the power grid data, we propose new methods for clustering with applications to model reduction, locational marginal pricing, phasor measurement unit (PMU or synchrophasor) placement, and power system protection. We focus our attention on model reduction for analysis based on time-series information from synchrophasor measurement devices, and spectral techniques for clustering. By comparing different clustering techniques on two instances of realistic power grids we show that the solutions are related and therefore one could leverage that relationship for a computational advantage. Thus, by contrasting different clustering techniques we make a case for exploiting structure inherent in the data with implications for several domains including power systems.

  16. Construction of dynamic stochastic simulation models using knowledge-based techniques

    NASA Technical Reports Server (NTRS)

    Williams, M. Douglas; Shiva, Sajjan G.

    1990-01-01

    Over the past three decades, computer-based simulation models have proven themselves to be cost-effective alternatives to the more structured deterministic methods of systems analysis. During this time, many techniques, tools and languages for constructing computer-based simulation models have been developed. More recently, advances in knowledge-based system technology have led many researchers to note the similarities between knowledge-based programming and simulation technologies and to investigate the potential application of knowledge-based programming techniques to simulation modeling. The integration of conventional simulation techniques with knowledge-based programming techniques is discussed to provide a development environment for constructing knowledge-based simulation models. A comparison of the techniques used in the construction of dynamic stochastic simulation models and those used in the construction of knowledge-based systems provides the requirements for the environment. This leads to the design and implementation of a knowledge-based simulation development environment. These techniques were used in the construction of several knowledge-based simulation models including the Advanced Launch System Model (ALSYM).

  17. Wavelet-based polarimetry analysis

    NASA Astrophysics Data System (ADS)

    Ezekiel, Soundararajan; Harrity, Kyle; Farag, Waleed; Alford, Mark; Ferris, David; Blasch, Erik

    2014-06-01

    Wavelet transformation has become a cutting edge and promising approach in the field of image and signal processing. A wavelet is a waveform of effectively limited duration that has an average value of zero. Wavelet analysis is done by breaking up the signal into shifted and scaled versions of the original signal. The key advantage of a wavelet is that it is capable of revealing smaller changes, trends, and breakdown points that are not revealed by other techniques such as Fourier analysis. The phenomenon of polarization has been studied for quite some time and is a very useful tool for target detection and tracking. Long Wave Infrared (LWIR) polarization is beneficial for detecting camouflaged objects and is a useful approach when identifying and distinguishing manmade objects from natural clutter. In addition, the Stokes Polarization Parameters, which are calculated from 0°, 45°, 90°, 135° right circular, and left circular intensity measurements, provide spatial orientations of target features and suppress natural features. In this paper, we propose a wavelet-based polarimetry analysis (WPA) method to analyze Long Wave Infrared Polarimetry Imagery to discriminate targets such as dismounts and vehicles from background clutter. These parameters can be used for image thresholding and segmentation. Experimental results show the wavelet-based polarimetry analysis is efficient and can be used in a wide range of applications such as change detection, shape extraction, target recognition, and feature-aided tracking.

  18. Recent Electrochemical and Optical Sensors in Flow-Based Analysis

    PubMed Central

    Chailapakul, Orawon; Ngamukot, Passapol; Yoosamran, Alongkorn; Siangproh, Weena; Wangfuengkanagul, Nattakarn

    2006-01-01

    Some recent analytical sensors based on electrochemical and optical detection coupled with different flow techniques have been chosen in this overview. A brief description of fundamental concepts and applications of each flow technique, such as flow injection analysis (FIA), sequential injection analysis (SIA), all injection analysis (AIA), batch injection analysis (BIA), multicommutated FIA (MCFIA), multisyringe FIA (MSFIA), and multipumped FIA (MPFIA) were reviewed.

  19. LOFT Debriefings: An Analysis of Instructor Techniques and Crew Participation

    NASA Technical Reports Server (NTRS)

    Dismukes, R. Key; Jobe, Kimberly K.; McDonnell, Lori K.

    1997-01-01

    This study analyzes techniques instructors use to facilitate crew analysis and evaluation of their Line-Oriented Flight Training (LOFT) performance. A rating instrument called the Debriefing Assessment Battery (DAB) was developed which enables raters to reliably assess instructor facilitation techniques and characterize crew participation. Thirty-six debriefing sessions conducted at five U.S. airlines were analyzed to determine the nature of instructor facilitation and crew participation. Ratings obtained using the DAB corresponded closely with descriptive measures of instructor and crew performance. The data provide empirical evidence that facilitation can be an effective tool for increasing the depth of crew participation and self-analysis of CRM performance. Instructor facilitation skill varied dramatically, suggesting a need for more concrete hands-on training in facilitation techniques. Crews were responsive but fell short of actively leading their own debriefings. Ways to improve debriefing effectiveness are suggested.

  20. Evaluation of Analysis Techniques for Fluted-Core Sandwich Cylinders

    NASA Technical Reports Server (NTRS)

    Lovejoy, Andrew E.; Schultz, Marc R.

    2012-01-01

    Buckling-critical launch-vehicle structures require structural concepts that have high bending stiffness and low mass. Fluted-core, also known as truss-core, sandwich construction is one such concept. In an effort to identify an analysis method appropriate for the preliminary design of fluted-core cylinders, the current paper presents and compares results from several analysis techniques applied to a specific composite fluted-core test article. The analysis techniques are evaluated in terms of their ease of use and for their appropriateness at certain stages throughout a design analysis cycle (DAC). Current analysis techniques that provide accurate determination of the global buckling load are not readily applicable early in the DAC, such as during preliminary design, because they are too costly to run. An analytical approach that neglects transverse-shear deformation is easily applied during preliminary design, but the lack of transverse-shear deformation results in global buckling load predictions that are significantly higher than those from more detailed analysis methods. The current state of the art is either too complex to be applied for preliminary design, or is incapable of the accuracy required to determine global buckling loads for fluted-core cylinders. Therefore, it is necessary to develop an analytical method for calculating global buckling loads of fluted-core cylinders that includes transverse-shear deformations, and that can be easily incorporated in preliminary design.

  1. 48 CFR 815.404-1 - Proposal analysis techniques.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 48 Federal Acquisition Regulations System 5 2010-10-01 2010-10-01 false Proposal analysis techniques. 815.404-1 Section 815.404-1 Federal Acquisition Regulations System DEPARTMENT OF VETERANS AFFAIRS CONTRACTING METHODS AND CONTRACT TYPES CONTRACTING BY NEGOTIATION Contract Pricing 815.404-1 Proposal...

  2. Dynamic mechanical analysis: A practical introduction to techniques and applications

    SciTech Connect

    Menard, K.

    1999-01-01

    This book introduces DMA, its history, and its current position as part of thermal analysis on polymers. It discusses major types of instrumentation, including oscillatory rotational, oscillatory axial, and torsional pendulum. It also describes analytical techniques in terms of utility, quality of data, methods of calibration, and suitability for different types of materials and assesses applications for thermoplastics, thermosetting systems, and thermosets.

  3. Analysis of leaching data using asymptotic expansion techniques

    SciTech Connect

    Simonson, S.A.; Machiels, A.J.

    1983-01-01

    Asymptotic analysis constitutes a useful technique to determine the adjustable parameters appearing in mathematical models attempting to reproduce some experimental data. In particular, asymptotic expansions of a leach model proposed by A.J. Machiels and C. Pescatore are used to interpret leaching data from PNL 76-68 glass in terms of corrosion velocities and diffusion coefficients.

  4. Regression Commonality Analysis: A Technique for Quantitative Theory Building

    ERIC Educational Resources Information Center

    Nimon, Kim; Reio, Thomas G., Jr.

    2011-01-01

    When it comes to multiple linear regression analysis (MLR), it is common for social and behavioral science researchers to rely predominately on beta weights when evaluating how predictors contribute to a regression model. Presenting an underutilized statistical technique, this article describes how organizational researchers can use commonality…

  5. Comparative Analysis of Different LIDAR System Calibration Techniques

    NASA Astrophysics Data System (ADS)

    Miller, M.; Habib, A.

    2016-06-01

    With light detection and ranging (LiDAR) now being a crucial tool for engineering products and on the fly spatial analysis, it is necessary for the user community to have standardized calibration methods. The three methods in this study were developed and proven by the Digital Photogrammetry Research Group (DPRG) for airborne LiDAR systems and are as follows; Simplified, Quasi-Rigorous, and Rigorous. In lieu of using expensive control surfaces for calibration, these methods compare overlapping LiDAR strips to estimate the systematic errors. These systematic errors are quantified by these methods and include the lever arm biases, boresight biases, range bias and scan angle scale bias. These three methods comprehensively represent all of the possible flight configurations and data availability and this paper will test the limits of the method with the most assumptions, the simplified calibration, by using data that violates the assumptions it's math model is based on and compares the results to the quasi-rigorous and rigorous techniques. The overarching goal is to provide a LiDAR system calibration that does not require raw measurements which can be carried out with minimal control and flight lines to reduce costs. This testing is unique because the terrain used for calibration does not contain gable roofs, all other LiDAR system calibration testing and development has been done with terrain containing features with high geometric integrity such as gable roofs.

  6. From Input to Output: Communication-Based Teaching Techniques.

    ERIC Educational Resources Information Center

    Tschirner, Erwin

    1992-01-01

    Communication-based teaching techniques are described that lead German language students from input to output in a stimulating and motivating learning environment. Input activities are most useful for presenting speech acts, vocabulary, and grammar; output activities, for fine-tuning those areas as well as for expanding students' productive…

  7. "Ayeli": Centering Technique Based on Cherokee Spiritual Traditions.

    ERIC Educational Resources Information Center

    Garrett, Michael Tlanusta; Garrett, J. T.

    2002-01-01

    Presents a centering technique called "Ayeli," based on Cherokee spiritual traditions as a way of incorporating spirituality into counseling by helping clients identify where they are in their journey, where they want to be, and how they can get there. Relevant Native cultural traditions and meanings are explored. (Contains 25 references.) (GCP)

  8. What Child Analysis Can Teach Us about Psychoanalytic Technique.

    PubMed

    Ablon, Steven Luria

    2014-01-01

    Child analysis has much to teach us about analytic technique. Children have an innate, developmentally driven sense of analytic process. Children in analysis underscore the importance of an understanding and belief in the therapeutic action of play, the provisional aspects of play, and that not all play will be understood. Each analysis requires learning a new play signature that is constantly reorganized. Child analysis emphasizes the emergence and integration of dissociated states, the negotiation of self-other relationships, the importance of co-creation, and the child's awareness of the analyst's sensibility. Child analysis highlights the robust nature of transference and how working through and repairing is related to the initiation of coordinated patterns of high predictability in the context of deep attachments. I will illustrate these and other ideas in the description of the analysis of a nine-year-old boy.

  9. Environmental Immunoassays: Alternative Techniques for Soil and Water Analysis

    USGS Publications Warehouse

    Aga, D.S.; Thurman, E.M.

    1996-01-01

    Analysis of soil and water samples for environmental studies and compliance testing can be formidable, time consuming, and costly. As a consequence, immunochemical techniques have become popular for environmental analysis because they are reliable, rapid, and cost effective. During the past 5 years, the use of immunoassays for environmental monitoring has increased substantially, and their use as an integral analytical tool in many environmental laboratories is now commonplace. This chapter will present the basic concept of immunoassays, recent advances in the development of immunochemical methods, and examples of successful applications of immunoassays in environmental analysis.

  10. Microfluidic IEF technique for sequential phosphorylation analysis of protein kinases

    NASA Astrophysics Data System (ADS)

    Choi, Nakchul; Song, Simon; Choi, Hoseok; Lim, Bu-Taek; Kim, Young-Pil

    2015-11-01

    Sequential phosphorylation of protein kinases play the important role in signal transduction, protein regulation, and metabolism in living cells. The analysis of these phosphorylation cascades will provide new insights into their physiological functions in many biological functions. Unfortunately, the existing methods are limited to analyze the cascade activity. Therefore, we suggest a microfluidic isoelectric focusing technique (μIEF) for the analysis of the cascade activity. Using the technique, we show that the sequential phosphorylation of a peptide by two different kinases can be successfully detected on a microfluidic chip. In addition, the inhibition assay for kinase activity and the analysis on a real sample have also been conducted. The results indicate that μIEF is an excellent means for studies on phosphorylation cascade activity.

  11. Application of Image Enhancement Techniques to Comets: A Critical Analysis

    NASA Astrophysics Data System (ADS)

    Samarasinha, Nalin H.; Larson, S.; Beshore, E.

    2006-09-01

    Investigation and accurate interpretation of many cometary coma phenomena depend on identification of coma features and their spatial and temporal variations. In many cases, the coma features are only few percent above the ambient coma, requiring the application of image enhancement techniques for easy identification and analysis. In the literature, there are a range of enhancement techniques used for the analysis of coma structures (e.g., Larson and Slaughter 1992, Schleicher and Farnham 2004). We use numerically simulated images to characterize pros and cons of a number of widely used enhancement techniques. In particular, we will identify techniques which are suitable for making measurements post-enhancement as well as the nature of the measurements which are unaffected by the enhancements. An effort will be made to present the results in a quantifiable format rather than with qualitative statements. Finally these enhancements techniques will be used to enhance and analyze the coma morphologies present in actual images of comet Hale-Bopp (C/1995 O1). NHS was supported by NASA Planetary Atmospheres Program.

  12. Video multiple watermarking technique based on image interlacing using DWT.

    PubMed

    Ibrahim, Mohamed M; Abdel Kader, Neamat S; Zorkany, M

    2014-01-01

    Digital watermarking is one of the important techniques to secure digital media files in the domains of data authentication and copyright protection. In the nonblind watermarking systems, the need of the original host file in the watermark recovery operation makes an overhead over the system resources, doubles memory capacity, and doubles communications bandwidth. In this paper, a robust video multiple watermarking technique is proposed to solve this problem. This technique is based on image interlacing. In this technique, three-level discrete wavelet transform (DWT) is used as a watermark embedding/extracting domain, Arnold transform is used as a watermark encryption/decryption method, and different types of media (gray image, color image, and video) are used as watermarks. The robustness of this technique is tested by applying different types of attacks such as: geometric, noising, format-compression, and image-processing attacks. The simulation results show the effectiveness and good performance of the proposed technique in saving system resources, memory capacity, and communications bandwidth. PMID:25587570

  13. Video multiple watermarking technique based on image interlacing using DWT.

    PubMed

    Ibrahim, Mohamed M; Abdel Kader, Neamat S; Zorkany, M

    2014-01-01

    Digital watermarking is one of the important techniques to secure digital media files in the domains of data authentication and copyright protection. In the nonblind watermarking systems, the need of the original host file in the watermark recovery operation makes an overhead over the system resources, doubles memory capacity, and doubles communications bandwidth. In this paper, a robust video multiple watermarking technique is proposed to solve this problem. This technique is based on image interlacing. In this technique, three-level discrete wavelet transform (DWT) is used as a watermark embedding/extracting domain, Arnold transform is used as a watermark encryption/decryption method, and different types of media (gray image, color image, and video) are used as watermarks. The robustness of this technique is tested by applying different types of attacks such as: geometric, noising, format-compression, and image-processing attacks. The simulation results show the effectiveness and good performance of the proposed technique in saving system resources, memory capacity, and communications bandwidth.

  14. Video Multiple Watermarking Technique Based on Image Interlacing Using DWT

    PubMed Central

    Ibrahim, Mohamed M.; Abdel Kader, Neamat S.; Zorkany, M.

    2014-01-01

    Digital watermarking is one of the important techniques to secure digital media files in the domains of data authentication and copyright protection. In the nonblind watermarking systems, the need of the original host file in the watermark recovery operation makes an overhead over the system resources, doubles memory capacity, and doubles communications bandwidth. In this paper, a robust video multiple watermarking technique is proposed to solve this problem. This technique is based on image interlacing. In this technique, three-level discrete wavelet transform (DWT) is used as a watermark embedding/extracting domain, Arnold transform is used as a watermark encryption/decryption method, and different types of media (gray image, color image, and video) are used as watermarks. The robustness of this technique is tested by applying different types of attacks such as: geometric, noising, format-compression, and image-processing attacks. The simulation results show the effectiveness and good performance of the proposed technique in saving system resources, memory capacity, and communications bandwidth. PMID:25587570

  15. Hyphenated techniques for the analysis of heparin and heparan sulfate

    PubMed Central

    Yang, Bo; Solakyildirim, Kemal; Chang, Yuqing

    2011-01-01

    The elucidation of the structure of glycosaminoglycan has proven to be challenging for analytical chemists. Molecules of glycosaminoglycan have a high negative charge and are polydisperse and microheterogeneous, thus requiring the application of multiple analytical techniques and methods. Heparin and heparan sulfate are the most structurally complex of the glycosaminoglycans and are widely distributed in nature. They play critical roles in physiological and pathophysiological processes through their interaction with heparin-binding proteins. Moreover, heparin and low-molecular weight heparin are currently used as pharmaceutical drugs to control blood coagulation. In 2008, the health crisis resulting from the contamination of pharmaceutical heparin led to considerable attention regarding their analysis and structural characterization. Modern analytical techniques, including high-performance liquid chromatography, capillary electrophoresis, mass spectrometry, and nuclear magnetic resonance spectroscopy, played critical roles in this effort. A successful combination of separation and spectral techniques will clearly provide a critical advantage in the future analysis of heparin and heparan sulfate. This review focuses on recent efforts to develop hyphenated techniques for the analysis of heparin and heparan sulfate. PMID:20853165

  16. Rain Attenuation Analysis using Synthetic Storm Technique in Malaysia

    NASA Astrophysics Data System (ADS)

    Lwas, A. K.; Islam, Md R.; Chebil, J.; Habaebi, M. H.; Ismail, A. F.; Zyoud, A.; Dao, H.

    2013-12-01

    Generated rain attenuation time series plays an important role for investigating the rain fade characteristics in the lack of real fade measurements. A suitable conversion technique can be applied to measured rain rate time series to produce rain attenuation data and be utilized to understand the rain fade characteristics. This paper focuses on applicability of synthetic storm technique (SST) to convert measured rain rate data to rain attenuation time series. Its performance is assessed for time series generation over a tropical location Kuala Lumpur, in Malaysia. From preliminary analysis, it is found that SST gives satisfactory results to estimate the rain attenuation time series from the rain rate measurements over this region.

  17. Bispectrum-based feature extraction technique for devising a practical brain-computer interface

    NASA Astrophysics Data System (ADS)

    Shahid, Shahjahan; Prasad, Girijesh

    2011-04-01

    The extraction of distinctly separable features from electroencephalogram (EEG) is one of the main challenges in designing a brain-computer interface (BCI). Existing feature extraction techniques for a BCI are mostly developed based on traditional signal processing techniques assuming that the signal is Gaussian and has linear characteristics. But the motor imagery (MI)-related EEG signals are highly non-Gaussian, non-stationary and have nonlinear dynamic characteristics. This paper proposes an advanced, robust but simple feature extraction technique for a MI-related BCI. The technique uses one of the higher order statistics methods, the bispectrum, and extracts the features of nonlinear interactions over several frequency components in MI-related EEG signals. Along with a linear discriminant analysis classifier, the proposed technique has been used to design an MI-based BCI. Three performance measures, classification accuracy, mutual information and Cohen's kappa have been evaluated and compared with a BCI using a contemporary power spectral density-based feature extraction technique. It is observed that the proposed technique extracts nearly recording-session-independent distinct features resulting in significantly much higher and consistent MI task detection accuracy and Cohen's kappa. It is therefore concluded that the bispectrum-based feature extraction is a promising technique for detecting different brain states.

  18. Experiments on Adaptive Techniques for Host-Based Intrusion Detection

    SciTech Connect

    DRAELOS, TIMOTHY J.; COLLINS, MICHAEL J.; DUGGAN, DAVID P.; THOMAS, EDWARD V.; WUNSCH, DONALD

    2001-09-01

    This research explores four experiments of adaptive host-based intrusion detection (ID) techniques in an attempt to develop systems that can detect novel exploits. The technique considered to have the most potential is adaptive critic designs (ACDs) because of their utilization of reinforcement learning, which allows learning exploits that are difficult to pinpoint in sensor data. Preliminary results of ID using an ACD, an Elman recurrent neural network, and a statistical anomaly detection technique demonstrate an ability to learn to distinguish between clean and exploit data. We used the Solaris Basic Security Module (BSM) as a data source and performed considerable preprocessing on the raw data. A detection approach called generalized signature-based ID is recommended as a middle ground between signature-based ID, which has an inability to detect novel exploits, and anomaly detection, which detects too many events including events that are not exploits. The primary results of the ID experiments demonstrate the use of custom data for generalized signature-based intrusion detection and the ability of neural network-based systems to learn in this application environment.

  19. Diffusion MRI of the neonate brain: acquisition, processing and analysis techniques.

    PubMed

    Pannek, Kerstin; Guzzetta, Andrea; Colditz, Paul B; Rose, Stephen E

    2012-10-01

    Diffusion MRI (dMRI) is a popular noninvasive imaging modality for the investigation of the neonate brain. It enables the assessment of white matter integrity, and is particularly suited for studying white matter maturation in the preterm and term neonate brain. Diffusion tractography allows the delineation of white matter pathways and assessment of connectivity in vivo. In this review, we address the challenges of performing and analysing neonate dMRI. Of particular importance in dMRI analysis is adequate data preprocessing to reduce image distortions inherent to the acquisition technique, as well as artefacts caused by head movement. We present a summary of techniques that should be used in the preprocessing of neonate dMRI data, and demonstrate the effect of these important correction steps. Furthermore, we give an overview of available analysis techniques, ranging from voxel-based analysis of anisotropy metrics including tract-based spatial statistics (TBSS) to recently developed methods of statistical analysis addressing issues of resolving complex white matter architecture. We highlight the importance of resolving crossing fibres for tractography and outline several tractography-based techniques, including connectivity-based segmentation, the connectome and tractography mapping. These techniques provide powerful tools for the investigation of brain development and maturation. PMID:22903761

  20. Graphene-based terahertz photodetector by noise thermometry technique

    SciTech Connect

    Wang, Ming-Jye; Wang, Ji-Wun; Wang, Chun-Lun; Chiang, Yen-Yu; Chang, Hsian-Hong

    2014-01-20

    We report the characteristics of graphene-based terahertz (THz) photodetector based on noise thermometry technique by measuring its noise power at frequency from 4 to 6 GHz. Hot electron system in graphene microbridge is generated after THz photon pumping and creates extra noise power. The equivalent noise temperature and electron temperature increase rapidly in low THz pumping regime and saturate gradually in high THz power regime which is attributed to a faster energy relaxation process involved by stronger electron-phonon interaction. Based on this detector, a conversion efficiency around 0.15 from THz power to noise power in 4–6 GHz span has been achieved.

  1. Noble Gas Measurement and Analysis Technique for Monitoring Reprocessing Facilities

    SciTech Connect

    Charlton, William S

    1999-09-01

    An environmental monitoring technique using analysis of stable noble gas isotopic ratios on-stack at a reprocessing facility was developed. This technique integrates existing technologies to strengthen safeguards at reprocessing facilities. The isotopic ratios are measured using a mass spectrometry system and are compared to a database of calculated isotopic ratios using a Bayesian data analysis method to determine specific fuel parameters (e.g., burnup, fuel type, fuel age, etc.). These inferred parameters can be used by investigators to verify operator declarations. A user-friendly software application (named NOVA) was developed for the application of this technique. NOVA included a Visual Basic user interface coupling a Bayesian data analysis procedure to a reactor physics database (calculated using the Monteburns 3.01 code system). The integrated system (mass spectrometry, reactor modeling, and data analysis) was validated using on-stack measurements during the reprocessing of target fuel from a U.S. production reactor and gas samples from the processing of EBR-II fast breeder reactor driver fuel. These measurements led to an inferred burnup that matched the declared burnup with sufficient accuracy and consistency for most safeguards applications. The NOVA code was also tested using numerous light water reactor measurements from the literature. NOVA was capable of accurately determining spent fuel type, burnup, and fuel age for these experimental results. Work should continue to demonstrate the robustness of this system for production, power, and research reactor fuels.

  2. Large areas elemental mapping by ion beam analysis techniques

    NASA Astrophysics Data System (ADS)

    Silva, T. F.; Rodrigues, C. L.; Curado, J. F.; Allegro, P.; Moro, M. V.; Campos, P. H. O. V.; Santos, S. B.; Kajiya, E. A. M.; Rizzutto, M. A.; Added, N.; Tabacniks, M. H.

    2015-07-01

    The external beam line of the Laboratory for Material Analysis with Ion Beams (LAMFI) is a versatile setup for multi-technique analysis. X-ray detectors for Particle Induced X-rays Emission (PIXE) measurements, a Gamma-ray detector for Particle Induced Gamma- ray Emission (PIGE), and a particle detector for scattering analysis, such as Rutherford Backscattering Spectrometry (RBS), were already installed. In this work, we present some results, using a large (60-cm range) XYZ computer controlled sample positioning system, completely developed and build in our laboratory. The XYZ stage was installed at the external beam line and its high spacial resolution (better than 5 μm over the full range) enables positioning the sample with high accuracy and high reproducibility. The combination of a sub-millimeter beam with the large range XYZ robotic stage is being used to produce elemental maps of large areas in samples like paintings, ceramics, stones, fossils, and all sort of samples. Due to its particular characteristics, this is a unique device in the sense of multi-technique analysis of large areas. With the continuous development of the external beam line at LAMFI, coupled to the robotic XYZ stage, it is becoming a robust and reliable option for regular analysis of trace elements (Z > 5) competing with the traditional in-vacuum ion-beam-analysis with the advantage of automatic rastering.

  3. MEMS-based power generation techniques for implantable biosensing applications.

    PubMed

    Lueke, Jonathan; Moussa, Walied A

    2011-01-01

    Implantable biosensing is attractive for both medical monitoring and diagnostic applications. It is possible to monitor phenomena such as physical loads on joints or implants, vital signs, or osseointegration in vivo and in real time. Microelectromechanical (MEMS)-based generation techniques can allow for the autonomous operation of implantable biosensors by generating electrical power to replace or supplement existing battery-based power systems. By supplementing existing battery-based power systems for implantable biosensors, the operational lifetime of the sensor is increased. In addition, the potential for a greater amount of available power allows additional components to be added to the biosensing module, such as computational and wireless and components, improving functionality and performance of the biosensor. Photovoltaic, thermovoltaic, micro fuel cell, electrostatic, electromagnetic, and piezoelectric based generation schemes are evaluated in this paper for applicability for implantable biosensing. MEMS-based generation techniques that harvest ambient energy, such as vibration, are much better suited for implantable biosensing applications than fuel-based approaches, producing up to milliwatts of electrical power. High power density MEMS-based approaches, such as piezoelectric and electromagnetic schemes, allow for supplemental and replacement power schemes for biosensing applications to improve device capabilities and performance. In addition, this may allow for the biosensor to be further miniaturized, reducing the need for relatively large batteries with respect to device size. This would cause the implanted biosensor to be less invasive, increasing the quality of care received by the patient.

  4. MEMS-Based Power Generation Techniques for Implantable Biosensing Applications

    PubMed Central

    Lueke, Jonathan; Moussa, Walied A.

    2011-01-01

    Implantable biosensing is attractive for both medical monitoring and diagnostic applications. It is possible to monitor phenomena such as physical loads on joints or implants, vital signs, or osseointegration in vivo and in real time. Microelectromechanical (MEMS)-based generation techniques can allow for the autonomous operation of implantable biosensors by generating electrical power to replace or supplement existing battery-based power systems. By supplementing existing battery-based power systems for implantable biosensors, the operational lifetime of the sensor is increased. In addition, the potential for a greater amount of available power allows additional components to be added to the biosensing module, such as computational and wireless and components, improving functionality and performance of the biosensor. Photovoltaic, thermovoltaic, micro fuel cell, electrostatic, electromagnetic, and piezoelectric based generation schemes are evaluated in this paper for applicability for implantable biosensing. MEMS-based generation techniques that harvest ambient energy, such as vibration, are much better suited for implantable biosensing applications than fuel-based approaches, producing up to milliwatts of electrical power. High power density MEMS-based approaches, such as piezoelectric and electromagnetic schemes, allow for supplemental and replacement power schemes for biosensing applications to improve device capabilities and performance. In addition, this may allow for the biosensor to be further miniaturized, reducing the need for relatively large batteries with respect to device size. This would cause the implanted biosensor to be less invasive, increasing the quality of care received by the patient. PMID:22319362

  5. Small area analysis using micro-diffraction techniques

    SciTech Connect

    GOEHNER,RAYMOND P.; TISSOT JR.,RALPH G.; MICHAEL,JOSEPH R.

    2000-02-11

    An overall trend toward smaller electronic packages and devices makes it increasingly important and difficult to obtain meaningful diffraction information from small areas. X-ray micro-diffraction, electron back-scattered diffraction (EBSD) and Kossel are micro-diffraction techniques used for crystallographic analysis including texture, phase identification and strain measurements. X-ray micro-diffraction primarily is used for phase analysis and residual strain measurements. X-ray micro-diffraction primarily is used for phase analysis and residual strain measurements of areas between 10 {micro}m to 100 {micro}m. For areas this small glass capillary optics are used for producing a usable collimated x-ray beam. These optics are designed to reflect x-rays below the critical angle therefore allowing for larger solid acceptance angle at the x-ray source resulting in brighter smaller x-ray beams. The determination of residual strain using micro-diffraction techniques is very important to the semiconductor industry. Residual stresses have caused voiding of the interconnect metal which then destroys electrical continuity. Being able to determine the residual stress helps industry to predict failures from the aging effects of interconnects due to this stress voiding. Stress measurements would be impossible using a conventional x-ray diffractometer; however, utilizing a 30{micro}m glass capillary these small areas are readily assessable for analysis. Kossel produces a wide angle diffraction pattern from fluorescent x-rays generated in the sample by an e-beam in a SEM. This technique can yield very precise lattice parameters for determining strain. Fig. 2 shows a Kossel pattern from a Ni specimen. Phase analysis on small areas is also possible using an energy dispersive spectrometer (EBSD) and x-ray micro-diffraction techniques. EBSD has the advantage of allowing the user to observe the area of interest using the excellent imaging capabilities of the SEM. An EDS detector has

  6. Novel techniques and the future of skull base reconstruction.

    PubMed

    Meier, Joshua C; Bleier, Benjamin S

    2013-01-01

    The field of endoscopic skull base surgery has evolved considerably in recent years fueled largely by advances in both imaging and instrumentation. While the indications for these approaches continue to be extended, the ability to reconstruct the resultant defects has emerged as a rate-limiting obstacle. Postoperative failures with current multilayer grafting techniques remain significant and may increase as the indications for endoscopic resections continue to expand. Laser tissue welding represents a novel method of wound repair in which laser energy is applied to a chromophore doped biologic solder at the wound edge to create a laser weld (fig. 1). These repairs are capable of withstanding forces far exceeding those exerted by intracranial pressure with negligible collateral thermal tissue injury. Recent clinical trials have demonstrated the safety and feasibility of endoscopic laser welding while exposing the limitations of first generation hyaluronic acid based solders. Novel supersaturated gel based solders are currently being tested in clinical trials and appear to possess significantly improved viscoelastic properties. While laser tissue welding remains an experimental technique, continued success with these novel solder formulations may catalyze the widespread adoption of this technique for skull base repair in the near future.

  7. Pathways of distinction analysis: a new technique for multi-SNP analysis of GWAS data.

    PubMed

    Braun, Rosemary; Buetow, Kenneth

    2011-06-01

    Genome-wide association studies (GWAS) have become increasingly common due to advances in technology and have permitted the identification of differences in single nucleotide polymorphism (SNP) alleles that are associated with diseases. However, while typical GWAS analysis techniques treat markers individually, complex diseases (cancers, diabetes, and Alzheimers, amongst others) are unlikely to have a single causative gene. Thus, there is a pressing need for multi-SNP analysis methods that can reveal system-level differences in cases and controls. Here, we present a novel multi-SNP GWAS analysis method called Pathways of Distinction Analysis (PoDA). The method uses GWAS data and known pathway-gene and gene-SNP associations to identify pathways that permit, ideally, the distinction of cases from controls. The technique is based upon the hypothesis that, if a pathway is related to disease risk, cases will appear more similar to other cases than to controls (or vice versa) for the SNPs associated with that pathway. By systematically applying the method to all pathways of potential interest, we can identify those for which the hypothesis holds true, i.e., pathways containing SNPs for which the samples exhibit greater within-class similarity than across classes. Importantly, PoDA improves on existing single-SNP and SNP-set enrichment analyses, in that it does not require the SNPs in a pathway to exhibit independent main effects. This permits PoDA to reveal pathways in which epistatic interactions drive risk. In this paper, we detail the PoDA method and apply it to two GWAS: one of breast cancer and the other of liver cancer. The results obtained strongly suggest that there exist pathway-wide genomic differences that contribute to disease susceptibility. PoDA thus provides an analytical tool that is complementary to existing techniques and has the power to enrich our understanding of disease genomics at the systems-level.

  8. System availability management technique for reliability and maintainability analysis

    NASA Technical Reports Server (NTRS)

    Davenport, G. K.

    1970-01-01

    Method for total system availability analysis is based on numerical prediction of the reliability, maintainability, and availability of each function system. It incorporates these functional-system estimates into an overall mathematical model.

  9. Choosing a DIVA: a comparison of emerging digital imagery vegetation analysis techniques

    USGS Publications Warehouse

    Jorgensen, Christopher F.; Stutzman, Ryan J.; Anderson, Lars C.; Decker, Suzanne E.; Powell, Larkin A.; Schacht, Walter H.; Fontaine, Joseph J.

    2013-01-01

    Question: What is the precision of five methods of measuring vegetation structure using ground-based digital imagery and processing techniques? Location: Lincoln, Nebraska, USA Methods: Vertical herbaceous cover was recorded using digital imagery techniques at two distinct locations in a mixed-grass prairie. The precision of five ground-based digital imagery vegetation analysis (DIVA) methods for measuring vegetation structure was tested using a split-split plot analysis of covariance. Variability within each DIVA technique was estimated using coefficient of variation of mean percentage cover. Results: Vertical herbaceous cover estimates differed among DIVA techniques. Additionally, environmental conditions affected the vertical vegetation obstruction estimates for certain digital imagery methods, while other techniques were more adept at handling various conditions. Overall, percentage vegetation cover values differed among techniques, but the precision of four of the five techniques was consistently high. Conclusions: DIVA procedures are sufficient for measuring various heights and densities of standing herbaceous cover. Moreover, digital imagery techniques can reduce measurement error associated with multiple observers' standing herbaceous cover estimates, allowing greater opportunity to detect patterns associated with vegetation structure.

  10. Scalable Analysis Techniques for Microprocessor Performance Counter Metrics

    SciTech Connect

    Ahn, D H; Vetter, J S

    2002-07-24

    Contemporary microprocessors provide a rich set of integrated performance counters that allow application developers and system architects alike the opportunity to gather important information about workload behaviors. These counters can capture instruction, memory, and operating system behaviors. Current techniques for analyzing data produced from these counters use raw counts, ratios, and visualization techniques to help users make decisions about their application source code. While these techniques are appropriate for analyzing data from one process, they do not scale easily to new levels demanded by contemporary computing systems. Indeed, the amount of data generated by these experiments is on the order of tens of thousands of data points. Furthermore, if users execute multiple experiments, then we add yet another dimension to this already knotty picture. This flood of multidimensional data can swamp efforts to harvest important ideas from these valuable counters. Very simply, this paper addresses these concerns by evaluating several multivariate statistical techniques on these datasets. We find that several techniques, such as statistical clustering, can automatically extract important features from this data. These derived results can, in turn, be feed directly back to an application developer, or used as input to a more comprehensive performance analysis environment, such as a visualization or an expert system.

  11. Mathematical analysis techniques for modeling the space network activities

    NASA Technical Reports Server (NTRS)

    Foster, Lisa M.

    1992-01-01

    The objective of the present work was to explore and identify mathematical analysis techniques, and in particular, the use of linear programming. This topic was then applied to the Tracking and Data Relay Satellite System (TDRSS) in order to understand the space network better. Finally, a small scale version of the system was modeled, variables were identified, data was gathered, and comparisons were made between actual and theoretical data.

  12. Preliminary assessment of aerial photography techniques for canvasback population analysis

    USGS Publications Warehouse

    Munro, R.E.; Trauger, D.L.

    1976-01-01

    Recent intensive research on the canvasback has focused attention on the need for more precise estimates of population parameters. During the 1972-75 period, various types of aerial photographing equipment were evaluated to determine the problems and potentials for employing these techniques in appraisals of canvasback populations. The equipment and procedures available for automated analysis of aerial photographic imagery were also investigated. Serious technical problems remain to be resolved, but some promising results were obtained. Final conclusions about the feasibility of operational implementation await a more rigorous analysis of the data collected.

  13. Simple technique for structural thermal-screening analysis. [LMFBR

    SciTech Connect

    Yang, C.C.; Dalcher, A.W.

    1982-01-01

    A closed form solution to one dimensional transient heat conduction problem is suggested for the thermal screening analysis of arbitrary input transients. This formulation has been derived from a classical solution and been applied to thermal screening analyses of nuclear structural components. Direct output in the form of computer plots is particularly useful not only for visualization of transient responses but also the selection of umbrella transients used in the detailed analyses of nuclear components especially for high temperature applications. An example is given to demonstrate the usefulness of this technique in the design analysis of heat transport system equipments.

  14. Gabor-based fusion technique for Optical Coherence Microscopy.

    PubMed

    Rolland, Jannick P; Meemon, Panomsak; Murali, Supraja; Thompson, Kevin P; Lee, Kye-sung

    2010-02-15

    We recently reported on an Optical Coherence Microscopy technique, whose innovation intrinsically builds on a recently reported - 2 microm invariant lateral resolution by design throughout a 2 mm cubic full-field of view - liquid-lens-based dynamic focusing optical probe [Murali et al., Optics Letters 34, 145-147, 2009]. We shall report in this paper on the image acquisition enabled by this optical probe when combined with an automatic data fusion method developed and described here to produce an in-focus high resolution image throughout the imaging depth of the sample. An African frog tadpole (Xenopus laevis) was imaged with the novel probe and the Gabor-based fusion technique, demonstrating subcellular resolution in a 0.5 mm (lateral) x 0.5 mm (axial) without the need, for the first time, for x-y translation stages, depth scanning, high-cost adaptive optics, or manual intervention. In vivo images of human skin are also presented.

  15. Comment on 'Comparative analysis of the isovolume calibration method for non-invasive respiratory monitoring techniques based on area transduction versus circumference transduction using the connected cylinders model' (2011 Physiol. Meas. 32 1265-74).

    PubMed

    Augousti, A T; Radosz, A

    2015-05-01

    An analysis introduced by the authors in 2011 examining the robustness of the isovolume method for the calibration of the respiratory inductive plethysmograph based on the connected cylinders particular model of Konno and Mead's generalized two-compartment model of respiration is extended. It is demonstrated that extending this to a more physically realistic geometrical model, termed the connected prismatic elliptical segments model, does not enhance the earlier analysis, and that the analysis can easily be proven to cover all area-based transduction sensors, irrespective of the actual geometry of the compartments.

  16. Analysis of weldability testing techniques for HAZ liquation cracking

    NASA Astrophysics Data System (ADS)

    Lin, Wangen; Lippold, John C.; Baeslack, William A., III

    A new methodology for quantifying heat-affected zone (HAZ) liquation cracking susceptibility has been developed by considering the temperature-dependent, hot ductility of a material during welding in the context of current liquation cracking theories. This methodology defines a temperature field in the HAZ surrounding the weld pool, termed the thermal crack susceptible region (CSR) in which liquation cracking may occur. The thermal CSR is material-specific and represents a metallurgically-based quantification of weldability. Using this method, weldability test results can be more easily applied to actual welding conditions. This paper describes how this methodology can be utilized in conjunction with three widely used weldability testing techniques, namely, the Gleeble hot ductility test, and the spot and longitudinal type Varestraint tests. The method of data interpretation is addressed for each test technique and correlations among test techniques described.

  17. Cost-variance analysis by DRGs; a technique for clinical budget analysis.

    PubMed

    Voss, G B; Limpens, P G; Brans-Brabant, L J; van Ooij, A

    1997-02-01

    In this article it is shown how a cost accounting system based on DRGs can be valuable in determining changes in clinical practice and explaining alterations in expenditure patterns from one period to another. A cost-variance analysis is performed using data from the orthopedic department from the fiscal years 1993 and 1994. Differences between predicted and observed cost for medical care, such as diagnostic procedures, therapeutic procedures and nursing care are analyzed into different components: changes in patient volume, case-mix differences, changes in resource use and variations in cost per procedure. Using a DRG cost accounting system proved to be a useful technique for clinical budget analysis. Results may stimulate discussions between hospital managers and medical professionals to explain cost variations integrating medical and economic aspects of clinical health care. PMID:10165044

  18. Fusing modeling techniques to support domain analysis for reuse opportunities identification

    NASA Technical Reports Server (NTRS)

    Hall, Susan Main; Mcguire, Eileen

    1993-01-01

    Functional modeling techniques or object-oriented graphical representations, which are more useful to someone trying to understand the general design or high level requirements of a system? For a recent domain analysis effort, the answer was a fusion of popular modeling techniques of both types. By using both functional and object-oriented techniques, the analysts involved were able to lean on their experience in function oriented software development, while taking advantage of the descriptive power available in object oriented models. In addition, a base of familiar modeling methods permitted the group of mostly new domain analysts to learn the details of the domain analysis process while producing a quality product. This paper describes the background of this project and then provides a high level definition of domain analysis. The majority of this paper focuses on the modeling method developed and utilized during this analysis effort.

  19. Ground-based intercomparison of nitric acid measurement techniques

    NASA Astrophysics Data System (ADS)

    Fehsenfeld, Fred C.; Huey, L. Greg; Sueper, Donna T.; Norton, Richard B.; Williams, Eric J.; Eisele, Fred L.; Mauldin, R. Lee; Tanner, David J.

    1998-02-01

    An informal intercomparison of gas-phase nitric acid (HNO3) measuring techniques was carried out. The intercomparison involved two new chemical ionization mass spectrometers (CIMSs) that have been developed for the measurement of HNO3 along with an older, more established filter pack (FP) technique. The filter pack was composed of a teflon prefilter which collected aerosols followed by a nylon filter which collected the gas-phase HNO3. The study was carried out during the late winter and early spring of 1996 at a site located on the western edge of the Denver metropolitan area. Throughout the study the two CIMS techniques were in general agreement. However, under certain conditions the HNO3 levels obtained from the nylon filter of the FP gave values for the gas-phase concentration of HNO3 that were somewhat higher than that recorded by the two CIMS systems. The formation of ammonium nitrate (NH4NO3) containing aerosols is common during the colder months in this area. An analysis of these results suggests that the HNO3 collected by the nylon filter in the FP suffers an interference associated with the disproportionation of NH4NO3 from aerosols containing that compound that were initially collected on the teflon prefilter. This problem with the FP technique has been suggested from results obtained in previous intercomparisons.

  20. Artificial intelligence techniques used in respiratory sound analysis--a systematic review.

    PubMed

    Palaniappan, Rajkumar; Sundaraj, Kenneth; Sundaraj, Sebastian

    2014-02-01

    Artificial intelligence (AI) has recently been established as an alternative method to many conventional methods. The implementation of AI techniques for respiratory sound analysis can assist medical professionals in the diagnosis of lung pathologies. This article highlights the importance of AI techniques in the implementation of computer-based respiratory sound analysis. Articles on computer-based respiratory sound analysis using AI techniques were identified by searches conducted on various electronic resources, such as the IEEE, Springer, Elsevier, PubMed, and ACM digital library databases. Brief descriptions of the types of respiratory sounds and their respective characteristics are provided. We then analyzed each of the previous studies to determine the specific respiratory sounds/pathology analyzed, the number of subjects, the signal processing method used, the AI techniques used, and the performance of the AI technique used in the analysis of respiratory sounds. A detailed description of each of these studies is provided. In conclusion, this article provides recommendations for further advancements in respiratory sound analysis.

  1. An Image Morphing Technique Based on Optimal Mass Preserving Mapping

    PubMed Central

    Zhu, Lei; Yang, Yan; Haker, Steven; Tannenbaum, Allen

    2013-01-01

    Image morphing, or image interpolation in the time domain, deals with the metamorphosis of one image into another. In this paper, a new class of image morphing algorithms is proposed based on the theory of optimal mass transport. The L2 mass moving energy functional is modified by adding an intensity penalizing term, in order to reduce the undesired double exposure effect. It is an intensity-based approach and, thus, is parameter free. The optimal warping function is computed using an iterative gradient descent approach. This proposed morphing method is also extended to doubly connected domains using a harmonic parameterization technique, along with finite-element methods. PMID:17547128

  2. An image morphing technique based on optimal mass preserving mapping.

    PubMed

    Zhu, Lei; Yang, Yan; Haker, Steven; Tannenbaum, Allen

    2007-06-01

    Image morphing, or image interpolation in the time domain, deals with the metamorphosis of one image into another. In this paper, a new class of image morphing algorithms is proposed based on the theory of optimal mass transport. The L(2) mass moving energy functional is modified by adding an intensity penalizing term, in order to reduce the undesired double exposure effect. It is an intensity-based approach and, thus, is parameter free. The optimal warping function is computed using an iterative gradient descent approach. This proposed morphing method is also extended to doubly connected domains using a harmonic parameterization technique, along with finite-element methods. PMID:17547128

  3. A Different Web-Based Geocoding Service Using Fuzzy Techniques

    NASA Astrophysics Data System (ADS)

    Pahlavani, P.; Abbaspour, R. A.; Zare Zadiny, A.

    2015-12-01

    Geocoding - the process of finding position based on descriptive data such as address or postal code - is considered as one of the most commonly used spatial analyses. Many online map providers such as Google Maps, Bing Maps and Yahoo Maps present geocoding as one of their basic capabilities. Despite the diversity of geocoding services, users usually face some limitations when they use available online geocoding services. In existing geocoding services, proximity and nearness concept is not modelled appropriately as well as these services search address only by address matching based on descriptive data. In addition there are also some limitations in display searching results. Resolving these limitations can enhance efficiency of the existing geocoding services. This paper proposes the idea of integrating fuzzy technique with geocoding process to resolve these limitations. In order to implement the proposed method, a web-based system is designed. In proposed method, nearness to places is defined by fuzzy membership functions and multiple fuzzy distance maps are created. Then these fuzzy distance maps are integrated using fuzzy overlay technique for obtain the results. Proposed methods provides different capabilities for users such as ability to search multi-part addresses, searching places based on their location, non-point representation of results as well as displaying search results based on their priority.

  4. Application of mass spectrometry-based proteomics techniques for the detection of protein doping in sports.

    PubMed

    Kay, Richard G; Creaser, Colin S

    2010-04-01

    Mass spectrometry-based proteomic approaches have been used to develop methodologies capable of detecting the abuse of protein therapeutics such as recombinant human erythropoietin and recombinant human growth hormone. Existing detection methods use antibody-based approaches that, although effective, suffer from long assay development times and specificity issues. The application of liquid chromatography with tandem mass spectrometry and selected reaction-monitoring-based analysis has demonstrated the ability to detect and quantify existing protein therapeutics in plasma. Furthermore, the multiplexing capability of selected reaction-monitoring analysis has also aided in the detection of multiple downstream biomarkers in a single analysis, requiring less sample than existing immunological techniques. The flexibility of mass spectrometric instrumentation has shown that the technique is capable of detecting the abuse of novel and existing protein therapeutics, and has a vital role in the fight to keep sports drug-free.

  5. The potential of electroanalytical techniques in pharmaceutical analysis.

    PubMed

    Kauffmann, J M; Pékli-Novák, M; Nagy, A

    1996-03-01

    With the considerable progresses observed in analytical instrumentation, it was of interest to survey recent trends in the field of electroanalysis of drugs. Potentiometric, voltammetric and amperometric techniques were scrutinized both in terms of historical evolution and in terms of potentialities with respect to the analysis of drugs in various matrices. With regard to the former, it appeared that numerous original selective electrodes (for drugs and ions) have been studied and several ion-selective electrodes have been successfully commercialized. Improvements are still expected in this field in order to find more robust membrane matrices and to minimize the surface fouling. Electrochemistry is well suited for trace metal analysis. A renewed interest in potentiometric stripping analysis is observed and is stimulated by the power of computers and microprocessors which allow rapid signal recording and data handling. Polarography and its refinements (Pulsed Waveform, Automation,...) is ideally applied for trace metal analysis and speciation. The technique is still useful in the analysis of drug formulations and in biological samples provided that the method is adequately validated (selectivity!). The same holds for solid electrodes which are currently routinely applied as sensitive detectors after chromatographic separation. New instrumentation is soon expected as regard electrochemical detection in capillary electrophoresis. Actually, in order to increase the responses and improve the selectivity, solid electrodes are facing exponential research dedicated to surface modifications. Perm-selectivity, chelations catalysis, etc. may be considered as appropriate strategies. Microelectrodes and screen printed (disposable) sensors are of considerable interest in cell culture e.g. for single cell excretion analysis and in field (decentralized) assays, respectively. Finally several biosensors and electrochemical immunoassays have been successfully development for the

  6. Wavelet-based techniques for the gamma-ray sky

    NASA Astrophysics Data System (ADS)

    McDermott, Samuel D.; Fox, Patrick J.; Cholis, Ilias; Lee, Samuel K.

    2016-07-01

    We demonstrate how the image analysis technique of wavelet decomposition can be applied to the gamma-ray sky to separate emission on different angular scales. New structures on scales that differ from the scales of the conventional astrophysical foreground and background uncertainties can be robustly extracted, allowing a model-independent characterization with no presumption of exact signal morphology. As a test case, we generate mock gamma-ray data to demonstrate our ability to extract extended signals without assuming a fixed spatial template. For some point source luminosity functions, our technique also allows us to differentiate a diffuse signal in gamma-rays from dark matter annihilation and extended gamma-ray point source populations in a data-driven way.

  7. Metal trace analysis by PIXE and PDMS techniques

    NASA Astrophysics Data System (ADS)

    Dias da Cunha, K.; Barros Leite, C. V.

    2002-03-01

    The risk for the human health due to exposure to aerosols depends on the intake pattern, the mass concentration and the speciation of the elements present in airborne particles. In this work plasma desorption mass spectrometry (PDMS) was used as complementary technique to the particle-induced X-ray emission (PIXE) technique to characterize aerosol samples collected in the environment. The PIXE technique allows the identification of the elements present in the sample and to determine their mass concentrations. The mass spectrometry (PDMS) was used to identify the speciation of these elements present in the samples. The aerosol samples were collected using a 6-stage cascade impactor (CI) in two sites of Rio de Janeiro City. One is an island (Fundão Island) in the Guanabara Bay close to an industrial zone and the other, in Gávea, is a residential zone close to a lagoon and to the seashore. The mass median aerodynamic diameter (MMAD) measured indicated that the airborne particulates were in the fine fraction of the aerosols collected in both locations. In order to identify the contribution of the seawater particles from the Guanabara Bay in the aerosols, seawater samples were also collected at Fundão Island. The samples were analyzed by PIXE and PDMS techniques. The analysis of the results suggests that the aerosols are different in both sampling sites and also exist a contribution from the Guanabara Bay seawater particles to the aerosols collected in the Fundão Island. PIXE allows identification and quantification of the elements heavier than Na ( Z=11) while PDMS allows identification of organic and inorganic compounds present in the samples, as these techniques are used as complementary techniques they provide important information about the aerosols characterization.

  8. Hydrocarbon microseepage mapping using signature based target detection techniques

    NASA Astrophysics Data System (ADS)

    Soydan, Hilal; Koz, Alper; Şebnem Düzgün, H.; Aydin Alatan, A.

    2015-10-01

    In this paper, we compare the conventional methods in hydrocarbon seepage anomalies with the signature based detection algorithms. The Crosta technique [1] is selected as a basement in the experimental comparisons for the conventional approach. The Crosta technique utilizes the characteristic bands of the searched target for principal component transformation in order to determine the components characterizing the target in interest. Desired Target Detection and Classification Algorithm (DTDCA), Spectral Matched Filter (SMF), and Normalized Correlation (NC) are employed for signature based target detection. Signature based target detection algorithms are applied to the whole spectrum benefiting from the information stored in all spectral bands. The selected methods are applied to a multispectral Advanced SpaceBorne Thermal Emission and Radiometer (ASTER) image of the study region, with an atmospheric correction prior to the realization of the algorithms. ASTER provides multispectral bands covering visible, short wave, and thermal infrared region, which serves as a useful tool for the interpretation of the areas with hydrocarbon anomalies. The exploration area is selected as Gemrik Anticline which is located in South East Anatolia, Adıyaman, Bozova Oil Field, where microseeps can be observed with almost no vegetation cover. The spectral signatures collected with Analytical Spectral Devices Inc. (ASD) spectrometer from the reference valley [2] have been utilized as an input to the signature based detection algorithms. The experiments have indicated that DTDCA and MF outperforms the Crosta technique by locating the microseepage patterns along the mitigation pathways with a better contrast. On the other hand, NC has not been able to map the searched target with a visible distinction. It is concluded that the signature based algorithms can be more effective than the conventional methods for the detection of microseepage induced anomalies.

  9. Different techniques of multispectral data analysis for vegetation fraction retrieval

    NASA Astrophysics Data System (ADS)

    Kancheva, Rumiana; Georgiev, Georgi

    2012-07-01

    Vegetation monitoring is one of the most important applications of remote sensing technologies. In respect to farmlands, the assessment of crop condition constitutes the basis of growth, development, and yield processes monitoring. Plant condition is defined by a set of biometric variables, such as density, height, biomass amount, leaf area index, and etc. The canopy cover fraction is closely related to these variables, and is state-indicative of the growth process. At the same time it is a defining factor of the soil-vegetation system spectral signatures. That is why spectral mixtures decomposition is a primary objective in remotely sensed data processing and interpretation, specifically in agricultural applications. The actual usefulness of the applied methods depends on their prediction reliability. The goal of this paper is to present and compare different techniques for quantitative endmember extraction from soil-crop patterns reflectance. These techniques include: linear spectral unmixing, two-dimensional spectra analysis, spectral ratio analysis (vegetation indices), spectral derivative analysis (red edge position), colorimetric analysis (tristimulus values sum, chromaticity coordinates and dominant wavelength). The objective is to reveal their potential, accuracy and robustness for plant fraction estimation from multispectral data. Regression relationships have been established between crop canopy cover and various spectral estimators.

  10. Diagnostic analysis of vibration signals using adaptive digital filtering techniques

    NASA Technical Reports Server (NTRS)

    Jewell, R. E.; Jones, J. H.; Paul, J. E.

    1983-01-01

    Signal enhancement techniques are described using recently developed digital adaptive filtering equipment. Adaptive filtering concepts are not new; however, as a result of recent advances in microprocessor-based electronics, hardware has been developed that has stable characteristics and of a size exceeding 1000th order. Selected data processing examples are presented illustrating spectral line enhancement, adaptive noise cancellation, and transfer function estimation in the presence of corrupting noise.

  11. Food analysis: a continuous challenge for miniaturized separation techniques.

    PubMed

    Asensio-Ramos, María; Hernández-Borges, Javier; Rocco, Anna; Fanali, Salvatore

    2009-11-01

    One of the current trends of modern analytical chemistry is the miniaturization of the various tools daily used by a large number of researchers. Ultrafast separations, consumption of small amounts of both samples and reagents as well as a high sensitivity and automation are some of the most important goals desired to be achieved. For many years a large number of research laboratories and analytical instrument manufacturing companies have been investing their efforts in this field, which includes miniaturized extraction materials, sample pre-treatment procedures and separation techniques. Among the separation techniques, capillary electromigration methods (which also include CEC), microchip and nano-LC/capillary LC have received special attention. Besides their well-known advantages over other separation tools, the role of these miniaturized techniques in food analysis is still probably in an early stage. In fact, applications in this field carried out by CEC, microchip, nano-LC and capillary LC are only a few when compared with other more established procedures such as conventional GC or HPLC. The scope of this review is to gather and discuss the different applications of such miniaturized techniques in this field. Concerning CE, microchip-CE and CEC works, emphasis has been placed on articles published after January 2007.

  12. Using object-oriented analysis techniques to support system testing

    NASA Astrophysics Data System (ADS)

    Zucconi, Lin

    1990-03-01

    Testing of real-time control systems can be greatly facilitated by use of object-oriented and structured analysis modeling techniques. This report describes a project where behavior, process and information models built for a real-time control system were used to augment and aid traditional system testing. The modeling techniques used were an adaptation of the Ward/Mellor method for real-time systems analysis and design (Ward85) for object-oriented development. The models were used to simulate system behavior by means of hand execution of the behavior or state model and the associated process (data and control flow) and information (data) models. The information model, which uses an extended entity-relationship modeling technique, is used to identify application domain objects and their attributes (instance variables). The behavioral model uses state-transition diagrams to describe the state-dependent behavior of the object. The process model uses a transformation schema to describe the operations performed on or by the object. Together, these models provide a means of analyzing and specifying a system in terms of the static and dynamic properties of the objects which it manipulates. The various models were used to simultaneously capture knowledge about both the objects in the application domain and the system implementation. Models were constructed, verified against the software as-built and validated through informal reviews with the developer. These models were then hand-executed.

  13. Dispersion analysis techniques within the space vehicle dynamics simulation program

    NASA Technical Reports Server (NTRS)

    Snow, L. S.; Kuhn, A. E.

    1975-01-01

    The Space Vehicle Dynamics Simulation (SVDS) program was evaluated as a dispersion analysis tool. The Linear Error Analysis (LEA) post processor was examined in detail and simulation techniques relative to conducting a dispersion analysis using the SVDS were considered. The LEA processor is a tool for correlating trajectory dispersion data developed by simulating 3 sigma uncertainties as single error source cases. The processor combines trajectory and performance deviations by a root-sum-square (RSS process) and develops a covariance matrix for the deviations. Results are used in dispersion analyses for the baseline reference and orbiter flight test missions. As a part of this study, LEA results were verified as follows: (A) Hand calculating the RSS data and the elements of the covariance matrix for comparison with the LEA processor computed data. (B) Comparing results with previous error analyses. The LEA comparisons and verification are made at main engine cutoff (MECO).

  14. Detection of arterial disorders by spectral analysis techniques.

    PubMed

    Ubeyli, Elif Derya

    2007-01-01

    This paper intends to an integrated view of the spectral analysis techniques in the detection of arterial disorders. The paper includes illustrative information about feature extraction from signals recorded from arteries. Short-time Fourier transform (STFT) and wavelet transform (WT) were used for spectral analysis of ophthalmic arterial (OA) Doppler signals. Using these spectral analysis methods, the variations in the shape of the Doppler spectra as a function of time were presented in the form of sonograms in order to obtain medical information. These sonograms were then used to compare the applied methods in terms of their frequency resolution and the effects in determination of OA stenosis. The author suggest that the content of the paper will assist to the people in gaining a better understanding of the STFT and WT in the detection of arterial disorders. PMID:17502695

  15. Optical accelerometer based on grating interferometer with phase modulation technique.

    PubMed

    Zhao, Shuangshuang; Zhang, Juan; Hou, Changlun; Bai, Jian; Yang, Guoguang

    2012-10-10

    In this paper, an optical accelerometer based on grating interferometer with phase modulation technique is proposed. This device architecture consists of a laser diode, a sensing chip and an optoelectronic processing circuit. The sensing chip is a sandwich structure, which is composed of a grating, a piezoelectric translator and a micromachined silicon structure consisting of a proof mass and four cantilevers. The detected signal is intensity-modulated with phase modulation technique and processed with a lock-in amplifier for demodulation. Experimental results show that this optical accelerometer has acceleration sensitivity of 619 V/g and high-resolution acceleration detection of 3 μg in the linear region. PMID:23052079

  16. An osmolyte-based micro-volume ultrafiltration technique.

    PubMed

    Ghosh, Raja

    2014-12-01

    This paper discusses a novel, simple, and inexpensive micro-volume ultrafiltration technique for protein concentration, desalting, buffer exchange, and size-based protein purification. The technique is suitable for processing protein samples in a high-throughput mode. It utilizes a combination of capillary action, and osmosis for drawing water and other permeable species from a micro-volume sample droplet applied on the surface of an ultrafiltration membrane. A macromolecule coated on the permeate side of the membrane functions as the osmolyte. The action of the osmolyte could, if required, be augmented by adding a supersorbent polymer layer over the osmolyte. The mildly hydrophobic surface of the polymeric ultrafiltration membrane used in this study minimized sample droplet spreading, thus making it easy to recover the retained material after separation, without sample interference and cross-contamination. High protein recoveries were observed in the micro-volume ultrafiltration experiments described in the paper. PMID:25284741

  17. Laser ablation in liquids as a new technique of sampling in elemental analysis of solid materials

    NASA Astrophysics Data System (ADS)

    Muravitskaya, E. V.; Rosantsev, V. A.; Belkov, M. V.; Ershov-Pavlov, E. A.; Klyachkovskaya, E. V.

    2009-02-01

    Laser ablation in liquid media is considered as a new sample preparation technique in the elemental composition analysis of materials using optical emission spectroscopy of inductively coupled plasma (ICP-OES). Solid samples are transformed into uniform colloidal solutions of nanosized analyte particles using laser radiation focused onto the sample surface. High homogeneity of the resulting solution allows performing the ICP-OES quantitative analysis especially for the samples, which are poorly soluble in acids. The technique is compatible with the conventional solution-based standards.

  18. Recording and analysis techniques for high-frequency oscillations

    PubMed Central

    Worrell, G.A.; Jerbi, K.; Kobayashi, K.; Lina, J.M.; Zelmann, R.; Le Van Quyen, M.

    2013-01-01

    In recent years, new recording technologies have advanced such that, at high temporal and spatial resolutions, high-frequency oscillations (HFO) can be recorded in human partial epilepsy. However, because of the deluge of multichannel data generated by these experiments, achieving the full potential of parallel neuronal recordings depends on the development of new data mining techniques to extract meaningful information relating to time, frequency and space. Here, we aim to bridge this gap by focusing on up-to-date recording techniques for measurement of HFO and new analysis tools for their quantitative assessment. In particular, we emphasize how these methods can be applied, what property might be inferred from neuronal signals, and potentially productive future directions. PMID:22420981

  19. Application of thermal analysis techniques in activated carbon production

    USGS Publications Warehouse

    Donnals, G.L.; DeBarr, J.A.; Rostam-Abadi, M.; Lizzio, A.A.; Brady, T.A.

    1996-01-01

    Thermal analysis techniques have been used at the ISGS as an aid in the development and characterization of carbon adsorbents. Promising adsorbents from fly ash, tires, and Illinois coals have been produced for various applications. Process conditions determined in the preparation of gram quantities of carbons were used as guides in the preparation of larger samples. TG techniques developed to characterize the carbon adsorbents included the measurement of the kinetics of SO2 adsorption, the performance of rapid proximate analyses, and the determination of equilibrium methane adsorption capacities. Thermal regeneration of carbons was assessed by TG to predict the life cycle of carbon adsorbents in different applications. TPD was used to determine the nature of surface functional groups and their effect on a carbon's adsorption properties.

  20. Crankshaft stress analysis; Combination of finite element and classical analysis techniques

    SciTech Connect

    Heath, A.R.; McNamara, P.M. )

    1990-07-01

    The conflicting legislative and customer pressures on engine design, for example, combining low friction and a high level of refinement, require sophisticated tools if competitive designs are to be realized. This is particularly true of crankshafts, probably the most analyzed of all engine components. This paper describes the hierarchy of methods used for crankshaft stress analysis with case studies. A computer-based analysis system is described that combines FE and classical methods to allow optimized designs to be produced efficiently. At the lowest level simplified classical techniques are integrated into the CAD-based design process. These methods give the rapid feedback necessary to perform concept design iterations. Various levels of FE analysis are available to carry out more detailed analyses of the crankshaft. The FE studies may feed information to or take information from the classical methods. At the highest level a method for including the load sharing effects of the flexible crankshaft within a flexible block interconnected by nonlinear oil films is described.

  1. Infrared Contrast Analysis Technique for Flash Thermography Nondestructive Evaluation

    NASA Technical Reports Server (NTRS)

    Koshti, Ajay

    2014-01-01

    The paper deals with the infrared flash thermography inspection to detect and analyze delamination-like anomalies in nonmetallic materials. It provides information on an IR Contrast technique that involves extracting normalized contrast verses time evolutions from the flash thermography infrared video data. The paper provides the analytical model used in the simulation of infrared image contrast. The contrast evolution simulation is achieved through calibration on measured contrast evolutions from many flat bottom holes in the subject material. The paper also provides formulas to calculate values of the thermal measurement features from the measured contrast evolution curve. Many thermal measurement features of the contrast evolution that relate to the anomaly characteristics are calculated. The measurement features and the contrast simulation are used to evaluate flash thermography inspection data in order to characterize the delamination-like anomalies. In addition, the contrast evolution prediction is matched to the measured anomaly contrast evolution to provide an assessment of the anomaly depth and width in terms of depth and diameter of the corresponding equivalent flat-bottom hole (EFBH) or equivalent uniform gap (EUG). The paper provides anomaly edge detection technique called the half-max technique which is also used to estimate width of an indication. The EFBH/EUG and half-max width estimations are used to assess anomaly size. The paper also provides some information on the "IR Contrast" software application, half-max technique and IR Contrast feature imaging application, which are based on models provided in this paper.

  2. Vision based techniques for rotorcraft low altitude flight

    NASA Technical Reports Server (NTRS)

    Sridhar, Banavar; Suorsa, Ray; Smith, Philip

    1991-01-01

    An overview of research in obstacle detection at NASA Ames Research Center is presented. The research applies techniques from computer vision to automation of rotorcraft navigation. The development of a methodology for detecting the range to obstacles based on the maximum utilization of passive sensors is emphasized. The development of a flight and image data base for verification of vision-based algorithms, and a passive ranging methodology tailored to the needs of helicopter flight are discussed. Preliminary results indicate that it is possible to obtain adequate range estimates except at regions close to the FOE. Closer to the FOE, the error in range increases since the magnitude of the disparity gets smaller, resulting in a low SNR.

  3. Nuclear and radiochemical techniques in chemical analysis. Final report

    SciTech Connect

    Finston, H.L.; Williams, E.T.

    1981-06-01

    The areas studied during the period of the contract included determinations of cross sections for nuclear reactions, determination of neutron capture cross sections of radionuclides, application of special activation techniques, and x-ray counting, elucidation of synergic solvent extraction mechanisms and development of new solvent extraction techniques, and the development of a PIXE analytical facility. The thermal neutron capture cross section of /sup 22/Na was determined, and cross sections and energy levels were determined for /sup 20/Ne(n,..cap alpha..)/sup 17/O, /sup 20/Ne(n,P)/sup 20/F, and /sup 40/Ar(n,..cap alpha..)/sup 37/S. Inelastic scattering with 2 to 3 MeV neutrons followed by counting of the metastable states permits analysis of the following elements: In, Sr, Cd, Hg, and Pb. Bromine can be detected in the presence of a 500-fold excess of Na and/or K by thermal neutron activation and x-ray counting, and as little as 0.3 x 10/sup -9/ g of Hg can be detected by this technique. Mediun energy neutrons (10 to 160 MeV) have been used to determine Tl, Pb, and Bi by (n,Xn) and (n,PXn) reactions. The reaction /sup 19/F(P,..cap alpha..)/sup 76/O has been used to determine as little as 50 ..mu..mol of Freon -14. Mechanisms for synergic solvent extractions have been elucidated and a new technique of homogeneous liquid-liquid solvent extraction has been developed in which the neutral complex is rapidly extracted propylene carbonate by raising and lowering the temperature of the system. An external-beam PIXE system has been developed for trace element analyses of a variety of sample types. Various sample preparation techniques have been applied to a diverse range of samples including marine sediment, coral, coal, and blood.

  4. Investigation of spectral analysis techniques for randomly sampled velocimetry data

    NASA Technical Reports Server (NTRS)

    Sree, Dave

    1993-01-01

    It is well known that velocimetry (LV) generates individual realization velocity data that are randomly or unevenly sampled in time. Spectral analysis of such data to obtain the turbulence spectra, and hence turbulence scales information, requires special techniques. The 'slotting' technique of Mayo et al, also described by Roberts and Ajmani, and the 'Direct Transform' method of Gaster and Roberts are well known in the LV community. The slotting technique is faster than the direct transform method in computation. There are practical limitations, however, as to how a high frequency and accurate estimate can be made for a given mean sampling rate. These high frequency estimates are important in obtaining the microscale information of turbulence structure. It was found from previous studies that reliable spectral estimates can be made up to about the mean sampling frequency (mean data rate) or less. If the data were evenly samples, the frequency range would be half the sampling frequency (i.e. up to Nyquist frequency); otherwise, aliasing problem would occur. The mean data rate and the sample size (total number of points) basically limit the frequency range. Also, there are large variabilities or errors associated with the high frequency estimates from randomly sampled signals. Roberts and Ajmani proposed certain pre-filtering techniques to reduce these variabilities, but at the cost of low frequency estimates. The prefiltering acts as a high-pass filter. Further, Shapiro and Silverman showed theoretically that, for Poisson sampled signals, it is possible to obtain alias-free spectral estimates far beyond the mean sampling frequency. But the question is, how far? During his tenure under 1993 NASA-ASEE Summer Faculty Fellowship Program, the author investigated from his studies on the spectral analysis techniques for randomly sampled signals that the spectral estimates can be enhanced or improved up to about 4-5 times the mean sampling frequency by using a suitable

  5. Communication methods and production techniques in fixed prosthesis fabrication: a UK based survey. Part 2: Production techniques

    PubMed Central

    Berry, J.; Nesbit, M.; Saberi, S.; Petridis, H.

    2014-01-01

    Aim The aim of this study was to identify the communication methods and production techniques used by dentists and dental technicians for the fabrication of fixed prostheses within the UK from the dental technicians' perspective. This second paper reports on the production techniques utilised. Materials and methods Seven hundred and eighty-two online questionnaires were distributed to the Dental Laboratories Association membership and included a broad range of topics, such as demographics, impression disinfection and suitability, and various production techniques. Settings were managed in order to ensure anonymity of respondents. Statistical analysis was undertaken to test the influence of various demographic variables such as the source of information, the location, and the size of the dental laboratory. Results The number of completed responses totalled 248 (32% response rate). Ninety percent of the respondents were based in England and the majority of dental laboratories were categorised as small sized (working with up to 25 dentists). Concerns were raised regarding inadequate disinfection protocols between dentists and dental laboratories and the poor quality of master impressions. Full arch plastic trays were the most popular impression tray used by dentists in the fabrication of crowns (61%) and bridgework (68%). The majority (89%) of jaw registration records were considered inaccurate. Forty-four percent of dental laboratories preferred using semi-adjustable articulators. Axial and occlusal under-preparation of abutment teeth was reported as an issue in about 25% of cases. Base metal alloy was the most (52%) commonly used alloy material. Metal-ceramic crowns were the most popular choice for anterior (69%) and posterior (70%) cases. The various factors considered did not have any statistically significant effect on the answers provided. The only notable exception was the fact that more methods of communicating the size and shape of crowns were utilised for

  6. Methodologies and techniques for analysis of network flow data

    SciTech Connect

    Bobyshev, A.; Grigoriev, M.; /Fermilab

    2004-12-01

    Network flow data gathered at the border routers and core switches is used at Fermilab for statistical analysis of traffic patterns, passive network monitoring, and estimation of network performance characteristics. Flow data is also a critical tool in the investigation of computer security incidents. Development and enhancement of flow based tools is an on-going effort. This paper describes the most recent developments in flow analysis at Fermilab.

  7. A modal impedance technique for mid and high frequency analysis of an uncertain stiffened composite plate

    NASA Astrophysics Data System (ADS)

    Seçgin, A.; Kara, M.; Ozankan, A.

    2016-03-01

    A modal impedance technique is introduced for mid frequency vibration analyses. The approach is mainly based on statistical energy analysis (SEA), however loss factors are determined by not only driving but also contributed by transfer mobilities. The mobilities are computed by finite element modal analysis. The technique takes geometrical complexity and boundary condition into account to handle their mid-frequency effects. It is applied to a stiffened composite plate having randomized mass, i.e., uncertain plate. For the verification, several numerical and experimental tests are performed. Internal damping of subsystems is evaluated using power injection and is then fed to finite element software to perform numerical analyses. Monte Carlo simulation is employed for the uncertainty analyses. To imitate plate mass heterogeneity, many small masses are used in both numerical and experimental analysis. It is shown that the proposed technique can reliably be used for vibration analyses of uncertain complex structures from mid to high frequency regions.

  8. Antimisting kerosene: Base fuel effects, blending and quality control techniques

    NASA Technical Reports Server (NTRS)

    Yavrouian, A. H.; Ernest, J.; Sarohia, V.

    1984-01-01

    The problems associated with blending of the AMK additive with Jet A, and the base fuel effects on AMK properties are addressed. The results from the evaluation of some of the quality control techniques for AMK are presented. The principal conclusions of this investigation are: significant compositional differences for base fuel (Jet A) within the ASTM specification DI655; higher aromatic content of the base fuel was found to be beneficial for the polymer dissolution at ambient (20 C) temperature; using static mixer technology, the antimisting additive (FM-9) is in-line blended with Jet A, producing AMK which has adequate fire-protection properties 15 to 20 minutes after blending; degradability of freshly blended and equilibrated AMK indicated that maximum degradability is reached after adequate fire protection is obtained; the results of AMK degradability as measured by filter ratio, confirmed previous RAE data that power requirements to decade freshly blended AMK are significantly higher than equilibrated AMK; blending of the additive by using FM-9 concentrate in Jet A produces equilibrated AMK almost instantly; nephelometry offers a simple continuous monitoring capability and is used as a real time quality control device for AMK; and trajectory (jet thurst) and pressure drop tests are useful laboratory techniques for evaluating AMK quality.

  9. Overview of independent component analysis technique with an application to synthetic aperture radar (SAR) imagery processing.

    PubMed

    Fiori, Simone

    2003-01-01

    We present an overview of independent component analysis, an emerging signal processing technique based on neural networks, with the aim to provide an up-to-date survey of the theoretical streams in this discipline and of the current applications in the engineering area. We also focus on a particular application, dealing with a remote sensing technique based on synthetic aperture radar imagery processing: we briefly review the features and main applications of synthetic aperture radar and show how blind signal processing by neural networks may be advantageously employed to enhance the quality of remote sensing data.

  10. [Computation techniques in the conformational analysis of carbohydrates].

    PubMed

    Gebst, A G; Grachev, A A; Shashkov, A S; Nifant'ev, N E

    2007-01-01

    A growing number of modern studies of carbohydrates is devoted to spatial mechanisms of their participation in the cell recognition processes and directed design of inhibitors of these processes. Any progress in this field is impossible without the development of theoretical conformational analysis of carbohydrates. In this review, we generalize literature data on the potentialities of using of different molecular-mechanic force fields, the methods of quantum mechanics, and molecular dynamics to study the conformation of glycoside bond. A possibility of analyzing the reactivity of carbohydrates with the computation techniques is also discussed in brief.

  11. Data Analysis Techniques for a Lunar Surface Navigation System Testbed

    NASA Technical Reports Server (NTRS)

    Chelmins, David; Sands, O. Scott; Swank, Aaron

    2011-01-01

    NASA is interested in finding new methods of surface navigation to allow astronauts to navigate on the lunar surface. In support of the Vision for Space Exploration, the NASA Glenn Research Center developed the Lunar Extra-Vehicular Activity Crewmember Location Determination System and performed testing at the Desert Research and Technology Studies event in 2009. A significant amount of sensor data was recorded during nine tests performed with six test subjects. This paper provides the procedure, formulas, and techniques for data analysis, as well as commentary on applications.

  12. Hybridizing experimental, numerical, and analytical stress analysis techniques

    NASA Astrophysics Data System (ADS)

    Rowlands, Robert E.

    2001-06-01

    Good measurements enjoy the advantage of conveying what actually occurs. However, recognizing that vast amounts of displacement, strain and/or stress-related information can now be recorded at high resolution, effective and reliable means of processing the data become important. It can therefore be advantageous to combine measured result with analytical and computations methods. This presentation will describe such synergism and applications to engineering problems. This includes static and transient analysis, notched and perforated composites, and fracture of composites and fiber-filled cement. Experimental methods of moire, thermo elasticity and strain gages are emphasized. Numerical techniques utilized include pseudo finite-element and boundary-element concepts.

  13. Development of solution techniques for nonlinear structural analysis

    NASA Technical Reports Server (NTRS)

    Vos, R. G.; Andrews, J. S.

    1974-01-01

    Nonlinear structural solution methods in the current research literature are classified according to order of the solution scheme, and it is shown that the analytical tools for these methods are uniformly derivable by perturbation techniques. A new perturbation formulation is developed for treating an arbitrary nonlinear material, in terms of a finite-difference generated stress-strain expansion. Nonlinear geometric effects are included in an explicit manner by appropriate definition of an applicable strain tensor. A new finite-element pilot computer program PANES (Program for Analysis of Nonlinear Equilibrium and Stability) is presented for treatment of problems involving material and geometric nonlinearities, as well as certain forms on nonconservative loading.

  14. Study of systems and techniques for data base management

    NASA Technical Reports Server (NTRS)

    1976-01-01

    Data management areas were studied to identify pertinent problems and issues that will affect future NASA data users in terms of performance and cost. Specific topics discussed include the identifications of potential NASA data users other than those normally discussed, consideration affecting the clustering of minicomputers, low cost computer system for information retrieval and analysis, the testing of minicomputer based data base management systems, ongoing work related to the use of dedicated systems for data base management, and the problems of data interchange among a community of NASA data users.

  15. Comparative study of manual liquid-based cytology (MLBC) technique and direct smear technique (conventional) on fine-needle cytology/fine-needle aspiration cytology samples

    PubMed Central

    Pawar, Prajkta Suresh; Gadkari, Rasika Uday; Swami, Sunil Y.; Joshi, Anil R.

    2014-01-01

    Background: Liquid-based cytology technique enables cells to be suspended in a liquid medium and spread in a monolayer, making better morphological assessment. Automated techniques have been widely used, but limited due to cost and availability. Aim: The aim was to establish manual liquid-based cytology (MLBC) technique on fine-needle aspiration cytology (FNAC) material and compare its results with conventional technique. Materials and Methods: In this study, we examined cells trapped in needles hub used for the collection of FNAC samples. 50 cases were examined by the MLBC technique and compared with the conventional FNAC technique. By centrifugation, sediment was obtained and imprint was taken on defined area. Papanicolaou (Pap) and May-Grünwald Giemsa (MGG) staining was done. Direct smears and MLBC smears were compared for cellularity, background, cellular preservation, and nuclear preservation. Slides were diagnosed independently by two cytologists with more than 5 years’ experience. Standard error of proportion was used for statistical analysis. Results: Cellularity was low in MLBC as compared with conventional smears, which is expected as remnant material in the needle hub was used. Nuclei overlap to a lesser extent and hemorrhage and necrosis was reduced, so cell morphology can be better studied in the MLBC technique. P value obtained was <0.05. Conclusion: This MLBC technique gives results comparable to the conventional technique with better morphology. In a set up where aspirators are learners, this technique will ensure adequacy due to remnant in needle hub getting processed PMID:25210235

  16. Water-based technique to produce porous PZT materials

    NASA Astrophysics Data System (ADS)

    Galassi, C.; Capiani, C.; Craciun, F.; Roncari, E.

    2005-09-01

    Water based colloidal processing of PZT materials was investigated in order to reduce costs and employ more environmental friendly manufacturing. The technique addressed was the production of porous thick samples by the so called “starch consolidation”. PZT “soft” compositions were used. The “starch consolidation” process allows to obtain the green body by raising the temperature of a suspension of PZT powder, soluble starch and water, cast into a metal mould. The influence of the processing parameters and composition on the morphology, pore volumes, pore size distributions and piezoelectric properties are investigated. Zeta potential determination and titration with different deflocculants were essential tools to adjust the slurry formulation.

  17. Foreign fiber detecting system based on multispectral technique

    NASA Astrophysics Data System (ADS)

    Li, Qi; Han, Shaokun; Wang, Ping; Wang, Liang; Xia, Wenze

    2015-08-01

    This paper presents a foreign fiber detecting system based on multi-spectral technique. The absorption rate and the reflectivity of foreign fibers differently under different wavelengths of light so that the characteristics of the image has difference in the different light irradiation. Contrast pyramid image fusion algorithm and adaptive enhancement is improved to extracted the foreign fiber from the cotton background. The experimental results show that the single light source can detect 6 kinds of foreign fiber in cotton and multi-spectral detection can detect eight kinds.

  18. A review of parametric modelling techniques for EEG analysis.

    PubMed

    Pardey, J; Roberts, S; Tarassenko, L

    1996-01-01

    This review provides an introduction to the use of parametric modelling techniques for time series analysis, and in particular the application of autoregressive modelling to the analysis of physiological signals such as the human electroencephalogram. The concept of signal stationarity is considered and, in the light of this, both adaptive models, and non-adaptive models employing fixed or adaptive segmentation, are discussed. For non-adaptive autoregressive models, the Yule-Walker equations are derived and the popular Levinson-Durbin and Burg algorithms are introduced. The interpretation of an autoregressive model as a recursive digital filter and its use in spectral estimation are considered, and the important issues of model stability and model complexity are discussed.

  19. Sensitivity analysis techniques for models of human behavior.

    SciTech Connect

    Bier, Asmeret Brooke

    2010-09-01

    Human and social modeling has emerged as an important research area at Sandia National Laboratories due to its potential to improve national defense-related decision-making in the presence of uncertainty. To learn about which sensitivity analysis techniques are most suitable for models of human behavior, different promising methods were applied to an example model, tested, and compared. The example model simulates cognitive, behavioral, and social processes and interactions, and involves substantial nonlinearity, uncertainty, and variability. Results showed that some sensitivity analysis methods create similar results, and can thus be considered redundant. However, other methods, such as global methods that consider interactions between inputs, can generate insight not gained from traditional methods.

  20. A new imaging technique for reliable migration velocity analysis

    SciTech Connect

    Duquet, B.; Ehinger, A.; Lailly, P.

    1994-12-31

    In case of severe lateral velocity variations prestack depth migration is not suitable for migration velocity analysis. The authors therefore propose to substitute prestack depth migration by prestack imaging by coupled linearized inversion (PICLI). Results obtained with the Marmousi model show the improvement offered by this method for migration velocity analysis. PICLI involves a huge amount of computation. Hence they have paid special attention both to the solution of the forward problem and to the optimization algorithm. To simplify the forward problem they make use of paraxial approximations of the wave equation. Efficiency in the optimization algorithm is obtained by an exact calculation of the gradient by means of the adjoint state technique and by an adequate preconditioning. Doing so the above mentioned improvement is obtained at reasonable cost.

  1. Evolutionary Based Techniques for Fault Tolerant Field Programmable Gate Arrays

    NASA Technical Reports Server (NTRS)

    Larchev, Gregory V.; Lohn, Jason D.

    2006-01-01

    The use of SRAM-based Field Programmable Gate Arrays (FPGAs) is becoming more and more prevalent in space applications. Commercial-grade FPGAs are potentially susceptible to permanently debilitating Single-Event Latchups (SELs). Repair methods based on Evolutionary Algorithms may be applied to FPGA circuits to enable successful fault recovery. This paper presents the experimental results of applying such methods to repair four commonly used circuits (quadrature decoder, 3-by-3-bit multiplier, 3-by-3-bit adder, 440-7 decoder) into which a number of simulated faults have been introduced. The results suggest that evolutionary repair techniques can improve the process of fault recovery when used instead of or as a supplement to Triple Modular Redundancy (TMR), which is currently the predominant method for mitigating FPGA faults.

  2. Linear Frequency Estimation Technique for Reducing Frequency Based Signals

    PubMed Central

    Woodbridge, Jonathan; Bui, Alex; Sarrafzadeh, Majid

    2016-01-01

    This paper presents a linear frequency estimation (LFE) technique for data reduction of frequency-based signals. LFE converts a signal to the frequency domain by utilizing the Fourier transform and estimates both the real and imaginary parts with a series of vectors much smaller than the original signal size. The estimation is accomplished by selecting optimal points from the frequency domain and interpolating data between these points with a first order approximation. The difficulty of such a problem lies in determining which points are most significant. LFE is unique in the fact that it is generic to a wide variety of frequency-based signals such as electromyography (EMG), voice, and electrocardiography (ECG). The only requirement is that spectral coefficients are spatially correlated. This paper presents the algorithm and results from both EMG and voice data. We complete the paper with a description of how this method can be applied to pattern types of recognition, signal indexing, and compression.

  3. Application of transport phenomena analysis technique to cerebrospinal fluid.

    PubMed

    Lam, C H; Hansen, E A; Hall, W A; Hubel, A

    2013-12-01

    The study of hydrocephalus and the modeling of cerebrospinal fluid flow have proceeded in the past using mathematical analysis that was very capable of prediction phenomenonologically but not well in physiologic parameters. In this paper, the basis of fluid dynamics at the physiologic state is explained using first established equations of transport phenomenon. Then, microscopic and molecular level techniques of modeling are described using porous media theory and chemical kinetic theory and then applied to cerebrospinal fluid (CSF) dynamics. Using techniques of transport analysis allows the field of cerebrospinal fluid dynamics to approach the level of sophistication of urine and blood transport. Concepts such as intracellular and intercellular pathways, compartmentalization, and tortuosity are associated with quantifiable parameters that are relevant to the anatomy and physiology of cerebrospinal fluid transport. The engineering field of transport phenomenon is rich and steeped in architectural, aeronautical, nautical, and more recently biological history. This paper summarizes and reviews the approaches that have been taken in the field of engineering and applies it to CSF flow. PMID:24091435

  4. Homogenization techniques for the analysis of porous SMA

    NASA Astrophysics Data System (ADS)

    Sepe, V.; Auricchio, F.; Marfia, S.; Sacco, E.

    2016-05-01

    In this paper the mechanical response of porous Shape Memory Alloy (SMA) is modeled. The porous SMA is considered as a composite medium made of a dense SMA matrix with voids treated as inclusions. The overall response of this very special composite is deduced performing a micromechanical and homogenization analysis. In particular, the incremental Mori-Tanaka averaging scheme is provided; then, the Transformation Field Analysis procedure in its uniform and nonuniform approaches, UTFA and NUTFA respectively, are presented. In particular, the extension of the NUTFA technique proposed by Sepe et al. (Int J Solids Struct 50:725-742, 2013) is presented to investigate the response of porous SMA characterized by closed and open porosity. A detailed comparison between the outcomes provided by the Mori-Tanaka, the UTFA and the proposed NUTFA procedures for porous SMA is presented, through numerical examples for two- and three-dimensional problems. In particular, several values of porosity and different loading conditions, inducing pseudoelastic effect in the SMA matrix, are investigated. The predictions assessed by the Mori-Tanaka, the UTFA and the NUTFA techniques are compared with the results obtained by nonlinear finite element analyses. A comparison with experimental data available in literature is also presented.

  5. RBF-based technique for statistical demodulation of pathological tremor.

    PubMed

    Gianfelici, Francesco

    2013-10-01

    This paper presents an innovative technique based on the joint approximation capabilities of radial basis function (RBF) networks and the estimation capability of the multivariate iterated Hilbert transform (IHT) for the statistical demodulation of pathological tremor from electromyography (EMG) signals in patients with Parkinson's disease. We define a stochastic model of the multichannel high-density surface EMG by means of the RBF networks applied to the reconstruction of the stochastic process (characterizing the disease) modeled by the multivariate relationships generated by the Karhunen-Loéve transform in Hilbert spaces. Next, we perform a demodulation of the entire random field by means of the estimation capability of the multivariate IHT in a statistical setting. The proposed method is applied to both simulated signals and data recorded from three Parkinsonian patients and the results show that the amplitude modulation components of the tremor oscillation can be estimated with signal-to-noise ratio close to 30 dB with root-mean-square error for the estimates of the tremor instantaneous frequency. Additionally, the comparisons with a large number of techniques based on all the combinations of the RBF, extreme learning machine, backpropagation, support vector machine used in the first step of the algorithm; and IHT, empirical mode decomposition, multiband energy separation algorithm, periodic algebraic separation and energy demodulation used in the second step of the algorithm, clearly show the effectiveness of our technique. These results show that the proposed approach is a potential useful tool for advanced neurorehabilitation technologies that aim at tremor characterization and suppression. PMID:24808594

  6. Modern Micro and Nanoparticle-Based Imaging Techniques

    PubMed Central

    Ryvolova, Marketa; Chomoucka, Jana; Drbohlavova, Jana; Kopel, Pavel; Babula, Petr; Hynek, David; Adam, Vojtech; Eckschlager, Tomas; Hubalek, Jaromir; Stiborova, Marie; Kaiser, Jozef; Kizek, Rene

    2012-01-01

    The requirements for early diagnostics as well as effective treatment of insidious diseases such as cancer constantly increase the pressure on development of efficient and reliable methods for targeted drug/gene delivery as well as imaging of the treatment success/failure. One of the most recent approaches covering both the drug delivery as well as the imaging aspects is benefitting from the unique properties of nanomaterials. Therefore a new field called nanomedicine is attracting continuously growing attention. Nanoparticles, including fluorescent semiconductor nanocrystals (quantum dots) and magnetic nanoparticles, have proven their excellent properties for in vivo imaging techniques in a number of modalities such as magnetic resonance and fluorescence imaging, respectively. In this article, we review the main properties and applications of nanoparticles in various in vitro imaging techniques, including microscopy and/or laser breakdown spectroscopy and in vivo methods such as magnetic resonance imaging and/or fluorescence-based imaging. Moreover the advantages of the drug delivery performed by nanocarriers such as iron oxides, gold, biodegradable polymers, dendrimers, lipid based carriers such as liposomes or micelles are also highlighted. PMID:23202187

  7. A polarization-based Thomson scattering technique for burning plasmas

    NASA Astrophysics Data System (ADS)

    Parke, E.; Mirnov, V. V.; Den Hartog, D. J.

    2014-02-01

    The traditional Thomson scattering diagnostic is based on measurement of the wavelength spectrum of scattered light, where electron temperature measurements are inferred from thermal broadening of the spectrum. At sufficiently high temperatures, especially those predicted for ITER and other burning plasmas, relativistic effects cause a change in the degree of polarization (P) of the scattered light; for fully polarized incident laser light, the scattered light becomes partially polarized. The resulting reduction of polarization is temperature dependent and has been proposed by other authors as a potential alternative to the traditional spectral decomposition technique. Following the previously developed Stokes vector approach, we analytically calculate the degree of polarization for incoherent Thomson scattering. For the first time, we obtain exact results valid for the full range of incident laser polarization states, scattering angles, and electron temperatures. While previous work focused only on linear polarization, we show that circularly polarized incident light optimizes the degree of depolarization for a wide range of temperatures relevant to burning plasmas. We discuss the feasibility of a polarization based Thomson scattering diagnostic for ITER-like plasmas with both linearly and circularly polarized light and compare to the traditional technique.

  8. Application of image processing techniques to fluid flow data analysis

    NASA Technical Reports Server (NTRS)

    Giamati, C. C.

    1981-01-01

    The application of color coding techniques used in processing remote sensing imagery to analyze and display fluid flow data is discussed. A minicomputer based color film recording and color CRT display system is described. High quality, high resolution images of two-dimensional data are produced on the film recorder. Three dimensional data, in large volume, are used to generate color motion pictures in which time is used to represent the third dimension. Several applications and examples are presented. System hardware and software is described.

  9. Analysis of non-linearity in differential wavefront sensing technique.

    PubMed

    Duan, Hui-Zong; Liang, Yu-Rong; Yeh, Hsien-Chi

    2016-03-01

    An analytical model of a differential wavefront sensing (DWS) technique based on Gaussian Beam propagation has been derived. Compared with the result of the interference signals detected by quadrant photodiode, which is calculated by using the numerical method, the analytical model has been verified. Both the analytical model and numerical simulation show milli-radians level non-linearity effect of DWS detection. In addition, the beam clipping has strong influence on the non-linearity of DWS. The larger the beam clipping is, the smaller the non-linearity is. However, the beam walking effect hardly has influence on DWS. Thus, it can be ignored in laser interferometer. PMID:26974079

  10. Statistical analysis of heartbeat data with wavelet techniques

    NASA Astrophysics Data System (ADS)

    Pazsit, Imre

    2004-05-01

    The purpose of this paper is to demonstrate the use of some methods of signal analysis, performed on ECG and in some cases blood pressure signals, for the classification of the health status of the heart of mice and rats. Spectral and wavelet analysis were performed on the raw signals. FFT-based coherence and phase was also calculated between blood pressure and raw ECG signals. Finally, RR-intervals were deduced from the ECG signals and an analysis of the fractal dimensions was performed. The analysis was made on data from mice and rats. A correlation was found between the health status of the mice and the rats and some of the statistical descriptors, most notably the phase of the cross-spectra between ECG and blood pressure, and the fractal properties and dimensions of the interbeat series (RR-interval fluctuations).

  11. [Thinking on TCM literature evaluation methods and techniques based on mass information].

    PubMed

    Xie, Qi; Cui, Meng; Pan, Yan-li

    2007-08-01

    The necessity and feasibility of TCM literature evaluation based on mass information of TCM literature was discussed in this paper. Beginning with the description on current situation of mass TCM literature information research, the authors offered a tentative plan for evaluating scientific and technologic TCM literature, its method and technique, and systematically analyzed the key issues, such as the subjects selection, documents screening and sorting, literature analysis, and development of software analysis platform, then, the methodology and the technology for constituting the mass TCM literature information based evaluation system was systemically clarified.

  12. New method for speciation analysis of aluminium fluoride complexes by HPLC-FAAS hyphenated technique.

    PubMed

    Frankowski, M; Zioła-Frankowska, A; Siepak, J

    2010-03-15

    Speciation analysis of aluminium in the presented system of HPLC-FAAS hyphenated technique lasts 4min. Using the bifunctional column in model analysis and using the calculation methods for modelling using the Mineql program enabled the authors to presume that particular forms will be subjected to elution in the following order: (1) AlF(2)(+) and AlF(4)(-), (2) AlF(2+) and AlF(3)(0) and (3) Al(3+). Based on the obtained results for model solutions, the presented method enables the determination of aluminium fluoride complexes and Al(3+) speciation form. The study compares the tendency of occurrence variability of aluminium fluoride complexes and Al(3+) form, determined based on the results obtained using the HPLC-FAAS hyphenated technique with the trend defined based on the Mineql program calculation method. The method was successfully applied to soil samples. PMID:20152461

  13. Large area photodetector based on microwave cavity perturbation techniques

    SciTech Connect

    Braggio, C. Carugno, G.; Sirugudu, R. K.; Lombardi, A.; Ruoso, G.

    2014-07-28

    We present a preliminary study to develop a large area photodetector, based on a semiconductor crystal placed inside a superconducting resonant cavity. Laser pulses are detected through a variation of the cavity impedance, as a consequence of the conductivity change in the semiconductor. A novel method, whereby the designed photodetector is simulated by finite element analysis, makes it possible to perform pulse-height spectroscopy on the reflected microwave signals. We measure an energy sensitivity of 100 fJ in the average mode without the employment of low noise electronics and suggest possible ways to further reduce the single-shot detection threshold, based on the results of the described method.

  14. An interactive tutorial-based training technique for vertebral morphometry.

    PubMed

    Gardner, J C; von Ingersleben, G; Heyano, S L; Chesnut, C H

    2001-01-01

    The purpose of this work was to develop a computer-based procedure for training technologists in vertebral morphometry. The utility of the resulting interactive, tutorial based training method was evaluated in this study. The training program was composed of four steps: (1) review of an online tutorial, (2) review of analyzed spine images, (3) practice in fiducial point placement and (4) testing. During testing, vertebral heights were measured from digital, lateral spine images containing osteoporotic fractures. Inter-observer measurement precision was compared between research technicians, and between technologists and radiologist. The technologists participating in this study had no prior experience in vertebral morphometry. Following completion of the online training program, good inter-observer measurement precision was seen between technologists, showing mean coefficients of variation of 2.33% for anterior, 2.87% for central and 2.65% for posterior vertebral heights. Comparisons between the technicians and radiologist ranged from 2.19% to 3.18%. Slightly better precision values were seen with height measurements compared with height ratios, and with unfractured compared with fractured vertebral bodies. The findings of this study indicate that self-directed, tutorial-based training for spine image analyses is effective, resulting in good inter-observer measurement precision. The interactive tutorial-based approach provides standardized training methods and assures consistency of instructional technique over time.

  15. Enhancing the effectiveness of IST through risk-based techniques

    SciTech Connect

    Floyd, S.D.

    1996-12-01

    Current IST requirements were developed mainly through deterministic-based methods. While this approach has resulted in an adequate level of safety and reliability for pumps and valves, insights from probabilistic safety assessments suggest a better safety focus can be achieved at lower costs. That is, some high safety impact pumps and valves are currently not tested under the IST program and should be added, while low safety impact valves could be tested at significantly greater intervals than allowed by the current IST program. The nuclear utility industry, through the Nuclear Energy Institute (NEI), has developed a draft guideline for applying risk-based techniques to focus testing on those pumps and valves with a high safety impact while reducing test frequencies on low safety impact pumps and valves. The guideline is being validated through an industry pilot application program that is being reviewed by the U.S. Nuclear Regulatory Commission. NEI and the ASME maintain a dialogue on the two groups` activities related to risk-based IST. The presenter will provide an overview of the NEI guideline, discuss the methodological approach for applying risk-based technology to IST and provide the status of the industry pilot plant effort.

  16. Analysis techniques for airborne laser range safety evaluations

    NASA Astrophysics Data System (ADS)

    Ramsburg, M. S.; Jenkins, D. L.; Doerflein, R. D.

    1982-08-01

    Techniques to evaluate safety of airborne laser operations on the range are reported. The objectives of the safety evaluations were to (1) protect civilian and military personnel from the hazards associated with lasers, (2) provide users with the least restrictive constraints in which to perform their mission and still maintain an adequate degree of safety, and (3) develop a data base for the Navy in the event of suspected laser exposure of other related incidents involving military or civilian personnel. A microcomputer code, written in ASNI 77 FORTRAN, has been developed, which will provide safe flight profiles for airborne laser systems. The output of this code can also be used in establishing operating areas for ground based Lasers. Input to the code includes output parameters, NOHD and assigned buffer zone for the laser system, as well as parameters describing the geometry of the range.

  17. Evaluations of mosquito age grading techniques based on morphological changes.

    PubMed

    Hugo, L E; Quick-Miles, S; Kay, B H; Ryan, P A

    2008-05-01

    Evaluations were made of the accuracy and practicality of mosquito age grading methods based on changes to mosquito morphology; including the Detinova ovarian tracheation, midgut meconium, Polovodova ovariole dilatation, ovarian injection, and daily growth line methods. Laboratory maintained Aedes vigilax (Skuse) and Culex annulirostris (Skuse) females of known chronological and physiological ages were used for these assessments. Application of the Detinova technique to laboratory reared Ae. vigilax females in a blinded trial enabled the successful identification of nulliparous and parous females in 83.7-89.8% of specimens. The success rate for identifying nulliparous females increased to 87.8-98.0% when observations of ovarian tracheation were combined with observations of the presence of midgut meconium. However, application of the Polovodova method only enabled 57.5% of nulliparous, 1-parous, 2-parous, and 3-parous Ae. vigilax females to be correctly classified, and ovarian injections were found to be unfeasible. Poor correlation was observed between the number of growth lines per phragma and the calendar age of laboratory reared Ae. vigilax females. In summary, morphological age grading methods that offer simple two-category predictions (ovarian tracheation and midgut meconium methods) were found to provide high-accuracy classifications, whereas methods that offer the separation of multiple age categories (ovariolar dilatation and growth line methods) were found to be extremely difficult and of low accuracy. The usefulness of the morphology-based methods is discussed in view of the availability of new mosquito age grading techniques based on cuticular hydrocarbon and gene transcription changes. PMID:18533427

  18. Insight to Nanoparticle Size Analysis-Novel and Convenient Image Analysis Method Versus Conventional Techniques.

    PubMed

    Vippola, Minnamari; Valkonen, Masi; Sarlin, Essi; Honkanen, Mari; Huttunen, Heikki

    2016-12-01

    The aim of this paper is to introduce a new image analysis program "Nanoannotator" particularly developed for analyzing individual nanoparticles in transmission electron microscopy images. This paper describes the usefulness and efficiency of the program when analyzing nanoparticles, and at the same time, we compare it to more conventional nanoparticle analysis techniques. The techniques which we are concentrating here are transmission electron microscopy (TEM) linked with different image analysis methods and X-ray diffraction techniques. The developed program appeared as a good supplement to the field of particle analysis techniques, since the traditional image analysis programs suffer from the inability to separate the individual particles from agglomerates in the TEM images. The program is more efficient, and it offers more detailed morphological information of the particles than the manual technique. However, particle shapes that are very different from spherical proved to be problematic also for the novel program. When compared to X-ray techniques, the main advantage of the small-angle X-ray scattering (SAXS) method is the average data it provides from a very large amount of particles. However, the SAXS method does not provide any data about the shape or appearance of the sample.

  19. Geospatial Products and Techniques at the Center for Transportation Analysis

    SciTech Connect

    Chin, Shih-Miao; Hwang, Ho-Ling; Peterson, Bruce E

    2008-01-01

    This paper highlights geospatial science-related innovations and developments conducted by the Center for Transportation Analysis (CTA) at the Oak Ridge National Laboratory. CTA researchers have been developing integrated inter-modal transportation solutions through innovative and cost-effective research and development for many years. Specifically, this paper profiles CTA-developed Geographic Information System (GIS) products that are publicly available. Examples of these GIS-related products include: the CTA Transportation Networks; GeoFreight system; and the web-based Multi-Modal Routing Analysis System. In addition, an application on assessment of railroad Hazmat routing alternatives is also discussed.

  20. Plasma and trap-based techniques for science with positrons

    NASA Astrophysics Data System (ADS)

    Danielson, J. R.; Dubin, D. H. E.; Greaves, R. G.; Surko, C. M.

    2015-01-01

    In recent years, there has been a wealth of new science involving low-energy antimatter (i.e., positrons and antiprotons) at energies ranging from 102 to less than 10-3 eV . Much of this progress has been driven by the development of new plasma-based techniques to accumulate, manipulate, and deliver antiparticles for specific applications. This article focuses on the advances made in this area using positrons. However, many of the resulting techniques are relevant to antiprotons as well. An overview is presented of relevant theory of single-component plasmas in electromagnetic traps. Methods are described to produce intense sources of positrons and to efficiently slow the typically energetic particles thus produced. Techniques are described to trap positrons efficiently and to cool and compress the resulting positron gases and plasmas. Finally, the procedures developed to deliver tailored pulses and beams (e.g., in intense, short bursts, or as quasimonoenergetic continuous beams) for specific applications are reviewed. The status of development in specific application areas is also reviewed. One example is the formation of antihydrogen atoms for fundamental physics [e.g., tests of invariance under charge conjugation, parity inversion, and time reversal (the CPT theorem), and studies of the interaction of gravity with antimatter]. Other applications discussed include atomic and materials physics studies and the study of the electron-positron many-body system, including both classical electron-positron plasmas and the complementary quantum system in the form of Bose-condensed gases of positronium atoms. Areas of future promise are also discussed. The review concludes with a brief summary and a list of outstanding challenges.

  1. A novel pulse height analysis technique for nuclear spectroscopic and imaging systems

    NASA Astrophysics Data System (ADS)

    Tseng, H. H.; Wang, C. Y.; Chou, H. P.

    2005-08-01

    The proposed pulse height analysis technique is based on the constant and linear relationship between pulse width and pulse height generated from front-end electronics of nuclear spectroscopic and imaging systems. The present technique has successfully implemented into the sump water radiation monitoring system in a nuclear power plant. The radiation monitoring system uses a NaI(Tl) scintillator to detect radioactive nuclides of Radon daughters brought down by rain. The technique is also used for a nuclear medical imaging system. The system uses a position sensitive photomultiplier tube coupled with a scintillator. The proposed techniques has greatly simplified the electronic design and made the system a feasible one for potable applications.

  2. Post-test navigation data analysis techniques for the shuttle ALT

    NASA Technical Reports Server (NTRS)

    1975-01-01

    Postflight test analysis data processing techniques for shuttle approach and landing tests (ALT) navigation data are defined. Postfight test processor requirements are described along with operational and design requirements, data input requirements, and software test requirements. The postflight test data processing is described based on the natural test sequence: quick-look analysis, postflight navigation processing, and error isolation processing. Emphasis is placed on the tradeoffs that must remain open and subject to analysis until final definition is achieved in the shuttle data processing system and the overall ALT plan. A development plan for the implementation of the ALT postflight test navigation data processing system is presented. Conclusions are presented.

  3. Detecting Molecular Properties by Various Laser-Based Techniques

    SciTech Connect

    Hsin, Tse-Ming

    2007-01-01

    Four different laser-based techniques were applied to study physical and chemical characteristics of biomolecules and dye molecules. These techniques are liole burning spectroscopy, single molecule spectroscopy, time-resolved coherent anti-Stokes Raman spectroscopy and laser-induced fluorescence microscopy. Results from hole burning and single molecule spectroscopy suggested that two antenna states (C708 & C714) of photosystem I from cyanobacterium Synechocystis PCC 6803 are connected by effective energy transfer and the corresponding energy transfer time is ~6 ps. In addition, results from hole burning spectroscopy indicated that the chlorophyll dimer of the C714 state has a large distribution of the dimer geometry. Direct observation of vibrational peaks and evolution of coumarin 153 in the electronic excited state was demonstrated by using the fs/ps CARS, a variation of time-resolved coherent anti-Stokes Raman spectroscopy. In three different solvents, methanol, acetonitrile, and butanol, a vibration peak related to the stretch of the carbonyl group exhibits different relaxation dynamics. Laser-induced fluorescence microscopy, along with the biomimetic containers-liposomes, allows the measurement of the enzymatic activity of individual alkaline phosphatase from bovine intestinal mucosa without potential interferences from glass surfaces. The result showed a wide distribution of the enzyme reactivity. Protein structural variation is one of the major reasons that are responsible for this highly heterogeneous behavior.

  4. Breath Analysis Using Laser Spectroscopic Techniques: Breath Biomarkers, Spectral Fingerprints, and Detection Limits

    PubMed Central

    Wang, Chuji; Sahay, Peeyush

    2009-01-01

    Breath analysis, a promising new field of medicine and medical instrumentation, potentially offers noninvasive, real-time, and point-of-care (POC) disease diagnostics and metabolic status monitoring. Numerous breath biomarkers have been detected and quantified so far by using the GC-MS technique. Recent advances in laser spectroscopic techniques and laser sources have driven breath analysis to new heights, moving from laboratory research to commercial reality. Laser spectroscopic detection techniques not only have high-sensitivity and high-selectivity, as equivalently offered by the MS-based techniques, but also have the advantageous features of near real-time response, low instrument costs, and POC function. Of the approximately 35 established breath biomarkers, such as acetone, ammonia, carbon dioxide, ethane, methane, and nitric oxide, 14 species in exhaled human breath have been analyzed by high-sensitivity laser spectroscopic techniques, namely, tunable diode laser absorption spectroscopy (TDLAS), cavity ringdown spectroscopy (CRDS), integrated cavity output spectroscopy (ICOS), cavity enhanced absorption spectroscopy (CEAS), cavity leak-out spectroscopy (CALOS), photoacoustic spectroscopy (PAS), quartz-enhanced photoacoustic spectroscopy (QEPAS), and optical frequency comb cavity-enhanced absorption spectroscopy (OFC-CEAS). Spectral fingerprints of the measured biomarkers span from the UV to the mid-IR spectral regions and the detection limits achieved by the laser techniques range from parts per million to parts per billion levels. Sensors using the laser spectroscopic techniques for a few breath biomarkers, e.g., carbon dioxide, nitric oxide, etc. are commercially available. This review presents an update on the latest developments in laser-based breath analysis. PMID:22408503

  5. Electron Microprobe Analysis Techniques for Accurate Measurements of Apatite

    NASA Astrophysics Data System (ADS)

    Goldoff, B. A.; Webster, J. D.; Harlov, D. E.

    2010-12-01

    Apatite [Ca5(PO4)3(F, Cl, OH)] is a ubiquitous accessory mineral in igneous, metamorphic, and sedimentary rocks. The mineral contains halogens and hydroxyl ions, which can provide important constraints on fugacities of volatile components in fluids and other phases in igneous and metamorphic environments in which apatite has equilibrated. Accurate measurements of these components in apatite are therefore necessary. Analyzing apatite by electron microprobe (EMPA), which is a commonly used geochemical analytical technique, has often been found to be problematic and previous studies have identified sources of error. For example, Stormer et al. (1993) demonstrated that the orientation of an apatite grain relative to the incident electron beam could significantly affect the concentration results. In this study, a variety of alternative EMPA operating conditions for apatite analysis were investigated: a range of electron beam settings, count times, crystal grain orientations, and calibration standards were tested. Twenty synthetic anhydrous apatite samples that span the fluorapatite-chlorapatite solid solution series, and whose halogen concentrations were determined by wet chemistry, were analyzed. Accurate measurements of these samples were obtained with many EMPA techniques. One effective method includes setting a static electron beam to 10-15nA, 15kV, and 10 microns in diameter. Additionally, the apatite sample is oriented with the crystal’s c-axis parallel to the slide surface and the count times are moderate. Importantly, the F and Cl EMPA concentrations are in extremely good agreement with the wet-chemical data. We also present EMPA operating conditions and techniques that are problematic and should be avoided. J.C. Stormer, Jr. et al., Am. Mineral. 78 (1993) 641-648.

  6. Novel technique for coal pyrolysis and hydrogenation product analysis

    SciTech Connect

    Pfefferle, L.D.; Boyle, J.

    1993-03-15

    A microjet reactor coupled to a VUV photoionization time-of-flight mass spectrometer has been used to obtain species measurements during high temperature pyrolysis and oxidation of a wide range of hydrocarbon compounds ranging from allene and acetylene to cyclohexane, benzene and toluene. Initial work focused on calibration of the technique, optimization of ion collection and detection and characterization of limitations. Using the optimized technique with 118 nm photoionization, intermediate species profiles were obtained for analysis of the hydrocarbon pyrolysis and oxidation mechanisms. The soft'' ionization, yielding predominantly molecular ions, allowed the study of reaction pathways in these high temperature systems where both sampling and detection challenges are severe. Work has focused on the pyrolysis and oxidative pyrolysis of aliphatic and aromatic hydrocarbon mixtures representative of coal pyrolysis and hydropyrolysis products. The detailed mass spectra obtained during pyrolysis and oxidation of hydrocarbon mixtures is especially important because of the complex nature of the product mixture even at short residence times and low primary reactant conversions. The combustion community has advanced detailed modeling of pyrolysis and oxidation to the C4 hydrocarbon level but in general above that size uncertainties in rate constant and thermodynamic data do not allow us to a priori predict products from mixed hydrocarbon pyrolyses using a detailed chemistry model. For pyrolysis of mixtures of coal-derived liquid fractions with a large range of compound structures and molecular weights in the hundreds of amu the modeling challenge is severe. Lumped models are possible from stable product data.

  7. Transit Spectroscopy: new data analysis techniques and interpretation

    NASA Astrophysics Data System (ADS)

    Tinetti, Giovanna; Waldmann, Ingo P.; Morello, Giuseppe; Tessenyi, Marcell; Varley, Ryan; Barton, Emma; Yurchenko, Sergey; Tennyson, Jonathan; Hollis, Morgan

    2014-11-01

    Planetary science beyond the boundaries of our Solar System is today in its infancy. Until a couple of decades ago, the detailed investigation of the planetary properties was restricted to objects orbiting inside the Kuiper Belt. Today, we cannot ignore that the number of known planets has increased by two orders of magnitude nor that these planets resemble anything but the objects present in our own Solar System. A key observable for planets is the chemical composition and state of their atmosphere. To date, two methods can be used to sound exoplanetary atmospheres: transit and eclipse spectroscopy, and direct imaging spectroscopy. Although the field of exoplanet spectroscopy has been very successful in past years, there are a few serious hurdles that need to be overcome to progress in this area: in particular instrument systematics are often difficult to disentangle from the signal, data are sparse and often not recorded simultaneously causing degeneracy of interpretation. We will present here new data analysis techniques and interpretation developed by the “ExoLights” team at UCL to address the above-mentioned issues. Said techniques include statistical tools, non-parametric, machine-learning algorithms, optimized radiative transfer models and spectroscopic line-lists. These new tools have been successfully applied to existing data recorded with space and ground instruments, shedding new light on our knowledge and understanding of these alien worlds.

  8. New technique for structural analysis of low-relief basins

    SciTech Connect

    Berger, Z.

    1986-05-01

    A new technique for structural analysis of low-relief basins integrates Landsat data with other geologic data sets such as gravity, magnetic, subsurface, and production data. Five analytical steps are recommended, and examples are supported by surface and subsurface controls. These steps are: (1) analyzing exposed structures that form the basin margin; (2) recognizing structural trends within the basin; (3) recognizing buried and obscured structures within the basin; (4) constructing an exploration model; and (5) generating new leads for the entire region. Examples cited are from various low-relief basins such as the Powder River, and the Central Basin platform of west Texas. Surface expressions of buried and obscured structures are attributed to differential compaction, loading, structural reactivation, and other processes related to abnormal flows of ground and surface waters near the structures. These well-recognized processes occur under various climatic and surface conditions. Landsat data can be used in low-relief frontier areas as a reconnaissance tool to identify regional trends, structural types, and potentially prospective structures. These data can also be used in low-relief mature areas to locate subtle structures not identified by other exploration techniques.

  9. Techniques for the analysis of Cl- ion in TMAH

    NASA Astrophysics Data System (ADS)

    Ho, Bang-Chein; Chang, Mong-Ling; Lin, Yu-Ping

    1998-06-01

    Fabricating integrated circuits with increasingly smaller elements mandates that the chlorine ion contents in 2.38% TMAH to be under the 50 ppb level. Previously, the interference from TMAH inhibited the analysis of chlorine ion in TMAH by ion chromatography. In this study, we present a novel technique to eliminate the annoying interference from TMAH. By using of acid type cation exchange resin Amberite IR-120, the interference from TMA+ and OH- is successfully eliminated. The TMA+ cation is retained in the exchange resin and the OH- from TMAH neutralized with the H+ released from the resin. This relatively simple scheme of cation exchange preprocessing not only eliminates the influence of TMAH, but also possesses the additional merit that the Cl- anion has remained intact through the cation exchange resin. To further increase the detection limit, on-line preconcentration by ion chromatography has been coupled with the above scheme by cation exchange resin preprocessing, via this technique, the detection limit of sub 1 ppb level is achieved with no need of adding a standard.

  10. External sorting: I/O analysis and parallel processing techniques

    SciTech Connect

    Kwan, S.C.

    1986-01-01

    This thesis deals with sorting of data that are much too large to fit in main memory or external sorting. The author focuses on two aspects of external sorting: I/O analysis and parallel processing techniques. Storage device models are defined and applied to analyze the I/O complexities of multi-way merge sort and tag sort (or key sort). It is shown that using higher-merge order, through, reduces the number of merge passes, causes excessive random I/O accesses and degrades the overall I/O performance of multi-way merge sort. Techniques are developed for producing long runs in merge sort and for rearranging the records in tag sort after their ranks are determined. A lower bound for the I/O access time or rearranging the records in tag sort is derived. Two methods are explored for implementing distribution sort on parallel computers. The first method, multi-pass distribution sort, determines the bucket ranges with one read pass over the input file, and uses subsequent passes to distribute the data into buckets and sort them. The distribution and sorting of the buckets are processed in parallel using a two-stage pipeline. The second method, one-pass distribution sort, coalesces the bucket partition, bucket distribution, and sort-bucket phases all together so that the input file needs to be processed only once.

  11. Diagnosis of Dengue Infection Using Conventional and Biosensor Based Techniques.

    PubMed

    Parkash, Om; Shueb, Rafidah Hanim

    2015-10-19

    Dengue is an arthropod-borne viral disease caused by four antigenically different serotypes of dengue virus. This disease is considered as a major public health concern around the world. Currently, there is no licensed vaccine or antiviral drug available for the prevention and treatment of dengue disease. Moreover, clinical features of dengue are indistinguishable from other infectious diseases such as malaria, chikungunya, rickettsia and leptospira. Therefore, prompt and accurate laboratory diagnostic test is urgently required for disease confirmation and patient triage. The traditional diagnostic techniques for the dengue virus are viral detection in cell culture, serological testing, and RNA amplification using reverse transcriptase PCR. This paper discusses the conventional laboratory methods used for the diagnosis of dengue during the acute and convalescent phase and highlights the advantages and limitations of these routine laboratory tests. Subsequently, the biosensor based assays developed using various transducers for the detection of dengue are also reviewed.

  12. Protein elasticity probed with two synchrotron-based techniques.

    SciTech Connect

    Leu, B. M.; Alatas, A.; Sinn, H.; Alp, E. E.; Said, A.; Yavas, H.; Zhao, J.; Sage, J. T.; Sturhahn, W.; X-Ray Science Division; Hasylab; Northeastern Univ.

    2010-02-25

    Compressibility characterizes three interconnecting properties of a protein: dynamics, structure, and function. The compressibility values for the electron-carrying protein cytochrome c and for other proteins, as well, available in the literature vary considerably. Here, we apply two synchrotron-based techniques - nuclear resonance vibrational spectroscopy and inelastic x-ray scattering - to measure the adiabatic compressibility of this protein. This is the first report of the compressibility of any material measured with this method. Unlike the methods previously used, this novel approach probes the protein globally, at ambient pressure, does not require the separation of protein and solvent contributions to the total compressibility, and uses samples that contain the heme iron, as in the native state. We show, by comparing our results with molecular dynamics predictions, that the compressibility is almost independent of temperature. We discuss potential applications of this method to other materials beyond proteins.

  13. Validation techniques for fault emulation of SRAM-based FPGAs

    DOE PAGES

    Quinn, Heather; Wirthlin, Michael

    2015-08-07

    A variety of fault emulation systems have been created to study the effect of single-event effects (SEEs) in static random access memory (SRAM) based field-programmable gate arrays (FPGAs). These systems are useful for augmenting radiation-hardness assurance (RHA) methodologies for verifying the effectiveness for mitigation techniques; understanding error signatures and failure modes in FPGAs; and failure rate estimation. For radiation effects researchers, it is important that these systems properly emulate how SEEs manifest in FPGAs. If the fault emulation systems does not mimic the radiation environment, the system will generate erroneous data and incorrect predictions of behavior of the FPGA inmore » a radiation environment. Validation determines whether the emulated faults are reasonable analogs to the radiation-induced faults. In this study we present methods for validating fault emulation systems and provide several examples of validated FPGA fault emulation systems.« less

  14. Validation techniques for fault emulation of SRAM-based FPGAs

    SciTech Connect

    Quinn, Heather; Wirthlin, Michael

    2015-08-07

    A variety of fault emulation systems have been created to study the effect of single-event effects (SEEs) in static random access memory (SRAM) based field-programmable gate arrays (FPGAs). These systems are useful for augmenting radiation-hardness assurance (RHA) methodologies for verifying the effectiveness for mitigation techniques; understanding error signatures and failure modes in FPGAs; and failure rate estimation. For radiation effects researchers, it is important that these systems properly emulate how SEEs manifest in FPGAs. If the fault emulation systems does not mimic the radiation environment, the system will generate erroneous data and incorrect predictions of behavior of the FPGA in a radiation environment. Validation determines whether the emulated faults are reasonable analogs to the radiation-induced faults. In this study we present methods for validating fault emulation systems and provide several examples of validated FPGA fault emulation systems.

  15. Diagnosis of Dengue Infection Using Conventional and Biosensor Based Techniques

    PubMed Central

    Parkash, Om; Hanim Shueb, Rafidah

    2015-01-01

    Dengue is an arthropod-borne viral disease caused by four antigenically different serotypes of dengue virus. This disease is considered as a major public health concern around the world. Currently, there is no licensed vaccine or antiviral drug available for the prevention and treatment of dengue disease. Moreover, clinical features of dengue are indistinguishable from other infectious diseases such as malaria, chikungunya, rickettsia and leptospira. Therefore, prompt and accurate laboratory diagnostic test is urgently required for disease confirmation and patient triage. The traditional diagnostic techniques for the dengue virus are viral detection in cell culture, serological testing, and RNA amplification using reverse transcriptase PCR. This paper discusses the conventional laboratory methods used for the diagnosis of dengue during the acute and convalescent phase and highlights the advantages and limitations of these routine laboratory tests. Subsequently, the biosensor based assays developed using various transducers for the detection of dengue are also reviewed. PMID:26492265

  16. Mars laser altimeter based on a single photon ranging technique

    NASA Technical Reports Server (NTRS)

    Prochazka, Ivan; Hamal, Karel; Sopko, B.; Pershin, S.

    1993-01-01

    The Mars 94/96 Mission will carry, among others things, the balloon probe experiment. The balloon with the scientific cargo in the gondola underneath will drift in the Mars atmosphere, its altitude will range from zero, in the night, up to 5 km at noon. The accurate gondola altitude will be determined by an altimeter. As the Balloon gondola mass is strictly limited, the altimeter total mass and power consumption are critical; maximum allowed is a few hundred grams a few tens of mWatts of average power consumption. We did propose, design, and construct the laser altimeter based on the single photon ranging technique. Topics covered include the following: principle of operation, altimeter construction, and ground tests.

  17. Radial Velocity Data Analysis with Compressed Sensing Techniques

    NASA Astrophysics Data System (ADS)

    Hara, Nathan C.; Boué, G.; Laskar, J.; Correia, A. C. M.

    2016-09-01

    We present a novel approach for analysing radial velocity data that combines two features: all the planets are searched at once and the algorithm is fast. This is achieved by utilizing compressed sensing techniques, which are modified to be compatible with the Gaussian processes framework. The resulting tool can be used like a Lomb-Scargle periodogram and has the same aspect but with much fewer peaks due to aliasing. The method is applied to five systems with published radial velocity data sets: HD 69830, HD 10180, 55 Cnc, GJ 876 and a simulated very active star. The results are fully compatible with previous analysis, though obtained more straightforwardly. We further show that 55 Cnc e and f could have been respectively detected and suspected in early measurements from the Lick observatory and Hobby-Eberly Telescope available in 2004, and that frequencies due to dynamical interactions in GJ 876 can be seen.

  18. Applying machine learning techniques to DNA sequence analysis

    SciTech Connect

    Shavlik, J.W.

    1992-01-01

    We are developing a machine learning system that modifies existing knowledge about specific types of biological sequences. It does this by considering sample members and nonmembers of the sequence motif being learned. Using this information (which we call a domain theory''), our learning algorithm produces a more accurate representation of the knowledge needed to categorize future sequences. Specifically, the KBANN algorithm maps inference rules, such as consensus sequences, into a neural (connectionist) network. Neural network training techniques then use the training examples of refine these inference rules. We have been applying this approach to several problems in DNA sequence analysis and have also been extending the capabilities of our learning system along several dimensions.

  19. Static gas analysis by a transient flow technique

    SciTech Connect

    Leckey, J.H.; Boeckmann, M.D.

    1988-07-01

    A technique is presented for using a residual gas analyzer (RGA) to analyze small concentrations of heavy gases in lighter gases in a static volume of <10 cm/sup 3/ . Passing the gas sample through a control valve causes it to enter the RGA chamber in molecular flow. This procedure results in fractionation that causes enrichment of the heavier gas during the evacuation of the sample, giving rise to a significantly higher heavy gas signal near the end of the evacuation, while maintaining low pressures in the RGA chamber that are required for linearity. This heavy gas enrichment near the end of the evacuation results in a significant reduction in its detection limit. Specific examples are presented for the analysis of argon in hydrogen and are compared to a gas-flow model of the system.

  20. Trial application of a technique for human error analysis (ATHEANA)

    SciTech Connect

    Bley, D.C.; Cooper, S.E.; Parry, G.W.

    1996-10-01

    The new method for HRA, ATHEANA, has been developed based on a study of the operating history of serious accidents and an understanding of the reasons why people make errors. Previous publications associated with the project have dealt with the theoretical framework under which errors occur and the retrospective analysis of operational events. This is the first attempt to use ATHEANA in a prospective way, to select and evaluate human errors within the PSA context.

  1. Adolescent baseball pitching technique: lower extremity biomechanical analysis.

    PubMed

    Milewski, Matthew D; Õunpuu, Sylvia; Solomito, Matthew; Westwell, Melany; Nissen, Carl W

    2012-11-01

    Documentation of the lower extremity motion patterns of adolescent pitchers is an important part of understanding the pitching motion and the implication of lower extremity technique on upper extremity loads, injury and performance. The purpose of this study was to take the initial step in this process by documenting the biomechanics of the lower extremities during the pitching cycle in adolescent pitchers and to compare these findings with the published data for older pitchers. Three-dimensional motion analysis using a comprehensive lower extremity model was used to evaluate the fast ball pitch technique in adolescent pitchers. Thirty-two pitchers with a mean age of 12.4 years (range 10.5-14.7 years) and at least 2 years of experience were included in this study. The pitchers showed a mean of 49 ± 12° of knee flexion of the lead leg at foot contact. They tended to maintain this position through ball release, and then extended their knee during the follow through phase (ball release to maximal internal glenohumeral rotation). The lead leg hip rapidly progressed into adduction and flexion during the arm cocking phase with a range of motion of 40 ± 10° adduction and 30 ± 13° flexion. The lead hip mean peak adduction velocity was 434 ± 83°/s and flexion velocity was 456 ± 156°/s. Simultaneously, the trailing leg hip rapidly extended approaching to a mean peak extension of -8 ± 5° at 39% of the pitch cycle, which is close to passive range of motion constraints. Peak hip abduction of the trailing leg at foot contact was -31 ± 12°, which also approached passive range of motion constraints. Differences and similarities were also noted between the adolescent lower extremity kinematics and adult pitchers; however, a more comprehensive analysis using similar methods is needed for a complete comparison. PMID:22660979

  2. Adolescent baseball pitching technique: lower extremity biomechanical analysis.

    PubMed

    Milewski, Matthew D; Õunpuu, Sylvia; Solomito, Matthew; Westwell, Melany; Nissen, Carl W

    2012-11-01

    Documentation of the lower extremity motion patterns of adolescent pitchers is an important part of understanding the pitching motion and the implication of lower extremity technique on upper extremity loads, injury and performance. The purpose of this study was to take the initial step in this process by documenting the biomechanics of the lower extremities during the pitching cycle in adolescent pitchers and to compare these findings with the published data for older pitchers. Three-dimensional motion analysis using a comprehensive lower extremity model was used to evaluate the fast ball pitch technique in adolescent pitchers. Thirty-two pitchers with a mean age of 12.4 years (range 10.5-14.7 years) and at least 2 years of experience were included in this study. The pitchers showed a mean of 49 ± 12° of knee flexion of the lead leg at foot contact. They tended to maintain this position through ball release, and then extended their knee during the follow through phase (ball release to maximal internal glenohumeral rotation). The lead leg hip rapidly progressed into adduction and flexion during the arm cocking phase with a range of motion of 40 ± 10° adduction and 30 ± 13° flexion. The lead hip mean peak adduction velocity was 434 ± 83°/s and flexion velocity was 456 ± 156°/s. Simultaneously, the trailing leg hip rapidly extended approaching to a mean peak extension of -8 ± 5° at 39% of the pitch cycle, which is close to passive range of motion constraints. Peak hip abduction of the trailing leg at foot contact was -31 ± 12°, which also approached passive range of motion constraints. Differences and similarities were also noted between the adolescent lower extremity kinematics and adult pitchers; however, a more comprehensive analysis using similar methods is needed for a complete comparison.

  3. Acoustical Characteristics of Mastication Sounds: Application of Speech Analysis Techniques

    NASA Astrophysics Data System (ADS)

    Brochetti, Denise

    Food scientists have used acoustical methods to study characteristics of mastication sounds in relation to food texture. However, a model for analysis of the sounds has not been identified, and reliability of the methods has not been reported. Therefore, speech analysis techniques were applied to mastication sounds, and variation in measures of the sounds was examined. To meet these objectives, two experiments were conducted. In the first experiment, a digital sound spectrograph generated waveforms and wideband spectrograms of sounds by 3 adult subjects (1 male, 2 females) for initial chews of food samples differing in hardness and fracturability. Acoustical characteristics were described and compared. For all sounds, formants appeared in the spectrograms, and energy occurred across a 0 to 8000-Hz range of frequencies. Bursts characterized waveforms for peanut, almond, raw carrot, ginger snap, and hard candy. Duration and amplitude of the sounds varied with the subjects. In the second experiment, the spectrograph was used to measure the duration, amplitude, and formants of sounds for the initial 2 chews of cylindrical food samples (raw carrot, teething toast) differing in diameter (1.27, 1.90, 2.54 cm). Six adult subjects (3 males, 3 females) having normal occlusions and temporomandibular joints chewed the samples between the molar teeth and with the mouth open. Ten repetitions per subject were examined for each food sample. Analysis of estimates of variation indicated an inconsistent intrasubject variation in the acoustical measures. Food type and sample diameter also affected the estimates, indicating the variable nature of mastication. Generally, intrasubject variation was greater than intersubject variation. Analysis of ranks of the data indicated that the effect of sample diameter on the acoustical measures was inconsistent and depended on the subject and type of food. If inferences are to be made concerning food texture from acoustical measures of mastication

  4. Advanced spatio-temporal filtering techniques for photogrammetric image sequence analysis in civil engineering material testing

    NASA Astrophysics Data System (ADS)

    Liebold, F.; Maas, H.-G.

    2016-01-01

    The paper shows advanced spatial, temporal and spatio-temporal filtering techniques which may be used to reduce noise effects in photogrammetric image sequence analysis tasks and tools. As a practical example, the techniques are validated in a photogrammetric spatio-temporal crack detection and analysis tool applied in load tests in civil engineering material testing. The load test technique is based on monocular image sequences of a test object under varying load conditions. The first image of a sequence is defined as a reference image under zero load, wherein interest points are determined and connected in a triangular irregular network structure. For each epoch, these triangles are compared to the reference image triangles to search for deformations. The result of the feature point tracking and triangle comparison process is a spatio-temporally resolved strain value field, wherein cracks can be detected, located and measured via local discrepancies. The strains can be visualized as a color-coded map. In order to improve the measuring system and to reduce noise, the strain values of each triangle must be treated in a filtering process. The paper shows the results of various filter techniques in the spatial and in the temporal domain as well as spatio-temporal filtering techniques applied to these data. The best results were obtained by a bilateral filter in the spatial domain and by a spatio-temporal EOF (empirical orthogonal function) filtering technique.

  5. Shape-based segmentation and visualization techniques for evaluation of atherosclerotic plaques in coronary artery disease

    NASA Astrophysics Data System (ADS)

    Rinck, Daniel; Krüger, Sebastian; Reimann, Anja; Scheuering, Michael

    2006-03-01

    Multi-slice computed tomography (MSCT) has developed strongly in the emerging field of cardiovascular imaging. The manual analysis of atherosclerotic plaques in coronary arteries is a very time consuming and labor intensive process and today only qualitative analysis is possible. In this paper we present a new shape-based segmentation and visualization technique for quantitative analysis of atherosclerotic plaques in coronary artery disease. The new technique takes into account several aspects of the vascular anatomy. It uses two surface representations, one for the contrast filled vessel lumen and also one for the vascular wall. The deviation between these two surfaces is defined as plaque volume. These surface representations can be edited by the user manually. With this kind of representation it is possible to calculate sub plaque volumes (such as: lipid rich core, fibrous tissue, calcified tissue) inside this suspicious area. Also a high quality 3D visualization, using Open Inventor is possible.

  6. A borax fusion technique for quantitative X-ray fluorescence analysis.

    PubMed

    Van Willigen, J H; Kruidhof, H; Dahmen, E A

    1971-04-01

    A borax fusion technique to cast glass discs for quantitative X-ray analysis is described in detail. The method is based on the "nonwetting" properties of a Pt/Au alloy towards molten borax, on the favourable composition of the flux and finally on the favourable form of the casting mould. The critical points of the technique are stressed, resulting in a method which could be carried out successfully by inexperienced workers. In general the method compares favourably in speed and accuracy with wet-chemical methods.

  7. Novel failure analysis techniques using photon probing with a scanning optical microscope

    SciTech Connect

    Cole, E.I. Jr.; Soden, J.M.; Rife, J.L.; Barton, D.L.; Henderson, C.L.

    1993-12-31

    Three new failure analysis techniques for integrated circuits (ICs) have been developed using localized photon probing with a scanning optical microscope (SOM). The first two are light-induced voltage alteration (LIVA) imaging techniques that (1) localize open-circuited and damaged junctions and (2) image transistor logic states. The third technique uses the SOM to control logic states optically from the IC backside. LIVA images are produced by monitoring the voltage fluctuations of a constant current power supply as a laser beam is scanned over the IC. High selectivity for localizing defects has been demonstrated using the LIVA approach. Logic state mapping results, similar to previous work using biased optical beam induced current (OBIC) and laser probing approaches have also been produced using LIVA. Application of the two LIVA based techniques to backside failure analysis has been demonstrated using an infrared laser source. Optical logic state control is based upon earlier work examining transistor response to photon injection. The physics of each method and their applications for failure analysis are described.

  8. Skull base tumours part I: imaging technique, anatomy and anterior skull base tumours.

    PubMed

    Borges, Alexandra

    2008-06-01

    Advances in cross-sectional imaging, surgical technique and adjuvant treatment have largely contributed to ameliorate the prognosis, lessen the morbidity and mortality of patients with skull base tumours and to the growing medical investment in the management of these patients. Because clinical assessment of the skull base is limited, cross-sectional imaging became indispensable in the diagnosis, treatment planning and follow-up of patients with suspected skull base pathology and the radiologist is increasingly responsible for the fate of these patients. This review will focus on the advances in imaging technique; contribution to patient's management and on the imaging features of the most common tumours affecting the anterior skull base. Emphasis is given to a systematic approach to skull base pathology based upon an anatomic division taking into account the major tissue constituents in each skull base compartment. The most relevant information that should be conveyed to surgeons and radiation oncologists involved in patient's management will be discussed.

  9. Thermoeconomic analysis of a CHP system by iterative numerical techniques

    SciTech Connect

    Damshala, P.R.

    2000-07-01

    This paper deals with the determination of the thermoeconomic optimum conditions for a constant space heat load imposed on the air coil of a combined heating and power (CHP) system using iterative numerical techniques. From the thermodynamic relations and equations derived from the energy balance and heat exchanger characteristics, an objective function and constraining equations are obtained. A computer program is developed based on the Redlich-Kwong equation of state to estimate the thermodynamic properties of the refrigerant fluid R-22. Additional computer subroutines are developed to perform thermodynamic and thermoeconomic optimization. Optimum values of the operating variables are identified at thermodynamic and thermoeconomic optimum conditions. Results show that the total irreversibilities produced in the system and the cost of fuel consumption are minimum at thermodynamic optimum conditions, but the annual cost of owning and operating the system is minimum at the thermoeconomic optimum condition, which is 34% lower than at the thermodynamic optimum condition.

  10. Efficient geometric rectification techniques for spectral analysis algorithm

    NASA Technical Reports Server (NTRS)

    Chang, C. Y.; Pang, S. S.; Curlander, J. C.

    1992-01-01

    The spectral analysis algorithm is a viable technique for processing synthetic aperture radar (SAR) data in near real time throughput rates by trading the image resolution. One major challenge of the spectral analysis algorithm is that the output image, often referred to as the range-Doppler image, is represented in the iso-range and iso-Doppler lines, a curved grid format. This phenomenon is known to be the fanshape effect. Therefore, resampling is required to convert the range-Doppler image into a rectangular grid format before the individual images can be overlaid together to form seamless multi-look strip imagery. An efficient algorithm for geometric rectification of the range-Doppler image is presented. The proposed algorithm, realized in two one-dimensional resampling steps, takes into consideration the fanshape phenomenon of the range-Doppler image as well as the high squint angle and updates of the cross-track and along-track Doppler parameters. No ground reference points are required.

  11. Sensitivity-analysis techniques: self-teaching curriculum

    SciTech Connect

    Iman, R.L.; Conover, W.J.

    1982-06-01

    This self teaching curriculum on sensitivity analysis techniques consists of three parts: (1) Use of the Latin Hypercube Sampling Program (Iman, Davenport and Ziegler, Latin Hypercube Sampling (Program User's Guide), SAND79-1473, January 1980); (2) Use of the Stepwise Regression Program (Iman, et al., Stepwise Regression with PRESS and Rank Regression (Program User's Guide) SAND79-1472, January 1980); and (3) Application of the procedures to sensitivity and uncertainty analyses of the groundwater transport model MWFT/DVM (Campbell, Iman and Reeves, Risk Methodology for Geologic Disposal of Radioactive Waste - Transport Model Sensitivity Analysis; SAND80-0644, NUREG/CR-1377, June 1980: Campbell, Longsine, and Reeves, The Distributed Velocity Method of Solving the Convective-Dispersion Equation, SAND80-0717, NUREG/CR-1376, July 1980). This curriculum is one in a series developed by Sandia National Laboratories for transfer of the capability to use the technology developed under the NRC funded High Level Waste Methodology Development Program.

  12. Ares Launch Vehicle Transonic Buffet Testing and Analysis Techniques

    NASA Technical Reports Server (NTRS)

    Piatak, David J.; Sekula, Martin K.; Rausch, Russ D.

    2010-01-01

    It is necessary to define the launch vehicle buffet loads to ensure that structural components and vehicle subsystems possess adequate strength, stress, and fatigue margins when the vehicle structural dynamic response to buffet forcing functions are considered. In order to obtain these forcing functions, the accepted method is to perform wind-tunnel testing of a rigid model instrumented with hundreds of unsteady pressure transducers designed to measure the buffet environment across the desired frequency range. The buffet wind-tunnel test program for the Ares Crew Launch Vehicle employed 3.5 percent scale rigid models of the Ares I and Ares I-X launch vehicles instrumented with 256 unsteady pressure transducers each. These models were tested at transonic conditions at the Transonic Dynamics Tunnel at NASA Langley Research Center. The ultimate deliverable of the Ares buffet test program are buffet forcing functions (BFFs) derived from integrating the measured fluctuating pressures on the rigid wind-tunnel models. These BFFs are then used as input to a multi-mode structural analysis to determine the vehicle response to buffet and the resulting buffet loads and accelerations. This paper discusses the development of the Ares I and I-X rigid buffet model test programs from the standpoint of model design, instrumentation system design, test implementation, data analysis techniques to yield final products, and presents normalized sectional buffet forcing function root-mean-squared levels.

  13. Spatiotemporal analysis of olive flowering using geostatistical techniques.

    PubMed

    Rojo, Jesús; Pérez-Badia, Rosa

    2015-02-01

    Analysis of flowering patterns in the olive (Olea europaea L.) are of considerable agricultural and ecological interest, and also provide valuable information for allergy-sufferers, enabling identification of the major sources of airborne pollen at any given moment by interpreting the aerobiological data recorded in pollen traps. The present spatiotemporal analysis of olive flowering in central Spain combined geostatistical techniques with the application of a Geographic Information Systems, and compared results for flowering intensity with airborne pollen records. The results were used to obtain continuous phenological maps which determined the pattern of the succession of the olive flowering. The results show also that, although the highest airborne olive-pollen counts were recorded during the greatest flowering intensity of the groves closest to the pollen trap, the counts recorded at the start of the pollen season were not linked to local olive groves, which had not yet begin to flower. To detect the remote sources of olive pollen several episodes of pollen recorded before the local flowering season were analysed using a HYSPLIT trajectory model and the findings showed that western, southern and southwestern winds transported pollen grains into the study area from earlier-flowering groves located outside the territory.

  14. Parameter tuning of PVD process based on artificial intelligence technique

    NASA Astrophysics Data System (ADS)

    Norlina, M. S.; Diyana, M. S. Nor; Mazidah, P.; Rusop, M.

    2016-07-01

    In this study, an artificial intelligence technique is proposed to be implemented in the parameter tuning of a PVD process. Due to its previous adaptation in similar optimization problems, genetic algorithm (GA) is selected to optimize the parameter tuning of the RF magnetron sputtering process. The most optimized parameter combination obtained from GA's optimization result is expected to produce the desirable zinc oxide (ZnO) thin film from the sputtering process. The parameters involved in this study were RF power, deposition time and substrate temperature. The algorithm was tested to optimize the 25 datasets of parameter combinations. The results from the computational experiment were then compared with the actual result from the laboratory experiment. Based on the comparison, GA had shown that the algorithm was reliable to optimize the parameter combination before the parameter tuning could be done to the RF magnetron sputtering machine. In order to verify the result of GA, the algorithm was also been compared to other well known optimization algorithms, which were, particle swarm optimization (PSO) and gravitational search algorithm (GSA). The results had shown that GA was reliable in solving this RF magnetron sputtering process parameter tuning problem. GA had shown better accuracy in the optimization based on the fitness evaluation.

  15. Remaining Creep Life Assessment Techniques Based on Creep Cavitation Modeling

    NASA Astrophysics Data System (ADS)

    Ankit, Kumar

    2009-05-01

    The boiler and its components are built with assumed nominal design and reasonable life of operation about two to three decades (one or two hundred thousand hours). These units are generally replaced or life is extended at the end of this period. Under normal operating conditions, after the initial period of teething troubles, the reliability of these units remains fairly constant up to about two decades of normal operation. The failure rate then increases as a result of their time-dependent material damage. Further running of these units may become uneconomical and dangerous in some cases. In the following article, step-by-step methodology to quantify creep cavitation based on statistical probability analysis and continuum damage mechanics has been described. The concepts of creep cavity nucleation have also been discussed with a special emphasis on the need for development of a model based on creep cavity growth kinetics.

  16. Validation techniques of agent based modelling for geospatial simulations

    NASA Astrophysics Data System (ADS)

    Darvishi, M.; Ahmadi, G.

    2014-10-01

    One of the most interesting aspects of modelling and simulation study is to describe the real world phenomena that have specific properties; especially those that are in large scales and have dynamic and complex behaviours. Studying these phenomena in the laboratory is costly and in most cases it is impossible. Therefore, Miniaturization of world phenomena in the framework of a model in order to simulate the real phenomena is a reasonable and scientific approach to understand the world. Agent-based modelling and simulation (ABMS) is a new modelling method comprising of multiple interacting agent. They have been used in the different areas; for instance, geographic information system (GIS), biology, economics, social science and computer science. The emergence of ABM toolkits in GIS software libraries (e.g. ESRI's ArcGIS, OpenMap, GeoTools, etc) for geospatial modelling is an indication of the growing interest of users to use of special capabilities of ABMS. Since ABMS is inherently similar to human cognition, therefore it could be built easily and applicable to wide range applications than a traditional simulation. But a key challenge about ABMS is difficulty in their validation and verification. Because of frequent emergence patterns, strong dynamics in the system and the complex nature of ABMS, it is hard to validate and verify ABMS by conventional validation methods. Therefore, attempt to find appropriate validation techniques for ABM seems to be necessary. In this paper, after reviewing on Principles and Concepts of ABM for and its applications, the validation techniques and challenges of ABM validation are discussed.

  17. Integration of geological remote-sensing techniques in subsurface analysis

    USGS Publications Warehouse

    Taranik, James V.; Trautwein, Charles M.

    1976-01-01

    Geological remote sensing is defined as the study of the Earth utilizing electromagnetic radiation which is either reflected or emitted from its surface in wavelengths ranging from 0.3 micrometre to 3 metres. The natural surface of the Earth is composed of a diversified combination of surface cover types, and geologists must understand the characteristics of surface cover types to successfully evaluate remotely-sensed data. In some areas landscape surface cover changes throughout the year, and analysis of imagery acquired at different times of year can yield additional geological information. Integration of different scales of analysis allows landscape features to be effectively interpreted. Interpretation of the static elements displayed on imagery is referred to as an image interpretation. Image interpretation is dependent upon: (1) the geologist's understanding of the fundamental aspects of image formation, and (2.) his ability to detect, delineate, and classify image radiometric data; recognize radiometric patterns; and identify landscape surface characteristics as expressed on imagery. A geologic interpretation integrates surface characteristics of the landscape with subsurface geologic relationships. Development of a geologic interpretation from imagery is dependent upon: (1) the geologist's ability to interpret geomorphic processes from their static surface expression as landscape characteristics on imagery, (2) his ability to conceptualize the dynamic processes responsible for the evolution 6f interpreted geologic relationships (his ability to develop geologic models). The integration of geologic remote-sensing techniques in subsurface analysis is illustrated by development of an exploration model for ground water in the Tucson area of Arizona, and by the development of an exploration model for mineralization in southwest Idaho.

  18. Dynamic digital watermark technique based on neural network

    NASA Astrophysics Data System (ADS)

    Gu, Tao; Li, Xu

    2008-04-01

    An algorithm of dynamic watermark based on neural network is presented which is more robust against attack of false authentication and watermark-tampered operations contrasting with one watermark embedded method. (1) Five binary images used as watermarks are coded into a binary array. The total number of 0s and 1s is 5*N, every 0 or 1 is enlarged fivefold by information-enlarged technique. N is the original total number of the watermarks' binary bits. (2) Choose the seed image pixel p x,y and its 3×3 vicinities pixel p x-1,y-1,p x-1,y,p x-1,y+1,p x,y-1,p x,y+1,p x+1,y-1,p x+1,y,p x+1,y+1 as one sample space. The p x,y is used as the neural network target and the other eight pixel values are used as neural network inputs. (3) To make the neural network learn the sample space, 5*N pixel values and their closely relevant pixel values are randomly chosen with a password from a color BMP format image and used to train the neural network.(4) A four-layer neural network is constructed to describe the nonlinear mapped relationship between inputs and outputs. (5) One bit from the array is embedded by adjusting the polarity between a chosen pixel value and the output value of the model. (6) One randomizer generates a number to ascertain the counts of watermarks for retrieving. The randomly ascertained watermarks can be retrieved by using the restored neural network outputs value, the corresponding image pixels value, and the restore function without knowing the original image and watermarks (The restored coded-watermark bit=1, if ox,y(restored)>p x,y(reconstructed, else coded-watermark bit =0). The retrieved watermarks are different when extracting each time. The proposed technique can offer more watermarking proofs than one watermark embedded algorithm. Experimental results show that the proposed technique is very robust against some image processing operations and JPEG lossy compression. Therefore, the algorithm can be used to protect the copyright of one important image.

  19. Optical asymmetric cryptography based on elliptical polarized light linear truncation and a numerical reconstruction technique.

    PubMed

    Lin, Chao; Shen, Xueju; Wang, Zhisong; Zhao, Cheng

    2014-06-20

    We demonstrate a novel optical asymmetric cryptosystem based on the principle of elliptical polarized light linear truncation and a numerical reconstruction technique. The device of an array of linear polarizers is introduced to achieve linear truncation on the spatially resolved elliptical polarization distribution during image encryption. This encoding process can be characterized as confusion-based optical cryptography that involves no Fourier lens and diffusion operation. Based on the Jones matrix formalism, the intensity transmittance for this truncation is deduced to perform elliptical polarized light reconstruction based on two intensity measurements. Use of a quick response code makes the proposed cryptosystem practical, with versatile key sensitivity and fault tolerance. Both simulation and preliminary experimental results that support theoretical analysis are presented. An analysis of the resistance of the proposed method on a known public key attack is also provided.

  20. Data analysis techniques: a tool for cumulative exposure assessment.

    PubMed

    Lalloué, Benoît; Monnez, Jean-Marie; Padilla, Cindy; Kihal, Wahida; Zmirou-Navier, Denis; Deguen, Séverine

    2015-01-01

    Everyone is subject to environmental exposures from various sources, with negative health impacts (air, water and soil contamination, noise, etc.or with positive effects (e.g. green space). Studies considering such complex environmental settings in a global manner are rare. We propose to use statistical factor and cluster analyses to create a composite exposure index with a data-driven approach, in view to assess the environmental burden experienced by populations. We illustrate this approach in a large French metropolitan area. The study was carried out in the Great Lyon area (France, 1.2 M inhabitants) at the census Block Group (BG) scale. We used as environmental indicators ambient air NO2 annual concentrations, noise levels and proximity to green spaces, to industrial plants, to polluted sites and to road traffic. They were synthesized using Multiple Factor Analysis (MFA), a data-driven technique without a priori modeling, followed by a Hierarchical Clustering to create BG classes. The first components of the MFA explained, respectively, 30, 14, 11 and 9% of the total variance. Clustering in five classes group: (1) a particular type of large BGs without population; (2) BGs of green residential areas, with less negative exposures than average; (3) BGs of residential areas near midtown; (4) BGs close to industries; and (5) midtown urban BGs, with higher negative exposures than average and less green spaces. Other numbers of classes were tested in order to assess a variety of clustering. We present an approach using statistical factor and cluster analyses techniques, which seem overlooked to assess cumulative exposure in complex environmental settings. Although it cannot be applied directly for risk or health effect assessment, the resulting index can help to identify hot spots of cumulative exposure, to prioritize urban policies or to compare the environmental burden across study areas in an epidemiological framework.

  1. A dynamic mechanical analysis technique for porous media

    PubMed Central

    Pattison, Adam J; McGarry, Matthew; Weaver, John B; Paulsen, Keith D

    2015-01-01

    Dynamic mechanical analysis (DMA) is a common way to measure the mechanical properties of materials as functions of frequency. Traditionally, a viscoelastic mechanical model is applied and current DMA techniques fit an analytical approximation to measured dynamic motion data by neglecting inertial forces and adding empirical correction factors to account for transverse boundary displacements. Here, a finite element (FE) approach to processing DMA data was developed to estimate poroelastic material properties. Frequency-dependent inertial forces, which are significant in soft media and often neglected in DMA, were included in the FE model. The technique applies a constitutive relation to the DMA measurements and exploits a non-linear inversion to estimate the material properties in the model that best fit the model response to the DMA data. A viscoelastic version of this approach was developed to validate the approach by comparing complex modulus estimates to the direct DMA results. Both analytical and FE poroelastic models were also developed to explore their behavior in the DMA testing environment. All of the models were applied to tofu as a representative soft poroelastic material that is a common phantom in elastography imaging studies. Five samples of three different stiffnesses were tested from 1 – 14 Hz with rough platens placed on the top and bottom surfaces of the material specimen under test to restrict transverse displacements and promote fluid-solid interaction. The viscoelastic models were identical in the static case, and nearly the same at frequency with inertial forces accounting for some of the discrepancy. The poroelastic analytical method was not sufficient when the relevant physical boundary constraints were applied, whereas the poroelastic FE approach produced high quality estimates of shear modulus and hydraulic conductivity. These results illustrated appropriate shear modulus contrast between tofu samples and yielded a consistent contrast in

  2. Two-dimensional Imaging Velocity Interferometry: Technique and Data Analysis

    SciTech Connect

    Erskine, D J; Smith, R F; Bolme, C; Celliers, P; Collins, G

    2011-03-23

    We describe the data analysis procedures for an emerging interferometric technique for measuring motion across a two-dimensional image at a moment in time, i.e. a snapshot 2d-VISAR. Velocity interferometers (VISAR) measuring target motion to high precision have been an important diagnostic in shockwave physics for many years Until recently, this diagnostic has been limited to measuring motion at points or lines across a target. We introduce an emerging interferometric technique for measuring motion across a two-dimensional image, which could be called a snapshot 2d-VISAR. If a sufficiently fast movie camera technology existed, it could be placed behind a traditional VISAR optical system and record a 2d image vs time. But since that technology is not yet available, we use a CCD detector to record a single 2d image, with the pulsed nature of the illumination providing the time resolution. Consequently, since we are using pulsed illumination having a coherence length shorter than the VISAR interferometer delay ({approx}0.1 ns), we must use the white light velocimetry configuration to produce fringes with significant visibility. In this scheme, two interferometers (illuminating, detecting) having nearly identical delays are used in series, with one before the target and one after. This produces fringes with at most 50% visibility, but otherwise has the same fringe shift per target motion of a traditional VISAR. The 2d-VISAR observes a new world of information about shock behavior not readily accessible by traditional point or 1d-VISARS, simultaneously providing both a velocity map and an 'ordinary' snapshot photograph of the target. The 2d-VISAR has been used to observe nonuniformities in NIF related targets (polycrystalline diamond, Be), and in Si and Al.

  3. Omics integrating physical techniques: aged Piedmontese meat analysis.

    PubMed

    Lana, Alessandro; Longo, Valentina; Dalmasso, Alessandra; D'Alessandro, Angelo; Bottero, Maria Teresa; Zolla, Lello

    2015-04-01

    Piedmontese meat tenderness becomes higher by extending the ageing period after slaughter up to 44 days. Classical physical analysis only partially explain this evidence, so in order to discover the reason of the potential beneficial effects of prolonged ageing, we performed omic analysis in the Longissimus thoracis muscle by examining main biochemical changes through mass spectrometry-based metabolomics and proteomics. We observed a progressive decline in myofibrillar structural integrity (underpinning meat tenderness) and impaired energy metabolism. Markers of autophagic responses (e.g. serine and glutathione metabolism) and nitrogen metabolism (urea cycle intermediates) accumulated until the end of the assayed period. Key metabolites such as glutamate, a mediator of the appreciated umami taste of the meat, were found to constantly accumulate until day 44. Finally, statistical analyses revealed that glutamate, serine and arginine could serve as good predictors of ultimate meat quality parameters, even though further studies are mandatory.

  4. Subcellular chemical and morphological analysis by stimulated Raman scattering microscopy and image analysis techniques.

    PubMed

    D'Arco, Annalisa; Brancati, Nadia; Ferrara, Maria Antonietta; Indolfi, Maurizio; Frucci, Maria; Sirleto, Luigi

    2016-05-01

    The visualization of heterogeneous morphology, segmentation and quantification of image features is a crucial point for nonlinear optics microscopy applications, spanning from imaging of living cells or tissues to biomedical diagnostic. In this paper, a methodology combining stimulated Raman scattering microscopy and image analysis technique is presented. The basic idea is to join the potential of vibrational contrast of stimulated Raman scattering and the strength of imaging analysis technique in order to delineate subcellular morphology with chemical specificity. Validation tests on label free imaging of polystyrene-beads and of adipocyte cells are reported and discussed. PMID:27231626

  5. Subcellular chemical and morphological analysis by stimulated Raman scattering microscopy and image analysis techniques

    PubMed Central

    D’Arco, Annalisa; Brancati, Nadia; Ferrara, Maria Antonietta; Indolfi, Maurizio; Frucci, Maria; Sirleto, Luigi

    2016-01-01

    The visualization of heterogeneous morphology, segmentation and quantification of image features is a crucial point for nonlinear optics microscopy applications, spanning from imaging of living cells or tissues to biomedical diagnostic. In this paper, a methodology combining stimulated Raman scattering microscopy and image analysis technique is presented. The basic idea is to join the potential of vibrational contrast of stimulated Raman scattering and the strength of imaging analysis technique in order to delineate subcellular morphology with chemical specificity. Validation tests on label free imaging of polystyrene-beads and of adipocyte cells are reported and discussed. PMID:27231626

  6. Subcellular chemical and morphological analysis by stimulated Raman scattering microscopy and image analysis techniques.

    PubMed

    D'Arco, Annalisa; Brancati, Nadia; Ferrara, Maria Antonietta; Indolfi, Maurizio; Frucci, Maria; Sirleto, Luigi

    2016-05-01

    The visualization of heterogeneous morphology, segmentation and quantification of image features is a crucial point for nonlinear optics microscopy applications, spanning from imaging of living cells or tissues to biomedical diagnostic. In this paper, a methodology combining stimulated Raman scattering microscopy and image analysis technique is presented. The basic idea is to join the potential of vibrational contrast of stimulated Raman scattering and the strength of imaging analysis technique in order to delineate subcellular morphology with chemical specificity. Validation tests on label free imaging of polystyrene-beads and of adipocyte cells are reported and discussed.

  7. Rapid Automated Dissolution and Analysis Techniques for Radionuclides in Recycle Process Streams

    SciTech Connect

    Sudowe, Ralf; Roman, Audrey; Dailey, Ashlee; Go, Elaine

    2013-07-18

    The analysis of process samples for radionuclide content is an important part of current procedures for material balance and accountancy in the different process streams of a recycling plant. The destructive sample analysis techniques currently available necessitate a significant amount of time. It is therefore desirable to develop new sample analysis procedures that allow for a quick turnaround time and increased sample throughput with a minimum of deviation between samples. In particular, new capabilities for rapid sample dissolution and radiochemical separation are required. Most of the radioanalytical techniques currently employed for sample analysis are based on manual laboratory procedures. Such procedures are time- and labor-intensive, and not well suited for situations in which a rapid sample analysis is required and/or large number of samples need to be analyzed. To address this issue we are currently investigating radiochemical separation methods based on extraction chromatography that have been specifically optimized for the analysis of process stream samples. The influence of potential interferences present in the process samples as well as mass loading, flow rate and resin performance is being studied. In addition, the potential to automate these procedures utilizing a robotic platform is evaluated. Initial studies have been carried out using the commercially available DGA resin. This resin shows an affinity for Am, Pu, U, and Th and is also exhibiting signs of a possible synergistic effects in the presence of iron.

  8. Exploring Techniques for Vision Based Human Activity Recognition: Methods, Systems, and Evaluation

    PubMed Central

    Xu, Xin; Tang, Jinshan; Zhang, Xiaolong; Liu, Xiaoming; Zhang, Hong; Qiu, Yimin

    2013-01-01

    With the wide applications of vision based intelligent systems, image and video analysis technologies have attracted the attention of researchers in the computer vision field. In image and video analysis, human activity recognition is an important research direction. By interpreting and understanding human activities, we can recognize and predict the occurrence of crimes and help the police or other agencies react immediately. In the past, a large number of papers have been published on human activity recognition in video and image sequences. In this paper, we provide a comprehensive survey of the recent development of the techniques, including methods, systems, and quantitative evaluation of the performance of human activity recognition. PMID:23353144

  9. Exploring techniques for vision based human activity recognition: methods, systems, and evaluation.

    PubMed

    Xu, Xin; Tang, Jinshan; Zhang, Xiaolong; Liu, Xiaoming; Zhang, Hong; Qiu, Yimin

    2013-01-25

    With the wide applications of vision based intelligent systems, image and video analysis technologies have attracted the attention of researchers in the computer vision field. In image and video analysis, human activity recognition is an important research direction. By interpreting and understanding human activity, we can recognize and predict the occurrence of crimes and help the police or other agencies react immediately. In the past, a large number of papers have been published on human activity recognition in video and image sequences. In this paper, we provide a comprehensive survey of the recent development of the techniques, including methods, systems, and quantitative evaluation towards the performance of human activity recognition.

  10. A Rapid, Fluorescence-Based Field Screening Technique for Organic Species in Soil and Water Matrices.

    PubMed

    Russell, Amber L; Martin, David P; Cuddy, Michael F; Bednar, Anthony J

    2016-06-01

    Real-time detection of hydrocarbon contaminants in the environment presents analytical challenges because traditional laboratory-based techniques are cumbersome and not readily field portable. In the current work, a method for rapid and semi-quantitative detection of organic contaminants, primarily crude oil, in natural water and soil matrices has been developed. Detection limits in the parts per million and parts per billion were accomplished when using visual and digital detection methods, respectively. The extraction technique was modified from standard methodologies used for hydrocarbon analysis and provides a straight-forward separation technique that can remove interference from complex natural constituents. For water samples this method is semi-quantitative, with recoveries ranging from 70 % to 130 %, while measurements of soil samples are more qualitative due to lower extraction efficiencies related to the limitations of field-deployable procedures.

  11. Dimensional Changes of Acrylic Resin Denture Bases: Conventional Versus Injection-Molding Technique

    PubMed Central

    Gharechahi, Jafar; Asadzadeh, Nafiseh; Shahabian, Foad; Gharechahi, Maryam

    2014-01-01

    Objective: Acrylic resin denture bases undergo dimensional changes during polymerization. Injection molding techniques are reported to reduce these changes and thereby improve physical properties of denture bases. The aim of this study was to compare dimensional changes of specimens processed by conventional and injection-molding techniques. Materials and Methods: SR-Ivocap Triplex Hot resin was used for conventional pressure-packed and SR-Ivocap High Impact was used for injection-molding techniques. After processing, all the specimens were stored in distilled water at room temperature until measured. For dimensional accuracy evaluation, measurements were recorded at 24-hour, 48-hour and 12-day intervals using a digital caliper with an accuracy of 0.01 mm. Statistical analysis was carried out by SPSS (SPSS Inc., Chicago, IL, USA) using t-test and repeated-measures ANOVA. Statistical significance was defined at P<0.05. Results: After each water storage period, the acrylic specimens produced by injection exhibited less dimensional changes compared to those produced by the conventional technique. Curing shrinkage was compensated by water sorption with an increase in water storage time decreasing dimensional changes. Conclusion: Within the limitations of this study, dimensional changes of acrylic resin specimens were influenced by the molding technique used and SR-Ivocap injection procedure exhibited higher dimensional accuracy compared to conventional molding. PMID:25584050

  12. Biogeosystem technique as a base of Sustainable Irrigated Agriculture

    NASA Astrophysics Data System (ADS)

    Batukaev, Abdulmalik

    2016-04-01

    The world water strategy is to be changed because the current imitational gravitational frontal isotropic-continual paradigm of irrigation is not sustainable. This paradigm causes excessive consumption of fresh water - global deficit - up to 4-15 times, adverse effects on soils and landscapes. Current methods of irrigation does not control the water spread throughout the soil continuum. The preferable downward fluxes of irrigation water are forming, up to 70% and more of water supply loses into vadose zone. The moisture of irrigated soil is high, soil loses structure in the process of granulometric fractions flotation decomposition, the stomatal apparatus of plant leaf is fully open, transpiration rate is maximal. We propose the Biogeosystem technique - the transcendental, uncommon and non-imitating methods for Sustainable Natural Resources Management. New paradigm of irrigation is based on the intra-soil pulse discrete method of water supply into the soil continuum by injection in small discrete portions. Individual volume of water is supplied as a vertical cylinder of soil preliminary watering. The cylinder position in soil is at depth form 10 to 30 cm. Diameter of cylinder is 1-2 cm. Within 5-10 min after injection the water spreads from the cylinder of preliminary watering into surrounding soil by capillary, film and vapor transfer. Small amount of water is transferred gravitationally to the depth of 35-40 cm. The soil watering cylinder position in soil profile is at depth of 5-50 cm, diameter of the cylinder is 2-4 cm. Lateral distance between next cylinders along the plant raw is 10-15 cm. The soil carcass which is surrounding the cylinder of non-watered soil remains relatively dry and mechanically stable. After water injection the structure of soil in cylinder restores quickly because of no compression from the stable adjoining volume of soil and soil structure memory. The mean soil thermodynamic water potential of watered zone is -0.2 MPa. At this potential

  13. Research on technique of wavefront retrieval based on Foucault test

    NASA Astrophysics Data System (ADS)

    Yuan, Lvjun; Wu, Zhonghua

    2010-05-01

    During finely grinding the best fit sphere and initial stage of polishing, surface error of large aperture aspheric mirrors is too big to test using common interferometer. Foucault test is widely used in fabricating large aperture mirrors. However, the optical path is disturbed seriously by air turbulence, and changes of light and dark zones can not be identified, which often lowers people's judging ability and results in making mistake to diagnose surface error of the whole mirror. To solve the problem, the research presents wavefront retrieval based on Foucault test through digital image processing and quantitative calculation. Firstly, real Foucault image can be gained through collecting a variety of images by CCD, and then average these image to eliminate air turbulence. Secondly, gray values are converted into surface error values through principle derivation, mathematical modeling, and software programming. Thirdly, linear deviation brought by defocus should be removed by least-square method to get real surface error. At last, according to real surface error, plot wavefront map, gray contour map and corresponding pseudo color contour map. The experimental results indicates that the three-dimensional wavefront map and two-dimensional contour map are able to accurately and intuitively show surface error on the whole mirrors under test, and they are beneficial to grasp surface error as a whole. The technique can be used to guide the fabrication of large aperture and long focal mirrors during grinding and initial stage of polishing the aspheric surface, which improves fabricating efficiency and precision greatly.

  14. Age estimation based on Kvaal's technique using digital panoramic radiographs

    PubMed Central

    Mittal, Samta; Nagendrareddy, Suma Gundareddy; Sharma, Manisha Lakhanpal; Agnihotri, Poornapragna; Chaudhary, Sunil; Dhillon, Manu

    2016-01-01

    Introduction: Age estimation is important for administrative and ethical reasons and also because of legal consequences. Dental pulp undergoes regression in size with increasing age due to secondary dentin deposition and can be used as a parameter of age estimation even beyond 25 years of age. Kvaal et al. developed a method for chronological age estimation based on the pulp size using periapical dental radiographs. There is a need for testing this method of age estimation in the Indian population using simple tools like digital imaging on living individuals not requiring extraction of teeth. Aims and Objectives: Estimation of the chronological age of subjects by Kvaal's method using digital panoramic radiographs and also testing the validity of regression equations as given by Kvaal et al. Materials and Methods: The study sample included a total of 152 subjects in the age group of 14-60 years. Measurements were performed on the standardized digital panoramic radiographs based on Kvaal's method. Different regression formulae were derived and the age was assessed. The assessed age was then correlated to the actual age of the patient using Student's t-test. Results: No significant difference between the mean of the chronological age and the estimated age was observed. However, the values of the mean age estimated by using regression equations as given previously in the study of Kvaal et al. significantly underestimated the chronological age in the present study sample. Conclusion: The results of the study give an inference for the feasibility of this technique by calculation of regression equations on digital panoramic radiographs. However, it negates the applicability of same regression equations as given by Kvaal et al. on the study population. PMID:27555738

  15. Novel technique: a pupillometer-based objective chromatic perimetry

    NASA Astrophysics Data System (ADS)

    Rotenstreich, Ygal; Skaat, Alon; Sher, Ifat; Kolker, Andru; Rosenfeld, Elkana; Melamed, Shlomo; Belkin, Michael

    2014-02-01

    Evaluation of visual field (VF) is important for clinical diagnosis and patient monitoring. The current VF methods are subjective and require patient cooperation. Here we developed a novel objective perimetry technique based on the pupil response (PR) to multifocal chromatic stimuli in normal subjects and in patients with glaucoma and retinitis pigmentosa (RP). A computerized infrared video pupillometer was used to record PR to short- and long-wavelength stimuli (peak 485 nm and 620 nm, respectively) at light intensities of 15-100 cd-s/m2 at thirteen different points of the VF. The RP study included 30 eyes of 16 patients and 20 eyes of 12 healthy participants. The glaucoma study included 22 eyes of 11 patients and 38 eyes of 19 healthy participants. Significantly reduced PR was observed in RP patients in response to short-wavelength stimuli at 40 cd-s/m2 in nearly all perimetric locations (P <0.05). By contrast, RP patients demonstrated nearly normal PR to long-wavelength in majority of perimetric locations. The glaucoma group showed significantly reduced PR to long- and short-wavelength stimuli at high intensity in all perimetric locations (P <0.05). The PR of glaucoma patients was significantly lower than normal in response to short-wavelength stimuli at low intensity mostly in central and 20° locations (p<0.05). This study demonstrates the feasibility of using pupillometer-based chromatic perimetry for objectively assessing VF defects and retinal function and optic nerve damage in patients with retinal dystrophies and glaucoma. Furthermore, this method may be used to distinguish between the damaged cells underlying the VF defect.

  16. Fractographic ceramic failure analysis using the replica technique

    PubMed Central

    Scherrer, Susanne S.; Quinn, Janet B.; Quinn, George D.; Anselm Wiskott, H. W.

    2007-01-01

    Objectives To demonstrate the effectiveness of in vivo replicas of fractured ceramic surfaces for descriptive fractography as applied to the analysis of clinical failures. Methods The fracture surface topography of partially failed veneering ceramic of a Procera Alumina molar and an In Ceram Zirconia premolar were examined utilizing gold-coated epoxy poured replicas viewed using scanning electron microscopy. The replicas were inspected for fractographic features such as hackle, wake hackle, twist hackle, compression curl and arrest lines for determination of the direction of crack propagation and location of the origin. Results For both veneering ceramics, replicas provided an excellent reproduction of the fractured surfaces. Fine details including all characteristic fracture features produced by the interaction of the advancing crack with the material's microstructure could be recognized. The observed features are indicators of the local direction of crack propagation and were used to trace the crack's progression back to its initial starting zone (the origin). Drawbacks of replicas such as artifacts (air bubbles) or imperfections resulting from inadequate epoxy pouring were noted but not critical for the overall analysis of the fractured surfaces. Significance The replica technique proved to be easy to use and allowed an excellent reproduction of failed ceramic surfaces. It should be applied before attempting to remove any failed part remaining in situ as the fracture surface may be damaged during this procedure. These two case studies are intended as an introduction for the clinical researcher in using qualitative (descriptive) fractography as a tool for understanding fracture processes in brittle restorative materials and, secondarily, to draw conclusions as to possible design inadequacies in failed restorations. PMID:17270267

  17. Weighted graph based ordering techniques for preconditioned conjugate gradient methods

    NASA Technical Reports Server (NTRS)

    Clift, Simon S.; Tang, Wei-Pai

    1994-01-01

    We describe the basis of a matrix ordering heuristic for improving the incomplete factorization used in preconditioned conjugate gradient techniques applied to anisotropic PDE's. Several new matrix ordering techniques, derived from well-known algorithms in combinatorial graph theory, which attempt to implement this heuristic, are described. These ordering techniques are tested against a number of matrices arising from linear anisotropic PDE's, and compared with other matrix ordering techniques. A variation of RCM is shown to generally improve the quality of incomplete factorization preconditioners.

  18. Analysis of compressive fracture in rock using statistical techniques

    SciTech Connect

    Blair, S.C.

    1994-12-01

    Fracture of rock in compression is analyzed using a field-theory model, and the processes of crack coalescence and fracture formation and the effect of grain-scale heterogeneities on macroscopic behavior of rock are studied. The model is based on observations of fracture in laboratory compression tests, and incorporates assumptions developed using fracture mechanics analysis of rock fracture. The model represents grains as discrete sites, and uses superposition of continuum and crack-interaction stresses to create cracks at these sites. The sites are also used to introduce local heterogeneity. Clusters of cracked sites can be analyzed using percolation theory. Stress-strain curves for simulated uniaxial tests were analyzed by studying the location of cracked sites, and partitioning of strain energy for selected intervals. Results show that the model implicitly predicts both development of shear-type fracture surfaces and a strength-vs-size relation that are similar to those observed for real rocks. Results of a parameter-sensitivity analysis indicate that heterogeneity in the local stresses, attributed to the shape and loading of individual grains, has a first-order effect on strength, and that increasing local stress heterogeneity lowers compressive strength following an inverse power law. Peak strength decreased with increasing lattice size and decreasing mean site strength, and was independent of site-strength distribution. A model for rock fracture based on a nearest-neighbor algorithm for stress redistribution is also presented and used to simulate laboratory compression tests, with promising results.

  19. Probability techniques for reliability analysis of composite materials

    NASA Technical Reports Server (NTRS)

    Wetherhold, Robert C.; Ucci, Anthony M.

    1994-01-01

    Traditional design approaches for composite materials have employed deterministic criteria for failure analysis. New approaches are required to predict the reliability of composite structures since strengths and stresses may be random variables. This report will examine and compare methods used to evaluate the reliability of composite laminae. The two types of methods that will be evaluated are fast probability integration (FPI) methods and Monte Carlo methods. In these methods, reliability is formulated as the probability that an explicit function of random variables is less than a given constant. Using failure criteria developed for composite materials, a function of design variables can be generated which defines a 'failure surface' in probability space. A number of methods are available to evaluate the integration over the probability space bounded by this surface; this integration delivers the required reliability. The methods which will be evaluated are: the first order, second moment FPI methods; second order, second moment FPI methods; the simple Monte Carlo; and an advanced Monte Carlo technique which utilizes importance sampling. The methods are compared for accuracy, efficiency, and for the conservativism of the reliability estimation. The methodology involved in determining the sensitivity of the reliability estimate to the design variables (strength distributions) and importance factors is also presented.

  20. Stratified source-sampling techniques for Monte Carlo eigenvalue analysis.

    SciTech Connect

    Mohamed, A.

    1998-07-10

    In 1995, at a conference on criticality safety, a special session was devoted to the Monte Carlo ''Eigenvalue of the World'' problem. Argonne presented a paper, at that session, in which the anomalies originally observed in that problem were reproduced in a much simplified model-problem configuration, and removed by a version of stratified source-sampling. In this paper, stratified source-sampling techniques are generalized and applied to three different Eigenvalue of the World configurations which take into account real-world statistical noise sources not included in the model problem, but which differ in the amount of neutronic coupling among the constituents of each configuration. It is concluded that, in Monte Carlo eigenvalue analysis of loosely-coupled arrays, the use of stratified source-sampling reduces the probability of encountering an anomalous result over that if conventional source-sampling methods are used. However, this gain in reliability is substantially less than that observed in the model-problem results.

  1. Damage Detection and Analysis in CFRPs Using Acoustic Emission Technique

    NASA Astrophysics Data System (ADS)

    Whitlow, Travis Laron

    Real time monitoring of damage is an important aspect of life management of critical structures. Acoustic emission (AE) techniques allow for measurement and assessment of damage in real time. Acoustic emission parameters such as signal amplitude and duration were monitored during the loading sequences. Criteria that can indicate the onset of critical damage to the structure were developed. Tracking the damage as it happens gives a better analysis of the failure evolution that will allow for a more accurate determination of structural life. The main challenge is distinguishing between legitimate damage signals and "false positives" which are unrelated to damage growth. Such false positives can be related to electrical noise, friction, or mechanical vibrations. This research focuses on monitoring signals of damage growth in carbon fiber reinforced polymers (CFRPs) and separating the relevant signals from the false ones. In this Dissertation, acoustic emission signals from CFRP specimens were experimentally recorded and analyzed. The objectives of this work are: (1) perform static and fatigue loading of CFRP composite specimens and measure the associated AE signals, (2) accurately determine the AE parameters (energy, frequency, duration, etc.) of signals generated during failure of such specimens, (3) use fiber optic sensors to monitor the strain distribution of the damage zone and relate these changes in strain measurements to AE data.

  2. Multivariate analysis of remote LIBS spectra using partial least squares, principal component analysis, and related techniques

    SciTech Connect

    Clegg, Samuel M; Barefield, James E; Wiens, Roger C; Sklute, Elizabeth; Dyare, Melinda D

    2008-01-01

    Quantitative analysis with LIBS traditionally employs calibration curves that are complicated by the chemical matrix effects. These chemical matrix effects influence the LIBS plasma and the ratio of elemental composition to elemental emission line intensity. Consequently, LIBS calibration typically requires a priori knowledge of the unknown, in order for a series of calibration standards similar to the unknown to be employed. In this paper, three new Multivariate Analysis (MV A) techniques are employed to analyze the LIBS spectra of 18 disparate igneous and highly-metamorphosed rock samples. Partial Least Squares (PLS) analysis is used to generate a calibration model from which unknown samples can be analyzed. Principal Components Analysis (PCA) and Soft Independent Modeling of Class Analogy (SIMCA) are employed to generate a model and predict the rock type of the samples. These MV A techniques appear to exploit the matrix effects associated with the chemistries of these 18 samples.

  3. Application of activation techniques to biological analysis. [813 references

    SciTech Connect

    Bowen, H.J.M.

    1981-12-01

    Applications of activation analysis in the biological sciences are reviewed for the period of 1970 to 1979. The stages and characteristics of activation analysis are described, and its advantages and disadvantages enumerated. Most applications involve activation by thermal neutrons followed by either radiochemical or instrumental determination. Relatively little use has been made of activation by fast neutrons, photons, or charged particles. In vivo analyses are included, but those based on prompt gamma or x-ray emission are not. Major applications include studies of reference materials, and the elemental analysis of plants, marine biota, animal and human tissues, diets, and excreta. Relatively little use of it has been made in biochemistry, microbiology, and entomology, but it has become important in toxicology and environmental science. The elements most often determined are Ag, As, Au, Br, Ca, Cd, Cl, Co, Cr, Cs, Cu, Fe, Hg, I, K, Mn, Mo, Na, Rb, Sb, Sc, Se, and Zn, while few or no determinations of B, Be, Bi, Ga, Gd, Ge, H, In, Ir, Li, Nd, Os, Pd, Pr, Pt, Re, Rh, Ru, Te, Tl, or Y have been made in biological materials.

  4. An assessment of finite-element modeling techniques for thick-solid/thin-shell joints analysis

    NASA Technical Reports Server (NTRS)

    Min, J. B.; Androlake, S. G.

    1993-01-01

    The subject of finite-element modeling has long been of critical importance to the practicing designer/analyst who is often faced with obtaining an accurate and cost-effective structural analysis of a particular design. Typically, these two goals are in conflict. The purpose is to discuss the topic of finite-element modeling for solid/shell connections (joints) which are significant for the practicing modeler. Several approaches are currently in use, but frequently various assumptions restrict their use. Such techniques currently used in practical applications were tested, especially to see which technique is the most ideally suited for the computer aided design (CAD) environment. Some basic thoughts regarding each technique are also discussed. As a consequence, some suggestions based on the results are given to lead reliable results in geometrically complex joints where the deformation and stress behavior are complicated.

  5. Molecular Techniques for Detection, Species Differentiation, and Phylogenetic Analysis of Microsporidia

    PubMed Central

    Franzen, Caspar; Müller, Andreas

    1999-01-01

    Microsporidia are obligate intracellular protozoan parasites that infect a broad range of vertebrates and invertebrates. These parasites are now recognized as one of the most common pathogens in human immunodeficiency virus-infected patients. For most patients with infectious diseases, microbiological isolation and identification techniques offer the most rapid and specific determination of the etiologic agent. This is not a suitable procedure for microsporidia, which are obligate intracellular parasites requiring cell culture systems for growth. Therefore, the diagnosis of microsporidiosis currently depends on morphological demonstration of the organisms themselves. Although the diagnosis of microsporidiosis and identification of microsporidia by light microscopy have greatly improved during the last few years, species differentiation by these techniques is usually impossible and transmission electron microscopy may be necessary. Immunfluorescent-staining techniques have been developed for species differentiation of microsporidia, but the antibodies used in these procedures are available only at research laboratories at present. During the last 10 years, the detection of infectious disease agents has begun to include the use of nucleic acid-based technologies. Diagnosis of infection caused by parasitic organisms is the last field of clinical microbiology to incorporate these techniques and molecular techniques (e.g., PCR and hybridization assays) have recently been developed for the detection, species differentiation, and phylogenetic analysis of microsporidia. In this paper we review human microsporidial infections and describe and discuss these newly developed molecular techniques. PMID:10194459

  6. Adhesive-based bonding technique for PDMS microfluidic devices.

    PubMed

    Thompson, C Shea; Abate, Adam R

    2013-02-21

    We present a simple and inexpensive technique for bonding PDMS microfluidic devices. The technique uses only adhesive tape and an oven; plasma bonders and cleanroom facilities are not required. It also produces channels that are immediately hydrophobic, allowing formation of aqueous-in-oil emulsions.

  7. Anterolateral Ligament Reconstruction Technique: An Anatomic-Based Approach.

    PubMed

    Chahla, Jorge; Menge, Travis J; Mitchell, Justin J; Dean, Chase S; LaPrade, Robert F

    2016-06-01

    Restoration of anteroposterior laxity after an anterior cruciate ligament reconstruction has been predictable with traditional open and endoscopic techniques. However, anterolateral rotational stability has been difficult to achieve in a subset of patients, even with appropriate anatomic techniques. Therefore, differing techniques have attempted to address this rotational laxity by augmenting or reconstructing lateral-sided structures about the knee. In recent years, there has been a renewed interest in the anterolateral ligament as a potential contributor to residual anterolateral rotatory instability in anterior cruciate ligament-deficient patients. Numerous anatomic and biomechanical studies have been performed to further define the functional importance of the anterolateral ligament, highlighting the need for surgical techniques to address these injuries in the unstable knee. This article details our technique for an anatomic anterolateral ligament reconstruction using a semitendinosus tendon allograft. PMID:27656361

  8. Deformation grating fabrication technique based on the solvent-assisted microcontact molding.

    PubMed

    Dai, Xianglu; Xie, Huimin; Wang, Huaixi

    2014-10-20

    A deformation grating fabrication technique based on solvent-assisted microcontact molding (SAMIM) is reported in this paper. The fabrication process can be divided into three steps: imprinting a grating on a medium polymer substrate (MPS) by SAMIM, coating a thin metal film on the MPS, and transferring the film to the measured surface. In order to increase the stiffness of the elastic mold without decreasing its conformal contact formation ability, a re-useable, glass-embedded polydimethylsiloxane (PDMS) mold is used. In addition, a characterization method based on the Fourier transform and phase analysis is proposed to check the quality of the fabricated grating. Verified by experiment, the proposed fabrication technique can fabricate a high-frequency large-area grating on different specimens, which can be a qualified deformation sensor for the moiré method. PMID:25402792

  9. Recent advancements in sensing techniques based on functional materials for organophosphate pesticides.

    PubMed

    Kumar, Pawan; Kim, Ki-Hyun; Deep, Akash

    2015-08-15

    The use of organophosphate pesticides (OPs) for pest control in agriculture has caused serious environmental problems throughout the world. OPs are highly toxic with the potential to cause neurological disorders in humans. As the application of OPs has greatly increased in various agriculture activities, it has become imperative to accurately monitor their concentration levels for the protection of ecological systems and food supplies. Although there are many conventional methods available for the detection of OPs, the development of portable sensors is necessary to facilitate routine analysis with more convenience. Some of these potent alternative techniques based on functional materials include fluorescence nanomaterials based sensors, molecular imprinted (MIP) sensors, electrochemical sensors, and biosensors. This review explores the basic features of these sensing approaches through evaluation of their performance. The discussion is extended further to describe the challenges and opportunities for these unique sensing techniques.

  10. A technique for the vibration signal analysis in vehicle diagnostics

    NASA Astrophysics Data System (ADS)

    Puchalski, Andrzej

    2015-05-01

    The method of utilising signals of vibration acceleration in the on-line and off-line diagnostics of mechanical defects of internal combustion engines is presented in the paper. The monitored vibration signals of the spark ignition (SI) engine in various maintenance states of the valve system were investigated. The suggested technique is based on mathematical methods of the lower triangular-orthogonal (LQ) factorisation and the singular value decomposition (SVD) of observation subspaces computed on a vibration time series after their angular resampling without any transformations in the frequency domain. The applied algorithm of data processing filters excessive information and allows the selection of diagnostic features (essential from the maintenance point of view) and generates the empirical model and matrix residuals assessed in the no-fault state as being 'zero'. Then, statistical feature vectors, for which the averaged successive singular values of the residuals of the observation subspaces of the vibration signals were assumed as components, were analysed. As a result of this procedure the vectors of lower dimensions reduced to components, allowing the classification of observations within all defined classes, were obtained. On the basis of these vectors a scalar measure - sensitive to the kind of defect - was proposed and verified.

  11. Image processing technique based on image understanding architecture

    NASA Astrophysics Data System (ADS)

    Kuvychko, Igor

    2000-12-01

    Effectiveness of image applications is directly based on its abilities to resolve ambiguity and uncertainty in the real images. That requires tight integration of low-level image processing with high-level knowledge-based reasoning, which is the solution of the image understanding problem. This article presents a generic computational framework necessary for the solution of image understanding problem -- Spatial Turing Machine. Instead of tape of symbols, it works with hierarchical networks dually represented as discrete and continuous structures. Dual representation provides natural transformation of the continuous image information into the discrete structures, making it available for analysis. Such structures are data and algorithms at the same time and able to perform graph and diagrammatic operations being the basis of intelligence. They can create derivative structures that play role of context, or 'measurement device,' giving the ability to analyze, and run top-bottom algorithms. Symbols naturally emerge there, and symbolic operations work in combination with new simplified methods of computational intelligence. That makes images and scenes self-describing, and provides flexible ways of resolving uncertainty. Classification of images truly invariant to any transformation could be done via matching their derivative structures. New proposed architecture does not require supercomputers, opening ways to the new image technologies.

  12. The Statistical Analysis Techniques to Support the NGNP Fuel Performance Experiments

    SciTech Connect

    Bihn T. Pham; Jeffrey J. Einerson

    2010-06-01

    This paper describes the development and application of statistical analysis techniques to support the AGR experimental program on NGNP fuel performance. The experiments conducted in the Idaho National Laboratory’s Advanced Test Reactor employ fuel compacts placed in a graphite cylinder shrouded by a steel capsule. The tests are instrumented with thermocouples embedded in graphite blocks and the target quantity (fuel/graphite temperature) is regulated by the He-Ne gas mixture that fills the gap volume. Three techniques for statistical analysis, namely control charting, correlation analysis, and regression analysis, are implemented in the SAS-based NGNP Data Management and Analysis System (NDMAS) for automated processing and qualification of the AGR measured data. The NDMAS also stores daily neutronic (power) and thermal (heat transfer) code simulation results along with the measurement data, allowing for their combined use and comparative scrutiny. The ultimate objective of this work includes (a) a multi-faceted system for data monitoring and data accuracy testing, (b) identification of possible modes of diagnostics deterioration and changes in experimental conditions, (c) qualification of data for use in code validation, and (d) identification and use of data trends to support effective control of test conditions with respect to the test target. Analysis results and examples given in the paper show the three statistical analysis techniques providing a complementary capability to warn of thermocouple failures. It also suggests that the regression analysis models relating calculated fuel temperatures and thermocouple readings can enable online regulation of experimental parameters (i.e. gas mixture content), to effectively maintain the target quantity (fuel temperature) within a given range.

  13. Improving Skill Development: An Exploratory Study Comparing a Philosophical and an Applied Ethical Analysis Technique

    ERIC Educational Resources Information Center

    Al-Saggaf, Yeslam; Burmeister, Oliver K.

    2012-01-01

    This exploratory study compares and contrasts two types of critical thinking techniques; one is a philosophical and the other an applied ethical analysis technique. The two techniques analyse an ethically challenging situation involving ICT that a recent media article raised to demonstrate their ability to develop the ethical analysis skills of…

  14. A Technique for the Analysis of Auto Exhaust.

    ERIC Educational Resources Information Center

    Sothern, Ray D.; And Others

    Developed for presentation at the 12th Conference on Methods in Air Pollution and Industrial Hygiene Studies, University of Southern California, April, 1971, this outline explains a technique for separating the complex mixture of hydrocarbons contained in automotive exhausts. A Golay column and subambient temperature programming technique are…

  15. Asymptotic Waveform Evaluation (AWE) Technique for Frequency Domain Electromagnetic Analysis

    NASA Technical Reports Server (NTRS)

    Cockrell, C. R.; Beck, F. B.

    1996-01-01

    The Asymptotic Waveform Evaluation (AWE) technique is applied to a generalized frequency domain electromagnetic problem. Most of the frequency domain techniques in computational electromagnetics result in a matrix equation, which is solved at a single frequency. In the AWE technique, the Taylor series expansion around that frequency is applied to the matrix equation. The coefficients of the Taylor's series are obtained in terms of the frequency derivatives of the matrices evaluated at the expansion frequency. The coefficients hence obtained will be used to predict the frequency response of the system over a frequency range. The detailed derivation of the coefficients (called 'moments') is given along with an illustration for electric field integral equation (or Method of Moments) technique. The radar cross section (RCS) frequency response of a square plate is presented using the AWE technique and is compared with the exact solution at various frequencies.

  16. Blood culture technique based on centrifugation: clinical evaluation.

    PubMed Central

    Dorn, G L; Burson, G G; Haynes, J R

    1976-01-01

    A total of 1,000 blood samples from patients suspected of having a bacteremia were analyzed concurrently, where possible, by three methods: (i) Trypticase soy broth with sodium polyanethol sulfonate and a CO2 atmosphere: (ii) pour plates with either brain heart infusion agar or Sabouraud dextrose agar; and (iii) centrifugation of the suspected organism in a hypertonic solution. There were 176 positive cultures. The centrifugation technique recovered 73% of the positive cultures. The broth and pour plate techniques recovered 38 and 49%, respectively. The centrifugation technique showed an increased isolation rate for Pseudomonas, fungi, and gram-positive cocci. In general, for each organism the time required for the detection of a positive culture was shortest for the centrifugation technique. PMID:1270591

  17. A local technique based on vectorized surfaces for craniofacial reconstruction.

    PubMed

    Tilotta, Françoise M; Glaunès, Joan A; Richard, Frédéric J P; Rozenholc, Yves

    2010-07-15

    In this paper, we focus on the automation of facial reconstruction. Since they consider the whole head as the object of interest, usual reconstruction techniques are global and involve a large number of parameters to be estimated. We present a local technique which aims at reaching a good trade-off between bias and variance following the paradigm of non-parametric statistics. The estimation is localized on patches delimited by surface geodesics between anatomical points of the skull. The technique relies on a continuous representation of the individual surfaces embedded in the vectorial space of extended normal vector fields. This allows to compute deformations and averages of surfaces. It consists in estimating the soft-tissue surface over patches. Using a homogeneous database described in [31], we obtain results on the chin and nasal regions with an average error below 1mm, outperforming the global reconstruction techniques.

  18. [Research progress on urban carbon fluxes based on eddy covariance technique].

    PubMed

    Liu, Min; Fu, Yu-Ling; Yang, Fang

    2014-02-01

    Land use change and fossil fuel consumption due to urbanization have made significant effect on global carbon cycle and climate change. Accurate estimating and understanding of the carbon budget and its characteristics are the premises for studying carbon cycle and its driving mechanisms in urban system. Based on the theory of eddy covariance (EC) technique, the characteristics atmospheric boundary layer and carbon cycle in urban area, this study systematically reviewed the principles of CO2 flux monitoring in urban system with EC technique, and then summarized the problems faced in urban CO2 flux monitoring and the method for data processing and further assessment. The main research processes on urban carbon fluxes with EC technique were also illustrated. The results showed that the urban surface was mostly acting as net carbon source. The CO2 exchange between urban surface and atmosphere showed obvious diurnal, weekly and seasonal variation resulted from the vehicle exhaust, domestic heating and vegetation respiration. However, there still exist great uncertainties in urban flux measurement and its explanation due to high spatial heterogeneity and complex distributions of carbon source/sink in urban environments. In the end, we suggested that further researches on EC technique and data assessment in complex urban area should be strengthened. It was also requisite to develop models of urban carbon cycle on the basis of the system principle, to investigate the influencing mechanism and variability of urban cycle at regional scale with spatial analysis technique. PMID:24830264

  19. [Research progress on urban carbon fluxes based on eddy covariance technique].

    PubMed

    Liu, Min; Fu, Yu-Ling; Yang, Fang

    2014-02-01

    Land use change and fossil fuel consumption due to urbanization have made significant effect on global carbon cycle and climate change. Accurate estimating and understanding of the carbon budget and its characteristics are the premises for studying carbon cycle and its driving mechanisms in urban system. Based on the theory of eddy covariance (EC) technique, the characteristics atmospheric boundary layer and carbon cycle in urban area, this study systematically reviewed the principles of CO2 flux monitoring in urban system with EC technique, and then summarized the problems faced in urban CO2 flux monitoring and the method for data processing and further assessment. The main research processes on urban carbon fluxes with EC technique were also illustrated. The results showed that the urban surface was mostly acting as net carbon source. The CO2 exchange between urban surface and atmosphere showed obvious diurnal, weekly and seasonal variation resulted from the vehicle exhaust, domestic heating and vegetation respiration. However, there still exist great uncertainties in urban flux measurement and its explanation due to high spatial heterogeneity and complex distributions of carbon source/sink in urban environments. In the end, we suggested that further researches on EC technique and data assessment in complex urban area should be strengthened. It was also requisite to develop models of urban carbon cycle on the basis of the system principle, to investigate the influencing mechanism and variability of urban cycle at regional scale with spatial analysis technique.

  20. A study of trends and techniques for space base electronics

    NASA Technical Reports Server (NTRS)

    Trotter, J. D.; Wade, T. E.; Gassaway, J. D.

    1978-01-01

    Furnaces and photolithography related equipment were applied to experiments on double layer metal. The double layer metal activity emphasized wet chemistry techniques. By incorporating the following techniques: (1) ultrasonic etching of the vias; (2) premetal clean using a modified buffered hydrogen fluoride; (3) phosphorus doped vapor; and (4) extended sintering, yields of 98 percent were obtained using the standard test pattern. The two dimensional modeling problems have stemmed from, alternately, instability and too much computation time to achieve convergence.

  1. Biogeosystem technique as a base of Sustainable Irrigated Agriculture

    NASA Astrophysics Data System (ADS)

    Batukaev, Abdulmalik

    2016-04-01

    The world water strategy is to be changed because the current imitational gravitational frontal isotropic-continual paradigm of irrigation is not sustainable. This paradigm causes excessive consumption of fresh water - global deficit - up to 4-15 times, adverse effects on soils and landscapes. Current methods of irrigation does not control the water spread throughout the soil continuum. The preferable downward fluxes of irrigation water are forming, up to 70% and more of water supply loses into vadose zone. The moisture of irrigated soil is high, soil loses structure in the process of granulometric fractions flotation decomposition, the stomatal apparatus of plant leaf is fully open, transpiration rate is maximal. We propose the Biogeosystem technique - the transcendental, uncommon and non-imitating methods for Sustainable Natural Resources Management. New paradigm of irrigation is based on the intra-soil pulse discrete method of water supply into the soil continuum by injection in small discrete portions. Individual volume of water is supplied as a vertical cylinder of soil preliminary watering. The cylinder position in soil is at depth form 10 to 30 cm. Diameter of cylinder is 1-2 cm. Within 5-10 min after injection the water spreads from the cylinder of preliminary watering into surrounding soil by capillary, film and vapor transfer. Small amount of water is transferred gravitationally to the depth of 35-40 cm. The soil watering cylinder position in soil profile is at depth of 5-50 cm, diameter of the cylinder is 2-4 cm. Lateral distance between next cylinders along the plant raw is 10-15 cm. The soil carcass which is surrounding the cylinder of non-watered soil remains relatively dry and mechanically stable. After water injection the structure of soil in cylinder restores quickly because of no compression from the stable adjoining volume of soil and soil structure memory. The mean soil thermodynamic water potential of watered zone is -0.2 MPa. At this potential

  2. Topology-based Feature Definition and Analysis

    SciTech Connect

    Weber, Gunther H.; Bremer, Peer-Timo; Gyulassy, Attila; Pascucci, Valerio

    2010-12-10

    Defining high-level features, detecting them, tracking them and deriving quantities based on them is an integral aspect of modern data analysis and visualization. In combustion simulations, for example, burning regions, which are characterized by high fuel-consumption, are a possible feature of interest. Detecting these regions makes it possible to derive statistics about their size and track them over time. However, features of interest in scientific simulations are extremely varied, making it challenging to develop cross-domain feature definitions. Topology-based techniques offer an extremely flexible means for general feature definitions and have proven useful in a variety of scientific domains. This paper will provide a brief introduction into topological structures like the contour tree and Morse-Smale complex and show how to apply them to define features in different science domains such as combustion. The overall goal is to provide an overview of these powerful techniques and start a discussion how these techniques can aid in the analysis of astrophysical simulations.

  3. Data Collection and Analysis Techniques for Evaluating the Perceptual Qualities of Auditory Stimuli

    SciTech Connect

    Bonebright, T.L.; Caudell, T.P.; Goldsmith, T.E.; Miner, N.E.

    1998-11-17

    This paper describes a general methodological framework for evaluating the perceptual properties of auditory stimuli. The framework provides analysis techniques that can ensure the effective use of sound for a variety of applications including virtual reality and data sonification systems. Specifically, we discuss data collection techniques for the perceptual qualities of single auditory stimuli including identification tasks, context-based ratings, and attribute ratings. In addition, we present methods for comparing auditory stimuli, such as discrimination tasks, similarity ratings, and sorting tasks. Finally, we discuss statistical techniques that focus on the perceptual relations among stimuli, such as Multidimensional Scaling (MDS) and Pathfinder Analysis. These methods are presented as a starting point for an organized and systematic approach for non-experts in perceptual experimental methods, rather than as a complete manual for performing the statistical techniques and data collection methods. It is our hope that this paper will help foster further interdisciplinary collaboration among perceptual researchers, designers, engineers, and others in the development of effective auditory displays.

  4. Figure analysis: A teaching technique to promote visual literacy and active Learning.

    PubMed

    Wiles, Amy M

    2016-07-01

    Learning often improves when active learning techniques are used in place of traditional lectures. For many of these techniques, however, students are expected to apply concepts that they have already grasped. A challenge, therefore, is how to incorporate active learning into the classroom of courses with heavy content, such as molecular-based biology courses. An additional challenge is that visual literacy is often overlooked in undergraduate science education. To address both of these challenges, a technique called figure analysis was developed and implemented in three different levels of undergraduate biology courses. Here, students learn content while gaining practice in interpreting visual information by discussing figures with their peers. Student groups also make connections between new and previously learned concepts on their own while in class. The instructor summarizes the material for the class only after students grapple with it in small groups. Students reported a preference for learning by figure analysis over traditional lecture, and female students in particular reported increased confidence in their analytical abilities. There is not a technology requirement for this technique; therefore, it may be utilized both in classrooms and in nontraditional spaces. Additionally, the amount of preparation required is comparable to that of a traditional lecture. © 2016 by The International Union of Biochemistry and Molecular Biology, 44(4):336-344, 2016. PMID:26891952

  5. Analysis of fluidized bed granulation process using conventional and novel modeling techniques.

    PubMed

    Petrović, Jelena; Chansanroj, Krisanin; Meier, Brigitte; Ibrić, Svetlana; Betz, Gabriele

    2011-10-01

    Various modeling techniques have been applied to analyze fluidized-bed granulation process. Influence of various input parameters (product, inlet and outlet air temperature, consumption of liquid-binder, granulation liquid-binder spray rate, spray pressure, drying time) on granulation output properties (granule flow rate, granule size determined using light scattering method and sieve analysis, granules Hausner ratio, porosity and residual moisture) has been assessed. Both conventional and novel modeling techniques were used, such as screening test, multiple regression analysis, self-organizing maps, artificial neural networks, decision trees and rule induction. Diverse testing of developed models (internal and external validation) has been discussed. Good correlation has been obtained between the predicted and the experimental data. It has been shown that nonlinear methods based on artificial intelligence, such as neural networks, are far better in generalization and prediction in comparison to conventional methods. Possibility of usage of SOMs, decision trees and rule induction technique to monitor and optimize fluidized-bed granulation process has also been demonstrated. Obtained findings can serve as guidance to implementation of modeling techniques in fluidized-bed granulation process understanding and control. PMID:21839830

  6. Figure analysis: A teaching technique to promote visual literacy and active Learning.

    PubMed

    Wiles, Amy M

    2016-07-01

    Learning often improves when active learning techniques are used in place of traditional lectures. For many of these techniques, however, students are expected to apply concepts that they have already grasped. A challenge, therefore, is how to incorporate active learning into the classroom of courses with heavy content, such as molecular-based biology courses. An additional challenge is that visual literacy is often overlooked in undergraduate science education. To address both of these challenges, a technique called figure analysis was developed and implemented in three different levels of undergraduate biology courses. Here, students learn content while gaining practice in interpreting visual information by discussing figures with their peers. Student groups also make connections between new and previously learned concepts on their own while in class. The instructor summarizes the material for the class only after students grapple with it in small groups. Students reported a preference for learning by figure analysis over traditional lecture, and female students in particular reported increased confidence in their analytical abilities. There is not a technology requirement for this technique; therefore, it may be utilized both in classrooms and in nontraditional spaces. Additionally, the amount of preparation required is comparable to that of a traditional lecture. © 2016 by The International Union of Biochemistry and Molecular Biology, 44(4):336-344, 2016.

  7. BEaST: brain extraction based on nonlocal segmentation technique.

    PubMed

    Eskildsen, Simon F; Coupé, Pierrick; Fonov, Vladimir; Manjón, José V; Leung, Kelvin K; Guizard, Nicolas; Wassef, Shafik N; Østergaard, Lasse Riis; Collins, D Louis

    2012-02-01

    Brain extraction is an important step in the analysis of brain images. The variability in brain morphology and the difference in intensity characteristics due to imaging sequences make the development of a general purpose brain extraction algorithm challenging. To address this issue, we propose a new robust method (BEaST) dedicated to produce consistent and accurate brain extraction. This method is based on nonlocal segmentation embedded in a multi-resolution framework. A library of 80 priors is semi-automatically constructed from the NIH-sponsored MRI study of normal brain development, the International Consortium for Brain Mapping, and the Alzheimer's Disease Neuroimaging Initiative databases. In testing, a mean Dice similarity coefficient of 0.9834±0.0053 was obtained when performing leave-one-out cross validation selecting only 20 priors from the library. Validation using the online Segmentation Validation Engine resulted in a top ranking position with a mean Dice coefficient of 0.9781±0.0047. Robustness of BEaST is demonstrated on all baseline ADNI data, resulting in a very low failure rate. The segmentation accuracy of the method is better than two widely used publicly available methods and recent state-of-the-art hybrid approaches. BEaST provides results comparable to a recent label fusion approach, while being 40 times faster and requiring a much smaller library of priors.

  8. Analysis of Piezoelectric Structural Sensors with Emergent Computing Techniques

    NASA Technical Reports Server (NTRS)

    Ramers, Douglas L.

    2005-01-01

    pressurizing the bottle on a test stand, and running sweeps of excitations frequencies for each of the piezo sensors and recording the resulting impedance. The sweeps were limited to 401 points by the available analyzer, and it was decided to perform individual sweeps at five different excitation frequency ranges. The frequency ranges used for the PZTs were different in two of the five ranges from the ranges used for the SCP. The bottles were pressurized to empty (no water), 0psig, 77 psig, 155 psig, 227 psig in nearly uniform increments of about 77psi. One of each of the two types of piezo sensors was fastened on to the bottle surface at two locations: about midway between the ends on cylindrical portion of the bottle and at the very edge of one of the end domes. The data was collected in files by sensor type (2 cases), by location (2 cases), by frequency range (5 cases), and pressure (5cases) to produce 100 data sets of 401 impedances. After familiarization with the piezo sensing technology and obtaining the data, the team developed a set of questions to try to answer regarding the data and made assignments of responsibilities. The next section lists the questions, and the remainder of the report describes the data analysis work performed by Dr. Ramers. This includes a discussion of the data, the approach to answering the question using statistical techniques, the use of an emergent system to investigate the data where statistical techniques were not usable, conclusions regarding the data, and recommendations.

  9. Comparative analysis of data mining techniques for business data

    NASA Astrophysics Data System (ADS)

    Jamil, Jastini Mohd; Shaharanee, Izwan Nizal Mohd

    2014-12-01

    Data mining is the process of employing one or more computer learning techniques to automatically analyze and extract knowledge from data contained within a database. Companies are using this tool to further understand their customers, to design targeted sales and marketing campaigns, to predict what product customers will buy and the frequency of purchase, and to spot trends in customer preferences that can lead to new product development. In this paper, we conduct a systematic approach to explore several of data mining techniques in business application. The experimental result reveals that all data mining techniques accomplish their goals perfectly, but each of the technique has its own characteristics and specification that demonstrate their accuracy, proficiency and preference.

  10. Application of Material Characterization Techniques to Electrical Forensic Analysis

    SciTech Connect

    Mills, T.D.

    2003-03-11

    The application of forensic science techniques to electrical equipment failure investigation has not been widely documented in the engineering world. This paper is intended to share an example of using material characterization techniques to support an initial cause determination of an electrical component failure event. The resulting conclusion supported the initial cause determination and ruled out the possibility of design deficiencies. Thus, the qualification testing of the equipment was allowed to continue to successful completion.

  11. Breast volumetric analysis for aesthetic planning in breast reconstruction: a literature review of techniques

    PubMed Central

    Rozen, Warren Matthew; Spychal, Robert T.; Hunter-Smith, David J.

    2016-01-01

    Background Accurate volumetric analysis is an essential component of preoperative planning in both reconstructive and aesthetic breast procedures towards achieving symmetrization and patient-satisfactory outcome. Numerous comparative studies and reviews of individual techniques have been reported. However, a unifying review of all techniques comparing their accuracy, reliability, and practicality has been lacking. Methods A review of the published English literature dating from 1950 to 2015 using databases, such as PubMed, Medline, Web of Science, and EMBASE, was undertaken. Results Since Bouman’s first description of water displacement method, a range of volumetric assessment techniques have been described: thermoplastic casting, direct anthropomorphic measurement, two-dimensional (2D) imaging, and computed tomography (CT)/magnetic resonance imaging (MRI) scans. However, most have been unreliable, difficult to execute and demonstrate limited practicability. Introduction of 3D surface imaging has revolutionized the field due to its ease of use, fast speed, accuracy, and reliability. However, its widespread use has been limited by its high cost and lack of high level of evidence. Recent developments have unveiled the first web-based 3D surface imaging program, 4D imaging, and 3D printing. Conclusions Despite its importance, an accurate, reliable, and simple breast volumetric analysis tool has been elusive until the introduction of 3D surface imaging technology. However, its high cost has limited its wide usage. Novel adjunct technologies, such as web-based 3D surface imaging program, 4D imaging, and 3D printing, appear promising. PMID:27047788

  12. Non-destructive techniques based on eddy current testing.

    PubMed

    García-Martín, Javier; Gómez-Gil, Jaime; Vázquez-Sánchez, Ernesto

    2011-01-01

    Non-destructive techniques are used widely in the metal industry in order to control the quality of materials. Eddy current testing is one of the most extensively used non-destructive techniques for inspecting electrically conductive materials at very high speeds that does not require any contact between the test piece and the sensor. This paper includes an overview of the fundamentals and main variables of eddy current testing. It also describes the state-of-the-art sensors and modern techniques such as multi-frequency and pulsed systems. Recent advances in complex models towards solving crack-sensor interaction, developments in instrumentation due to advances in electronic devices, and the evolution of data processing suggest that eddy current testing systems will be increasingly used in the future.

  13. Copyright protection for multimedia data based on asymmetric cryptographic techniques

    NASA Astrophysics Data System (ADS)

    Herrigel, Alexander

    1998-09-01

    This paper presents a new approach for the copyright protection of digital multimedia data. The system applies cryptographic protocols and a public key technique for different purposes, namely encoding/decoding a digital watermark generated by any spread spectrum technique and the secure transfer of watermarked data from the sender to the receiver in a commercial business process. The public key technique is applied for the construction of a one-way watermark embedding and verification function to identify and prove the uniqueness of the watermark. In addition, our approach provides secure owner authentication data who has initiated the watermark process for a specific data set. Legal dispute resolution is supported for multiple watermarking of digital data without revealing the confidential keying information.

  14. Non-Destructive Techniques Based on Eddy Current Testing

    PubMed Central

    García-Martín, Javier; Gómez-Gil, Jaime; Vázquez-Sánchez, Ernesto

    2011-01-01

    Non-destructive techniques are used widely in the metal industry in order to control the quality of materials. Eddy current testing is one of the most extensively used non-destructive techniques for inspecting electrically conductive materials at very high speeds that does not require any contact between the test piece and the sensor. This paper includes an overview of the fundamentals and main variables of eddy current testing. It also describes the state-of-the-art sensors and modern techniques such as multi-frequency and pulsed systems. Recent advances in complex models towards solving crack-sensor interaction, developments in instrumentation due to advances in electronic devices, and the evolution of data processing suggest that eddy current testing systems will be increasingly used in the future. PMID:22163754

  15. Regional flood frequency analysis using spatial proximity and basin characteristics: Quantile regression vs. parameter regression technique

    NASA Astrophysics Data System (ADS)

    Ahn, Kuk-Hyun; Palmer, Richard

    2016-09-01

    Despite wide use of regression-based regional flood frequency analysis (RFFA) methods, the majority are based on either ordinary least squares (OLS) or generalized least squares (GLS). This paper proposes 'spatial proximity' based RFFA methods using the spatial lagged model (SLM) and spatial error model (SEM). The proposed methods are represented by two frameworks: the quantile regression technique (QRT) and parameter regression technique (PRT). The QRT develops prediction equations for flooding quantiles in average recurrence intervals (ARIs) of 2, 5, 10, 20, and 100 years whereas the PRT provides prediction of three parameters for the selected distribution. The proposed methods are tested using data incorporating 30 basin characteristics from 237 basins in Northeastern United States. Results show that generalized extreme value (GEV) distribution properly represents flood frequencies in the study gages. Also, basin area, stream network, and precipitation seasonality are found to be the most effective explanatory variables in prediction modeling by the QRT and PRT. 'Spatial proximity' based RFFA methods provide reliable flood quantile estimates compared to simpler methods. Compared to the QRT, the PRT may be recommended due to its accuracy and computational simplicity. The results presented in this paper may serve as one possible guidepost for hydrologists interested in flood analysis at ungaged sites.

  16. How complex climate networks complement eigen techniques for the statistical analysis of climatological data

    NASA Astrophysics Data System (ADS)

    Donges, Jonathan; Petrova, Irina; Löw, Alexander; Marwan, Norbert; Kurths, Jürgen

    2015-04-01

    Eigen techniques such as empirical orthogonal function (EOF) or coupled pattern (CP) / maximum covariance analysis have been frequently used for detecting patterns in multivariate climatological data sets. Recently, statistical methods originating from the theory of complex networks have been employed for the very same purpose of spatio-temporal analysis. This climate network (CN) analysis is usually based on the same set of similarity matrices as is used in classical EOF or CP analysis, e.g., the correlation matrix of a single climatological field or the cross-correlation matrix between two distinct climatological fields. In this study, formal relationships as well as conceptual differences between both eigen and network approaches are derived and illustrated using global precipitation, evaporation and surface air temperature data sets. These results allow us to pinpoint that CN analysis can complement classical eigen techniques and provides additional information on the higher-order structure of statistical interrelationships in climatological data. Hence, CNs are a valuable supplement to the statistical toolbox of the climatologist, particularly for making sense out of very large data sets such as those generated by satellite observations and climate model intercomparison exercises.

  17. How complex climate networks complement eigen techniques for the statistical analysis of climatological data

    NASA Astrophysics Data System (ADS)

    Donges, Jonathan F.; Petrova, Irina; Loew, Alexander; Marwan, Norbert; Kurths, Jürgen

    2015-11-01

    Eigen techniques such as empirical orthogonal function (EOF) or coupled pattern (CP)/maximum covariance analysis have been frequently used for detecting patterns in multivariate climatological data sets. Recently, statistical methods originating from the theory of complex networks have been employed for the very same purpose of spatio-temporal analysis. This climate network (CN) analysis is usually based on the same set of similarity matrices as is used in classical EOF or CP analysis, e.g., the correlation matrix of a single climatological field or the cross-correlation matrix between two distinct climatological fields. In this study, formal relationships as well as conceptual differences between both eigen and network approaches are derived and illustrated using global precipitation, evaporation and surface air temperature data sets. These results allow us to pinpoint that CN analysis can complement classical eigen techniques and provides additional information on the higher-order structure of statistical interrelationships in climatological data. Hence, CNs are a valuable supplement to the statistical toolbox of the climatologist, particularly for making sense out of very large data sets such as those generated by satellite observations and climate model intercomparison exercises.

  18. New approaches to the analysis of complex samples using fluorescence lifetime techniques and organized media

    SciTech Connect

    Hertz, P.R.

    1992-01-01

    Fluorescence spectroscopy is a highly sensitive and selective tool for the analysis of complex systems. In order to investigate the efficacy of several steady state and dynamic techniques for the analysis of complex systems, this work focuses on two types of complex, multicomponent samples: petrolatums and coal liquids. It is shown in these studies dynamic, fluorescence lifetime-based measurements provide enhanced discrimination between complex petrolatum samples. Additionally, improved quantitative analysis of multicomponent systems is demonstrated via incorporation of organized media in coal liquid samples. This research provides the first systematic studies of (1) multifrequency phase-resolved fluorescence spectroscopy for dynamic fluorescence spectral fingerprinting of complex samples, and (2) the incorporation of bile salt micellar media to improve accuracy and sensitivity for characterization of complex systems. In the petroleum studies, phase-resolved fluorescence spectroscopy is used to combine spectral and lifetime information through the measurement of phase-resolved fluorescence intensity. The intensity is collected as a function of excitation and emission wavelengths, angular modulation frequency, and detector phase angle. This multidimensional information enhances the ability to distinguish between complex samples with similar spectral characteristics. Examination of the eigenvalues and eigenvectors from factor analysis of phase-resolved and steady state excitation-emission matrices, using chemometric methods of data analysis, confirms that phase-resolved fluorescence techniques offer improved discrimination between complex samples as compared with conventional steady state methods.

  19. Behavior Change Techniques in Popular Alcohol Reduction Apps: Content Analysis

    PubMed Central

    Garnett, Claire; Brown, James; West, Robert; Michie, Susan

    2015-01-01

    Background Mobile phone apps have the potential to reduce excessive alcohol consumption cost-effectively. Although hundreds of alcohol-related apps are available, there is little information about the behavior change techniques (BCTs) they contain, or the extent to which they are based on evidence or theory and how this relates to their popularity and user ratings. Objective Our aim was to assess the proportion of popular alcohol-related apps available in the United Kingdom that focus on alcohol reduction, identify the BCTs they contain, and explore whether BCTs or the mention of theory or evidence is associated with app popularity and user ratings. Methods We searched the iTunes and Google Play stores with the terms “alcohol” and “drink”, and the first 800 results were classified into alcohol reduction, entertainment, or blood alcohol content measurement. Of those classified as alcohol reduction, all free apps and the top 10 paid apps were coded for BCTs and for reference to evidence or theory. Measures of popularity and user ratings were extracted. Results Of the 800 apps identified, 662 were unique. Of these, 13.7% (91/662) were classified as alcohol reduction (95% CI 11.3-16.6), 53.9% (357/662) entertainment (95% CI 50.1-57.7), 18.9% (125/662) blood alcohol content measurement (95% CI 16.1-22.0) and 13.4% (89/662) other (95% CI 11.1-16.3). The 51 free alcohol reduction apps and the top 10 paid apps contained a mean of 3.6 BCTs (SD 3.4), with approximately 12% (7/61) not including any BCTs. The BCTs used most often were “facilitate self-recording” (54%, 33/61), “provide information on consequences of excessive alcohol use and drinking cessation” (43%, 26/61), “provide feedback on performance” (41%, 25/61), “give options for additional and later support” (25%, 15/61) and “offer/direct towards appropriate written materials” (23%, 14/61). These apps also rarely included any of the 22 BCTs frequently used in other health behavior change

  20. Error Analysis of non-TLD HDR Brachytherapy Dosimetric Techniques

    NASA Astrophysics Data System (ADS)

    Amoush, Ahmad

    The American Association of Physicists in Medicine Task Group Report43 (AAPM-TG43) and its updated version TG-43U1 rely on the LiF TLD detector to determine the experimental absolute dose rate for brachytherapy. The recommended uncertainty estimates associated with TLD experimental dosimetry include 5% for statistical errors (Type A) and 7% for systematic errors (Type B). TG-43U1 protocol does not include recommendation for other experimental dosimetric techniques to calculate the absolute dose for brachytherapy. This research used two independent experimental methods and Monte Carlo simulations to investigate and analyze uncertainties and errors associated with absolute dosimetry of HDR brachytherapy for a Tandem applicator. An A16 MicroChamber* and one dose MOSFET detectors† were selected to meet the TG-43U1 recommendations for experimental dosimetry. Statistical and systematic uncertainty analyses associated with each experimental technique were analyzed quantitatively using MCNPX 2.6‡ to evaluate source positional error, Tandem positional error, the source spectrum, phantom size effect, reproducibility, temperature and pressure effects, volume averaging, stem and wall effects, and Tandem effect. Absolute dose calculations for clinical use are based on Treatment Planning System (TPS) with no corrections for the above uncertainties. Absolute dose and uncertainties along the transverse plane were predicted for the A16 microchamber. The generated overall uncertainties are 22%, 17%, 15%, 15%, 16%, 17%, and 19% at 1cm, 2cm, 3cm, 4cm, and 5cm, respectively. Predicting the dose beyond 5cm is complicated due to low signal-to-noise ratio, cable effect, and stem effect for the A16 microchamber. Since dose beyond 5cm adds no clinical information, it has been ignored in this study. The absolute dose was predicted for the MOSFET detector from 1cm to 7cm along the transverse plane. The generated overall uncertainties are 23%, 11%, 8%, 7%, 7%, 9%, and 8% at 1cm, 2cm, 3cm