Sample records for pattern analysis techniques

  1. A technique for conducting point pattern analysis of cluster plot stem-maps

    Treesearch

    C.W. Woodall; J.M. Graham

    2004-01-01

    Point pattern analysis of forest inventory stem-maps may aid interpretation and inventory estimation of forest attributes. To evaluate the techniques and benefits of conducting point pattern analysis of forest inventory stem-maps, Ripley`s K(t) was calculated for simulated tree spatial distributions and for over 600 USDA Forest Service Forest...

  2. Physical vs. photolithographic patterning of plasma polymers: an investigation by ToF-SSIMS and multivariate analysis

    PubMed Central

    Mishra, Gautam; Easton, Christopher D.; McArthur, Sally L.

    2009-01-01

    Physical and photolithographic techniques are commonly used to create chemical patterns for a range of technologies including cell culture studies, bioarrays and other biomedical applications. In this paper, we describe the fabrication of chemical micropatterns from commonly used plasma polymers. Atomic force microcopy (AFM) imaging, Time-of-Flight Static Secondary Ion Mass Spectrometry (ToF-SSIMS) imaging and multivariate analysis have been employed to visualize the chemical boundaries created by these patterning techniques and assess the spatial and chemical resolution of the patterns. ToF-SSIMS analysis demonstrated that well defined chemical and spatial boundaries were obtained from photolithographic patterning, while the resolution of physical patterning via a transmission electron microscopy (TEM) grid varied depending on the properties of the plasma system including the substrate material. In general, physical masking allowed diffusion of the plasma species below the mask and bleeding of the surface chemistries. Multivariate analysis techniques including Principal Component Analysis (PCA) and Region of Interest (ROI) assessment were used to investigate the ToF-SSIMS images of a range of different plasma polymer patterns. In the most challenging case, where two strongly reacting polymers, allylamine and acrylic acid were deposited, PCA confirmed the fabrication of micropatterns with defined spatial resolution. ROI analysis allowed for the identification of an interface between the two plasma polymers for patterns fabricated using the photolithographic technique which has been previously overlooked. This study clearly demonstrated the versatility of photolithographic patterning for the production of multichemistry plasma polymer arrays and highlighted the need for complimentary characterization and analytical techniques during the fabrication plasma polymer micropatterns. PMID:19950941

  3. Fringe pattern demodulation with a two-frame digital phase-locked loop algorithm.

    PubMed

    Gdeisat, Munther A; Burton, David R; Lalor, Michael J

    2002-09-10

    A novel technique called a two-frame digital phase-locked loop for fringe pattern demodulation is presented. In this scheme, two fringe patterns with different spatial carrier frequencies are grabbed for an object. A digital phase-locked loop algorithm tracks and demodulates the phase difference between both fringe patterns by employing the wrapped phase components of one of the fringe patterns as a reference to demodulate the second fringe pattern. The desired phase information can be extracted from the demodulated phase difference. We tested the algorithm experimentally using real fringe patterns. The technique is shown to be suitable for noncontact measurement of objects with rapid surface variations, and it outperforms the Fourier fringe analysis technique in this aspect. Phase maps produced withthis algorithm are noisy in comparison with phase maps generated with the Fourier fringe analysis technique.

  4. Spectroscopic vector analysis for fast pattern quality monitoring

    NASA Astrophysics Data System (ADS)

    Sohn, Younghoon; Ryu, Sungyoon; Lee, Chihoon; Yang, Yusin

    2018-03-01

    In semiconductor industry, fast and effective measurement of pattern variation has been key challenge for assuring massproduct quality. Pattern measurement techniques such as conventional CD-SEMs or Optical CDs have been extensively used, but these techniques are increasingly limited in terms of measurement throughput and time spent in modeling. In this paper we propose time effective pattern monitoring method through the direct spectrum-based approach. In this technique, a wavelength band sensitive to a specific pattern change is selected from spectroscopic ellipsometry signal scattered by pattern to be measured, and the amplitude and phase variation in the wavelength band are analyzed as a measurement index of the pattern change. This pattern change measurement technique is applied to several process steps and verified its applicability. Due to its fast and simple analysis, the methods can be adapted to the massive process variation monitoring maximizing measurement throughput.

  5. Association mining of dependency between time series

    NASA Astrophysics Data System (ADS)

    Hafez, Alaaeldin

    2001-03-01

    Time series analysis is considered as a crucial component of strategic control over a broad variety of disciplines in business, science and engineering. Time series data is a sequence of observations collected over intervals of time. Each time series describes a phenomenon as a function of time. Analysis on time series data includes discovering trends (or patterns) in a time series sequence. In the last few years, data mining has emerged and been recognized as a new technology for data analysis. Data Mining is the process of discovering potentially valuable patterns, associations, trends, sequences and dependencies in data. Data mining techniques can discover information that many traditional business analysis and statistical techniques fail to deliver. In this paper, we adapt and innovate data mining techniques to analyze time series data. By using data mining techniques, maximal frequent patterns are discovered and used in predicting future sequences or trends, where trends describe the behavior of a sequence. In order to include different types of time series (e.g. irregular and non- systematic), we consider past frequent patterns of the same time sequences (local patterns) and of other dependent time sequences (global patterns). We use the word 'dependent' instead of the word 'similar' for emphasis on real life time series where two time series sequences could be completely different (in values, shapes, etc.), but they still react to the same conditions in a dependent way. In this paper, we propose the Dependence Mining Technique that could be used in predicting time series sequences. The proposed technique consists of three phases: (a) for all time series sequences, generate their trend sequences, (b) discover maximal frequent trend patterns, generate pattern vectors (to keep information of frequent trend patterns), use trend pattern vectors to predict future time series sequences.

  6. Point pattern analysis of FIA data

    Treesearch

    Chris Woodall

    2002-01-01

    Point pattern analysis is a branch of spatial statistics that quantifies the spatial distribution of points in two-dimensional space. Point pattern analysis was conducted on stand stem-maps from FIA fixed-radius plots to explore point pattern analysis techniques and to determine the ability of pattern descriptions to describe stand attributes. Results indicate that the...

  7. Application of pattern recognition techniques to crime analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bender, C.F.; Cox, L.A. Jr.; Chappell, G.A.

    1976-08-15

    The initial goal was to evaluate the capabilities of current pattern recognition techniques when applied to existing computerized crime data. Performance was to be evaluated both in terms of the system's capability to predict crimes and to optimize police manpower allocation. A relation was sought to predict the crime's susceptibility to solution, based on knowledge of the crime type, location, time, etc. The preliminary results of this work are discussed. They indicate that automatic crime analysis involving pattern recognition techniques is feasible, and that efforts to determine optimum variables and techniques are warranted. 47 figures (RWR)

  8. Data analysis techniques

    NASA Technical Reports Server (NTRS)

    Park, Steve

    1990-01-01

    A large and diverse number of computational techniques are routinely used to process and analyze remotely sensed data. These techniques include: univariate statistics; multivariate statistics; principal component analysis; pattern recognition and classification; other multivariate techniques; geometric correction; registration and resampling; radiometric correction; enhancement; restoration; Fourier analysis; and filtering. Each of these techniques will be considered, in order.

  9. Analysis of enamel rod end patterns on tooth surface for personal identification--ameloglyphics.

    PubMed

    Manjunath, Krishnappa; Sivapathasundharam, Balasundharam; Saraswathi, Thillai R

    2012-05-01

    Ameloglyphics is the study of enamel rod end patterns on a tooth surface. Our aim was to study the in vivo analysis of enamel rod end patterns on tooth surfaces for personal identification. In this study, the maxillary left canine and 1st premolar of 30 men and 30 women were included. The cellulose acetate peel technique was used to record enamel rod endings on tooth surfaces. Photomicrographs of the acetate peel imprint were subjected to VeriFinger Standard SDK v5.0 software for obtaining enamel rod end patterns. All 120 enamel rod end patterns were subjected to visual analysis and biometric analysis. Biometric analysis revealed that the enamel rod end pattern is unique for each tooth in an individual. It shows both intra- and interindividual variation. Enamel rod end patterns were unique between the male and female subjects. Visual analysis showed that wavy branched subpattern was the predominant subpattern observed among examined teeth. Hence, ameloglyphics is a reliable technique for personal identification. © 2012 American Academy of Forensic Sciences.

  10. Data Flow Analysis and Visualization for Spatiotemporal Statistical Data without Trajectory Information.

    PubMed

    Kim, Seokyeon; Jeong, Seongmin; Woo, Insoo; Jang, Yun; Maciejewski, Ross; Ebert, David S

    2018-03-01

    Geographic visualization research has focused on a variety of techniques to represent and explore spatiotemporal data. The goal of those techniques is to enable users to explore events and interactions over space and time in order to facilitate the discovery of patterns, anomalies and relationships within the data. However, it is difficult to extract and visualize data flow patterns over time for non-directional statistical data without trajectory information. In this work, we develop a novel flow analysis technique to extract, represent, and analyze flow maps of non-directional spatiotemporal data unaccompanied by trajectory information. We estimate a continuous distribution of these events over space and time, and extract flow fields for spatial and temporal changes utilizing a gravity model. Then, we visualize the spatiotemporal patterns in the data by employing flow visualization techniques. The user is presented with temporal trends of geo-referenced discrete events on a map. As such, overall spatiotemporal data flow patterns help users analyze geo-referenced temporal events, such as disease outbreaks, crime patterns, etc. To validate our model, we discard the trajectory information in an origin-destination dataset and apply our technique to the data and compare the derived trajectories and the original. Finally, we present spatiotemporal trend analysis for statistical datasets including twitter data, maritime search and rescue events, and syndromic surveillance.

  11. Fringe pattern demodulation with a two-dimensional digital phase-locked loop algorithm.

    PubMed

    Gdeisat, Munther A; Burton, David R; Lalor, Michael J

    2002-09-10

    A novel technique called a two-dimensional digital phase-locked loop (DPLL) for fringe pattern demodulation is presented. This algorithm is more suitable for demodulation of fringe patterns with varying phase in two directions than the existing DPLL techniques that assume that the phase of the fringe patterns varies only in one direction. The two-dimensional DPLL technique assumes that the phase of a fringe pattern is continuous in both directions and takes advantage of the phase continuity; consequently, the algorithm has better noise performance than the existing DPLL schemes. The two-dimensional DPLL algorithm is also suitable for demodulation of fringe patterns with low sampling rates, and it outperforms the Fourier fringe analysis technique in this aspect.

  12. Computer aided fringe pattern analysis

    NASA Astrophysics Data System (ADS)

    Sciammarella, Cesar A.

    The paper reviews the basic laws of fringe pattern interpretation. The different techniques that are currently utilized are presented using a common frame of reference stressing the fact that these techniques are different variations of the same basic principle. Digital and analog techniques are discussed. Currently available hardware is presented and the relationships between hardware and the operations of pattern fringe processing are pointed out. Examples are given to illustrate the ideas discussed in the paper.

  13. Development of neural network techniques for finger-vein pattern classification

    NASA Astrophysics Data System (ADS)

    Wu, Jian-Da; Liu, Chiung-Tsiung; Tsai, Yi-Jang; Liu, Jun-Ching; Chang, Ya-Wen

    2010-02-01

    A personal identification system using finger-vein patterns and neural network techniques is proposed in the present study. In the proposed system, the finger-vein patterns are captured by a device that can transmit near infrared through the finger and record the patterns for signal analysis and classification. The biometric system for verification consists of a combination of feature extraction using principal component analysis and pattern classification using both back-propagation network and adaptive neuro-fuzzy inference systems. Finger-vein features are first extracted by principal component analysis method to reduce the computational burden and removes noise residing in the discarded dimensions. The features are then used in pattern classification and identification. To verify the effect of the proposed adaptive neuro-fuzzy inference system in the pattern classification, the back-propagation network is compared with the proposed system. The experimental results indicated the proposed system using adaptive neuro-fuzzy inference system demonstrated a better performance than the back-propagation network for personal identification using the finger-vein patterns.

  14. Automatic Generation of English-Japanese Translation Pattern Utilizing Genetic Programming Technique

    NASA Astrophysics Data System (ADS)

    Matsumura, Koki; Tamekuni, Yuji; Kimura, Shuhei

    There are a lot of constructional differences in an English-Japanese phrase template, and that often makes the act of translation difficult. Moreover, there exist various and tremendous phrase templates and sentence to be refered to. It is not easy to prepare the corpus that covers the all. Therefore, it is very significant to generate the translation pattern of the sentence pattern automatically from a viewpoint of the translation success rate and the capacity of the pattern dictionary. Then, for the purpose of realizing the automatic generation of the translation pattern, this paper proposed the new method for the generation of the translation pattern by using the genetic programming technique (GP). The technique tries to generate the translation pattern of various sentences which are not registered in the phrase template dictionary automatically by giving the genetic operation to the parsing tree of a basic pattern. The tree consists of the pair of the English-Japanese sentence generated as the first stage population. The analysis tree data base with 50,100,150,200 pairs was prepared as the first stage population. And this system was applied and executed for an English input of 1,555 sentences. As a result, the analysis tree increases from 200 to 517, and the accuracy rate of the translation pattern has improved from 42.57% to 70.10%. And, 86.71% of the generated translations was successfully done, whose meanings are enough acceptable and understandable. It seemed that this proposal technique became a clue to raise the translation success rate, and to find the possibility of the reduction of the analysis tree data base.

  15. Data Mining Techniques Applied to Hydrogen Lactose Breath Test.

    PubMed

    Rubio-Escudero, Cristina; Valverde-Fernández, Justo; Nepomuceno-Chamorro, Isabel; Pontes-Balanza, Beatriz; Hernández-Mendoza, Yoedusvany; Rodríguez-Herrera, Alfonso

    2017-01-01

    Analyze a set of data of hydrogen breath tests by use of data mining tools. Identify new patterns of H2 production. Hydrogen breath tests data sets as well as k-means clustering as the data mining technique to a dataset of 2571 patients. Six different patterns have been extracted upon analysis of the hydrogen breath test data. We have also shown the relevance of each of the samples taken throughout the test. Analysis of the hydrogen breath test data sets using data mining techniques has identified new patterns of hydrogen generation upon lactose absorption. We can see the potential of application of data mining techniques to clinical data sets. These results offer promising data for future research on the relations between gut microbiota produced hydrogen and its link to clinical symptoms.

  16. Development of Pattern Recognition Techniques for the Evaluation of Toxicant Impacts to Multispecies Systems

    DTIC Science & Technology

    1993-06-18

    the exception. In the Standardized Aquatic Microcosm and the Mixed Flask Culture (MFC) microcosms, multivariate analysis and clustering methods...rule rather than the exception. In the Standardized Aquatic Microcosm and the Mixed Flask Culture (MFC) microcosms, multivariate analysis and...experiments using two microcosm protocols. We use nonmetric clustering, a multivariate pattern recognition technique developed by Matthews and Heame (1991

  17. Using pattern recognition as a method for predicting extreme events in natural and socio-economic systems

    NASA Astrophysics Data System (ADS)

    Intriligator, M.

    2011-12-01

    Vladimir (Volodya) Keilis-Borok has pioneered the use of pattern recognition as a technique for analyzing and forecasting developments in natural as well as socio-economic systems. Keilis-Borok's work on predicting earthquakes and landslides using this technique as a leading geophysicist has been recognized around the world. Keilis-Borok has also been a world leader in the application of pattern recognition techniques to the analysis and prediction of socio-economic systems. He worked with Allan Lichtman of American University in using such techniques to predict presidential elections in the U.S. Keilis-Borok and I have worked together with others on the use of pattern recognition techniques to analyze and to predict socio-economic systems. We have used this technique to study the pattern of macroeconomic indicators that would predict the end of an economic recession in the U.S. We have also worked with officers in the Los Angeles Police Department to use this technique to predict surges of homicides in Los Angeles.

  18. Antenna analysis using neural networks

    NASA Technical Reports Server (NTRS)

    Smith, William T.

    1992-01-01

    Conventional computing schemes have long been used to analyze problems in electromagnetics (EM). The vast majority of EM applications require computationally intensive algorithms involving numerical integration and solutions to large systems of equations. The feasibility of using neural network computing algorithms for antenna analysis is investigated. The ultimate goal is to use a trained neural network algorithm to reduce the computational demands of existing reflector surface error compensation techniques. Neural networks are computational algorithms based on neurobiological systems. Neural nets consist of massively parallel interconnected nonlinear computational elements. They are often employed in pattern recognition and image processing problems. Recently, neural network analysis has been applied in the electromagnetics area for the design of frequency selective surfaces and beam forming networks. The backpropagation training algorithm was employed to simulate classical antenna array synthesis techniques. The Woodward-Lawson (W-L) and Dolph-Chebyshev (D-C) array pattern synthesis techniques were used to train the neural network. The inputs to the network were samples of the desired synthesis pattern. The outputs are the array element excitations required to synthesize the desired pattern. Once trained, the network is used to simulate the W-L or D-C techniques. Various sector patterns and cosecant-type patterns (27 total) generated using W-L synthesis were used to train the network. Desired pattern samples were then fed to the neural network. The outputs of the network were the simulated W-L excitations. A 20 element linear array was used. There were 41 input pattern samples with 40 output excitations (20 real parts, 20 imaginary). A comparison between the simulated and actual W-L techniques is shown for a triangular-shaped pattern. Dolph-Chebyshev is a different class of synthesis technique in that D-C is used for side lobe control as opposed to pattern shaping. The interesting thing about D-C synthesis is that the side lobes have the same amplitude. Five-element arrays were used. Again, 41 pattern samples were used for the input. Nine actual D-C patterns ranging from -10 dB to -30 dB side lobe levels were used to train the network. A comparison between simulated and actual D-C techniques for a pattern with -22 dB side lobe level is shown. The goal for this research was to evaluate the performance of neural network computing with antennas. Future applications will employ the backpropagation training algorithm to drastically reduce the computational complexity involved in performing EM compensation for surface errors in large space reflector antennas.

  19. Antenna analysis using neural networks

    NASA Astrophysics Data System (ADS)

    Smith, William T.

    1992-09-01

    Conventional computing schemes have long been used to analyze problems in electromagnetics (EM). The vast majority of EM applications require computationally intensive algorithms involving numerical integration and solutions to large systems of equations. The feasibility of using neural network computing algorithms for antenna analysis is investigated. The ultimate goal is to use a trained neural network algorithm to reduce the computational demands of existing reflector surface error compensation techniques. Neural networks are computational algorithms based on neurobiological systems. Neural nets consist of massively parallel interconnected nonlinear computational elements. They are often employed in pattern recognition and image processing problems. Recently, neural network analysis has been applied in the electromagnetics area for the design of frequency selective surfaces and beam forming networks. The backpropagation training algorithm was employed to simulate classical antenna array synthesis techniques. The Woodward-Lawson (W-L) and Dolph-Chebyshev (D-C) array pattern synthesis techniques were used to train the neural network. The inputs to the network were samples of the desired synthesis pattern. The outputs are the array element excitations required to synthesize the desired pattern. Once trained, the network is used to simulate the W-L or D-C techniques. Various sector patterns and cosecant-type patterns (27 total) generated using W-L synthesis were used to train the network. Desired pattern samples were then fed to the neural network. The outputs of the network were the simulated W-L excitations. A 20 element linear array was used. There were 41 input pattern samples with 40 output excitations (20 real parts, 20 imaginary).

  20. Proceedings of the Second Annual Symposium on Mathematical Pattern Recognition and Image Analysis Program

    NASA Technical Reports Server (NTRS)

    Guseman, L. F., Jr. (Principal Investigator)

    1984-01-01

    Several papers addressing image analysis and pattern recognition techniques for satellite imagery are presented. Texture classification, image rectification and registration, spatial parameter estimation, and surface fitting are discussed.

  1. Photomorphic analysis techniques: An interim spatial analysis using satellite remote sensor imagery and historical data

    NASA Technical Reports Server (NTRS)

    Keuper, H. R.; Peplies, R. W.; Gillooly, R. P.

    1977-01-01

    The use of machine scanning and/or computer-based techniques to provide greater objectivity in the photomorphic approach was investigated. Photomorphic analysis and its application in regional planning are discussed. Topics included: delineation of photomorphic regions; inadequacies of existing classification systems; tonal and textural characteristics and signature analysis techniques; pattern recognition and Fourier transform analysis; and optical experiments. A bibliography is included.

  2. The Use of Satellite Observed Cloud Patterns in Northern Hemisphere 300 mb and 1000/300 mb Numerical Analysis.

    DTIC Science & Technology

    1984-02-01

    prediction Extratropical cyclones Objective analysis Bogus techniques 20. ABSTRACT (Continue on reverse aide If necooearn mid Identify by block number) Jh A...quasi-objective statistical method for deriving 300 mb geopotential heights and 1000/300 mb thicknesses in the vicinity of extratropical cyclones 0I...with the aid of satellite imagery is presented. The technique utilizes satellite observed extratropical spiral cloud pattern parameters in conjunction

  3. Synchronous Stroboscopic Electronic Speckle Pattern Interferometry

    NASA Astrophysics Data System (ADS)

    Soares, Oliverio D. D.

    1986-10-01

    Electronic Speckle Pattern Interferometry (E.S.P.I) oftenly called Electronic Holography is a practical powerful technique in non-destructive testing. Practical capabilities of the technique have been improved by fringe betterment and the control of analysis in the time domain, in particular, the scanning of the vibration cycle, with introduction of: synchronized amplitude and phase modulated pulse illumination, microcomputer control, fibre optics design, and moire evaluation techniques.

  4. Spatial patterns in vegetation fires in the Indian region.

    PubMed

    Vadrevu, Krishna Prasad; Badarinath, K V S; Anuradha, Eaturu

    2008-12-01

    In this study, we used fire count datasets derived from Along Track Scanning Radiometer (ATSR) satellite to characterize spatial patterns in fire occurrences across highly diverse geographical, vegetation and topographic gradients in the Indian region. For characterizing the spatial patterns of fire occurrences, observed fire point patterns were tested against the hypothesis of a complete spatial random (CSR) pattern using three different techniques, the quadrat analysis, nearest neighbor analysis and Ripley's K function. Hierarchical nearest neighboring technique was used to depict the 'hotspots' of fire incidents. Of the different states, highest fire counts were recorded in Madhya Pradesh (14.77%) followed by Gujarat (10.86%), Maharastra (9.92%), Mizoram (7.66%), Jharkhand (6.41%), etc. With respect to the vegetation categories, highest number of fires were recorded in agricultural regions (40.26%) followed by tropical moist deciduous vegetation (12.72), dry deciduous vegetation (11.40%), abandoned slash and burn secondary forests (9.04%), tropical montane forests (8.07%) followed by others. Analysis of fire counts based on elevation and slope range suggested that maximum number of fires occurred in low and medium elevation types and in very low to low-slope categories. Results from three different spatial techniques for spatial pattern suggested clustered pattern in fire events compared to CSR. Most importantly, results from Ripley's K statistic suggested that fire events are highly clustered at a lag-distance of 125 miles. Hierarchical nearest neighboring clustering technique identified significant clusters of fire 'hotspots' in different states in northeast and central India. The implications of these results in fire management and mitigation were discussed. Also, this study highlights the potential of spatial point pattern statistics in environmental monitoring and assessment studies with special reference to fire events in the Indian region.

  5. Changes in frontal plane dynamics and the loading response phase of the gait cycle are characteristic of severe knee osteoarthritis application of a multidimensional analysis technique.

    PubMed

    Astephen, J L; Deluzio, K J

    2005-02-01

    Osteoarthritis of the knee is related to many correlated mechanical factors that can be measured with gait analysis. Gait analysis results in large data sets. The analysis of these data is difficult due to the correlated, multidimensional nature of the measures. A multidimensional model that uses two multivariate statistical techniques, principal component analysis and discriminant analysis, was used to discriminate between the gait patterns of the normal subject group and the osteoarthritis subject group. Nine time varying gait measures and eight discrete measures were included in the analysis. All interrelationships between and within the measures were retained in the analysis. The multidimensional analysis technique successfully separated the gait patterns of normal and knee osteoarthritis subjects with a misclassification error rate of <6%. The most discriminatory feature described a static and dynamic alignment factor. The second most discriminatory feature described a gait pattern change during the loading response phase of the gait cycle. The interrelationships between gait measures and between the time instants of the gait cycle can provide insight into the mechanical mechanisms of pathologies such as knee osteoarthritis. These results suggest that changes in frontal plane loading and alignment and the loading response phase of the gait cycle are characteristic of severe knee osteoarthritis gait patterns. Subsequent investigations earlier in the disease process may suggest the importance of these factors to the progression of knee osteoarthritis.

  6. Predicting Effective Course Conduction Strategy Using Datamining Techniques

    ERIC Educational Resources Information Center

    Parkavi, A.; Lakshmi, K.; Srinivasa, K. G.

    2017-01-01

    Data analysis techniques can be used to analyze the pattern of data in different fields. Based on the analysis' results, it is recommended that suggestions be provided to decision making authorities. The data mining techniques can be used in educational domain to improve the outcome of the educational sectors. The authors carried out this research…

  7. An algol program for dissimilarity analysis: a divisive-omnithetic clustering technique

    USGS Publications Warehouse

    Tipper, J.C.

    1979-01-01

    Clustering techniques are used properly to generate hypotheses about patterns in data. Of the hierarchical techniques, those which are divisive and omnithetic possess many theoretically optimal properties. One such method, dissimilarity analysis, is implemented here in ALGOL 60, and determined to be competitive computationally with most other methods. ?? 1979.

  8. Analysis of DNA Cytosine Methylation Patterns Using Methylation-Sensitive Amplification Polymorphism (MSAP).

    PubMed

    Guevara, María Ángeles; de María, Nuria; Sáez-Laguna, Enrique; Vélez, María Dolores; Cervera, María Teresa; Cabezas, José Antonio

    2017-01-01

    Different molecular techniques have been developed to study either the global level of methylated cytosines or methylation at specific gene sequences. One of them is the methylation-sensitive amplified polymorphism technique (MSAP) which is a modification of amplified fragment length polymorphism (AFLP). It has been used to study methylation of anonymous CCGG sequences in different fungi, plants, and animal species. The main variation of this technique resides on the use of isoschizomers with different methylation sensitivity (such as HpaII and MspI) as a frequent-cutter restriction enzyme. For each sample, MSAP analysis is performed using both EcoRI/HpaII- and EcoRI/MspI-digested samples. A comparative analysis between EcoRI/HpaII and EcoRI/MspI fragment patterns allows the identification of two types of polymorphisms: (1) methylation-insensitive polymorphisms that show common EcoRI/HpaII and EcoRI/MspI patterns but are detected as polymorphic amplified fragments among samples and (2) methylation-sensitive polymorphisms which are associated with the amplified fragments that differ in their presence or absence or in their intensity between EcoRI/HpaII and EcoRI/MspI patterns. This chapter describes a detailed protocol of this technique and discusses the modifications that can be applied to adjust the technology to different species of interest.

  9. Cluster analysis and subgrouping to investigate inter-individual variability to non-invasive brain stimulation: a systematic review.

    PubMed

    Pellegrini, Michael; Zoghi, Maryam; Jaberzadeh, Shapour

    2018-01-12

    Cluster analysis and other subgrouping techniques have risen in popularity in recent years in non-invasive brain stimulation research in the attempt to investigate the issue of inter-individual variability - the issue of why some individuals respond, as traditionally expected, to non-invasive brain stimulation protocols and others do not. Cluster analysis and subgrouping techniques have been used to categorise individuals, based on their response patterns, as responder or non-responders. There is, however, a lack of consensus and consistency on the most appropriate technique to use. This systematic review aimed to provide a systematic summary of the cluster analysis and subgrouping techniques used to date and suggest recommendations moving forward. Twenty studies were included that utilised subgrouping techniques, while seven of these additionally utilised cluster analysis techniques. The results of this systematic review appear to indicate that statistical cluster analysis techniques are effective in identifying subgroups of individuals based on response patterns to non-invasive brain stimulation. This systematic review also reports a lack of consensus amongst researchers on the most effective subgrouping technique and the criteria used to determine whether an individual is categorised as a responder or a non-responder. This systematic review provides a step-by-step guide to carrying out statistical cluster analyses and subgrouping techniques to provide a framework for analysis when developing further insights into the contributing factors of inter-individual variability in response to non-invasive brain stimulation.

  10. Two dimensional Fourier transform methods for fringe pattern analysis

    NASA Astrophysics Data System (ADS)

    Sciammarella, C. A.; Bhat, G.

    An overview of the use of FFTs for fringe pattern analysis is presented, with emphasis on fringe patterns containing displacement information. The techniques are illustrated via analysis of the displacement and strain distributions in the direction perpendicular to the loading, in a disk under diametral compression. The experimental strain distribution is compared to the theoretical, and the agreement is found to be excellent in regions where the elasticity solution models well the actual problem.

  11. Principal component analysis of normalized full spectrum mass spectrometry data in multiMS-toolbox: An effective tool to identify important factors for classification of different metabolic patterns and bacterial strains.

    PubMed

    Cejnar, Pavel; Kuckova, Stepanka; Prochazka, Ales; Karamonova, Ludmila; Svobodova, Barbora

    2018-06-15

    Explorative statistical analysis of mass spectrometry data is still a time-consuming step. We analyzed critical factors for application of principal component analysis (PCA) in mass spectrometry and focused on two whole spectrum based normalization techniques and their application in the analysis of registered peak data and, in comparison, in full spectrum data analysis. We used this technique to identify different metabolic patterns in the bacterial culture of Cronobacter sakazakii, an important foodborne pathogen. Two software utilities, the ms-alone, a python-based utility for mass spectrometry data preprocessing and peak extraction, and the multiMS-toolbox, an R software tool for advanced peak registration and detailed explorative statistical analysis, were implemented. The bacterial culture of Cronobacter sakazakii was cultivated on Enterobacter sakazakii Isolation Agar, Blood Agar Base and Tryptone Soya Agar for 24 h and 48 h and applied by the smear method on an Autoflex speed MALDI-TOF mass spectrometer. For three tested cultivation media only two different metabolic patterns of Cronobacter sakazakii were identified using PCA applied on data normalized by two different normalization techniques. Results from matched peak data and subsequent detailed full spectrum analysis identified only two different metabolic patterns - a cultivation on Enterobacter sakazakii Isolation Agar showed significant differences to the cultivation on the other two tested media. The metabolic patterns for all tested cultivation media also proved the dependence on cultivation time. Both whole spectrum based normalization techniques together with the full spectrum PCA allow identification of important discriminative factors in experiments with several variable condition factors avoiding any problems with improper identification of peaks or emphasis on bellow threshold peak data. The amounts of processed data remain still manageable. Both implemented software utilities are available free of charge from http://uprt.vscht.cz/ms. Copyright © 2018 John Wiley & Sons, Ltd.

  12. Analysis of chemical signals in red fire ants by gas chromatography and pattern recognition techniques

    USDA-ARS?s Scientific Manuscript database

    The combination of gas chromatography and pattern recognition (GC/PR) analysis is a powerful tool for investigating complicated biological problems. Clustering, mapping, discriminant development, etc. are necessary to analyze realistically large chromatographic data sets and to seek meaningful relat...

  13. Pattern recognition of satellite cloud imagery for improved weather prediction

    NASA Technical Reports Server (NTRS)

    Gautier, Catherine; Somerville, Richard C. J.; Volfson, Leonid B.

    1986-01-01

    The major accomplishment was the successful development of a method for extracting time derivative information from geostationary meteorological satellite imagery. This research is a proof-of-concept study which demonstrates the feasibility of using pattern recognition techniques and a statistical cloud classification method to estimate time rate of change of large-scale meteorological fields from remote sensing data. The cloud classification methodology is based on typical shape function analysis of parameter sets characterizing the cloud fields. The three specific technical objectives, all of which were successfully achieved, are as follows: develop and test a cloud classification technique based on pattern recognition methods, suitable for the analysis of visible and infrared geostationary satellite VISSR imagery; develop and test a methodology for intercomparing successive images using the cloud classification technique, so as to obtain estimates of the time rate of change of meteorological fields; and implement this technique in a testbed system incorporating an interactive graphics terminal to determine the feasibility of extracting time derivative information suitable for comparison with numerical weather prediction products.

  14. Application of optical correlation techniques to particle imaging velocimetry

    NASA Technical Reports Server (NTRS)

    Wernet, Mark P.; Edwards, Robert V.

    1988-01-01

    Pulsed laser sheet velocimetry yields nonintrusive measurements of velocity vectors across an extended 2-dimensional region of the flow field. The application of optical correlation techniques to the analysis of multiple exposure laser light sheet photographs can reduce and/or simplify the data reduction time and hardware. Here, Matched Spatial Filters (MSF) are used in a pattern recognition system. Usually MSFs are used to identify the assembly line parts. In this application, the MSFs are used to identify the iso-velocity vector contours in the flow. The patterns to be recognized are the recorded particle images in a pulsed laser light sheet photograph. Measurement of the direction of the partical image displacements between exposures yields the velocity vector. The particle image exposure sequence is designed such that the velocity vector direction is determined unambiguously. A global analysis technique is used in comparison to the more common particle tracking algorithms and Young's fringe analysis technique.

  15. Inferring common cognitive mechanisms from brain blood-flow lateralization data: a new methodology for fTCD analysis.

    PubMed

    Meyer, Georg F; Spray, Amy; Fairlie, Jo E; Uomini, Natalie T

    2014-01-01

    Current neuroimaging techniques with high spatial resolution constrain participant motion so that many natural tasks cannot be carried out. The aim of this paper is to show how a time-locked correlation-analysis of cerebral blood flow velocity (CBFV) lateralization data, obtained with functional TransCranial Doppler (fTCD) ultrasound, can be used to infer cerebral activation patterns across tasks. In a first experiment we demonstrate that the proposed analysis method results in data that are comparable with the standard Lateralization Index (LI) for within-task comparisons of CBFV patterns, recorded during cued word generation (CWG) at two difficulty levels. In the main experiment we demonstrate that the proposed analysis method shows correlated blood-flow patterns for two different cognitive tasks that are known to draw on common brain areas, CWG, and Music Synthesis. We show that CBFV patterns for Music and CWG are correlated only for participants with prior musical training. CBFV patterns for tasks that draw on distinct brain areas, the Tower of London and CWG, are not correlated. The proposed methodology extends conventional fTCD analysis by including temporal information in the analysis of cerebral blood-flow patterns to provide a robust, non-invasive method to infer whether common brain areas are used in different cognitive tasks. It complements conventional high resolution imaging techniques.

  16. The role of chemometrics in single and sequential extraction assays: a review. Part II. Cluster analysis, multiple linear regression, mixture resolution, experimental design and other techniques.

    PubMed

    Giacomino, Agnese; Abollino, Ornella; Malandrino, Mery; Mentasti, Edoardo

    2011-03-04

    Single and sequential extraction procedures are used for studying element mobility and availability in solid matrices, like soils, sediments, sludge, and airborne particulate matter. In the first part of this review we reported an overview on these procedures and described the applications of chemometric uni- and bivariate techniques and of multivariate pattern recognition techniques based on variable reduction to the experimental results obtained. The second part of the review deals with the use of chemometrics not only for the visualization and interpretation of data, but also for the investigation of the effects of experimental conditions on the response, the optimization of their values and the calculation of element fractionation. We will describe the principles of the multivariate chemometric techniques considered, the aims for which they were applied and the key findings obtained. The following topics will be critically addressed: pattern recognition by cluster analysis (CA), linear discriminant analysis (LDA) and other less common techniques; modelling by multiple linear regression (MLR); investigation of spatial distribution of variables by geostatistics; calculation of fractionation patterns by a mixture resolution method (Chemometric Identification of Substrates and Element Distributions, CISED); optimization and characterization of extraction procedures by experimental design; other multivariate techniques less commonly applied. Copyright © 2010 Elsevier B.V. All rights reserved.

  17. Analysis of XFEL serial diffraction data from individual crystalline fibrils

    PubMed Central

    Wojtas, David H.; Ayyer, Kartik; Liang, Mengning; Mossou, Estelle; Romoli, Filippo; Seuring, Carolin; Beyerlein, Kenneth R.; Bean, Richard J.; Morgan, Andrew J.; Oberthuer, Dominik; Fleckenstein, Holger; Heymann, Michael; Gati, Cornelius; Yefanov, Oleksandr; Barthelmess, Miriam; Ornithopoulou, Eirini; Galli, Lorenzo; Xavier, P. Lourdu; Ling, Wai Li; Frank, Matthias; Yoon, Chun Hong; White, Thomas A.; Bajt, Saša; Mitraki, Anna; Boutet, Sebastien; Aquila, Andrew; Barty, Anton; Forsyth, V. Trevor; Chapman, Henry N.; Millane, Rick P.

    2017-01-01

    Serial diffraction data collected at the Linac Coherent Light Source from crystalline amyloid fibrils delivered in a liquid jet show that the fibrils are well oriented in the jet. At low fibril concentrations, diffraction patterns are recorded from single fibrils; these patterns are weak and contain only a few reflections. Methods are developed for determining the orientation of patterns in reciprocal space and merging them in three dimensions. This allows the individual structure amplitudes to be calculated, thus overcoming the limitations of orientation and cylindrical averaging in conventional fibre diffraction analysis. The advantages of this technique should allow structural studies of fibrous systems in biology that are inaccessible using existing techniques. PMID:29123682

  18. Study of sea ice in the Sea of Okhotsk and its influence on the Oyashio current

    NASA Technical Reports Server (NTRS)

    Watanabe, K.; Kuroda, R.; Hata, K.; Akagawa, M. (Principal Investigator)

    1975-01-01

    The author has identified the following significant results. Two photographic techniques were applied to Skylab S190A multispectral pictures for extracting oceanic patterns at the sea surface separately from cloud patterns. One is the image-masking technique and another a stereographic analysis. The extracted oceanic patterns were interpreted as areas where the amount, or the concentration of phytoplankton was high by utilizing surface data of water temperature, ocean current by GEK, and microplankton.

  19. Unsupervised EEG analysis for automated epileptic seizure detection

    NASA Astrophysics Data System (ADS)

    Birjandtalab, Javad; Pouyan, Maziyar Baran; Nourani, Mehrdad

    2016-07-01

    Epilepsy is a neurological disorder which can, if not controlled, potentially cause unexpected death. It is extremely crucial to have accurate automatic pattern recognition and data mining techniques to detect the onset of seizures and inform care-givers to help the patients. EEG signals are the preferred biosignals for diagnosis of epileptic patients. Most of the existing pattern recognition techniques used in EEG analysis leverage the notion of supervised machine learning algorithms. Since seizure data are heavily under-represented, such techniques are not always practical particularly when the labeled data is not sufficiently available or when disease progression is rapid and the corresponding EEG footprint pattern will not be robust. Furthermore, EEG pattern change is highly individual dependent and requires experienced specialists to annotate the seizure and non-seizure events. In this work, we present an unsupervised technique to discriminate seizures and non-seizures events. We employ power spectral density of EEG signals in different frequency bands that are informative features to accurately cluster seizure and non-seizure events. The experimental results tried so far indicate achieving more than 90% accuracy in clustering seizure and non-seizure events without having any prior knowledge on patient's history.

  20. Techniques for generation of control and guidance signals derived from optical fields, part 2

    NASA Technical Reports Server (NTRS)

    Hemami, H.; Mcghee, R. B.; Gardner, S. R.

    1971-01-01

    The development is reported of a high resolution technique for the detection and identification of landmarks from spacecraft optical fields. By making use of nonlinear regression analysis, a method is presented whereby a sequence of synthetic images produced by a digital computer can be automatically adjusted to provide a least squares approximation to a real image. The convergence of the method is demonstrated by means of a computer simulation for both elliptical and rectangular patterns. Statistical simulation studies with elliptical and rectangular patterns show that the computational techniques developed are able to at least match human pattern recognition capabilities, even in the presence of large amounts of noise. Unlike most pattern recognition techniques, this ability is unaffected by arbitrary pattern rotation, translation, and scale change. Further development of the basic approach may eventually allow a spacecraft or robot vehicle to be provided with an ability to very accurately determine its spatial relationship to arbitrary known objects within its optical field of view.

  1. Nondestructive evaluation of turbine blades vibrating in resonant modes

    NASA Astrophysics Data System (ADS)

    Sciammarella, Cesar A.; Ahmadshahi, Mansour A.

    1991-12-01

    The paper presents the analysis of the strain distribution of turbine blades. The holographic moire technique is used in conjunction with computer analysis of the fringes. The application of computer fringe analysis technique reduces the number of holograms to be recorded to two. Stroboscopic illumination is used to record the patterns. Strains and stresses are computed.

  2. A perturbation analysis of a mechanical model for stable spatial patterning in embryology

    NASA Astrophysics Data System (ADS)

    Bentil, D. E.; Murray, J. D.

    1992-12-01

    We investigate a mechanical cell-traction mechanism that generates stationary spatial patterns. A linear analysis highlights the model's potential for these heterogeneous solutions. We use multiple-scale perturbation techniques to study the evolution of these solutions and compare our solutions with numerical simulations of the model system. We discuss some potential biological applications among which are the formation of ridge patterns, dermatoglyphs, and wound healing.

  3. Constellation of phase singularities in a speckle-like pattern for optical vortex metrology applied to biological kinematic analysis.

    PubMed

    Wang, Wei; Qiao, Yu; Ishijima, Reika; Yokozeki, Tomoaki; Honda, Daigo; Matsuda, Akihiro; Hanson, Steen G; Takeda, Mitsuo

    2008-09-01

    A novel technique for biological kinematic analysis is proposed that makes use of the pseudophase singularities in a complex signal generated from a speckle-like pattern. In addition to the information about the locations and the anisotropic core structures of the pseudophase singularities, we also detect the spatial structures of a cluster of phase singularities, which serves as a unique constellation characterizing the mutual position relation between the individual pseudophase singularities. Experimental results of in vivo measurements for a swimming fish along with its kinematic analysis are presented, which demonstrate the validity of the proposed technique.

  4. Fourier Analysis and the Rhythm of Conversation.

    ERIC Educational Resources Information Center

    Dabbs, James M., Jr.

    Fourier analysis, a common technique in engineering, breaks down a complex wave form into its simple sine wave components. Communication researchers have recently suggested that this technique may provide an index of the rhythm of conversation, since vocalizing and pausing produce a complex wave form pattern of alternation between two speakers. To…

  5. Identifying scales of pattern in ecological data: a comparison of lacunarity, spectral and wavelet analyses

    Treesearch

    Sari C. Saunders; Jiquan Chen; Thomas D. Drummer; Eric J. Gustafson; Kimberley D. Brosofske

    2005-01-01

    Identifying scales of pattern in ecological systems and coupling patterns to processes that create them are ongoing challenges. We examined the utility of three techniques (lacunarity, spectral, and wavelet analysis) for detecting scales of pattern of ecological data. We compared the information obtained using these methods for four datasets, including: surface...

  6. Analysis of XFEL serial diffraction data from individual crystalline fibrils

    DOE PAGES

    Wojtas, David H.; Ayyer, Kartik; Liang, Mengning; ...

    2017-10-20

    Serial diffraction data collected at the Linac Coherent Light Source from crystalline amyloid fibrils delivered in a liquid jet show that the fibrils are well oriented in the jet. At low fibril concentrations, diffraction patterns are recorded from single fibrils; these patterns are weak and contain only a few reflections. Methods are developed for determining the orientation of patterns in reciprocal space and merging them in three dimensions. This allows the individual structure amplitudes to be calculated, thus overcoming the limitations of orientation and cylindrical averaging in conventional fibre diffraction analysis. In conclusion, the advantages of this technique should allowmore » structural studies of fibrous systems in biology that are inaccessible using existing techniques.« less

  7. Analysis of XFEL serial diffraction data from individual crystalline fibrils

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wojtas, David H.; Ayyer, Kartik; Liang, Mengning

    Serial diffraction data collected at the Linac Coherent Light Source from crystalline amyloid fibrils delivered in a liquid jet show that the fibrils are well oriented in the jet. At low fibril concentrations, diffraction patterns are recorded from single fibrils; these patterns are weak and contain only a few reflections. Methods are developed for determining the orientation of patterns in reciprocal space and merging them in three dimensions. This allows the individual structure amplitudes to be calculated, thus overcoming the limitations of orientation and cylindrical averaging in conventional fibre diffraction analysis. In conclusion, the advantages of this technique should allowmore » structural studies of fibrous systems in biology that are inaccessible using existing techniques.« less

  8. Visual cluster analysis and pattern recognition methods

    DOEpatents

    Osbourn, Gordon Cecil; Martinez, Rubel Francisco

    2001-01-01

    A method of clustering using a novel template to define a region of influence. Using neighboring approximation methods, computation times can be significantly reduced. The template and method are applicable and improve pattern recognition techniques.

  9. Collected Notes on the Workshop for Pattern Discovery in Large Databases

    NASA Technical Reports Server (NTRS)

    Buntine, Wray (Editor); Delalto, Martha (Editor)

    1991-01-01

    These collected notes are a record of material presented at the Workshop. The core data analysis is addressed that have traditionally required statistical or pattern recognition techniques. Some of the core tasks include classification, discrimination, clustering, supervised and unsupervised learning, discovery and diagnosis, i.e., general pattern discovery.

  10. PATTERN PREDICTION OF ACADEMIC SUCCESS.

    ERIC Educational Resources Information Center

    LUNNEBORG, CLIFFORD E.; LUNNEBORG, PATRICIA W.

    A TECHNIQUE OF PATTERN ANALYSIS WHICH EMPHASIZES THE DEVELOPMENT OF MORE EFFECTIVE WAYS OF SCORING A GIVEN SET OF VARIABLES WAS FORMULATED. TO THE ORIGINAL VARIABLES WERE SUCCESSIVELY ADDED TWO, THREE, AND FOUR VARIABLE PATTERNS AND THE INCREASE IN PREDICTIVE EFFICIENCY ASSESSED. RANDOMLY SELECTED HIGH SCHOOL SENIORS WHO HAD PARTICIPATED IN THE…

  11. Syntactic/semantic techniques for feature description and character recognition

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gonzalez, R.C.

    1983-01-01

    The Pattern Analysis Branch, Mapping, Charting and Geodesy (MC/G) Division, of the Naval Ocean Research and Development Activity (NORDA) has been involved over the past several years in the development of algorithms and techniques for computer recognition of free-form handprinted symbols as they appear on the Defense Mapping Agency (DMA) maps and charts. NORDA has made significant contributions to the automation of MC/G through advancing the state of the art in such information extraction techniques. In particular, new concepts in character (symbol) skeletonization, rugged feature measurements, and expert system-oriented decision logic have allowed the development of a very high performancemore » Handprinted Symbol Recognition (HSR) system for identifying depth soundings from naval smooth sheets (accuracies greater than 99.5%). The study reported in this technical note is part of NORDA's continuing research and development in pattern and shape analysis as it applies to Navy and DMA ocean/environment problems. The issue addressed in this technical note deals with emerging areas of syntactic and semantic techniques in pattern recognition as they might apply to the free-form symbol problem.« less

  12. An analysis of pilot error-related aircraft accidents

    NASA Technical Reports Server (NTRS)

    Kowalsky, N. B.; Masters, R. L.; Stone, R. B.; Babcock, G. L.; Rypka, E. W.

    1974-01-01

    A multidisciplinary team approach to pilot error-related U.S. air carrier jet aircraft accident investigation records successfully reclaimed hidden human error information not shown in statistical studies. New analytic techniques were developed and applied to the data to discover and identify multiple elements of commonality and shared characteristics within this group of accidents. Three techniques of analysis were used: Critical element analysis, which demonstrated the importance of a subjective qualitative approach to raw accident data and surfaced information heretofore unavailable. Cluster analysis, which was an exploratory research tool that will lead to increased understanding and improved organization of facts, the discovery of new meaning in large data sets, and the generation of explanatory hypotheses. Pattern recognition, by which accidents can be categorized by pattern conformity after critical element identification by cluster analysis.

  13. PyMVPA: A python toolbox for multivariate pattern analysis of fMRI data.

    PubMed

    Hanke, Michael; Halchenko, Yaroslav O; Sederberg, Per B; Hanson, Stephen José; Haxby, James V; Pollmann, Stefan

    2009-01-01

    Decoding patterns of neural activity onto cognitive states is one of the central goals of functional brain imaging. Standard univariate fMRI analysis methods, which correlate cognitive and perceptual function with the blood oxygenation-level dependent (BOLD) signal, have proven successful in identifying anatomical regions based on signal increases during cognitive and perceptual tasks. Recently, researchers have begun to explore new multivariate techniques that have proven to be more flexible, more reliable, and more sensitive than standard univariate analysis. Drawing on the field of statistical learning theory, these new classifier-based analysis techniques possess explanatory power that could provide new insights into the functional properties of the brain. However, unlike the wealth of software packages for univariate analyses, there are few packages that facilitate multivariate pattern classification analyses of fMRI data. Here we introduce a Python-based, cross-platform, and open-source software toolbox, called PyMVPA, for the application of classifier-based analysis techniques to fMRI datasets. PyMVPA makes use of Python's ability to access libraries written in a large variety of programming languages and computing environments to interface with the wealth of existing machine learning packages. We present the framework in this paper and provide illustrative examples on its usage, features, and programmability.

  14. PyMVPA: A Python toolbox for multivariate pattern analysis of fMRI data

    PubMed Central

    Hanke, Michael; Halchenko, Yaroslav O.; Sederberg, Per B.; Hanson, Stephen José; Haxby, James V.; Pollmann, Stefan

    2009-01-01

    Decoding patterns of neural activity onto cognitive states is one of the central goals of functional brain imaging. Standard univariate fMRI analysis methods, which correlate cognitive and perceptual function with the blood oxygenation-level dependent (BOLD) signal, have proven successful in identifying anatomical regions based on signal increases during cognitive and perceptual tasks. Recently, researchers have begun to explore new multivariate techniques that have proven to be more flexible, more reliable, and more sensitive than standard univariate analysis. Drawing on the field of statistical learning theory, these new classifier-based analysis techniques possess explanatory power that could provide new insights into the functional properties of the brain. However, unlike the wealth of software packages for univariate analyses, there are few packages that facilitate multivariate pattern classification analyses of fMRI data. Here we introduce a Python-based, cross-platform, and open-source software toolbox, called PyMVPA, for the application of classifier-based analysis techniques to fMRI datasets. PyMVPA makes use of Python's ability to access libraries written in a large variety of programming languages and computing environments to interface with the wealth of existing machine-learning packages. We present the framework in this paper and provide illustrative examples on its usage, features, and programmability. PMID:19184561

  15. Visual cluster analysis and pattern recognition template and methods

    DOEpatents

    Osbourn, Gordon Cecil; Martinez, Rubel Francisco

    1999-01-01

    A method of clustering using a novel template to define a region of influence. Using neighboring approximation methods, computation times can be significantly reduced. The template and method are applicable and improve pattern recognition techniques.

  16. Using Exploratory Spatial Data Analysis to Leverage Social Indicator Databases: The Discovery of Interesting Patterns

    ERIC Educational Resources Information Center

    Anselin, Luc; Sridharan, Sanjeev; Gholston, Susan

    2007-01-01

    With the proliferation of social indicator databases, the need for powerful techniques to study patterns of change has grown. In this paper, the utility of spatial data analytical methods such as exploratory spatial data analysis (ESDA) is suggested as a means to leverage the information contained in social indicator databases. The principles…

  17. Computer-assisted techniques to evaluate fringe patterns

    NASA Astrophysics Data System (ADS)

    Sciammarella, Cesar A.; Bhat, Gopalakrishna K.

    1992-01-01

    Strain measurement using interferometry requires an efficient way to extract the desired information from interferometric fringes. Availability of digital image processing systems makes it possible to use digital techniques for the analysis of fringes. In the past, there have been several developments in the area of one dimensional and two dimensional fringe analysis techniques, including the carrier fringe method (spatial heterodyning) and the phase stepping (quasi-heterodyning) technique. This paper presents some new developments in the area of two dimensional fringe analysis, including a phase stepping technique supplemented by the carrier fringe method and a two dimensional Fourier transform method to obtain the strain directly from the discontinuous phase contour map.

  18. A forestry application simulation of man-machine techniques for analyzing remotely sensed data

    NASA Technical Reports Server (NTRS)

    Berkebile, J.; Russell, J.; Lube, B.

    1976-01-01

    The typical steps in the analysis of remotely sensed data for a forestry applications example are simulated. The example uses numerically-oriented pattern recognition techniques and emphasizes man-machine interaction.

  19. Verification and extension of the MBL technique for photo resist pattern shape measurement

    NASA Astrophysics Data System (ADS)

    Isawa, Miki; Tanaka, Maki; Kazumi, Hideyuki; Shishido, Chie; Hamamatsu, Akira; Hasegawa, Norio; De Bisschop, Peter; Laidler, David; Leray, Philippe; Cheng, Shaunee

    2011-03-01

    In order to achieve pattern shape measurement with CD-SEM, the Model Based Library (MBL) technique is in the process of development. In this study, several libraries which consisted by double trapezoid model placed in optimum layout, were used to measure the various layout patterns. In order to verify the accuracy of the MBL photoresist pattern shape measurement, CDAFM measurements were carried out as a reference metrology. Both results were compared to each other, and we confirmed that there is a linear correlation between them. After that, to expand the application field of the MBL technique, it was applied to end-of-line (EOL) shape measurement to show the capability. Finally, we confirmed the possibility that the MBL could be applied to more local area shape measurement like hot-spot analysis.

  20. Fingerprint pattern restoration by digital image processing techniques.

    PubMed

    Wen, Che-Yen; Yu, Chiu-Chung

    2003-09-01

    Fingerprint evidence plays an important role in solving criminal problems. However, defective (lacking information needed for completeness) or contaminated (undesirable information included) fingerprint patterns make identifying and recognizing processes difficult. Unfortunately. this is the usual case. In the recognizing process (enhancement of patterns, or elimination of "false alarms" so that a fingerprint pattern can be searched in the Automated Fingerprint Identification System (AFIS)), chemical and physical techniques have been proposed to improve pattern legibility. In the identifying process, a fingerprint examiner can enhance contaminated (but not defective) fingerprint patterns under guidelines provided by the Scientific Working Group on Friction Ridge Analysis, Study and Technology (SWGFAST), the Scientific Working Group on Imaging Technology (SWGIT), and an AFIS working group within the National Institute of Justice. Recently, the image processing techniques have been successfully applied in forensic science. For example, we have applied image enhancement methods to improve the legibility of digital images such as fingerprints and vehicle plate numbers. In this paper, we propose a novel digital image restoration technique based on the AM (amplitude modulation)-FM (frequency modulation) reaction-diffusion method to restore defective or contaminated fingerprint patterns. This method shows its potential application to fingerprint pattern enhancement in the recognizing process (but not for the identifying process). Synthetic and real images are used to show the capability of the proposed method. The results of enhancing fingerprint patterns by the manual process and our method are evaluated and compared.

  1. Determination of polychlorinated biphenyl levels in the serum of residents and in the homogenates of seafood from the New Bedford, Massachusetts Area: A comparison of exposure sources through pattern recognition techniques

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Burse, V.W.; Groce, D.F.; Caudill, S.P.

    1994-01-01

    Gas chromatographic patterns of polychlorinated biophenyls (PCBs) found in the serum of New Bedford, MA residents with high serum PCBs were compared to patterns found in lobsters and bluefish taken from local waters, and goats fed selected technical Aroclors (e.g., Aroclors 1016, 1242, 1254, or 1260) using Jaccard measures of similarity and Principal Component Analysis. Pattern in humans were silimar to patterns in lobsters and both were more similar to those in the goat fed Aroclor 1254 as demonstrated by both pattern recognition techniques. However, patterns observed in humans, lobsters and bluefish all exhibited some presence of PCBs more characteristicmore » of Aroclors 1016 and/or 1242 or 1260.« less

  2. Market segmentation for multiple option healthcare delivery systems--an application of cluster analysis.

    PubMed

    Jarboe, G R; Gates, R H; McDaniel, C D

    1990-01-01

    Healthcare providers of multiple option plans may be confronted with special market segmentation problems. This study demonstrates how cluster analysis may be used for discovering distinct patterns of preference for multiple option plans. The availability of metric, as opposed to categorical or ordinal, data provides the ability to use sophisticated analysis techniques which may be superior to frequency distributions and cross-tabulations in revealing preference patterns.

  3. Proceedings of the NASA Symposium on Mathematical Pattern Recognition and Image Analysis

    NASA Technical Reports Server (NTRS)

    Guseman, L. F., Jr.

    1983-01-01

    The application of mathematical and statistical analyses techniques to imagery obtained by remote sensors is described by Principal Investigators. Scene-to-map registration, geometric rectification, and image matching are among the pattern recognition aspects discussed.

  4. Real-time particulate mass measurement based on laser scattering

    NASA Astrophysics Data System (ADS)

    Rentz, Julia H.; Mansur, David; Vaillancourt, Robert; Schundler, Elizabeth; Evans, Thomas

    2005-11-01

    OPTRA has developed a new approach to the determination of particulate size distribution from a measured, composite, laser angular scatter pattern. Drawing from the field of infrared spectroscopy, OPTRA has employed a multicomponent analysis technique which uniquely recognizes patterns associated with each particle size "bin" over a broad range of sizes. The technique is particularly appropriate for overlapping patterns where large signals are potentially obscuring weak ones. OPTRA has also investigated a method for accurately training the algorithms without the use of representative particles for any given application. This streamlined calibration applies a one-time measured "instrument function" to theoretical Mie patterns to create the training data for the algorithms. OPTRA has demonstrated this algorithmic technique on a compact, rugged, laser scatter sensor head we developed for gas turbine engine emissions measurements. The sensor contains a miniature violet solid state laser and an array of silicon photodiodes, both of which are commercial off the shelf. The algorithmic technique can also be used with any commercially available laser scatter system.

  5. New X-Ray Technique to Characterize Nanoscale Precipitates in Aged Aluminum Alloys

    NASA Astrophysics Data System (ADS)

    Sitdikov, V. D.; Murashkin, M. Yu.; Valiev, R. Z.

    2017-10-01

    This paper puts forward a new technique for measurement of x-ray patterns, which enables to solve the problem of identification and determination of precipitates (nanoscale phases) in metallic alloys of the matrix type. The minimum detection limit of precipitates in the matrix of the base material provided by this technique constitutes as little as 1%. The identification of precipitates in x-ray patterns and their analysis are implemented through a transmission mode with a larger radiation area, longer holding time and higher diffractometer resolution as compared to the conventional reflection mode. The presented technique has been successfully employed to identify and quantitatively describe precipitates formed in the Al alloy of the Al-Mg-Si system as a result of artificial aging. For the first time, the x-ray phase analysis has been used to identify and measure precipitates formed during the alloy artificial aging.

  6. Visual cluster analysis and pattern recognition template and methods

    DOEpatents

    Osbourn, G.C.; Martinez, R.F.

    1999-05-04

    A method of clustering using a novel template to define a region of influence is disclosed. Using neighboring approximation methods, computation times can be significantly reduced. The template and method are applicable and improve pattern recognition techniques. 30 figs.

  7. Logic programming to predict cell fate patterns and retrodict genotypes in organogenesis.

    PubMed

    Hall, Benjamin A; Jackson, Ethan; Hajnal, Alex; Fisher, Jasmin

    2014-09-06

    Caenorhabditis elegans vulval development is a paradigm system for understanding cell differentiation in the process of organogenesis. Through temporal and spatial controls, the fate pattern of six cells is determined by the competition of the LET-23 and the Notch signalling pathways. Modelling cell fate determination in vulval development using state-based models, coupled with formal analysis techniques, has been established as a powerful approach in predicting the outcome of combinations of mutations. However, computing the outcomes of complex and highly concurrent models can become prohibitive. Here, we show how logic programs derived from state machines describing the differentiation of C. elegans vulval precursor cells can increase the speed of prediction by four orders of magnitude relative to previous approaches. Moreover, this increase in speed allows us to infer, or 'retrodict', compatible genomes from cell fate patterns. We exploit this technique to predict highly variable cell fate patterns resulting from dig-1 reduced-function mutations and let-23 mosaics. In addition to the new insights offered, we propose our technique as a platform for aiding the design and analysis of experimental data. © 2014 The Author(s) Published by the Royal Society. All rights reserved.

  8. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shumway, R.H.; McQuarrie, A.D.

    Robust statistical approaches to the problem of discriminating between regional earthquakes and explosions are developed. We compare linear discriminant analysis using descriptive features like amplitude and spectral ratios with signal discrimination techniques using the original signal waveforms and spectral approximations to the log likelihood function. Robust information theoretic techniques are proposed and all methods are applied to 8 earthquakes and 8 mining explosions in Scandinavia and to an event from Novaya Zemlya of unknown origin. It is noted that signal discrimination approaches based on discrimination information and Renyi entropy perform better in the test sample than conventional methods based onmore » spectral ratios involving the P and S phases. Two techniques for identifying the ripple-firing pattern for typical mining explosions are proposed and shown to work well on simulated data and on several Scandinavian earthquakes and explosions. We use both cepstral analysis in the frequency domain and a time domain method based on the autocorrelation and partial autocorrelation functions. The proposed approach strips off underlying smooth spectral and seasonal spectral components corresponding to the echo pattern induced by two simple ripple-fired models. For two mining explosions, a pattern is identified whereas for two earthquakes, no pattern is evident.« less

  9. Quantization error of CCD cameras and their influence on phase calculation in fringe pattern analysis.

    PubMed

    Skydan, Oleksandr A; Lilley, Francis; Lalor, Michael J; Burton, David R

    2003-09-10

    We present an investigation into the phase errors that occur in fringe pattern analysis that are caused by quantization effects. When acquisition devices with a limited value of camera bit depth are used, there are a limited number of quantization levels available to record the signal. This may adversely affect the recorded signal and adds a potential source of instrumental error to the measurement system. Quantization effects also determine the accuracy that may be achieved by acquisition devices in a measurement system. We used the Fourier fringe analysis measurement technique. However, the principles can be applied equally well for other phase measuring techniques to yield a phase error distribution that is caused by the camera bit depth.

  10. Using Machine Learning Techniques in the Analysis of Oceanographic Data

    NASA Astrophysics Data System (ADS)

    Falcinelli, K. E.; Abuomar, S.

    2017-12-01

    Acoustic Doppler Current Profilers (ADCPs) are oceanographic tools capable of collecting large amounts of current profile data. Using unsupervised machine learning techniques such as principal component analysis, fuzzy c-means clustering, and self-organizing maps, patterns and trends in an ADCP dataset are found. Cluster validity algorithms such as visual assessment of cluster tendency and clustering index are used to determine the optimal number of clusters in the ADCP dataset. These techniques prove to be useful in analysis of ADCP data and demonstrate potential for future use in other oceanographic applications.

  11. Pattern detection in forensic case data using graph theory: application to heroin cutting agents.

    PubMed

    Terrettaz-Zufferey, Anne-Laure; Ratle, Frédéric; Ribaux, Olivier; Esseiva, Pierre; Kanevski, Mikhail

    2007-04-11

    Pattern recognition techniques can be very useful in forensic sciences to point out to relevant sets of events and potentially encourage an intelligence-led style of policing. In this study, these techniques have been applied to categorical data corresponding to cutting agents found in heroin seizures. An application of graph theoretic methods has been performed, in order to highlight the possible relationships between the location of seizures and co-occurrences of particular heroin cutting agents. An analysis of the co-occurrences to establish several main combinations has been done. Results illustrate the practical potential of mathematical models in forensic data analysis.

  12. Choosing the Most Effective Pattern Classification Model under Learning-Time Constraint.

    PubMed

    Saito, Priscila T M; Nakamura, Rodrigo Y M; Amorim, Willian P; Papa, João P; de Rezende, Pedro J; Falcão, Alexandre X

    2015-01-01

    Nowadays, large datasets are common and demand faster and more effective pattern analysis techniques. However, methodologies to compare classifiers usually do not take into account the learning-time constraints required by applications. This work presents a methodology to compare classifiers with respect to their ability to learn from classification errors on a large learning set, within a given time limit. Faster techniques may acquire more training samples, but only when they are more effective will they achieve higher performance on unseen testing sets. We demonstrate this result using several techniques, multiple datasets, and typical learning-time limits required by applications.

  13. A spot pattern test chart technique for measurement of geometric aberrations caused by an intervening medium—a novel method

    NASA Astrophysics Data System (ADS)

    Ganesan, A. R.; Arulmozhivarman, P.; Jesson, M.

    2005-12-01

    Accurate surface metrology and transmission characteristics measurements have become vital to certify the manufacturing excellence in the field of glass visors, windshields, menu boards and transportation industries. We report a simple, cost-effective and novel technique for the measurement of geometric aberrations in transparent materials such as glass sheets, Perspex, etc. The technique makes use of an array of spot pattern, we call the spot pattern test chart technique, in the diffraction limited imaging position having large field of view. Performance features include variable angular dynamic range and angular sensitivity. Transparent sheets as the intervening medium introduced in the line of sight, causing aberrations, are estimated in real time using the Zernike reconstruction method. Quantitative comparative analysis between a Shack-Hartmann wavefront sensor and the proposed new method is presented and the results are discussed.

  14. Cerebrovascular pattern improved by ozone autohemotherapy: an entropy-based study on multiple sclerosis patients.

    PubMed

    Molinari, Filippo; Rimini, Daniele; Liboni, William; Acharya, U Rajendra; Franzini, Marianno; Pandolfi, Sergio; Ricevuti, Giovanni; Vaiano, Francesco; Valdenassi, Luigi; Simonetti, Vincenzo

    2017-08-01

    Ozone major autohemotherapy is effective in reducing the symptoms of multiple sclerosis (MS) patients, but its effects on brain are still not clear. In this work, we have monitored the changes in the cerebrovascular pattern of MS patients and normal subjects during major ozone autohemotherapy by using near-infrared spectroscopy (NIRS) as functional and vascular technique. NIRS signals are analyzed using a combination of time, time-frequency analysis and nonlinear analysis of intrinsic mode function signals obtained from empirical mode decomposition technique. Our results show that there is an improvement in the cerebrovascular pattern of all subjects indicated by increasing the entropy of the NIRS signals. Hence, we can conclude that the ozone therapy increases the brain metabolism and helps to recover from the lower activity levels which is predominant in MS patients.

  15. Conjecturing and Generalization Process on The Structural Development

    NASA Astrophysics Data System (ADS)

    Ni'mah, Khomsatun; Purwanto; Bambang Irawan, Edy; Hidayanto, Erry

    2017-06-01

    This study aims to describe how conjecturing process and generalization process of structural development to thirty children in middle school at grade 8 in solving problems of patterns. Processing of the data in this study uses qualitative data analysis techniques. The analyzed data is the data obtained through direct observation technique, documentation, and interviews. This study based on research studies Mulligan et al (2012) which resulted in a five - structural development stage, namely prestructural, emergent, partial, structural, and advance. From the analysis of the data in this study found there are two phenomena that is conjecturing and generalization process are related. During the conjecturing process, the childrens appropriately in making hypothesis of patterns problem through two phases, which are numerically and symbolically. Whereas during the generalization of process, the childrens able to related rule of pattern on conjecturing process to another context.

  16. Social Learning Network Analysis Model to Identify Learning Patterns Using Ontology Clustering Techniques and Meaningful Learning

    ERIC Educational Resources Information Center

    Firdausiah Mansur, Andi Besse; Yusof, Norazah

    2013-01-01

    Clustering on Social Learning Network still not explored widely, especially when the network focuses on e-learning system. Any conventional methods are not really suitable for the e-learning data. SNA requires content analysis, which involves human intervention and need to be carried out manually. Some of the previous clustering techniques need…

  17. Cognitive approaches for patterns analysis and security applications

    NASA Astrophysics Data System (ADS)

    Ogiela, Marek R.; Ogiela, Lidia

    2017-08-01

    In this paper will be presented new opportunities for developing innovative solutions for semantic pattern classification and visual cryptography, which will base on cognitive and bio-inspired approaches. Such techniques can be used for evaluation of the meaning of analyzed patterns or encrypted information, and allow to involve such meaning into the classification task or encryption process. It also allows using some crypto-biometric solutions to extend personalized cryptography methodologies based on visual pattern analysis. In particular application of cognitive information systems for semantic analysis of different patterns will be presented, and also a novel application of such systems for visual secret sharing will be described. Visual shares for divided information can be created based on threshold procedure, which may be dependent on personal abilities to recognize some image details visible on divided images.

  18. Spatial epidemiological techniques in cholera mapping and analysis towards a local scale predictive modelling

    NASA Astrophysics Data System (ADS)

    Rasam, A. R. A.; Ghazali, R.; Noor, A. M. M.; Mohd, W. M. N. W.; Hamid, J. R. A.; Bazlan, M. J.; Ahmad, N.

    2014-02-01

    Cholera spatial epidemiology is the study of the spread and control of the disease spatial pattern and epidemics. Previous studies have shown that multi-factorial causation such as human behaviour, ecology and other infectious risk factors influence the disease outbreaks. Thus, understanding spatial pattern and possible interrelationship factors of the outbreaks are crucial to be explored an in-depth study. This study focuses on the integration of geographical information system (GIS) and epidemiological techniques in exploratory analyzing the cholera spatial pattern and distribution in the selected district of Sabah. Spatial Statistic and Pattern tools in ArcGIS and Microsoft Excel software were utilized to map and analyze the reported cholera cases and other data used. Meanwhile, cohort study in epidemiological technique was applied to investigate multiple outcomes of the disease exposure. The general spatial pattern of cholera was highly clustered showed the disease spread easily at a place or person to others especially 1500 meters from the infected person and locations. Although the cholera outbreaks in the districts are not critical, it could be endemic at the crowded areas, unhygienic environment, and close to contaminated water. It was also strongly believed that the coastal water of the study areas has possible relationship with the cholera transmission and phytoplankton bloom since the areas recorded higher cases. GIS demonstrates a vital spatial epidemiological technique in determining the distribution pattern and elucidating the hypotheses generating of the disease. The next research would be applying some advanced geo-analysis methods and other disease risk factors for producing a significant a local scale predictive risk model of the disease in Malaysia.

  19. Pattern-recognition techniques applied to performance monitoring of the DSS 13 34-meter antenna control assembly

    NASA Technical Reports Server (NTRS)

    Mellstrom, J. A.; Smyth, P.

    1991-01-01

    The results of applying pattern recognition techniques to diagnose fault conditions in the pointing system of one of the Deep Space network's large antennas, the DSS 13 34-meter structure, are discussed. A previous article described an experiment whereby a neural network technique was used to identify fault classes by using data obtained from a simulation model of the Deep Space Network (DSN) 70-meter antenna system. Described here is the extension of these classification techniques to the analysis of real data from the field. The general architecture and philosophy of an autonomous monitoring paradigm is described and classification results are discussed and analyzed in this context. Key features of this approach include a probabilistic time-varying context model, the effective integration of signal processing and system identification techniques with pattern recognition algorithms, and the ability to calibrate the system given limited amounts of training data. Reported here are recognition accuracies in the 97 to 98 percent range for the particular fault classes included in the experiments.

  20. Cultural and environmental influences on temporal-spectral development patterns of corn and soybeans

    NASA Technical Reports Server (NTRS)

    Crist, E. P.

    1982-01-01

    A technique for evaluating crop temporal-spectral development patterns is described and applied to the analysis of cropping practices and environmental conditions as they affect reflectance characteristics of corn and soybean canopies. Typical variations in field conditions are shown to exert significant influences on the spectral development patterns, and thereby to affect the separability of the two crops.

  1. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lyu, Sungnam, E-mail: blueden@postech.ac.kr; Hwang, Woonbong, E-mail: whwang@postech.ac.kr

    Patterning techniques are essential to many research fields such as chemistry, biology, medicine, and micro-electromechanical systems. In this letter, we report a simple, fast, and low-cost superhydrophobic patterning method using a superhydrophilic template. The technique is based on the contact stamping of the surface during hydrophobic dip coating. Surface characteristics were measured using scanning electron microscopy and energy-dispersive X-ray spectroscopic analysis. The results showed that the hydrophilic template, which was contacted with the stamp, was not affected by the hydrophobic solution. The resolution study was conducted using a stripe shaped stamp. The patterned line was linearly proportional to the widthmore » of the stamp line with a constant narrowing effect. A surface with regions of four different types of wetting was fabricated to demonstrate the patterning performance.« less

  2. An image analysis of TLC patterns for quality control of saffron based on soil salinity effect: A strategy for data (pre)-processing.

    PubMed

    Sereshti, Hassan; Poursorkh, Zahra; Aliakbarzadeh, Ghazaleh; Zarre, Shahin; Ataolahi, Sahar

    2018-01-15

    Quality of saffron, a valuable food additive, could considerably affect the consumers' health. In this work, a novel preprocessing strategy for image analysis of saffron thin layer chromatographic (TLC) patterns was introduced. This includes performing a series of image pre-processing techniques on TLC images such as compression, inversion, elimination of general baseline (using asymmetric least squares (AsLS)), removing spots shift and concavity (by correlation optimization warping (COW)), and finally conversion to RGB chromatograms. Subsequently, an unsupervised multivariate data analysis including principal component analysis (PCA) and k-means clustering was utilized to investigate the soil salinity effect, as a cultivation parameter, on saffron TLC patterns. This method was used as a rapid and simple technique to obtain the chemical fingerprints of saffron TLC images. Finally, the separated TLC spots were chemically identified using high-performance liquid chromatography-diode array detection (HPLC-DAD). Accordingly, the saffron quality from different areas of Iran was evaluated and classified. Copyright © 2017 Elsevier Ltd. All rights reserved.

  3. Comparison of infinite and wedge fringe settings in Mach Zehnder interferometer for temperature field measurement

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Haridas, Divya; P, Vibin Antony; Sajith, V.

    2014-10-15

    Interferometric method, which utilizes the interference of coherent light beams, is used to determine the temperature distribution in the vicinity of a vertical heater plate. The optical components are arranged so as to obtain wedge fringe and infinite fringe patterns and isotherms obtained in each case were compared. In wedge fringe setting, image processing techniques has been used for obtaining isotherms by digital subtraction of initial parallel fringe pattern from deformed fringe pattern. The experimental results obtained are compared with theoretical correlations. The merits and demerits of the fringe analysis techniques are discussed on the basis of the experimental results.

  4. Movement coordination patterns between the foot joints during walking.

    PubMed

    Arnold, John B; Caravaggi, Paolo; Fraysse, François; Thewlis, Dominic; Leardini, Alberto

    2017-01-01

    In 3D gait analysis, kinematics of the foot joints are usually reported via isolated time histories of joint rotations and no information is provided on the relationship between rotations at different joints. The aim of this study was to identify movement coordination patterns in the foot during walking by expanding an existing vector coding technique according to an established multi-segment foot and ankle model. A graphical representation is also described to summarise the coordination patterns of joint rotations across multiple patients. Three-dimensional multi-segment foot kinematics were recorded in 13 adults during walking. A modified vector coding technique was used to identify coordination patterns between foot joints involving calcaneus, midfoot, metatarsus and hallux segments. According to the type and direction of joints rotations, these were classified as in-phase (same direction), anti-phase (opposite directions), proximal or distal joint dominant. In early stance, 51 to 75% of walking trials showed proximal-phase coordination between foot joints comprising the calcaneus, midfoot and metatarsus. In-phase coordination was more prominent in late stance, reflecting synergy in the simultaneous inversion occurring at multiple foot joints. Conversely, a distal-phase coordination pattern was identified for sagittal plane motion of the ankle relative to the midtarsal joint, highlighting the critical role of arch shortening to locomotor function in push-off. This study has identified coordination patterns between movement of the calcaneus, midfoot, metatarsus and hallux by expanding an existing vector cording technique for assessing and classifying coordination patterns of foot joints rotations during walking. This approach provides a different perspective in the analysis of multi-segment foot kinematics, and may be used for the objective quantification of the alterations in foot joint coordination patterns due to lower limb pathologies or following injuries.

  5. Fan fault diagnosis based on symmetrized dot pattern analysis and image matching

    NASA Astrophysics Data System (ADS)

    Xu, Xiaogang; Liu, Haixiao; Zhu, Hao; Wang, Songling

    2016-07-01

    To detect the mechanical failure of fans, a new diagnostic method based on the symmetrized dot pattern (SDP) analysis and image matching is proposed. Vibration signals of 13 kinds of running states are acquired on a centrifugal fan test bed and reconstructed by the SDP technique. The SDP pattern templates of each running state are established. An image matching method is performed to diagnose the fault. In order to improve the diagnostic accuracy, the single template, multiple templates and clustering fault templates are used to perform the image matching.

  6. A reference web architecture and patterns for real-time visual analytics on large streaming data

    NASA Astrophysics Data System (ADS)

    Kandogan, Eser; Soroker, Danny; Rohall, Steven; Bak, Peter; van Ham, Frank; Lu, Jie; Ship, Harold-Jeffrey; Wang, Chun-Fu; Lai, Jennifer

    2013-12-01

    Monitoring and analysis of streaming data, such as social media, sensors, and news feeds, has become increasingly important for business and government. The volume and velocity of incoming data are key challenges. To effectively support monitoring and analysis, statistical and visual analytics techniques need to be seamlessly integrated; analytic techniques for a variety of data types (e.g., text, numerical) and scope (e.g., incremental, rolling-window, global) must be properly accommodated; interaction, collaboration, and coordination among several visualizations must be supported in an efficient manner; and the system should support the use of different analytics techniques in a pluggable manner. Especially in web-based environments, these requirements pose restrictions on the basic visual analytics architecture for streaming data. In this paper we report on our experience of building a reference web architecture for real-time visual analytics of streaming data, identify and discuss architectural patterns that address these challenges, and report on applying the reference architecture for real-time Twitter monitoring and analysis.

  7. Surface inspection: Research and development

    NASA Technical Reports Server (NTRS)

    Batchelder, J. S.

    1987-01-01

    Surface inspection techniques are used for process learning, quality verification, and postmortem analysis in manufacturing for a spectrum of disciplines. First, trends in surface analysis are summarized for integrated circuits, high density interconnection boards, and magnetic disks, emphasizing on-line applications as opposed to off-line or development techniques. Then, a closer look is taken at microcontamination detection from both a patterned defect and a particulate inspection point of view.

  8. Isolation and measurement of the features of arrays of cell aggregates formed by dielectrophoresis using the user-specified Multi Regions Masking (MRM) technique

    NASA Astrophysics Data System (ADS)

    Yusvana, Rama; Headon, Denis; Markx, Gerard H.

    2009-08-01

    The use of dielectrophoresis for the construction of artificial skin tissue with skin cells in follicle-like 3D cell aggregates in well-defined patterns is demonstrated. To analyse the patterns produced and to study their development after their formation a Virtual Instrument (VI) system was developed using the LabVIEW IMAQ Vision Development Module. A series of programming functions (algorithms) was used to isolate the features on the image (in our case; the patterned aggregates) and separate them from all other unwanted regions on the image. The image was subsequently converted into a binary version, covering only the desired microarray regions which could then be analysed by computer for automatic object measurements. The analysis utilized the simple and easy-to-use User-Specified Multi-Regions Masking (MRM) technique, which allows one to concentrate the analysis on the desired regions specified in the mask. This simplified the algorithms for the analysis of images of cell arrays having similar geometrical properties. By having a collection of scripts containing masks of different patterns, it was possible to quickly and efficiently develop sets of custom virtual instruments for the offline or online analysis of images of cell arrays in the database.

  9. Comments on the use of network structures to analyse commercial companies’ evolution and their impact on economic behaviour

    NASA Astrophysics Data System (ADS)

    Costea, Carmen

    2006-10-01

    Network analysis studies the development of the social structure of relationships around a group or an institutional body, and how it affects beliefs and behaviours. Causal constraints require a special and deeper attention to the social structure. The purpose of this paper is to give a new approach to the idea that this reality should be primarily conceived and investigated from the perspective of the properties of relations between and within units, instead of the properties of these units themselves. The relationship may refer to the exchange of products, labour, information and money. By mapping these relationships, network analysis can help to uncover the emergent and informal communication patterns of commercial companies that may be compared to the formal communication structures. These emergent patterns can be used to explain institutional and individuals’ behaviours. Network analysis techniques focus on the communication structure of an organization that can be subdivided and handled with different approaches. Structural features that can be analysed through the use of network analysis techniques are, for example, the (formal and informal) communication patterns in an organization or the identification of specific groups within an organization. Special attention may be given to specific aspects of communication patterns.

  10. Principal Component Analysis in the Spectral Analysis of the Dynamic Laser Speckle Patterns

    NASA Astrophysics Data System (ADS)

    Ribeiro, K. M.; Braga, R. A., Jr.; Horgan, G. W.; Ferreira, D. D.; Safadi, T.

    2014-02-01

    Dynamic laser speckle is a phenomenon that interprets an optical patterns formed by illuminating a surface under changes with coherent light. Therefore, the dynamic change of the speckle patterns caused by biological material is known as biospeckle. Usually, these patterns of optical interference evolving in time are analyzed by graphical or numerical methods, and the analysis in frequency domain has also been an option, however involving large computational requirements which demands new approaches to filter the images in time. Principal component analysis (PCA) works with the statistical decorrelation of data and it can be used as a data filtering. In this context, the present work evaluated the PCA technique to filter in time the data from the biospeckle images aiming the reduction of time computer consuming and improving the robustness of the filtering. It was used 64 images of biospeckle in time observed in a maize seed. The images were arranged in a data matrix and statistically uncorrelated by PCA technique, and the reconstructed signals were analyzed using the routine graphical and numerical methods to analyze the biospeckle. Results showed the potential of the PCA tool in filtering the dynamic laser speckle data, with the definition of markers of principal components related to the biological phenomena and with the advantage of fast computational processing.

  11. Structural Pattern Recognition Techniques for Data Retrieval in Massive Fusion Databases

    NASA Astrophysics Data System (ADS)

    Vega, J.; Murari, A.; Rattá, G. A.; Castro, P.; Pereira, A.; Portas, A.

    2008-03-01

    Diagnostics of present day reactor class fusion experiments, like the Joint European Torus (JET), generate thousands of signals (time series and video images) in each discharge. There is a direct correspondence between the physical phenomena taking place in the plasma and the set of structural shapes (patterns) that they form in the signals: bumps, unexpected amplitude changes, abrupt peaks, periodic components, high intensity zones or specific edge contours. A major difficulty related to data analysis is the identification, in a rapid and automated way, of a set of discharges with comparable behavior, i.e. discharges with "similar" patterns. Pattern recognition techniques are efficient tools to search for similar structural forms within the database in a fast an intelligent way. To this end, classification systems must be developed to be used as indexation methods to directly fetch the more similar patterns.

  12. Moire technique utilization for detection and measurement of scoliosis

    NASA Astrophysics Data System (ADS)

    Zawieska, Dorota; Podlasiak, Piotr

    1993-02-01

    Moire projection method enables non-contact measurement of the shape or deformation of different surfaces and constructions by fringe pattern analysis. The fringe map acquisition of the whole surface of the object under test is one of the main advantages compared with 'point by point' methods. The computer analyzes the shape of the whole surface and next user can selected different points or cross section of the object map. In this paper a few typical examples of an application of the moire technique in solving different medical problems will be presented. We will also present to you the equipment the moire pattern analysis is done in real time using the phase stepping method with CCD camera.

  13. Identification of unique repeated patterns, location of mutation in DNA finger printing using artificial intelligence technique.

    PubMed

    Mukunthan, B; Nagaveni, N

    2014-01-01

    In genetic engineering, conventional techniques and algorithms employed by forensic scientists to assist in identification of individuals on the basis of their respective DNA profiles involves more complex computational steps and mathematical formulae, also the identification of location of mutation in a genomic sequence in laboratories is still an exigent task. This novel approach provides ability to solve the problems that do not have an algorithmic solution and the available solutions are also too complex to be found. The perfect blend made of bioinformatics and neural networks technique results in efficient DNA pattern analysis algorithm with utmost prediction accuracy.

  14. Techniques for spatio-temporal analysis of vegetation fires in the topical belt of Africa

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brivio, P.A.; Ober, G.; Koffi, B.

    1995-12-31

    Biomass burning of forests and savannas is a phenomenon of continental or even global proportions, capable of causing large scale environmental changes. Satellite space observations, in particular from NOAA-AVHRR GAC data, are the only source of information allowing one to document burning patterns at regional and continental scale and over long periods of time. This paper presents some techniques, such as clustering and rose-diagram, useful in the spatial-temporal analysis of satellite derived fires maps to characterize the evolution of spatial patterns of vegetation fires at regional scale. An automatic clustering approach is presented which enables one to describe and parameterizemore » spatial distribution of fire patterns at different scales. The problem of geographical distribution of vegetation fires with respect to some location of interest, point or line, is also considered and presented. In particular rose-diagrams are used to relate fires patterns to some reference point, as experimental sites of tropospheric chemistry measurements. Different temporal data-sets in the tropical belt of Africa, covering both Northern and Southern Hemisphere dry seasons, using these techniques were analyzed and showed very promising results when compared with data from rain chemistry studies at different sampling sites in the equatorial forest.« less

  15. Contrast-enhanced magnetic resonance imaging of pulmonary lesions: description of a technique aiming clinical practice.

    PubMed

    Koenigkam-Santos, Marcel; Optazaite, Elzbieta; Sommer, Gregor; Safi, Seyer; Heussel, Claus Peter; Kauczor, Hans-Ulrich; Puderbach, Michael

    2015-01-01

    To propose a technique for evaluation of pulmonary lesions using contrast-enhanced MRI; to assess morphological patterns of enhancement and correlate quantitative analysis with histopathology. Thirty-six patients were prospectively studied. Volumetric-interpolated T1W images were obtained during consecutive breath holds after bolus triggered contrast injection. Volume coverage of first three acquisitions was limited (higher temporal resolution) and last acquisition obtained at 4th min. Two radiologists individually evaluated the patterns of enhancement. Region-of-interest-based signal intensity (SI)-time curves were created to assess quantitative parameters. Readers agreed moderately to substantially concerning lesions' enhancement pattern. SI-time curves could be created for all lesions. In comparison to benign, malignant lesions showed higher values of maximum enhancement, early peak, slope and 4th min enhancement. Early peak >15% showed 100% sensitivity to detect malignancy, maximum enhancement >40% showed 100% specificity. The proposed technique is robust, simple to perform and can be applied in clinical scenario. It allows visual evaluation of enhancement pattern/progression together with creation of SI-time curves and assessment of derived quantitative parameters. Perfusion analysis was highly sensitive to detect malignancy, in accordance to what is recommended by most recent guidelines on imaging evaluation of pulmonary lesions. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  16. Pattern Analysis of Suicide Mortality Surveillance Data in Urban South Africa

    ERIC Educational Resources Information Center

    Burrows, Stephanie; Laflamme, Lucie

    2008-01-01

    The typical circumstances of suicide occurrence in post-apartheid urban South Africa are described. Data comprise suicide cases from all geographical locations (urban municipalities) where an injury surveillance system has full coverage. Typical patterns were identified by means of a classification technique applied to eight variables descriptive…

  17. Investigation of a Moire Based Crack Detection Technique for Propulsion Health Monitoring

    NASA Technical Reports Server (NTRS)

    Woike, Mark R.; Abudl-Aziz, Ali; Fralick, Gustave C.; Wrbanek, John D.

    2012-01-01

    The development of techniques for the health monitoring of the rotating components in gas turbine engines is of major interest to NASA s Aviation Safety Program. As part of this on-going effort several experiments utilizing a novel optical Moir based concept along with external blade tip clearance and shaft displacement instrumentation were conducted on a simulated turbine engine disk as a means of demonstrating a potential optical crack detection technique. A Moir pattern results from the overlap of two repetitive patterns with slightly different periods. With this technique, it is possible to detect very small differences in spacing and hence radial growth in a rotating disk due to a flaw such as a crack. The experiment involved etching a circular reference pattern on a subscale engine disk that had a 50.8 mm (2 in.) long notch machined into it to simulate a crack. The disk was operated at speeds up to 12 000 rpm and the Moir pattern due to the shift with respect to the reference pattern was monitored as a means of detecting the radial growth of the disk due to the defect. In addition, blade displacement data were acquired using external blade tip clearance and shaft displacement sensors as a means of confirming the data obtained from the optical technique. The results of the crack detection experiments and its associated analysis are presented in this paper.

  18. Isoschizomers and amplified fragment length polymorphism for the detection of specific cytosine methylation changes.

    PubMed

    Ruiz-García, Leonor; Cabezas, Jose Antonio; de María, Nuria; Cervera, María-Teresa

    2010-01-01

    Different molecular techniques have been developed to study either the global level of methylated cytosines or methylation at specific gene sequences. One of them is a modification of the Amplified Fragment Length Polymorphism (AFLP) technique that has been used to study methylation of anonymous CCGG sequences in different fungi, plant and animal species. The main variation of this technique is based on the use of isoschizomers with different methylation sensitivity (such as HpaII and MspI) as a frequent cutter restriction enzyme. For each sample, AFLP analysis is performed using both EcoRI/HpaII and EcoRI/MspI digested samples. Comparative analysis between EcoRI/HpaII and EcoRI/MspI fragment patterns allows the identification of two types of polymorphisms: (1) "Methylation-insensitive polymorphisms" that show common EcoRI/HpaII and EcoRI/MspI patterns but are detected as polymorphic amplified fragments among samples; and (2) "Methylation-sensitive polymorphisms" that are associated with amplified fragments differing in their presence or absence or in their intensity between EcoRI/HpaII and EcoRI/MspI patterns. This chapter describes a detailed protocol of this technique and discusses modifications that can be applied to adjust the technology to different species of interest.

  19. Classification and Validation of Behavioral Subtypes of Learning-Disabled Children.

    ERIC Educational Resources Information Center

    Speece, Deborah L.; And Others

    1985-01-01

    Using the Classroom Behavior Inventory, teachers rated the behaviors of 63 school-identified, learning-disabled first and second graders. Hierarchical cluster analysis techniques identified seven distinct behavioral subtypes. Internal validation techniques indicated that the subtypes were replicable and had profile patterns different from a sample…

  20. Restoration of out-of-focus images based on circle of confusion estimate

    NASA Astrophysics Data System (ADS)

    Vivirito, Paolo; Battiato, Sebastiano; Curti, Salvatore; La Cascia, M.; Pirrone, Roberto

    2002-11-01

    In this paper a new method for a fast out-of-focus blur estimation and restoration is proposed. It is suitable for CFA (Color Filter Array) images acquired by typical CCD/CMOS sensor. The method is based on the analysis of a single image and consists of two steps: 1) out-of-focus blur estimation via Bayer pattern analysis; 2) image restoration. Blur estimation is based on a block-wise edge detection technique. This edge detection is carried out on the green pixels of the CFA sensor image also called Bayer pattern. Once the blur level has been estimated the image is restored through the application of a new inverse filtering technique. This algorithm gives sharp images reducing ringing and crisping artifact, involving wider region of frequency. Experimental results show the effectiveness of the method, both in subjective and numerical way, by comparison with other techniques found in literature.

  1. Detection of Anomalies in Hydrometric Data Using Artificial Intelligence Techniques

    NASA Astrophysics Data System (ADS)

    Lauzon, N.; Lence, B. J.

    2002-12-01

    This work focuses on the detection of anomalies in hydrometric data sequences, such as 1) outliers, which are individual data having statistical properties that differ from those of the overall population; 2) shifts, which are sudden changes over time in the statistical properties of the historical records of data; and 3) trends, which are systematic changes over time in the statistical properties. For the purpose of the design and management of water resources systems, it is important to be aware of these anomalies in hydrometric data, for they can induce a bias in the estimation of water quantity and quality parameters. These anomalies may be viewed as specific patterns affecting the data, and therefore pattern recognition techniques can be used for identifying them. However, the number of possible patterns is very large for each type of anomaly and consequently large computing capacities are required to account for all possibilities using the standard statistical techniques, such as cluster analysis. Artificial intelligence techniques, such as the Kohonen neural network and fuzzy c-means, are clustering techniques commonly used for pattern recognition in several areas of engineering and have recently begun to be used for the analysis of natural systems. They require much less computing capacity than the standard statistical techniques, and therefore are well suited for the identification of outliers, shifts and trends in hydrometric data. This work constitutes a preliminary study, using synthetic data representing hydrometric data that can be found in Canada. The analysis of the results obtained shows that the Kohonen neural network and fuzzy c-means are reasonably successful in identifying anomalies. This work also addresses the problem of uncertainties inherent to the calibration procedures that fit the clusters to the possible patterns for both the Kohonen neural network and fuzzy c-means. Indeed, for the same database, different sets of clusters can be established with these calibration procedures. A simple method for analyzing uncertainties associated with the Kohonen neural network and fuzzy c-means is developed here. The method combines the results from several sets of clusters, either from the Kohonen neural network or fuzzy c-means, so as to provide an overall diagnosis as to the identification of outliers, shifts and trends. The results indicate an improvement in the performance for identifying anomalies when the method of combining cluster sets is used, compared with when only one cluster set is used.

  2. Most Frequent Errors in Judo Uki Goshi Technique and the Existing Relations among Them Analysed through T-Patterns

    PubMed Central

    Gutiérrez, Alfonso; Prieto, Iván; Cancela, José M.

    2009-01-01

    The purpose of this study is to provide a tool, based on the knowledge of technical errors, which helps to improve the teaching and learning process of the Uki Goshi technique. With this aim, we set out to determine the most frequent errors made by 44 students when performing this technique and how these mistakes relate. In order to do so, an observational analysis was carried out using the OSJUDO-UKG instrument and the data were registered using Match Vision Studio (Castellano, Perea, Alday and Hernández, 2008). The results, analyzed through descriptive statistics, show that the absence of a correct initial unbalancing movement (45,5%), the lack of proper right-arm pull (56,8%), not blocking the faller’s body (Uke) against the thrower’s hip -Tori- (54,5%) and throwing the Uke through the Tori’s side are the most usual mistakes (72,7%). Through the sequencial analysis of T-Patterns obtained with the THÈME program (Magnusson, 1996, 2000) we have concluded that not blocking the body with the Tori’s hip provokes the Uke’s throw through the Tori’s side during the final phase of the technique (95,8%), and positioning the right arm on the dorsal region of the Uke’s back during the Tsukuri entails the absence of a subsequent pull of the Uke’s body (73,3%). Key Points In this study, the most frequent errors in the performance of the Uki Goshi technique have been determined and the existing relations among these mistakes have been shown through T-Patterns. The SOBJUDO-UKG is an observation instrument for detecting mistakes in the aforementioned technique. The results show that those mistakes related to the initial imbalancing movement and the main driving action of the technique are the most frequent. The use of T-Patterns turns out to be effective in order to obtain the most important relations among the observed errors. PMID:24474885

  3. Spatial and temporal air quality pattern recognition using environmetric techniques: a case study in Malaysia.

    PubMed

    Syed Abdul Mutalib, Sharifah Norsukhairin; Juahir, Hafizan; Azid, Azman; Mohd Sharif, Sharifah; Latif, Mohd Talib; Aris, Ahmad Zaharin; Zain, Sharifuddin M; Dominick, Doreena

    2013-09-01

    The objective of this study is to identify spatial and temporal patterns in the air quality at three selected Malaysian air monitoring stations based on an eleven-year database (January 2000-December 2010). Four statistical methods, Discriminant Analysis (DA), Hierarchical Agglomerative Cluster Analysis (HACA), Principal Component Analysis (PCA) and Artificial Neural Networks (ANNs), were selected to analyze the datasets of five air quality parameters, namely: SO2, NO2, O3, CO and particulate matter with a diameter size of below 10 μm (PM10). The three selected air monitoring stations share the characteristic of being located in highly urbanized areas and are surrounded by a number of industries. The DA results show that spatial characterizations allow successful discrimination between the three stations, while HACA shows the temporal pattern from the monthly and yearly factor analysis which correlates with severe haze episodes that have happened in this country at certain periods of time. The PCA results show that the major source of air pollution is mostly due to the combustion of fossil fuel in motor vehicles and industrial activities. The spatial pattern recognition (S-ANN) results show a better prediction performance in discriminating between the regions, with an excellent percentage of correct classification compared to DA. This study presents the necessity and usefulness of environmetric techniques for the interpretation of large datasets aiming to obtain better information about air quality patterns based on spatial and temporal characterizations at the selected air monitoring stations.

  4. Phase and amplitude modification of a laser beam by two deformable mirrors using conventional 4f image encryption techniques

    NASA Astrophysics Data System (ADS)

    Wu, Chensheng; Ko, Jonathan; Rzasa, John Robertson; Davis, Christopher C.

    2017-08-01

    The image encryption and decryption technique using lens components and random phase screens has attracted a great deal of research interest in the past few years. In general, the optical encryption technique can translate a positive image into an image with nearly a white speckle pattern that is impossible to decrypt. However, with the right keys as conjugated random phase screens, the white noise speckle pattern can be decoded into the original image. We find that the fundamental ideas in image encryption can be borrowed and applied to carry out beam corrections through turbulent channels. Based on our detailed analysis, we show that by using two deformable mirrors arranged in similar fashions as in the image encryption technique, a large number of controllable phase and amplitude distribution patterns can be generated from a collimated Gaussian beam. Such a result can be further coupled with wavefront sensing techniques to achieve laser beam correction against turbulence distortions. In application, our approach leads to a new type of phase conjugation mirror that could be beneficial for directed energy systems.

  5. Embedded expert system for space shuttle main engine maintenance

    NASA Technical Reports Server (NTRS)

    Pooley, J.; Thompson, W.; Homsley, T.; Teoh, W.; Jones, J.; Lewallen, P.

    1987-01-01

    The SPARTA Embedded Expert System (SEES) is an intelligent health monitoring system that directs analysis by placing confidence factors on possible engine status and then recommends a course of action to an engineer or engine controller. The technique can prevent catastropic failures or costly rocket engine down time because of false alarms. Further, the SEES has potential as an on-board flight monitor for reusable rocket engine systems. The SEES methodology synergistically integrates vibration analysis, pattern recognition and communications theory techniques with an artificial intelligence technique - the Embedded Expert System (EES).

  6. Health Lifestyles: Audience Segmentation Analysis for Public Health Interventions.

    ERIC Educational Resources Information Center

    Slater, Michael D.; Flora, June A.

    This paper is concerned with the application of market research techniques to segment large populations into homogeneous units in order to improve the reach, utilization, and effectiveness of health programs. The paper identifies seven distinctive patterns of health attitudes, social influences, and behaviors using cluster analytic techniques in a…

  7. A new technique for ordering asymmetrical three-dimensional data sets in ecology.

    PubMed

    Pavoine, Sandrine; Blondel, Jacques; Baguette, Michel; Chessel, Daniel

    2007-02-01

    The aim of this paper is to tackle the problem that arises from asymmetrical data cubes formed by two crossed factors fixed by the experimenter (factor A and factor B, e.g., sites and dates) and a factor which is not controlled for (the species). The entries of this cube are densities in species. We approach this kind of data by the comparison of patterns, that is to say by analyzing first the effect of factor B on the species-factor A pattern, and second the effect of factor A on the species-factor B pattern. The analysis of patterns instead of individual responses requires a correspondence analysis. We use a method we call Foucart's correspondence analysis to coordinate the correspondence analyses of several independent matrices of species x factor A (respectively B) type, corresponding to each modality of factor B (respectively A). Such coordination makes it possible to evaluate the effect of factor B (respectively A) on the species-factor A (respectively B) pattern. The results obtained by such a procedure are much more insightful than those resulting from a classical single correspondence analysis applied to the global matrix that is obtained by simply unrolling the data cube, juxtaposing for example the individual species x factor A matrices through modalities of factor B. This is because a single global correspondence analysis combines three effects of factors in a way that cannot be determined from factorial maps (factor A, factor B, and factor A x factor B interaction) whereas the applications of Foucart's correspondence analysis clearly discriminate two different issues. Using two data sets, we illustrate that this technique proves to be particularly powerful in the analyses of ecological convergence which include several distinct data sets and in the analyses of spatiotemporal variations of species distributions.

  8. Quartz preferred orientation in naturally deformed mylonitic rocks (Montalto shear zone-Italy): a comparison of results by different techniques, their advantages and limitations

    NASA Astrophysics Data System (ADS)

    Fazio, Eugenio; Punturo, Rosalda; Cirrincione, Rosolino; Kern, Hartmut; Pezzino, Antonino; Wenk, Hans-Rudolf; Goswami, Shalini; Mamtani, Manish A.

    2017-10-01

    In the geologic record, the quartz c-axis patterns are widely adopted in the investigation of crystallographic preferred orientations (CPO) of naturally deformed rocks. To this aim, in the present work, four different methods for measuring quartz c-axis orientations in naturally sheared rocks were applied and compared: the classical universal stage technique, the computer-integrated polarization microscopy method (CIP), the time-of-flight (TOF) neutron diffraction analysis , and the electron backscatter diffraction (EBSD). Microstructural analysis and CPO patterns of quartz, together with the ones obtained for feldspars and micas in mylonitic granitoid rocks, have been then considered to solve structural and geological questions related to the Montalto crustal scale shear zone (Calabria, southern Italy). Results obtained by applying the different techniques are discussed, and the advantages as well as limitations of each method are highlighted. Importantly, our findings suggest that patterns obtained by means of different techniques are quite similar. In particular, for such mylonites, a subsimple shear (40% simple shear vs 60% pure shear) by shape analysis of porphyroclasts was inferred. A general tendency of an asymmetric c-maximum near to the Z direction (normal to foliation) suggesting dominant basal slip, consistent with fabric patterns related to dynamically recrystallization under greenschist facies, is recognized. Rhombohedral slip was likely active as documented by pole figures of positive and negative rhombs (TOF), which reveal also potential mechanical Dauphiné twinning. Results showed that the most complete CPO characterization on deformed rocks is given by the TOF (from which also other quartz crystallographic axes can be obtained as well as various mineral phases may be investigated). However, this use is restricted by the fact that (a) there are very few TOF facilities around the world and (b) there is loss of any domainal reference, since TOF is a bulk type analysis. EBSD is a widely used technique, which allows an excellent microstructural control of the user covering a good amount of investigated grains. CIP and US are not expensive techniques with respect the other kind of investigations and even if they might be considered obsolete and/or time-consuming, they have the advantage to provide a fine and grain by grain "first round" inspection on the investigated rock fabric.

  9. Comparing rainfall patterns between regions in Peninsular Malaysia via a functional data analysis technique

    NASA Astrophysics Data System (ADS)

    Suhaila, Jamaludin; Jemain, Abdul Aziz; Hamdan, Muhammad Fauzee; Wan Zin, Wan Zawiah

    2011-12-01

    SummaryNormally, rainfall data is collected on a daily, monthly or annual basis in the form of discrete observations. The aim of this study is to convert these rainfall values into a smooth curve or function which could be used to represent the continuous rainfall process at each region via a technique known as functional data analysis. Since rainfall data shows a periodic pattern in each region, the Fourier basis is introduced to capture these variations. Eleven basis functions with five harmonics are used to describe the unimodal rainfall pattern for stations in the East while five basis functions which represent two harmonics are needed to describe the rainfall pattern in the West. Based on the fitted smooth curve, the wet and dry periods as well as the maximum and minimum rainfall values could be determined. Different rainfall patterns are observed among the studied regions based on the smooth curve. Using the functional analysis of variance, the test results indicated that there exist significant differences in the functional means between each region. The largest differences in the functional means are found between the East and Northwest regions and these differences may probably be due to the effect of topography and, geographical location and are mostly influenced by the monsoons. Therefore, the same inputs or approaches might not be useful in modeling the hydrological process for different regions.

  10. Empirical OPC rule inference for rapid RET application

    NASA Astrophysics Data System (ADS)

    Kulkarni, Anand P.

    2006-10-01

    A given technological node (45 nm, 65 nm) can be expected to process thousands of individual designs. Iterative methods applied at the node consume valuable days in determining proper placement of OPC features, and manufacturing and testing mask correspondence to wafer patterns in a trial-and-error fashion for each design. Repeating this fabrication process for each individual design is a time-consuming and expensive process. We present a novel technique which sidesteps the requirement to iterate through the model-based OPC analysis and pattern verification cycle on subsequent designs at the same node. Our approach relies on the inference of rules from a correct pattern at the wafer surface it relates to the OPC and pre-OPC pattern layout files. We begin with an offline phase where we obtain a "gold standard" design file that has been fab-tested at the node with a prepared, post-OPC layout file that corresponds to the intended on-wafer pattern. We then run an offline analysis to infer rules to be used in this method. During the analysis, our method implicitly identifies contextual OPC strategies for optimal placement of RET features on any design at that node. Using these strategies, we can apply OPC to subsequent designs at the same node with accuracy comparable to the original design file but significantly smaller expected runtimes. The technique promises to offer a rapid and accurate complement to existing RET application strategies.

  11. Meteor tracking via local pattern clustering in spatio-temporal domain

    NASA Astrophysics Data System (ADS)

    Kukal, Jaromír.; Klimt, Martin; Švihlík, Jan; Fliegel, Karel

    2016-09-01

    Reliable meteor detection is one of the crucial disciplines in astronomy. A variety of imaging systems is used for meteor path reconstruction. The traditional approach is based on analysis of 2D image sequences obtained from a double station video observation system. Precise localization of meteor path is difficult due to atmospheric turbulence and other factors causing spatio-temporal fluctuations of the image background. The proposed technique performs non-linear preprocessing of image intensity using Box-Cox transform as recommended in our previous work. Both symmetric and asymmetric spatio-temporal differences are designed to be robust in the statistical sense. Resulting local patterns are processed by data whitening technique and obtained vectors are classified via cluster analysis and Self-Organized Map (SOM).

  12. Elephants in space and time

    Treesearch

    Samuel A. Cushman; Michael Chase; Curtice Griffin

    2005-01-01

    Autocorrelation in animal movements can be both a serious nuisance to analysis and a source of valuable information about the scale and patterns of animal behavior, depending on the question and the techniques employed. In this paper we present an approach to analyzing the patterns of autocorrelation in animal movements that provides a detailed picture of seasonal...

  13. Pattern recognition and expert image analysis systems in biomedical image processing (Invited Paper)

    NASA Astrophysics Data System (ADS)

    Oosterlinck, A.; Suetens, P.; Wu, Q.; Baird, M.; F. M., C.

    1987-09-01

    This paper gives an overview of pattern recoanition techniques (P.R.) used in biomedical image processing and problems related to the different P.R. solutions. Also the use of knowledge based systems to overcome P.R. difficulties, is described. This is illustrated by a common example ofabiomedical image processing application.

  14. Analysis of Acoustic Emission Parameters from Corrosion of AST Bottom Plate in Field Testing

    NASA Astrophysics Data System (ADS)

    Jomdecha, C.; Jirarungsatian, C.; Suwansin, W.

    Field testing of aboveground storage tank (AST) to monitor corrosion of the bottom plate is presented in this chapter. AE testing data of the ten AST with different sizes, materials, and products were employed to monitor the bottom plate condition. AE sensors of 30 and 150 kHz were used to monitor the corrosion activity of up to 24 channels including guard sensors. Acoustic emission (AE) parameters were analyzed to explore the AE parameter patterns of occurring corrosion compared to the laboratory results. Amplitude, count, duration, and energy were main parameters of analysis. Pattern recognition technique with statistical was implemented to eliminate the electrical and environmental noises. The results showed the specific AE patterns of corrosion activities related to the empirical results. In addition, plane algorithm was utilized to locate the significant AE events from corrosion. Both results of parameter patterns and AE event locations can be used to interpret and locate the corrosion activities. Finally, basic statistical grading technique was used to evaluate the bottom plate condition of the AST.

  15. [Methods of a posteriori identification of food patterns in Brazilian children: a systematic review].

    PubMed

    Carvalho, Carolina Abreu de; Fonsêca, Poliana Cristina de Almeida; Nobre, Luciana Neri; Priore, Silvia Eloiza; Franceschini, Sylvia do Carmo Castro

    2016-01-01

    The objective of this study is to provide guidance for identifying dietary patterns using the a posteriori approach, and analyze the methodological aspects of the studies conducted in Brazil that identified the dietary patterns of children. Articles were selected from the Latin American and Caribbean Literature on Health Sciences, Scientific Electronic Library Online and Pubmed databases. The key words were: Dietary pattern; Food pattern; Principal Components Analysis; Factor analysis; Cluster analysis; Reduced rank regression. We included studies that identified dietary patterns of children using the a posteriori approach. Seven studies published between 2007 and 2014 were selected, six of which were cross-sectional and one cohort, Five studies used the food frequency questionnaire for dietary assessment; one used a 24-hour dietary recall and the other a food list. The method of exploratory approach used in most publications was principal components factor analysis, followed by cluster analysis. The sample size of the studies ranged from 232 to 4231, the values of the Kaiser-Meyer-Olkin test from 0.524 to 0.873, and Cronbach's alpha from 0.51 to 0.69. Few Brazilian studies identified dietary patterns of children using the a posteriori approach and principal components factor analysis was the technique most used.

  16. Image analysis library software development

    NASA Technical Reports Server (NTRS)

    Guseman, L. F., Jr.; Bryant, J.

    1977-01-01

    The Image Analysis Library consists of a collection of general purpose mathematical/statistical routines and special purpose data analysis/pattern recognition routines basic to the development of image analysis techniques for support of current and future Earth Resources Programs. Work was done to provide a collection of computer routines and associated documentation which form a part of the Image Analysis Library.

  17. Extended local similarity analysis (eLSA) of microbial community and other time series data with replicates.

    PubMed

    Xia, Li C; Steele, Joshua A; Cram, Jacob A; Cardon, Zoe G; Simmons, Sheri L; Vallino, Joseph J; Fuhrman, Jed A; Sun, Fengzhu

    2011-01-01

    The increasing availability of time series microbial community data from metagenomics and other molecular biological studies has enabled the analysis of large-scale microbial co-occurrence and association networks. Among the many analytical techniques available, the Local Similarity Analysis (LSA) method is unique in that it captures local and potentially time-delayed co-occurrence and association patterns in time series data that cannot otherwise be identified by ordinary correlation analysis. However LSA, as originally developed, does not consider time series data with replicates, which hinders the full exploitation of available information. With replicates, it is possible to understand the variability of local similarity (LS) score and to obtain its confidence interval. We extended our LSA technique to time series data with replicates and termed it extended LSA, or eLSA. Simulations showed the capability of eLSA to capture subinterval and time-delayed associations. We implemented the eLSA technique into an easy-to-use analytic software package. The software pipeline integrates data normalization, statistical correlation calculation, statistical significance evaluation, and association network construction steps. We applied the eLSA technique to microbial community and gene expression datasets, where unique time-dependent associations were identified. The extended LSA analysis technique was demonstrated to reveal statistically significant local and potentially time-delayed association patterns in replicated time series data beyond that of ordinary correlation analysis. These statistically significant associations can provide insights to the real dynamics of biological systems. The newly designed eLSA software efficiently streamlines the analysis and is freely available from the eLSA homepage, which can be accessed at http://meta.usc.edu/softs/lsa.

  18. Extended local similarity analysis (eLSA) of microbial community and other time series data with replicates

    PubMed Central

    2011-01-01

    Background The increasing availability of time series microbial community data from metagenomics and other molecular biological studies has enabled the analysis of large-scale microbial co-occurrence and association networks. Among the many analytical techniques available, the Local Similarity Analysis (LSA) method is unique in that it captures local and potentially time-delayed co-occurrence and association patterns in time series data that cannot otherwise be identified by ordinary correlation analysis. However LSA, as originally developed, does not consider time series data with replicates, which hinders the full exploitation of available information. With replicates, it is possible to understand the variability of local similarity (LS) score and to obtain its confidence interval. Results We extended our LSA technique to time series data with replicates and termed it extended LSA, or eLSA. Simulations showed the capability of eLSA to capture subinterval and time-delayed associations. We implemented the eLSA technique into an easy-to-use analytic software package. The software pipeline integrates data normalization, statistical correlation calculation, statistical significance evaluation, and association network construction steps. We applied the eLSA technique to microbial community and gene expression datasets, where unique time-dependent associations were identified. Conclusions The extended LSA analysis technique was demonstrated to reveal statistically significant local and potentially time-delayed association patterns in replicated time series data beyond that of ordinary correlation analysis. These statistically significant associations can provide insights to the real dynamics of biological systems. The newly designed eLSA software efficiently streamlines the analysis and is freely available from the eLSA homepage, which can be accessed at http://meta.usc.edu/softs/lsa. PMID:22784572

  19. Gait Analysis Laboratory

    NASA Technical Reports Server (NTRS)

    1976-01-01

    Complete motion analysis laboratory has evolved out of analyzing walking patterns of crippled children at Stanford Children's Hospital. Data is collected by placing tiny electrical sensors over muscle groups of child's legs and inserting step-sensing switches in soles of shoes. Miniature radio transmitters send signals to receiver for continuous recording of abnormal walking pattern. Engineers are working to apply space electronics miniaturization techniques to reduce size and weight of telemetry system further as well as striving to increase signal bandwidth so analysis can be performed faster and more accurately using a mini-computer.

  20. A preliminary computer pattern analysis of satellite images of mature extratropical cyclones

    NASA Technical Reports Server (NTRS)

    Burfeind, Craig R.; Weinman, James A.; Barkstrom, Bruce R.

    1987-01-01

    This study has applied computerized pattern analysis techniques to the location and classification of features of several mature extratropical cyclones that were depicted in GOES satellite images. These features include the location of the center of the cyclone vortex core and the location of the associated occluded front. The cyclone type was classified in accord with the scheme of Troup and Streten. The present analysis was implemented on a personal computer; results were obtained within approximately one or two minutes without the intervention of an analyst.

  1. Sinusoidal modulation analysis for optical system MTF measurements.

    PubMed

    Boone, J M; Yu, T; Seibert, J A

    1996-12-01

    The modulation transfer function (MTF) is a commonly used metric for defining the spatial resolution characteristics of imaging systems. While the MTF is defined in terms of how an imaging system demodulates the amplitude of a sinusoidal input, this approach has not been in general use to measure MTFs in the medical imaging community because producing sinusoidal x-ray patterns is technically difficult. However, for optical systems such as charge coupled devices (CCD), which are rapidly becoming a part of many medical digital imaging systems, the direct measurement of modulation at discrete spatial frequencies using a sinusoidal test pattern is practical. A commercially available optical test pattern containing spatial frequencies ranging from 0.375 cycles/mm to 80 cycles/mm was sued to determine the MRF of a CCD-based optical system. These results were compared with the angulated slit method of Fujita [H. Fujita, D. Tsia, T. Itoh, K. Doi, J. Morishita, K. Ueda, and A. Ohtsuka, "A simple method for determining the modulation transfer function in digital radiography," IEEE Trans. Medical Imaging 11, 34-39 (1992)]. The use of a semiautomated profiled iterated reconstruction technique (PIRT) is introduced, where the shift factor between successive pixel rows (due to angulation) is optimized iteratively by least-squares error analysis rather than by hand measurement of the slit angle. PIRT was used to find the slit angle for the Fujita technique and to find the sine-pattern angle for the sine-pattern technique. Computer simulation of PIRT for the case of the slit image (a line spread function) demonstrated that it produced a more accurate angle determination than "hand" measurement, and there is a significant difference between the errors in the two techniques (Wilcoxon Signed Rank Test, p < 0.001). The sine-pattern method and the Fujita slit method produced comparable MTF curves for the CCD camera evaluated.

  2. Wave Propagation Measurements on Two-Dimensional Lattice.

    DTIC Science & Technology

    1985-09-15

    of boundaries, lattice member connectivities, and structural defects on these parameters. Perhaps, statistical energy analysis or pattern recognition techniques would also be of benefit in such efforts.

  3. HPLC fingerprint analysis combined with chemometrics for pattern recognition of ginger.

    PubMed

    Feng, Xu; Kong, Weijun; Wei, Jianhe; Ou-Yang, Zhen; Yang, Meihua

    2014-03-01

    Ginger, the fresh rhizome of Zingiber officinale Rosc. (Zingiberaceae), has been used worldwide; however, for a long time, there has been no standard approbated internationally for its quality control. To establish an efficacious and combinational method and pattern recognition technique for quality control of ginger. A simple, accurate and reliable method based on high-performance liquid chromatography with photodiode array (HPLC-PDA) detection was developed for establishing the chemical fingerprints of 10 batches of ginger from different markets in China. The method was validated in terms of precision, reproducibility and stability; and the relative standard deviations were all less than 1.57%. On the basis of this method, the fingerprints of 10 batches of ginger samples were obtained, which showed 16 common peaks. Coupled with similarity evaluation software, the similarities between each fingerprint of the sample and the simulative mean chromatogram were in the range of 0.998-1.000. Then, the chemometric techniques, including similarity analysis, hierarchical clustering analysis and principal component analysis were applied to classify the ginger samples. Consistent results were obtained to show that ginger samples could be successfully classified into two groups. This study revealed that HPLC-PDA method was simple, sensitive and reliable for fingerprint analysis, and moreover, for pattern recognition and quality control of ginger.

  4. Principal component analysis of the cytokine and chemokine response to human traumatic brain injury.

    PubMed

    Helmy, Adel; Antoniades, Chrystalina A; Guilfoyle, Mathew R; Carpenter, Keri L H; Hutchinson, Peter J

    2012-01-01

    There is a growing realisation that neuro-inflammation plays a fundamental role in the pathology of Traumatic Brain Injury (TBI). This has led to the search for biomarkers that reflect these underlying inflammatory processes using techniques such as cerebral microdialysis. The interpretation of such biomarker data has been limited by the statistical methods used. When analysing data of this sort the multiple putative interactions between mediators need to be considered as well as the timing of production and high degree of statistical co-variance in levels of these mediators. Here we present a cytokine and chemokine dataset from human brain following human traumatic brain injury and use principal component analysis and partial least squares discriminant analysis to demonstrate the pattern of production following TBI, distinct phases of the humoral inflammatory response and the differing patterns of response in brain and in peripheral blood. This technique has the added advantage of making no assumptions about the Relative Recovery (RR) of microdialysis derived parameters. Taken together these techniques can be used in complex microdialysis datasets to summarise the data succinctly and generate hypotheses for future study.

  5. Brazilian Road Traffic Fatalities: A Spatial and Environmental Analysis

    PubMed Central

    de Andrade, Luciano; Vissoci, João Ricardo Nickenig; Rodrigues, Clarissa Garcia; Finato, Karen; Carvalho, Elias; Pietrobon, Ricardo; de Souza, Eniuce Menezes; Nihei, Oscar Kenji; Lynch, Catherine; de Barros Carvalho, Maria Dalva

    2014-01-01

    Background Road traffic injuries (RTI) are a major public health epidemic killing thousands of people daily. Low and middle-income countries, such as Brazil, have the highest annual rates of road traffic fatalities. In order to improve road safety, this study mapped road traffic fatalities on a Brazilian highway to determine the main environmental factors affecting road traffic fatalities. Methods and Findings Four techniques were utilized to identify and analyze RTI hotspots. We used spatial analysis by points by applying kernel density estimator, and wavelet analysis to identify the main hot regions. Additionally, built environment analysis, and principal component analysis were conducted to verify patterns contributing to crash occurrence in the hotspots. Between 2007 and 2009, 379 crashes were notified, with 466 fatalities on BR277. Higher incidence of crashes occurred on sections of highway with double lanes (ratio 2∶1). The hotspot analysis demonstrated that both the eastern and western regions had higher incidences of crashes when compared to the central region. Through the built environment analysis, we have identified five different patterns, demonstrating that specific environmental characteristics are associated with different types of fatal crashes. Patterns 2 and 4 are constituted mainly by predominantly urban characteristics and have frequent fatal pedestrian crashes. Patterns 1, 3 and 5 display mainly rural characteristics and have higher prevalence of vehicular collisions. In the built environment analysis, the variables length of road in urban area, limited lighting, double lanes roadways, and less auxiliary lanes were associated with a higher incidence of fatal crashes. Conclusions By combining different techniques of analyses, we have identified numerous hotspots and environmental characteristics, which governmental or regulatory agencies could make use to plan strategies to reduce RTI and support life-saving policies. PMID:24498051

  6. Spatial patterns of soil moisture connected to monthly-seasonal precipitation variability in a monsoon region

    Treesearch

    Yongqiang Liu

    2003-01-01

    The relations between monthly-seasonal soil moisture and precipitation variability are investigated by identifying the coupled patterns of the two hydrological fields using singular value decomposition (SVD). SVD is a technique of principal component analysis similar to empirical orthogonal knctions (EOF). However, it is applied to two variables simultaneously and is...

  7. Integrating Remote Sensing Data with Directional Two- Dimensional Wavelet Analysis and Open Geospatial Techniques for Efficient Disaster Monitoring and Management.

    PubMed

    Lin, Yun-Bin; Lin, Yu-Pin; Deng, Dong-Po; Chen, Kuan-Wei

    2008-02-19

    In Taiwan, earthquakes have long been recognized as a major cause oflandslides that are wide spread by floods brought by typhoons followed. Distinguishingbetween landslide spatial patterns in different disturbance regimes is fundamental fordisaster monitoring, management, and land-cover restoration. To circumscribe landslides,this study adopts the normalized difference vegetation index (NDVI), which can bedetermined by simply applying mathematical operations of near-infrared and visible-redspectral data immediately after remotely sensed data is acquired. In real-time disastermonitoring, the NDVI is more effective than using land-cover classifications generatedfrom remotely sensed data as land-cover classification tasks are extremely time consuming.Directional two-dimensional (2D) wavelet analysis has an advantage over traditionalspectrum analysis in that it determines localized variations along a specific direction whenidentifying dominant modes of change, and where those modes are located in multi-temporal remotely sensed images. Open geospatial techniques comprise a series ofsolutions developed based on Open Geospatial Consortium specifications that can beapplied to encode data for interoperability and develop an open geospatial service for sharing data. This study presents a novel approach and framework that uses directional 2Dwavelet analysis of real-time NDVI images to effectively identify landslide patterns andshare resulting patterns via open geospatial techniques. As a case study, this study analyzedNDVI images derived from SPOT HRV images before and after the ChiChi earthquake(7.3 on the Richter scale) that hit the Chenyulan basin in Taiwan, as well as images aftertwo large typhoons (Xangsane and Toraji) to delineate the spatial patterns of landslidescaused by major disturbances. Disturbed spatial patterns of landslides that followed theseevents were successfully delineated using 2D wavelet analysis, and results of patternrecognitions of landslides were distributed simultaneously to other agents using geographymarkup language. Real-time information allows successive platforms (agents) to work withlocal geospatial data for disaster management. Furthermore, the proposed is suitable fordetecting landslides in various regions on continental, regional, and local scales usingremotely sensed data in various resolutions derived from SPOT HRV, IKONOS, andQuickBird multispectral images.

  8. Pattern Recognition Using Artificial Neural Network: A Review

    NASA Astrophysics Data System (ADS)

    Kim, Tai-Hoon

    Among the various frameworks in which pattern recognition has been traditionally formulated, the statistical approach has been most intensively studied and used in practice. More recently, artificial neural network techniques theory have been receiving increasing attention. The design of a recognition system requires careful attention to the following issues: definition of pattern classes, sensing environment, pattern representation, feature extraction and selection, cluster analysis, classifier design and learning, selection of training and test samples, and performance evaluation. In spite of almost 50 years of research and development in this field, the general problem of recognizing complex patterns with arbitrary orientation, location, and scale remains unsolved. New and emerging applications, such as data mining, web searching, retrieval of multimedia data, face recognition, and cursive handwriting recognition, require robust and efficient pattern recognition techniques. The objective of this review paper is to summarize and compare some of the well-known methods used in various stages of a pattern recognition system using ANN and identify research topics and applications which are at the forefront of this exciting and challenging field.

  9. Assessment of water quality monitoring for the optimal sensor placement in lake Yahuarcocha using pattern recognition techniques and geographical information systems.

    PubMed

    Jácome, Gabriel; Valarezo, Carla; Yoo, Changkyoo

    2018-03-30

    Pollution and the eutrophication process are increasing in lake Yahuarcocha and constant water quality monitoring is essential for a better understanding of the patterns occurring in this ecosystem. In this study, key sensor locations were determined using spatial and temporal analyses combined with geographical information systems (GIS) to assess the influence of weather features, anthropogenic activities, and other non-point pollution sources. A water quality monitoring network was established to obtain data on 14 physicochemical and microbiological parameters at each of seven sample sites over a period of 13 months. A spatial and temporal statistical approach using pattern recognition techniques, such as cluster analysis (CA) and discriminant analysis (DA), was employed to classify and identify the most important water quality parameters in the lake. The original monitoring network was reduced to four optimal sensor locations based on a fuzzy overlay of the interpolations of concentration variations of the most important parameters.

  10. Streamflow characterization using functional data analysis of the Potomac River

    NASA Astrophysics Data System (ADS)

    Zelmanow, A.; Maslova, I.; Ticlavilca, A. M.; McKee, M.

    2013-12-01

    Flooding and droughts are extreme hydrological events that affect the United States economically and socially. The severity and unpredictability of flooding has caused billions of dollars in damage and the loss of lives in the eastern United States. In this context, there is an urgent need to build a firm scientific basis for adaptation by developing and applying new modeling techniques for accurate streamflow characterization and reliable hydrological forecasting. The goal of this analysis is to use numerical streamflow characteristics in order to classify, model, and estimate the likelihood of extreme events in the eastern United States, mainly the Potomac River. Functional data analysis techniques are used to study yearly streamflow patterns, with the extreme streamflow events characterized via functional principal component analysis. These methods are merged with more classical techniques such as cluster analysis, classification analysis, and time series modeling. The developed functional data analysis approach is used to model continuous streamflow hydrographs. The forecasting potential of this technique is explored by incorporating climate factors to produce a yearly streamflow outlook.

  11. An analysis of fracture trace patterns in areas of flat-lying sedimentary rocks for the detection of buried geologic structure. [Kansas and Texas

    NASA Technical Reports Server (NTRS)

    Podwysocki, M. H.

    1974-01-01

    Two study areas in a cratonic platform underlain by flat-lying sedimentary rocks were analyzed to determine if a quantitative relationship exists between fracture trace patterns and their frequency distributions and subsurface structural closures which might contain petroleum. Fracture trace lengths and frequency (number of fracture traces per unit area) were analyzed by trend surface analysis and length frequency distributions also were compared to a standard Gaussian distribution. Composite rose diagrams of fracture traces were analyzed using a multivariate analysis method which grouped or clustered the rose diagrams and their respective areas on the basis of the behavior of the rays of the rose diagram. Analysis indicates that the lengths of fracture traces are log-normally distributed according to the mapping technique used. Fracture trace frequency appeared higher on the flanks of active structures and lower around passive reef structures. Fracture trace log-mean lengths were shorter over several types of structures, perhaps due to increased fracturing and subsequent erosion. Analysis of rose diagrams using a multivariate technique indicated lithology as the primary control for the lower grouping levels. Groupings at higher levels indicated that areas overlying active structures may be isolated from their neighbors by this technique while passive structures showed no differences which could be isolated.

  12. Developing a complex independent component analysis technique to extract non-stationary patterns from geophysical time-series

    NASA Astrophysics Data System (ADS)

    Forootan, Ehsan; Kusche, Jürgen

    2016-04-01

    Geodetic/geophysical observations, such as the time series of global terrestrial water storage change or sea level and temperature change, represent samples of physical processes and therefore contain information about complex physical interactionswith many inherent time scales. Extracting relevant information from these samples, for example quantifying the seasonality of a physical process or its variability due to large-scale ocean-atmosphere interactions, is not possible by rendering simple time series approaches. In the last decades, decomposition techniques have found increasing interest for extracting patterns from geophysical observations. Traditionally, principal component analysis (PCA) and more recently independent component analysis (ICA) are common techniques to extract statistical orthogonal (uncorrelated) and independent modes that represent the maximum variance of observations, respectively. PCA and ICA can be classified as stationary signal decomposition techniques since they are based on decomposing the auto-covariance matrix or diagonalizing higher (than two)-order statistical tensors from centered time series. However, the stationary assumption is obviously not justifiable for many geophysical and climate variables even after removing cyclic components e.g., the seasonal cycles. In this paper, we present a new decomposition method, the complex independent component analysis (CICA, Forootan, PhD-2014), which can be applied to extract to non-stationary (changing in space and time) patterns from geophysical time series. Here, CICA is derived as an extension of real-valued ICA (Forootan and Kusche, JoG-2012), where we (i) define a new complex data set using a Hilbert transformation. The complex time series contain the observed values in their real part, and the temporal rate of variability in their imaginary part. (ii) An ICA algorithm based on diagonalization of fourth-order cumulants is then applied to decompose the new complex data set in (i). (iii) Dominant non-stationary patterns are recognized as independent complex patterns that can be used to represent the space and time amplitude and phase propagations. We present the results of CICA on simulated and real cases e.g., for quantifying the impact of large-scale ocean-atmosphere interaction on global mass changes. Forootan (PhD-2014) Statistical signal decomposition techniques for analyzing time-variable satellite gravimetry data, PhD Thesis, University of Bonn, http://hss.ulb.uni-bonn.de/2014/3766/3766.htm Forootan and Kusche (JoG-2012) Separation of global time-variable gravity signals into maximally independent components, Journal of Geodesy 86 (7), 477-497, doi: 10.1007/s00190-011-0532-5

  13. Configural Frequency Analysis as a Statistical Tool for Developmental Research.

    ERIC Educational Resources Information Center

    Lienert, Gustav A.; Oeveste, Hans Zur

    1985-01-01

    Configural frequency analysis (CFA) is suggested as a technique for longitudinal research in developmental psychology. Stability and change in answers to multiple choice and yes-no item patterns obtained with repeated measurements are identified by CFA and illustrated by developmental analysis of an item from Gorham's Proverb Test. (Author/DWH)

  14. Atlas of computerized blood flow analysis in bone disease.

    PubMed

    Gandsman, E J; Deutsch, S D; Tyson, I B

    1983-11-01

    The role of computerized blood flow analysis in routine bone scanning is reviewed. Cases illustrating the technique include proven diagnoses of toxic synovitis, Legg-Perthes disease, arthritis, avascular necrosis of the hip, fractures, benign and malignant tumors, Paget's disease, cellulitis, osteomyelitis, and shin splints. Several examples also show the use of the technique in monitoring treatment. The use of quantitative data from the blood flow, bone uptake phase, and static images suggests specific diagnostic patterns for each of the diseases presented in this atlas. Thus, this technique enables increased accuracy in the interpretation of the radionuclide bone scan.

  15. Review and classification of variability analysis techniques with clinical applications.

    PubMed

    Bravi, Andrea; Longtin, André; Seely, Andrew J E

    2011-10-10

    Analysis of patterns of variation of time-series, termed variability analysis, represents a rapidly evolving discipline with increasing applications in different fields of science. In medicine and in particular critical care, efforts have focussed on evaluating the clinical utility of variability. However, the growth and complexity of techniques applicable to this field have made interpretation and understanding of variability more challenging. Our objective is to provide an updated review of variability analysis techniques suitable for clinical applications. We review more than 70 variability techniques, providing for each technique a brief description of the underlying theory and assumptions, together with a summary of clinical applications. We propose a revised classification for the domains of variability techniques, which include statistical, geometric, energetic, informational, and invariant. We discuss the process of calculation, often necessitating a mathematical transform of the time-series. Our aims are to summarize a broad literature, promote a shared vocabulary that would improve the exchange of ideas, and the analyses of the results between different studies. We conclude with challenges for the evolving science of variability analysis.

  16. Review and classification of variability analysis techniques with clinical applications

    PubMed Central

    2011-01-01

    Analysis of patterns of variation of time-series, termed variability analysis, represents a rapidly evolving discipline with increasing applications in different fields of science. In medicine and in particular critical care, efforts have focussed on evaluating the clinical utility of variability. However, the growth and complexity of techniques applicable to this field have made interpretation and understanding of variability more challenging. Our objective is to provide an updated review of variability analysis techniques suitable for clinical applications. We review more than 70 variability techniques, providing for each technique a brief description of the underlying theory and assumptions, together with a summary of clinical applications. We propose a revised classification for the domains of variability techniques, which include statistical, geometric, energetic, informational, and invariant. We discuss the process of calculation, often necessitating a mathematical transform of the time-series. Our aims are to summarize a broad literature, promote a shared vocabulary that would improve the exchange of ideas, and the analyses of the results between different studies. We conclude with challenges for the evolving science of variability analysis. PMID:21985357

  17. Methods for trend analysis: Examples with problem/failure data

    NASA Technical Reports Server (NTRS)

    Church, Curtis K.

    1989-01-01

    Statistics are emphasized as an important role in quality control and reliability. Consequently, Trend Analysis Techniques recommended a variety of statistical methodologies that could be applied to time series data. The major goal of the working handbook, using data from the MSFC Problem Assessment System, is to illustrate some of the techniques in the NASA standard, some different techniques, and to notice patterns of data. Techniques for trend estimation used are: regression (exponential, power, reciprocal, straight line) and Kendall's rank correlation coefficient. The important details of a statistical strategy for estimating a trend component are covered in the examples. However, careful analysis and interpretation is necessary because of small samples and frequent zero problem reports in a given time period. Further investigations to deal with these issues are being conducted.

  18. Array magnetics modal analysis for the DIII-D tokamak based on localized time-series modelling

    DOE PAGES

    Olofsson, K. Erik J.; Hanson, Jeremy M.; Shiraki, Daisuke; ...

    2014-07-14

    Here, time-series analysis of magnetics data in tokamaks is typically done using block-based fast Fourier transform methods. This work presents the development and deployment of a new set of algorithms for magnetic probe array analysis. The method is based on an estimation technique known as stochastic subspace identification (SSI). Compared with the standard coherence approach or the direct singular value decomposition approach, the new technique exhibits several beneficial properties. For example, the SSI method does not require that frequencies are orthogonal with respect to the timeframe used in the analysis. Frequencies are obtained directly as parameters of localized time-series models.more » The parameters are extracted by solving small-scale eigenvalue problems. Applications include maximum-likelihood regularized eigenmode pattern estimation, detection of neoclassical tearing modes, including locked mode precursors, and automatic clustering of modes, and magnetics-pattern characterization of sawtooth pre- and postcursors, edge harmonic oscillations and fishbones.« less

  19. Fast discrimination of traditional Chinese medicine according to geographical origins with FTIR spectroscopy and advanced pattern recognition techniques

    NASA Astrophysics Data System (ADS)

    Li, Ning; Wang, Yan; Xu, Kexin

    2006-08-01

    Combined with Fourier transform infrared (FTIR) spectroscopy and three kinds of pattern recognition techniques, 53 traditional Chinese medicine danshen samples were rapidly discriminated according to geographical origins. The results showed that it was feasible to discriminate using FTIR spectroscopy ascertained by principal component analysis (PCA). An effective model was built by employing the Soft Independent Modeling of Class Analogy (SIMCA) and PCA, and 82% of the samples were discriminated correctly. Through use of the artificial neural network (ANN)-based back propagation (BP) network, the origins of danshen were completely classified.

  20. Using factor analysis to identify neuromuscular synergies during treadmill walking

    NASA Technical Reports Server (NTRS)

    Merkle, L. A.; Layne, C. S.; Bloomberg, J. J.; Zhang, J. J.

    1998-01-01

    Neuroscientists are often interested in grouping variables to facilitate understanding of a particular phenomenon. Factor analysis is a powerful statistical technique that groups variables into conceptually meaningful clusters, but remains underutilized by neuroscience researchers presumably due to its complicated concepts and procedures. This paper illustrates an application of factor analysis to identify coordinated patterns of whole-body muscle activation during treadmill walking. Ten male subjects walked on a treadmill (6.4 km/h) for 20 s during which surface electromyographic (EMG) activity was obtained from the left side sternocleidomastoid, neck extensors, erector spinae, and right side biceps femoris, rectus femoris, tibialis anterior, and medial gastrocnemius. Factor analysis revealed 65% of the variance of seven muscles sampled aligned with two orthogonal factors, labeled 'transition control' and 'loading'. These two factors describe coordinated patterns of muscular activity across body segments that would not be evident by evaluating individual muscle patterns. The results show that factor analysis can be effectively used to explore relationships among muscle patterns across all body segments to increase understanding of the complex coordination necessary for smooth and efficient locomotion. We encourage neuroscientists to consider using factor analysis to identify coordinated patterns of neuromuscular activation that would be obscured using more traditional EMG analyses.

  1. Concrete Condition Assessment Using Impact-Echo Method and Extreme Learning Machines

    PubMed Central

    Zhang, Jing-Kui; Yan, Weizhong; Cui, De-Mi

    2016-01-01

    The impact-echo (IE) method is a popular non-destructive testing (NDT) technique widely used for measuring the thickness of plate-like structures and for detecting certain defects inside concrete elements or structures. However, the IE method is not effective for full condition assessment (i.e., defect detection, defect diagnosis, defect sizing and location), because the simple frequency spectrum analysis involved in the existing IE method is not sufficient to capture the IE signal patterns associated with different conditions. In this paper, we attempt to enhance the IE technique and enable it for full condition assessment of concrete elements by introducing advanced machine learning techniques for performing comprehensive analysis and pattern recognition of IE signals. Specifically, we use wavelet decomposition for extracting signatures or features out of the raw IE signals and apply extreme learning machine, one of the recently developed machine learning techniques, as classification models for full condition assessment. To validate the capabilities of the proposed method, we build a number of specimens with various types, sizes, and locations of defects and perform IE testing on these specimens in a lab environment. Based on analysis of the collected IE signals using the proposed machine learning based IE method, we demonstrate that the proposed method is effective in performing full condition assessment of concrete elements or structures. PMID:27023563

  2. Metacarpophalangeal pattern profile analysis: useful diagnostic tool for differentiating between dyschondrosteosis, Turner syndrome, and hypochondroplasia.

    PubMed

    Laurencikas, E; Sävendahl, L; Jorulf, H

    2006-06-01

    To assess the value of the metacarpophalangeal pattern profile (MCPP) analysis as a diagnostic tool for differentiating between patients with dyschondrosteosis, Turner syndrome, and hypochondroplasia. Radiographic and clinical data from 135 patients between 1 and 51 years of age were collected and analyzed. The study included 25 patients with hypochondroplasia (HCP), 39 with dyschondrosteosis (LWD), and 71 with Turner syndrome (TS). Hand pattern profiles were calculated and compared with those of 110 normal individuals. Pearson correlation coefficient (r) and multivariate discriminant analysis were used for pattern profile analysis. Pattern variability index, a measure of dysmorphogenesis, was calculated for LWD, TS, HCP, and normal controls. Our results demonstrate that patients with LWD, TS, or HCP have distinct pattern profiles that are significantly different from each other and from those of normal controls. Discriminant analysis yielded correct classification of normal versus abnormal individuals in 84% of cases. Classification of the patients into LWD, TS, and HCP groups was successful in 75%. The correct classification rate was higher (85%) when differentiating two pathological groups at a time. Pattern variability index was not helpful for differential diagnosis of LWD, TS, and HCP. Patients with LWD, TS, or HCP have distinct MCPPs and can be successfully differentiated from each other using advanced MCPP analysis. Discriminant analysis is to be preferred over Pearson correlation coefficient because it is a more sensitive and specific technique. MCPP analysis is a helpful tool for differentiating between syndromes with similar clinical and radiological abnormalities.

  3. Integration of Scale Invariant Generator Technique and S-A Technique for Characterizing 2-D Patterns for Information Retrieve

    NASA Astrophysics Data System (ADS)

    Cao, L.; Cheng, Q.

    2004-12-01

    The scale invariant generator technique (SIG) and spectrum-area analysis technique (S-A) were developed independently relevant to the concept of the generalized scale invariance (GSI). The former was developed for characterizing the parameters involved in the GSI for characterizing and simulating multifractal measures whereas the latter was for identifying scaling breaks for decomposition of superimposed multifractal measures caused by multiple geophysical processes. A natural integration of these two techniques may yield a new technique to serve two purposes, on the one hand, that can enrich the power of S-A by increasing the interpretability of decomposed patterns in some applications of S-A and, on the other hand, that can provide a mean to test the uniqueness of multifractality of measures which is essential for application of SIG technique in more complicated environment. The implementation of the proposed technique has been done as a Dynamic Link Library (DLL) in Visual C++. The program can be friendly used for method validation and application in different fields.

  4. Linkage analysis with multiplexed short tandem repeat polymorphisms using infrared fluorescence and M13 tailed primers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Oetting, W.S.; Lee, H.K.; Flanders, D.J.

    The use of short tandem repeat polymorphisms (STRPs) as marker loci for linkage analysis is becoming increasingly important due to their large numbers in the human genome and their high degree of polymorphism. Fluorescence-based detection of the STRP pattern with an automated DNA sequencer has improved the efficiency of this technique by eliminating the need for radioactivity and producing a digitized autoradiogram-like image that can be used for computer analysis. In an effort to simplify the procedure and to reduce the cost of fluorescence STRP analysis, we have developed a technique known as multiplexing STRPs with tailed primers (MSTP) usingmore » primers that have a 19-bp extension, identical to the sequence of an M13 sequencing primer, on the 5{prime} end of the forward primer in conjunction with multiplexing several primer pairs in a single polymerase chain reaction (PCR) amplification. The banding pattern is detected with the addition of the M13 primer-dye conjugate as the sole primer conjugated to the fluorescent dye, eliminating the need for direct conjugation of the infrared fluorescent dye to the STRP primers. The use of MSTP for linkage analysis greatly reduces the number of PCR reactions. Up to five primer pairs can be multiplexed together in the same reaction. At present, a set of 148 STRP markers spaced at an average genetic distance of 28 cM throughout the autosomal genome can be analyzed in 37 sets of multiplexed amplification reactions. We have automated the analysis of these patterns for linkage using software that both detects the STRP banding pattern and determines their sizes. This information can then be exported in a user-defined format from a database manager for linkage analysis. 15 refs., 2 figs., 4 tabs.« less

  5. [Identification of mycobacteria by matrix-assisted laser desorption-ionization time-of-flight mass spectrometry--using reference strains and clinical isolates of Mycobacterium].

    PubMed

    Niitsuma, Katsunao; Saito, Miwako; Koshiba, Shizuko; Kaneko, Michiyo

    2014-05-01

    Matrix-assisted laser desorption-ionization time-of-flight mass spectrometry (MALDI-TOF MS) method is being played an important role for the inspection of clinical microorganism as a rapid and the price reduction. Mass spectra obtained by measuring become points of identification whether the peak pattern match any species mass spectral pattern. We currently use MALDI-TOF MS for rapid and accurate diagnosis of inactivated reference and clinical isolates of Mycobacterium because of the improved pretreatment techniques compared with former inspection methods that pose a higher risk of infection to the operator. The identification matching rate of score value (SV) peak pattern spectra was compared with that of conventional methods such as strain diffusion/amplification. Also, cultures were examined after a fixed number of days. Compared with the initial inspection technique, the pretreatment stage of current MALDI-TOF MS inspection techniques can improve the analysis of inactivated acid-fast bacteria that are often used as inspection criteria strains of clinical isolates. Next, we compared the concordance rate for identification between MALDI-TOF MS and conventional methods such as diffusion/amplification by comparison of peak pattern spectra and evaluated SV spectra to identify differences in the culture media after the retention period. In examination of 158 strains of clinical isolated Mycobacterium tuberculosis complex (MTC), the identification coincidence rate in the genus level in a matching pattern was 99.4%, when the species level was included 94.9%. About 37 strains of nontuberculous mycobacteria (NTM), the identification coincidence rate in the genus level was 94.6%. M. bovis BCG (Tokyo strain) in the reference strain was judged by the matching pattern to be MTC, and it suggested that they are M. tuberculosis and affinity species with high DNA homology. Nontuberculous mycobacterial M. gordonae strain JATA 33-01 shared peak pattern spectra, excluding the isolates, with each clinically isolated strain. However, the mass spectra of six M. gordonae clinical isolates suggested polymorphisms with similar mass-to-charge ratios compared with those of the reference strains. The peak pattern spectra of the clinical isolates and reference strains, excluding the NTM M. gordonae strain JATA33-01, were consistent with the peak pattern characteristics of each isolate. However, a comparison between the peak patterns of the reference strains and those of the six clinically isolated M. gordonae strains revealed a similar mass-to-charge ratio, which may indicate few polymorphisms. The SV spectrum of the improved inspection technique showed no fidelity, but it was acceptable after days of culture as indicated by the decrease in SV (0.3 degree). Also, the reproducibility of this method was good, but no difference was observed from the SV of the improved inspection technique, which decreased by approximately 0.3 because of the number of days of culture storage. In addition, expansion of the database and dissemination of regional specificity by genotype analysis of clinical isolates was relevant to the accumulated data, as expected. In future studies, the relevance and regional specificity of clinical isolates by genotype analysis can be determined by stacking the solid media and database penetration.

  6. fMRI activation patterns in an analytic reasoning task: consistency with EEG source localization

    NASA Astrophysics Data System (ADS)

    Li, Bian; Vasanta, Kalyana C.; O'Boyle, Michael; Baker, Mary C.; Nutter, Brian; Mitra, Sunanda

    2010-03-01

    Functional magnetic resonance imaging (fMRI) is used to model brain activation patterns associated with various perceptual and cognitive processes as reflected by the hemodynamic (BOLD) response. While many sensory and motor tasks are associated with relatively simple activation patterns in localized regions, higher-order cognitive tasks may produce activity in many different brain areas involving complex neural circuitry. We applied a recently proposed probabilistic independent component analysis technique (PICA) to determine the true dimensionality of the fMRI data and used EEG localization to identify the common activated patterns (mapped as Brodmann areas) associated with a complex cognitive task like analytic reasoning. Our preliminary study suggests that a hybrid GLM/PICA analysis may reveal additional regions of activation (beyond simple GLM) that are consistent with electroencephalography (EEG) source localization patterns.

  7. A Science Mapping Analysis of 'Communication' WoS Subject Category (1980-2013)

    ERIC Educational Resources Information Center

    Montero-Díaz, Julio; Cobo, Manuel-Jesús; Gutiérrez-Salcedo, María; Segado-Boj, Francisco; Herrera-Viedma, Enrique

    2018-01-01

    Communication research field has an extraordinary growth pattern, indeed bigger than other research fields. In order to extract knowledge from such amount, intelligent techniques are needed. In such a way, using bibliometric techniques, the evolution of the conceptual, social and intellectual aspects of this research field could be analysed, and…

  8. Quaternion-Based Signal Analysis for Motor Imagery Classification from Electroencephalographic Signals.

    PubMed

    Batres-Mendoza, Patricia; Montoro-Sanjose, Carlos R; Guerra-Hernandez, Erick I; Almanza-Ojeda, Dora L; Rostro-Gonzalez, Horacio; Romero-Troncoso, Rene J; Ibarra-Manzano, Mario A

    2016-03-05

    Quaternions can be used as an alternative to model the fundamental patterns of electroencephalographic (EEG) signals in the time domain. Thus, this article presents a new quaternion-based technique known as quaternion-based signal analysis (QSA) to represent EEG signals obtained using a brain-computer interface (BCI) device to detect and interpret cognitive activity. This quaternion-based signal analysis technique can extract features to represent brain activity related to motor imagery accurately in various mental states. Experimental tests in which users where shown visual graphical cues related to left and right movements were used to collect BCI-recorded signals. These signals were then classified using decision trees (DT), support vector machine (SVM) and k-nearest neighbor (KNN) techniques. The quantitative analysis of the classifiers demonstrates that this technique can be used as an alternative in the EEG-signal modeling phase to identify mental states.

  9. Quaternion-Based Signal Analysis for Motor Imagery Classification from Electroencephalographic Signals

    PubMed Central

    Batres-Mendoza, Patricia; Montoro-Sanjose, Carlos R.; Guerra-Hernandez, Erick I.; Almanza-Ojeda, Dora L.; Rostro-Gonzalez, Horacio; Romero-Troncoso, Rene J.; Ibarra-Manzano, Mario A.

    2016-01-01

    Quaternions can be used as an alternative to model the fundamental patterns of electroencephalographic (EEG) signals in the time domain. Thus, this article presents a new quaternion-based technique known as quaternion-based signal analysis (QSA) to represent EEG signals obtained using a brain-computer interface (BCI) device to detect and interpret cognitive activity. This quaternion-based signal analysis technique can extract features to represent brain activity related to motor imagery accurately in various mental states. Experimental tests in which users where shown visual graphical cues related to left and right movements were used to collect BCI-recorded signals. These signals were then classified using decision trees (DT), support vector machine (SVM) and k-nearest neighbor (KNN) techniques. The quantitative analysis of the classifiers demonstrates that this technique can be used as an alternative in the EEG-signal modeling phase to identify mental states. PMID:26959029

  10. Pairwise Classifier Ensemble with Adaptive Sub-Classifiers for fMRI Pattern Analysis.

    PubMed

    Kim, Eunwoo; Park, HyunWook

    2017-02-01

    The multi-voxel pattern analysis technique is applied to fMRI data for classification of high-level brain functions using pattern information distributed over multiple voxels. In this paper, we propose a classifier ensemble for multiclass classification in fMRI analysis, exploiting the fact that specific neighboring voxels can contain spatial pattern information. The proposed method converts the multiclass classification to a pairwise classifier ensemble, and each pairwise classifier consists of multiple sub-classifiers using an adaptive feature set for each class-pair. Simulated and real fMRI data were used to verify the proposed method. Intra- and inter-subject analyses were performed to compare the proposed method with several well-known classifiers, including single and ensemble classifiers. The comparison results showed that the proposed method can be generally applied to multiclass classification in both simulations and real fMRI analyses.

  11. A Pressure Plate-Based Method for the Automatic Assessment of Foot Strike Patterns During Running.

    PubMed

    Santuz, Alessandro; Ekizos, Antonis; Arampatzis, Adamantios

    2016-05-01

    The foot strike pattern (FSP, description of how the foot touches the ground at impact) is recognized to be a predictor of both performance and injury risk. The objective of the current investigation was to validate an original foot strike pattern assessment technique based on the numerical analysis of foot pressure distribution. We analyzed the strike patterns during running of 145 healthy men and women (85 male, 60 female). The participants ran on a treadmill with integrated pressure plate at three different speeds: preferred (shod and barefoot 2.8 ± 0.4 m/s), faster (shod 3.5 ± 0.6 m/s) and slower (shod 2.3 ± 0.3 m/s). A custom-designed algorithm allowed the automatic footprint recognition and FSP evaluation. Incomplete footprints were simultaneously identified and corrected from the software itself. The widely used technique of analyzing high-speed video recordings was checked for its reliability and has been used to validate the numerical technique. The automatic numerical approach showed a good conformity with the reference video-based technique (ICC = 0.93, p < 0.01). The great improvement in data throughput and the increased completeness of results allow the use of this software as a powerful feedback tool in a simple experimental setup.

  12. Handling Dynamic Weights in Weighted Frequent Pattern Mining

    NASA Astrophysics Data System (ADS)

    Ahmed, Chowdhury Farhan; Tanbeer, Syed Khairuzzaman; Jeong, Byeong-Soo; Lee, Young-Koo

    Even though weighted frequent pattern (WFP) mining is more effective than traditional frequent pattern mining because it can consider different semantic significances (weights) of items, existing WFP algorithms assume that each item has a fixed weight. But in real world scenarios, the weight (price or significance) of an item can vary with time. Reflecting these changes in item weight is necessary in several mining applications, such as retail market data analysis and web click stream analysis. In this paper, we introduce the concept of a dynamic weight for each item, and propose an algorithm, DWFPM (dynamic weighted frequent pattern mining), that makes use of this concept. Our algorithm can address situations where the weight (price or significance) of an item varies dynamically. It exploits a pattern growth mining technique to avoid the level-wise candidate set generation-and-test methodology. Furthermore, it requires only one database scan, so it is eligible for use in stream data mining. An extensive performance analysis shows that our algorithm is efficient and scalable for WFP mining using dynamic weights.

  13. Choosing a DIVA: a comparison of emerging digital imagery vegetation analysis techniques

    USGS Publications Warehouse

    Jorgensen, Christopher F.; Stutzman, Ryan J.; Anderson, Lars C.; Decker, Suzanne E.; Powell, Larkin A.; Schacht, Walter H.; Fontaine, Joseph J.

    2013-01-01

    Question: What is the precision of five methods of measuring vegetation structure using ground-based digital imagery and processing techniques? Location: Lincoln, Nebraska, USA Methods: Vertical herbaceous cover was recorded using digital imagery techniques at two distinct locations in a mixed-grass prairie. The precision of five ground-based digital imagery vegetation analysis (DIVA) methods for measuring vegetation structure was tested using a split-split plot analysis of covariance. Variability within each DIVA technique was estimated using coefficient of variation of mean percentage cover. Results: Vertical herbaceous cover estimates differed among DIVA techniques. Additionally, environmental conditions affected the vertical vegetation obstruction estimates for certain digital imagery methods, while other techniques were more adept at handling various conditions. Overall, percentage vegetation cover values differed among techniques, but the precision of four of the five techniques was consistently high. Conclusions: DIVA procedures are sufficient for measuring various heights and densities of standing herbaceous cover. Moreover, digital imagery techniques can reduce measurement error associated with multiple observers' standing herbaceous cover estimates, allowing greater opportunity to detect patterns associated with vegetation structure.

  14. Joint Utility of Event-Dependent and Environmental Crime Analysis Techniques for Violent Crime Forecasting

    ERIC Educational Resources Information Center

    Caplan, Joel M.; Kennedy, Leslie W.; Piza, Eric L.

    2013-01-01

    Violent crime incidents occurring in Irvington, New Jersey, in 2007 and 2008 are used to assess the joint analytical capabilities of point pattern analysis, hotspot mapping, near-repeat analysis, and risk terrain modeling. One approach to crime analysis suggests that the best way to predict future crime occurrence is to use past behavior, such as…

  15. Analysis of 3D Modeling Software Usage Patterns for K-12 Students

    ERIC Educational Resources Information Center

    Wu, Yi-Chieh; Liao, Wen-Hung; Chi, Ming-Te; Li, Tsai-Yen

    2016-01-01

    In response to the recent trend in maker movement, teachers are learning 3D techniques actively and bringing 3D printing into the classroom to enhance variety and creativity in designing lectures. This study investigates the usage pattern of a 3D modeling software, Qmodel Creator, which is targeted at K-12 students. User logs containing…

  16. Definition of spatial patterns of bark beetle Ips typographus (L.) outbreak spreading in Tatra Mountains (Central Europe), using GIS

    Treesearch

    Rastislav Jakus; Wojciech Grodzki; Marek Jezik; Marcin Jachym

    2003-01-01

    The spread of bark beetle outbreaks in the Tatra Mountains was explored by using both terrestrial and remote sensing techniques. Both approaches have proven to be useful for studying spatial patterns of bark beetle population dynamics. The terrestrial methods were applied on existing forestry databases. Vegetation change analysis (image differentiation), digital...

  17. Rapid Discrimination for Traditional Complex Herbal Medicines from Different Parts, Collection Time, and Origins Using High-Performance Liquid Chromatography and Near-Infrared Spectral Fingerprints with Aid of Pattern Recognition Methods

    PubMed Central

    Fu, Haiyan; Fan, Yao; Zhang, Xu; Lan, Hanyue; Yang, Tianming; Shao, Mei; Li, Sihan

    2015-01-01

    As an effective method, the fingerprint technique, which emphasized the whole compositions of samples, has already been used in various fields, especially in identifying and assessing the quality of herbal medicines. High-performance liquid chromatography (HPLC) and near-infrared (NIR), with their unique characteristics of reliability, versatility, precision, and simple measurement, played an important role among all the fingerprint techniques. In this paper, a supervised pattern recognition method based on PLSDA algorithm by HPLC and NIR has been established to identify the information of Hibiscus mutabilis L. and Berberidis radix, two common kinds of herbal medicines. By comparing component analysis (PCA), linear discriminant analysis (LDA), and particularly partial least squares discriminant analysis (PLSDA) with different fingerprint preprocessing of NIR spectra variables, PLSDA model showed perfect functions on the analysis of samples as well as chromatograms. Most important, this pattern recognition method by HPLC and NIR can be used to identify different collection parts, collection time, and different origins or various species belonging to the same genera of herbal medicines which proved to be a promising approach for the identification of complex information of herbal medicines. PMID:26345990

  18. Synthesis of samarium doped gadolinium oxide nanorods, its spectroscopic and physical properties

    NASA Astrophysics Data System (ADS)

    Boopathi, G.; Gokul Raj, S.; Ramesh Kumar, G.; Mohan, R.; Mohan, S.

    2018-06-01

    One-dimensional samarium doped gadolinium oxide [Sm:Gd2O3] nanorods have been synthesized successfully through co-precipitation technique in aqueous solution. The as-synthesized and calcined products were characterized by using powder X-ray diffraction pattern, Fourier transform Raman spectroscopy, thermogravimetric/differential thermal analysis, scanning electron microscopy with energy-dispersive X-ray analysis, transmission electron microscopy, Fourier transform infrared spectroscopy, Ultraviolet-Visible spectrometry, photoluminescence spectrophotometer and X-ray photoelectron spectroscopy techniques. The obtained results are discussed in detailed manner.

  19. Differentiation of tea varieties using UV-Vis spectra and pattern recognition techniques

    NASA Astrophysics Data System (ADS)

    Palacios-Morillo, Ana; Alcázar, Ángela.; de Pablos, Fernando; Jurado, José Marcos

    2013-02-01

    Tea, one of the most consumed beverages all over the world, is of great importance in the economies of a number of countries. Several methods have been developed to classify tea varieties or origins based in pattern recognition techniques applied to chemical data, such as metal profile, amino acids, catechins and volatile compounds. Some of these analytical methods become tedious and expensive to be applied in routine works. The use of UV-Vis spectral data as discriminant variables, highly influenced by the chemical composition, can be an alternative to these methods. UV-Vis spectra of methanol-water extracts of tea have been obtained in the interval 250-800 nm. Absorbances have been used as input variables. Principal component analysis was used to reduce the number of variables and several pattern recognition methods, such as linear discriminant analysis, support vector machines and artificial neural networks, have been applied in order to differentiate the most common tea varieties. A successful classification model was built by combining principal component analysis and multilayer perceptron artificial neural networks, allowing the differentiation between tea varieties. This rapid and simple methodology can be applied to solve classification problems in food industry saving economic resources.

  20. Use of 16S-23S rRNA spacer-region (SR)-PCR for identification of intestinal clostridia.

    PubMed

    Song, Yuli; Liu, Chengxu; Molitoris, Denise; Tomzynski, Thomas J; Mc Teague, Maureen; Read, Erik; Finegold, Sydney M

    2002-12-01

    The suitability of a species identification technique based on PCR analysis of 16S-23S rRNA spacer region (SR) polymorphism for human intestinal Clostridium species was evaluated. This SR-PCR based technique is highly reproducible and successfully differentiated the strains tested, which included 17 ATCC type strains of Clostridium and 152 human stool Clostridium isolates, at the species or intraspecies level. Ninety-eight of 152 stool isolates, including C. bifermentans, C. butyricum, C. cadaveris, C. orbiscindens, C. paraputrificum, C. pefringens, C. ramosum, C. scindens, C. spiroforme, C. symbiosum and C. tertium, were identified to species level by SR-PCR patterns that were identical to those of their corresponding ATCC type strains. The other 54 stool isolates distributed among ten SR-PCR patterns that are unique and possibly represent ten novel Clostridium species or subspecies. The species identification obtained by SR-PCR pattern analysis completely agreed with that obtained by 16S rRNA sequencing, and led to identification that clearly differed from that obtained by cellular fatty acid analysis for 23/152 strains (15%). These results indicate that SR-PCR provides an accurate and rapid molecular method for the identification of human intestinal Clostridium species.

  1. T-pattern analysis for the study of temporal structure of animal and human behavior: a comprehensive review.

    PubMed

    Casarrubea, M; Jonsson, G K; Faulisi, F; Sorbera, F; Di Giovanni, G; Benigno, A; Crescimanno, G; Magnusson, M S

    2015-01-15

    A basic tenet in the realm of modern behavioral sciences is that behavior consists of patterns in time. For this reason, investigations of behavior deal with sequences that are not easily perceivable by the unaided observer. This problem calls for improved means of detection, data handling and analysis. This review focuses on the analysis of the temporal structure of behavior carried out by means of a multivariate approach known as T-pattern analysis. Using this technique, recurring sequences of behavioral events, usually hard to detect, can be unveiled and carefully described. T-pattern analysis has been successfully applied in the study of various aspects of human or animal behavior such as behavioral modifications in neuro-psychiatric diseases, route-tracing stereotypy in mice, interaction between human subjects and animal or artificial agents, hormonal-behavioral interactions, patterns of behavior associated with emesis and, in our laboratories, exploration and anxiety-related behaviors in rodents. After describing the theory and concepts of T-pattern analysis, this review will focus on the application of the analysis to the study of the temporal characteristics of behavior in different species from rodents to human beings. This work could represent a useful background for researchers who intend to employ such a refined multivariate approach to the study of behavior. Copyright © 2014 Elsevier B.V. All rights reserved.

  2. MULTIVARIATE ANALYSIS OF DRINKING BEHAVIOUR IN A RURAL POPULATION

    PubMed Central

    Mathrubootham, N.; Bashyam, V.S.P.; Shahjahan

    1997-01-01

    This study was carried out to find out the drinking pattern in a rural population, using multivariate techniques. 386 current users identified in a community were assessed with regard to their drinking behaviours using a structured interview. For purposes of the study the questions were condensed into 46 meaningful variables. In bivariate analysis, 14 variables including dependent variables such as dependence, MAST & CAGE (measuring alcoholic status), Q.F. Index and troubled drinking were found to be significant. Taking these variables and other multivariate techniques too such as ANOVA, correlation, regression analysis and factor analysis were done using both SPSS PC + and HCL magnum mainframe computer with FOCUS package and UNIX systems. Results revealed that number of factors such as drinking style, duration of drinking, pattern of abuse, Q.F. Index and various problems influenced drinking and some of them set up a vicious circle. Factor analysis revealed mainly 3 factors, abuse, dependence and social drinking factors. Dependence could be divided into low/moderate dependence. The implications and practical applications of these tests are also discussed. PMID:21584077

  3. Analysis of ELA-DQB exon 2 polymorphism in Argentine Creole horses by PCR-RFLP and PCR-SSCP.

    PubMed

    Villegas-Castagnasso, E E; Díaz, S; Giovambattista, G; Dulout, F N; Peral-García, P

    2003-08-01

    The second exon of equine leucocyte antigen (ELA)-DQB genes was amplified from genomic DNA of 32 Argentine Creole horses by PCR. Amplified DNA was analysed by PCR-restriction fragment length polymorphism (RFLP) and PCR-single-strand conformation polymorphism (SSCP). The PCR-RFLP analysis revealed two HaeIII patterns, four RsaI patterns, five MspI patterns and two HinfI patterns. EcoRI showed no variation in the analysed sample. Additional patterns that did not account for known exon 2 DNA sequences were observed, suggesting the existence of novel ELA-DQB alleles. PCR-SSCP analysis exhibited seven different band patterns, and the number of bands per animal ranged from four to nine. Both methods indicated that at least two DQB genes are present. The presence of more than two alleles in each animal showed that the primers employed in this work are not specific for a unique DQB locus. The improvement of this PCR-RFLP method should provide a simple and rapid technique for an accurate definition of ELA-DQB typing in horses.

  4. Comparison the Marginal and Internal Fit of Metal Copings Cast from Wax Patterns Fabricated by CAD/CAM and Conventional Wax up Techniques.

    PubMed

    Vojdani, M; Torabi, K; Farjood, E; Khaledi, Aar

    2013-09-01

    Metal-ceramic crowns are most commonly used as the complete coverage restorations in clinical daily use. Disadvantages of conventional hand-made wax-patterns introduce some alternative ways by means of CAD/CAM technologies. This study compares the marginal and internal fit of copings cast from CAD/CAM and conventional fabricated wax-patterns. Twenty-four standardized brass dies were prepared and randomly divided into 2 groups according to the wax-patterns fabrication method (CAD/CAM technique and conventional method) (n=12). All the wax-patterns were fabricated in a standard fashion by means of contour, thickness and internal relief (M1-M12: representative of CAD/CAM group, C1-C12: representative of conventional group). CAD/CAM milling machine (Cori TEC 340i; imes-icore GmbH, Eiterfeld, Germany) was used to fabricate the CAD/CAM group wax-patterns. The copings cast from 24 wax-patterns were cemented to the corresponding dies. For all the coping-die assemblies cross-sectional technique was used to evaluate the marginal and internal fit at 15 points. The Student's t- test was used for statistical analysis (α=0.05). The overall mean (SD) for absolute marginal discrepancy (AMD) was 254.46 (25.10) um for CAD/CAM group and 88.08(10.67) um for conventional group (control). The overall mean of internal gap total (IGT) was 110.77(5.92) um for CAD/CAM group and 76.90 (10.17) um for conventional group. The Student's t-test revealed significant differences between 2 groups. Marginal and internal gaps were found to be significantly higher at all measured areas in CAD/CAM group than conventional group (p< 0.001). Within limitations of this study, conventional method of wax-pattern fabrication produced copings with significantly better marginal and internal fit than CAD/CAM (machine-milled) technique. All the factors for 2 groups were standardized except wax pattern fabrication technique, therefore, only the conventional group results in copings with clinically acceptable margins of less than 120um.

  5. Comparison the Marginal and Internal Fit of Metal Copings Cast from Wax Patterns Fabricated by CAD/CAM and Conventional Wax up Techniques

    PubMed Central

    Vojdani, M; Torabi, K; Farjood, E; Khaledi, AAR

    2013-01-01

    Statement of Problem: Metal-ceramic crowns are most commonly used as the complete coverage restorations in clinical daily use. Disadvantages of conventional hand-made wax-patterns introduce some alternative ways by means of CAD/CAM technologies. Purpose: This study compares the marginal and internal fit of copings cast from CAD/CAM and conventional fabricated wax-patterns. Materials and Method: Twenty-four standardized brass dies were prepared and randomly divided into 2 groups according to the wax-patterns fabrication method (CAD/CAM technique and conventional method) (n=12). All the wax-patterns were fabricated in a standard fashion by means of contour, thickness and internal relief (M1-M12: representative of CAD/CAM group, C1-C12: representative of conventional group). CAD/CAM milling machine (Cori TEC 340i; imes-icore GmbH, Eiterfeld, Germany) was used to fabricate the CAD/CAM group wax-patterns. The copings cast from 24 wax-patterns were cemented to the corresponding dies. For all the coping-die assemblies cross-sectional technique was used to evaluate the marginal and internal fit at 15 points. The Student’s t- test was used for statistical analysis (α=0.05). Results: The overall mean (SD) for absolute marginal discrepancy (AMD) was 254.46 (25.10) um for CAD/CAM group and 88.08(10.67) um for conventional group (control). The overall mean of internal gap total (IGT) was 110.77(5.92) um for CAD/CAM group and 76.90 (10.17) um for conventional group. The Student’s t-test revealed significant differences between 2 groups. Marginal and internal gaps were found to be significantly higher at all measured areas in CAD/CAM group than conventional group (p< 0.001). Conclusion: Within limitations of this study, conventional method of wax-pattern fabrication produced copings with significantly better marginal and internal fit than CAD/CAM (machine-milled) technique. All the factors for 2 groups were standardized except wax pattern fabrication technique, therefore, only the conventional group results in copings with clinically acceptable margins of less than 120um. PMID:24724133

  6. Transition Icons for Time-Series Visualization and Exploratory Analysis.

    PubMed

    Nickerson, Paul V; Baharloo, Raheleh; Wanigatunga, Amal A; Manini, Todd M; Tighe, Patrick J; Rashidi, Parisa

    2018-03-01

    The modern healthcare landscape has seen the rapid emergence of techniques and devices that temporally monitor and record physiological signals. The prevalence of time-series data within the healthcare field necessitates the development of methods that can analyze the data in order to draw meaningful conclusions. Time-series behavior is notoriously difficult to intuitively understand due to its intrinsic high-dimensionality, which is compounded in the case of analyzing groups of time series collected from different patients. Our framework, which we call transition icons, renders common patterns in a visual format useful for understanding the shared behavior within groups of time series. Transition icons are adept at detecting and displaying subtle differences and similarities, e.g., between measurements taken from patients receiving different treatment strategies or stratified by demographics. We introduce various methods that collectively allow for exploratory analysis of groups of time series, while being free of distribution assumptions and including simple heuristics for parameter determination. Our technique extracts discrete transition patterns from symbolic aggregate approXimation representations, and compiles transition frequencies into a bag of patterns constructed for each group. These transition frequencies are normalized and aligned in icon form to intuitively display the underlying patterns. We demonstrate the transition icon technique for two time-series datasets-postoperative pain scores, and hip-worn accelerometer activity counts. We believe transition icons can be an important tool for researchers approaching time-series data, as they give rich and intuitive information about collective time-series behaviors.

  7. Interaction Pattern Analysis in cMOOCs Based on the Connectivist Interaction and Engagement Framework

    ERIC Educational Resources Information Center

    Wang, Zhijun; Anderson, Terry; Chen, Li; Barbera, Elena

    2017-01-01

    Connectivist learning is interaction-centered learning. A framework describing interaction and cognitive engagement in connectivist learning was constructed using logical reasoning techniques. The framework and analysis was designed to help researchers and learning designers understand and adapt the characteristics and principles of interaction in…

  8. Establishing Evidence for Internal Structure Using Exploratory Factor Analysis

    ERIC Educational Resources Information Center

    Watson, Joshua C.

    2017-01-01

    Exploratory factor analysis (EFA) is a data reduction technique used to condense data into smaller sets of summary variables by identifying underlying factors potentially accounting for patterns of collinearity among said variables. Using an illustrative example, the 5 general steps of EFA are described with best practices for decision making…

  9. The Mathematical Analysis of Style: A Correlation-Based Approach.

    ERIC Educational Resources Information Center

    Oppenheim, Rosa

    1988-01-01

    Examines mathematical models of style analysis, focusing on the pattern in which literary characteristics occur. Describes an autoregressive integrated moving average model (ARIMA) for predicting sentence length in different works by the same author and comparable works by different authors. This technique is valuable in characterizing stylistic…

  10. What School Financial Reports Reveal and Hide.

    ERIC Educational Resources Information Center

    Walters, Donald L.

    The problem of full disclosure of the financial operation and position of a school system is discussed in this paper. Techniques described for analyzing revenue and expenditure patterns include percentage changes and index numbers for horizontal analysis and proportions for vertical analysis. Also discussed are how financial reports are affected…

  11. Estimation of Errors in Force Platform Data

    ERIC Educational Resources Information Center

    Psycharakis, Stelios G.; Miller, Stuart

    2006-01-01

    Force platforms (FPs) are regularly used in the biomechanical analysis of sport and exercise techniques, often in combination with image-based motion analysis. Force time data, particularly when combined with joint positions and segmental inertia parameters, can be used to evaluate the effectiveness of a wide range of movement patterns in sport…

  12. Beginning Learners' Development of Interactional Competence: Alignment Activity

    ERIC Educational Resources Information Center

    Tecedor, Marta

    2016-01-01

    This study examined the development of interactional competence (Hall, 1993; He & Young, 1998) by beginning learners of Spanish as indexed by their use of alignment moves. Discourse analysis techniques and quantitative data analysis were used to explore how 52 learners expressed alignment and changes in participation patterns in two sets of…

  13. GEMAS: Spatial pattern analysis of Ni by using digital image processing techniques on European agricultural soil data

    NASA Astrophysics Data System (ADS)

    Jordan, Gyozo; Petrik, Attila; De Vivo, Benedetto; Albanese, Stefano; Demetriades, Alecos; Sadeghi, Martiya

    2017-04-01

    Several studies have investigated the spatial distribution of chemical elements in topsoil (0-20 cm) within the framework of the EuroGeoSurveys Geochemistry Expert Group's 'Geochemical Mapping of Agricultural and Grazing Land Soil' project . Most of these studies used geostatistical analyses and interpolated concentration maps, Exploratory and Compositional Data and Analysis to identify anomalous patterns. The objective of our investigation is to demonstrate the use of digital image processing techniques for reproducible spatial pattern recognition and quantitative spatial feature characterisation. A single element (Ni) concentration in agricultural topsoil is used to perform the detailed spatial analysis, and to relate these features to possible underlying processes. In this study, simple univariate statistical methods were implemented first, and Tukey's inner-fence criterion was used to delineate statistical outliers. The linear and triangular irregular network (TIN) interpolation was used on the outlier-free Ni data points, which was resampled to a 10*10 km grid. Successive moving average smoothing was applied to generalise the TIN model and to suppress small- and at the same time enhance significant large-scale features of Nickel concentration spatial distribution patterns in European topsoil. The TIN map smoothed with a moving average filter revealed the spatial trends and patterns without losing much detail, and it was used as the input into digital image processing, such as local maxima and minima determination, digital cross sections, gradient magnitude and gradient direction calculation, second derivative profile curvature calculation, edge detection, local variability assessment, lineament density and directional variogram analyses. The detailed image processing analysis revealed several NE-SW, E-W and NW-SE oriented elongated features, which coincide with different spatial parameter classes and alignment with local maxima and minima. The NE-SW oriented linear pattern is the dominant feature to the south of the last glaciation limit. Some of these linear features are parallel to the suture zone of the Iapetus Ocean, while the others follow the Alpine and Carpathian Chains. The highest variability zones of Ni concentration in topsoil are located in the Alps and in the Balkans where mafic and ultramafic rocks outcrop. The predominant NE-SW oriented pattern is also captured by the strong anisotropy in the semi-variograms in this direction. A single major E-W oriented north-facing feature runs along the southern border of the last glaciation zone. This zone also coincides with a series of local maxima in Ni concentration along the glaciofluvial deposits. The NW-SE elongated spatial features are less dominant and are located in the Pyrenees and Scandinavia. This study demonstrates the efficiency of systematic image processing analysis in identifying and characterising spatial geochemical patterns that often remain uncovered by the usual visual map interpretation techniques.

  14. Fatigue damage monitoring for basalt fiber reinforced polymer composites using acoustic emission technique

    NASA Astrophysics Data System (ADS)

    Wang, Wentao; Li, Hui; Qu, Zhi

    2012-04-01

    Basalt fiber reinforced polymer (BFRP) is a structural material with superior mechanical properties. In this study, unidirectional BFRP laminates with 14 layers are made with the hand lay-up method. Then, the acoustic emission technique (AE) combined with the scanning electronic microscope (SEM) technique is employed to monitor the fatigue damage evolution of the BFRP plates in the fatigue loading tests. Time-frequency analysis using the wavelet transform technique is proposed to analyze the received AE signal instead of the peak frequency method. A comparison between AE signals and SEM images indicates that the multi-frequency peaks picked from the time-frequency curves of AE signals reflect the accumulated fatigue damage evolution and fatigue damage patterns. Furthermore, seven damage patterns, that is, matrix cracking, delamination, fiber fracture and their combinations, are identified from the time-frequency curves of the AE signals.

  15. Evaluation of the veracity of one work by the artist Di Cavalcanti through non-destructive techniques: XRF, imaging and brush stroke analysis

    NASA Astrophysics Data System (ADS)

    Kajiya, E. A. M.; Campos, P. H. O. V.; Rizzutto, M. A.; Appoloni, C. R.; Lopes, F.

    2014-02-01

    This paper presents systematic studies and analysis that contributed to the identification of the forgery of a work by the artist Emiliano Augusto Cavalcanti de Albuquerque e Melo, known as Di Cavalcanti. The use of several areas of expertise such as brush stroke analysis ("pinacologia"), applied physics, and art history resulted in an accurate diagnosis for ascertaining the authenticity of the work entitled "Violeiro" (1950). For this work we used non-destructive methods such as techniques of infrared, ultraviolet, visible and tangential light imaging combined with chemical analysis of the pigments by portable X-Ray Fluorescence (XRF) and graphic gesture analysis. Each applied method of analysis produced specific information that made possible the identification of materials and techniques employed and we concluded that this work is not consistent with patterns characteristic of the artist Di Cavalcanti.

  16. Monitoring Urban Greenness Dynamics Using Multiple Endmember Spectral Mixture Analysis

    PubMed Central

    Gan, Muye; Deng, Jinsong; Zheng, Xinyu; Hong, Yang; Wang, Ke

    2014-01-01

    Urban greenness is increasingly recognized as an essential constituent of the urban environment and can provide a range of services and enhance residents’ quality of life. Understanding the pattern of urban greenness and exploring its spatiotemporal dynamics would contribute valuable information for urban planning. In this paper, we investigated the pattern of urban greenness in Hangzhou, China, over the past two decades using time series Landsat-5 TM data obtained in 1990, 2002, and 2010. Multiple endmember spectral mixture analysis was used to derive vegetation cover fractions at the subpixel level. An RGB-vegetation fraction model, change intensity analysis and the concentric technique were integrated to reveal the detailed, spatial characteristics and the overall pattern of change in the vegetation cover fraction. Our results demonstrated the ability of multiple endmember spectral mixture analysis to accurately model the vegetation cover fraction in pixels despite the complex spectral confusion of different land cover types. The integration of multiple techniques revealed various changing patterns in urban greenness in this region. The overall vegetation cover has exhibited a drastic decrease over the past two decades, while no significant change occurred in the scenic spots that were studied. Meanwhile, a remarkable recovery of greenness was observed in the existing urban area. The increasing coverage of small green patches has played a vital role in the recovery of urban greenness. These changing patterns were more obvious during the period from 2002 to 2010 than from 1990 to 2002, and they revealed the combined effects of rapid urbanization and greening policies. This work demonstrates the usefulness of time series of vegetation cover fractions for conducting accurate and in-depth studies of the long-term trajectories of urban greenness to obtain meaningful information for sustainable urban development. PMID:25375176

  17. Exploration of Hand Grasp Patterns Elicitable Through Non-Invasive Proximal Nerve Stimulation.

    PubMed

    Shin, Henry; Watkins, Zach; Hu, Xiaogang

    2017-11-29

    Various neurological conditions, such as stroke or spinal cord injury, result in an impaired control of the hand. One method of restoring this impairment is through functional electrical stimulation (FES). However, traditional FES techniques often lead to quick fatigue and unnatural ballistic movements. In this study, we sought to explore the capabilities of a non-invasive proximal nerve stimulation technique in eliciting various hand grasp patterns. The ulnar and median nerves proximal to the elbow joint were activated transcutanously using a programmable stimulator, and the resultant finger flexion joint angles were recorded using a motion capture system. The individual finger motions averaged across the three joints were analyzed using a cluster analysis, in order to classify the different hand grasp patterns. With low current intensity (<5 mA and 100 µs pulse width) stimulation, our results show that all of our subjects demonstrated a variety of consistent hand grasp patterns including single finger movement and coordinated multi-finger movements. This study provides initial evidence on the feasibility of a proximal nerve stimulation technique in controlling a variety of finger movements and grasp patterns. Our approach could also be developed into a rehabilitative/assistive tool that can result in flexible movements of the fingers.

  18. Survey of Analysis of Crime Detection Techniques Using Data Mining and Machine Learning

    NASA Astrophysics Data System (ADS)

    Prabakaran, S.; Mitra, Shilpa

    2018-04-01

    Data mining is the field containing procedures for finding designs or patterns in a huge dataset, it includes strategies at the convergence of machine learning and database framework. It can be applied to various fields like future healthcare, market basket analysis, education, manufacturing engineering, crime investigation etc. Among these, crime investigation is an interesting application to process crime characteristics to help the society for a better living. This paper survey various data mining techniques used in this domain. This study may be helpful in designing new strategies for crime prediction and analysis.

  19. Wavelet analysis of frequency chaos game signal: a time-frequency signature of the C. elegans DNA.

    PubMed

    Messaoudi, Imen; Oueslati, Afef Elloumi; Lachiri, Zied

    2014-12-01

    Challenging tasks are encountered in the field of bioinformatics. The choice of the genomic sequence's mapping technique is one the most fastidious tasks. It shows that a judicious choice would serve in examining periodic patterns distribution that concord with the underlying structure of genomes. Despite that, searching for a coding technique that can highlight all the information contained in the DNA has not yet attracted the attention it deserves. In this paper, we propose a new mapping technique based on the chaos game theory that we call the frequency chaos game signal (FCGS). The particularity of the FCGS coding resides in exploiting the statistical properties of the genomic sequence itself. This may reflect important structural and organizational features of DNA. To prove the usefulness of the FCGS approach in the detection of different local periodic patterns, we use the wavelet analysis because it provides access to information that can be obscured by other time-frequency methods such as the Fourier analysis. Thus, we apply the continuous wavelet transform (CWT) with the complex Morlet wavelet as a mother wavelet function. Scalograms that relate to the organism Caenorhabditis elegans (C. elegans) exhibit a multitude of periodic organization of specific DNA sequences.

  20. Using unmanned aerial vehicle (UAV) surveys and image analysis in the study of large surface-associated marine species: a case study on reef sharks Carcharhinus melanopterus shoaling behaviour.

    PubMed

    Rieucau, G; Kiszka, J J; Castillo, J C; Mourier, J; Boswell, K M; Heithaus, M R

    2018-06-01

    A novel image analysis-based technique applied to unmanned aerial vehicle (UAV) survey data is described to detect and locate individual free-ranging sharks within aggregations. The method allows rapid collection of data and quantification of fine-scale swimming and collective patterns of sharks. We demonstrate the usefulness of this technique in a small-scale case study exploring the shoaling tendencies of blacktip reef sharks Carcharhinus melanopterus in a large lagoon within Moorea, French Polynesia. Using our approach, we found that C. melanopterus displayed increased alignment with shoal companions when distributed over a sandflat where they are regularly fed for ecotourism purposes as compared with when they shoaled in a deeper adjacent channel. Our case study highlights the potential of a relatively low-cost method that combines UAV survey data and image analysis to detect differences in shoaling patterns of free-ranging sharks in shallow habitats. This approach offers an alternative to current techniques commonly used in controlled settings that require time-consuming post-processing effort. This article is protected by copyright. All rights reserved. This article is protected by copyright. All rights reserved.

  1. Recognition of surface lithologic and topographic patterns in southwest Colorado with ADP techniques

    NASA Technical Reports Server (NTRS)

    Melhorn, W. N.; Sinnock, S.

    1973-01-01

    Analysis of ERTS-1 multispectral data by automatic pattern recognition procedures is applicable toward grappling with current and future resource stresses by providing a means for refining existing geologic maps. The procedures used in the current analysis already yield encouraging results toward the eventual machine recognition of extensive surface lithologic and topographic patterns. Automatic mapping of a series of hogbacks, strike valleys, and alluvial surfaces along the northwest flank of the San Juan Basin in Colorado can be obtained by minimal man-machine interaction. The determination of causes for separable spectral signatures is dependent upon extensive correlation of micro- and macro field based ground truth observations and aircraft underflight data with the satellite data.

  2. Component pattern analysis of chemicals using multispectral THz imaging system

    NASA Astrophysics Data System (ADS)

    Kawase, Kodo; Ogawa, Yuichi; Watanabe, Yuki

    2004-04-01

    We have developed a novel basic technology for terahertz (THz) imaging, which allows detection and identification of chemicals by introducing the component spatial pattern analysis. The spatial distributions of the chemicals were obtained from terahertz multispectral transillumination images, using absorption spectra previously measured with a widely tunable THz-wave parametric oscillator. Further we have applied this technique to the detection and identification of illicit drugs concealed in envelopes. The samples we used were methamphetamine and MDMA, two of the most widely consumed illegal drugs in Japan, and aspirin as a reference.

  3. A Comparative Study of Random Patterns for Digital Image Correlation

    NASA Astrophysics Data System (ADS)

    Stoilov, G.; Kavardzhikov, V.; Pashkouleva, D.

    2012-06-01

    Digital Image Correlation (DIC) is a computer based image analysis technique utilizing random patterns, which finds applications in experimental mechanics of solids and structures. In this paper a comparative study of three simulated random patterns is done. One of them is generated according to a new algorithm, introduced by the authors. A criterion for quantitative evaluation of random patterns after the calculation of their autocorrelation functions is introduced. The patterns' deformations are simulated numerically and realized experimentally. The displacements are measured by using the DIC method. Tensile tests are performed after printing the generated random patterns on surfaces of standard iron sheet specimens. It is found that the new designed random pattern keeps relatively good quality until reaching 20% deformation.

  4. Digital versus conventional techniques for pattern fabrication of implant-supported frameworks

    PubMed Central

    Alikhasi, Marzieh; Rohanian, Ahmad; Ghodsi, Safoura; Kolde, Amin Mohammadpour

    2018-01-01

    Objective: The aim of this experimental study was to compare retention of frameworks cast from wax patterns fabricated by three different methods. Materials and Methods: Thirty-six implant analogs connected to one-piece abutments were divided randomly into three groups according to the wax pattern fabrication method (n = 12). Computer-aided design/computer-aided manufacturing (CAD/CAM) milling machine, three-dimensional printer, and conventional technique were used for fabrication of waxing patterns. All laboratory procedures were performed by an expert-reliable technician to eliminate intra-operator bias. The wax patterns were cast, finished, and seated on related abutment analogs. The number of adjustment times was recorded and analyzed by Kruskal–Wallis test. Frameworks were cemented on the corresponding analogs with zinc phosphate cement and tensile resistance test was used to measure retention value. Statistical Analysis Used: One-way analysis of variance (ANOVA) and post hoc Tukey tests were used for statistical analysis. Level of significance was set at P < 0.05. Results: The mean retentive values of 680.36 ± 21.93 N, 440.48 ± 85.98 N, and 407.23 ± 67.48 N were recorded for CAD/CAM, rapid prototyping, and conventional group, respectively. One-way ANOVA test revealed significant differences among the three groups (P < 0.001). The post hoc Tukey test showed significantly higher retention for CAD/CAM group (P < 0.001), while there was no significant difference between the two other groups (P = 0.54). CAD/CAM group required significantly more adjustments (P < 0.001). Conclusions: CAD/CAM-fabricated wax patterns showed significantly higher retention for implant-supported cement-retained frameworks; this could be a valuable help when there are limitations in the retention of single-unit implant restorations. PMID:29657528

  5. Neural net diagnostics for VLSI test

    NASA Technical Reports Server (NTRS)

    Lin, T.; Tseng, H.; Wu, A.; Dogan, N.; Meador, J.

    1990-01-01

    This paper discusses the application of neural network pattern analysis algorithms to the IC fault diagnosis problem. A fault diagnostic is a decision rule combining what is known about an ideal circuit test response with information about how it is distorted by fabrication variations and measurement noise. The rule is used to detect fault existence in fabricated circuits using real test equipment. Traditional statistical techniques may be used to achieve this goal, but they can employ unrealistic a priori assumptions about measurement data. Our approach to this problem employs an adaptive pattern analysis technique based on feedforward neural networks. During training, a feedforward network automatically captures unknown sample distributions. This is important because distributions arising from the nonlinear effects of process variation can be more complex than is typically assumed. A feedforward network is also able to extract measurement features which contribute significantly to making a correct decision. Traditional feature extraction techniques employ matrix manipulations which can be particularly costly for large measurement vectors. In this paper we discuss a software system which we are developing that uses this approach. We also provide a simple example illustrating the use of the technique for fault detection in an operational amplifier.

  6. Fractal analysis of multiscale spatial autocorrelation among point data

    USGS Publications Warehouse

    De Cola, L.

    1991-01-01

    The analysis of spatial autocorrelation among point-data quadrats is a well-developed technique that has made limited but intriguing use of the multiscale aspects of pattern. In this paper are presented theoretical and algorithmic approaches to the analysis of aggregations of quadrats at or above a given density, in which these sets are treated as multifractal regions whose fractal dimension, D, may vary with phenomenon intensity, scale, and location. The technique is illustrated with Matui's quadrat house-count data, which yield measurements consistent with a nonautocorrelated simulated Poisson process but not with an orthogonal unit-step random walk. The paper concludes with a discussion of the implications of such analysis for multiscale geographic analysis systems. -Author

  7. New Optical Transforms For Statistical Image Recognition

    NASA Astrophysics Data System (ADS)

    Lee, Sing H.

    1983-12-01

    In optical implementation of statistical image recognition, new optical transforms on large images for real-time recognition are of special interest. Several important linear transformations frequently used in statistical pattern recognition have now been optically implemented, including the Karhunen-Loeve transform (KLT), the Fukunaga-Koontz transform (FKT) and the least-squares linear mapping technique (LSLMT).1-3 The KLT performs principle components analysis on one class of patterns for feature extraction. The FKT performs feature extraction for separating two classes of patterns. The LSLMT separates multiple classes of patterns by maximizing the interclass differences and minimizing the intraclass variations.

  8. Cultural and environmental effects on the spectral development patterns of corn and soybeans: Field data analysis

    NASA Technical Reports Server (NTRS)

    Crist, E. P. (Principal Investigator)

    1982-01-01

    An overall approach to crop spectral understanding is presented which serves to maintain a strong link between actual plant responses and characteristics and spectral observations from ground based and spaceborne sensors. A specific technique for evaluating field reflectance data, as a part of the overall approach, is also described. Results of the application of this technique to corn and soybeans reflectance data collected by and at Purdue/LARS indicate that a number of common cultural and environmental factors can significantly affect the temporal spectral development patterns of these crops in tasseled cap greenness (a transformed variable of LANDSAT MSS signals).

  9. The use of Electronic Speckle Pattern Interferometry (ESPI) in the crack propagation analysis of epoxy resins

    NASA Astrophysics Data System (ADS)

    Herbert, D. P.; Al-Hassani, A. H. M.; Richardson, M. O. W.

    The ESPI (electronic speckle pattern interferometry) technique at high magnification levels is demonstrated to be of considerable value in interpreting the fracture behaviour of epoxy resins. The fracture toughness of powder coating system at different thicknesses has been measured using a TDCB (tapered double cantilever beam) technique and the deformation zone at the tip of the moving crack monitored. Initial indications are that a mechanistic changeover occurs at a critical bond (coating) thickness and that this is synonymous with the occurence of a fracture toughness maximum, which in turn is associated with a deformation zone of specific diameter.

  10. Development and evaluation of an automatic labeling technique for spring small grains

    NASA Technical Reports Server (NTRS)

    Crist, E. P.; Malila, W. A. (Principal Investigator)

    1981-01-01

    A labeling technique is described which seeks to associate a sampling entity with a particular crop or crop group based on similarity of growing season and temporal-spectral patterns of development. Human analyst provide contextual information, after which labeling decisions are made automatically. Results of a test of the technique on a large, multi-year data set are reported. Grain labeling accuracies are similar to those achieved by human analysis techniques, while non-grain accuracies are lower. Recommendations for improvments and implications of the test results are discussed.

  11. Exploring patterns enriched in a dataset with contrastive principal component analysis.

    PubMed

    Abid, Abubakar; Zhang, Martin J; Bagaria, Vivek K; Zou, James

    2018-05-30

    Visualization and exploration of high-dimensional data is a ubiquitous challenge across disciplines. Widely used techniques such as principal component analysis (PCA) aim to identify dominant trends in one dataset. However, in many settings we have datasets collected under different conditions, e.g., a treatment and a control experiment, and we are interested in visualizing and exploring patterns that are specific to one dataset. This paper proposes a method, contrastive principal component analysis (cPCA), which identifies low-dimensional structures that are enriched in a dataset relative to comparison data. In a wide variety of experiments, we demonstrate that cPCA with a background dataset enables us to visualize dataset-specific patterns missed by PCA and other standard methods. We further provide a geometric interpretation of cPCA and strong mathematical guarantees. An implementation of cPCA is publicly available, and can be used for exploratory data analysis in many applications where PCA is currently used.

  12. Regime Behavior in Paleo-Reconstructed Streamflow: Attributions to Atmospheric Dynamics, Synoptic Circulation and Large-Scale Climate Teleconnection Patterns

    NASA Astrophysics Data System (ADS)

    Ravindranath, A.; Devineni, N.

    2017-12-01

    Studies have shown that streamflow behavior and dynamics have a significant link with climate and climate variability. Patterns of persistent regime behavior from extended streamflow records in many watersheds justify investigating large-scale climate mechanisms as potential drivers of hydrologic regime behavior and streamflow variability. Understanding such streamflow-climate relationships is crucial to forecasting/simulation systems and the planning and management of water resources. In this study, hidden Markov models are used with reconstructed streamflow to detect regime-like behaviors - the hidden states - and state transition phenomena. Individual extreme events and their spatial variability across the basin are then verified with the identified states. Wavelet analysis is performed to examine the signals over time in the streamflow records. Joint analyses of the climatic data in the 20th century and the identified states are undertaken to better understand the hydroclimatic connections within the basin as well as important teleconnections that influence water supply. Compositing techniques are used to identify atmospheric circulation patterns associated with identified states of streamflow. The grouping of such synoptic patterns and their frequency are then examined. Sliding time-window correlation analysis and cross-wavelet spectral analysis are performed to establish the synchronicity of basin flows to the identified synoptic and teleconnection patterns. The Missouri River Basin (MRB) is examined in this study, both as a means of better understanding the synoptic climate controls in this important watershed and as a case study for the techniques developed here. Initial wavelet analyses of reconstructed streamflow at major gauges in the MRB show multidecadal cycles in regime behavior.

  13. Hierarchical Spatio-temporal Visual Analysis of Cluster Evolution in Electrocorticography Data

    DOE PAGES

    Murugesan, Sugeerth; Bouchard, Kristofer; Chang, Edward; ...

    2016-10-02

    Here, we present ECoG ClusterFlow, a novel interactive visual analysis tool for the exploration of high-resolution Electrocorticography (ECoG) data. Our system detects and visualizes dynamic high-level structures, such as communities, using the time-varying spatial connectivity network derived from the high-resolution ECoG data. ECoG ClusterFlow provides a multi-scale visualization of the spatio-temporal patterns underlying the time-varying communities using two views: 1) an overview summarizing the evolution of clusters over time and 2) a hierarchical glyph-based technique that uses data aggregation and small multiples techniques to visualize the propagation of clusters in their spatial domain. ECoG ClusterFlow makes it possible 1) tomore » compare the spatio-temporal evolution patterns across various time intervals, 2) to compare the temporal information at varying levels of granularity, and 3) to investigate the evolution of spatial patterns without occluding the spatial context information. Lastly, we present case studies done in collaboration with neuroscientists on our team for both simulated and real epileptic seizure data aimed at evaluating the effectiveness of our approach.« less

  14. Finite-Difference Time-Domain Analysis of Tapered Photonic Crystal Fiber

    NASA Astrophysics Data System (ADS)

    Ali, M. I. Md; Sanusidin, S. N.; Yusof, M. H. M.

    2018-03-01

    This paper brief about the simulation of tapered photonic crystal fiber (PCF) LMA-8 single-mode type based on correlation of scattering pattern at wavelength of 1.55 μm, analyzation of transmission spectrum at wavelength over the range of 1.0 until 2.5 μm and correlation of transmission spectrum with the refractive index change in photonic crystal holes with respect to taper size of 0.1 until 1.0 using Optiwave simulation software. The main objective is to simulate using Finite-Difference Time-Domain (FDTD) technique of tapered LMA-8 PCF and for sensing application by improving the capabilities of PCF without collapsing the crystal holes. The types of FDTD techniques used are scattering pattern and transverse transmission and principal component analysis (PCA) used as a mathematical tool to model the data obtained by MathCad software. The simulation results showed that there is no obvious correlation of scattering pattern at a wavelength of 1.55 μm, a correlation obtained between taper sizes with a transverse transmission and there is a parabolic relationship between the refractive index changes inside the crystal structure.

  15. Electron microprobe analysis and histochemical examination of the calcium distribution in human bone trabeculae: a methodological study using biopsy specimens from post-traumatic osteopenia

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Obrant, K.J.; Odselius, R.

    1984-01-01

    Energy dispersive X-ray microanalysis (EDX) (or electron microprobe analysis) of the relative intensity for calcium in different bone trabeculae from the tibia epiphysis, and in different parts of one and the same trabecula, was performed on 3 patients who had earlier had a fracture of the ipsilateral tibia-diaphysis. The variation in intensity was compared with the histochemical patterns obtained with both the Goldner and the von Kossa staining techniques for detecting calcium in tissues. Previously reported calcium distribution features, found to be typical for posttraumatic osteopenia, such as striated mineralization patterns in individual trabeculae and large differences in mineralization levelmore » between different trabeculae, could be verified both by means of the two histochemical procedures and from the electron microprobe analysis. A pronounced difference was observed, however, between the two histochemical staining techniques as regards their sensitivity to detect calcium. To judge from the values obtained from the EDX measurements, the sensitivity of the Goldner technique should be more than ten times higher than that of von Kossa. The EDX measurements gave more detailed information than either of the two histochemical techniques: great variations in the intensity of the calcium peak were found in trabeculae stained as unmineralized as well as mineralized.« less

  16. Requirements analysis, domain knowledge, and design

    NASA Technical Reports Server (NTRS)

    Potts, Colin

    1988-01-01

    Two improvements to current requirements analysis practices are suggested: domain modeling, and the systematic application of analysis heuristics. Domain modeling is the representation of relevant application knowledge prior to requirements specification. Artificial intelligence techniques may eventually be applicable for domain modeling. In the short term, however, restricted domain modeling techniques, such as that in JSD, will still be of practical benefit. Analysis heuristics are standard patterns of reasoning about the requirements. They usually generate questions of clarification or issues relating to completeness. Analysis heuristics can be represented and therefore systematically applied in an issue-based framework. This is illustrated by an issue-based analysis of JSD's domain modeling and functional specification heuristics. They are discussed in the context of the preliminary design of simple embedded systems.

  17. Social Network Analysis to Evaluate an Interdisciplinary Research Center

    ERIC Educational Resources Information Center

    Aboelela, Sally W.; Merrill, Jacqueline A.; Carley, Kathleen M.; Larson, Elaine

    2007-01-01

    We sought to examine the growth of an interdisciplinary center using social network analysis techniques. Specific aims were to examine the patterns of growth and interdisciplinary connectedness of the Center and to identify the social network characteristics of its productive members. The setting for this study was The Center for Interdisciplinary…

  18. Seeking Social Capital and Expertise in a Newly-Formed Research Community: A Co-Author Analysis

    ERIC Educational Resources Information Center

    Forte, Christine E.

    2017-01-01

    This exploratory study applies social network analysis techniques to existing, publicly available data to understand collaboration patterns within the co-author network of a federally-funded, interdisciplinary research program. The central questions asked: What underlying social capital structures can be determined about a group of researchers…

  19. Principal component analysis vs. self-organizing maps combined with hierarchical clustering for pattern recognition in volcano seismic spectra

    NASA Astrophysics Data System (ADS)

    Unglert, K.; Radić, V.; Jellinek, A. M.

    2016-06-01

    Variations in the spectral content of volcano seismicity related to changes in volcanic activity are commonly identified manually in spectrograms. However, long time series of monitoring data at volcano observatories require tools to facilitate automated and rapid processing. Techniques such as self-organizing maps (SOM) and principal component analysis (PCA) can help to quickly and automatically identify important patterns related to impending eruptions. For the first time, we evaluate the performance of SOM and PCA on synthetic volcano seismic spectra constructed from observations during two well-studied eruptions at Klauea Volcano, Hawai'i, that include features observed in many volcanic settings. In particular, our objective is to test which of the techniques can best retrieve a set of three spectral patterns that we used to compose a synthetic spectrogram. We find that, without a priori knowledge of the given set of patterns, neither SOM nor PCA can directly recover the spectra. We thus test hierarchical clustering, a commonly used method, to investigate whether clustering in the space of the principal components and on the SOM, respectively, can retrieve the known patterns. Our clustering method applied to the SOM fails to detect the correct number and shape of the known input spectra. In contrast, clustering of the data reconstructed by the first three PCA modes reproduces these patterns and their occurrence in time more consistently. This result suggests that PCA in combination with hierarchical clustering is a powerful practical tool for automated identification of characteristic patterns in volcano seismic spectra. Our results indicate that, in contrast to PCA, common clustering algorithms may not be ideal to group patterns on the SOM and that it is crucial to evaluate the performance of these tools on a control dataset prior to their application to real data.

  20. Statistical performance and information content of time lag analysis and redundancy analysis in time series modeling.

    PubMed

    Angeler, David G; Viedma, Olga; Moreno, José M

    2009-11-01

    Time lag analysis (TLA) is a distance-based approach used to study temporal dynamics of ecological communities by measuring community dissimilarity over increasing time lags. Despite its increased use in recent years, its performance in comparison with other more direct methods (i.e., canonical ordination) has not been evaluated. This study fills this gap using extensive simulations and real data sets from experimental temporary ponds (true zooplankton communities) and landscape studies (landscape categories as pseudo-communities) that differ in community structure and anthropogenic stress history. Modeling time with a principal coordinate of neighborhood matrices (PCNM) approach, the canonical ordination technique (redundancy analysis; RDA) consistently outperformed the other statistical tests (i.e., TLAs, Mantel test, and RDA based on linear time trends) using all real data. In addition, the RDA-PCNM revealed different patterns of temporal change, and the strength of each individual time pattern, in terms of adjusted variance explained, could be evaluated, It also identified species contributions to these patterns of temporal change. This additional information is not provided by distance-based methods. The simulation study revealed better Type I error properties of the canonical ordination techniques compared with the distance-based approaches when no deterministic component of change was imposed on the communities. The simulation also revealed that strong emphasis on uniform deterministic change and low variability at other temporal scales is needed to result in decreased statistical power of the RDA-PCNM approach relative to the other methods. Based on the statistical performance of and information content provided by RDA-PCNM models, this technique serves ecologists as a powerful tool for modeling temporal change of ecological (pseudo-) communities.

  1. Spatial statistical analysis of basal stem root disease under natural field epidemic of oil palm

    NASA Astrophysics Data System (ADS)

    Kamu, Assis; Phin, Chong Khim; Seman, Idris Abu; Wan, Hoong Hak; Mun, Ho Chong

    2015-02-01

    Oil palm or scientifically known as Elaeis guineensis Jacq. is the most important commodity crop in Malaysia and has greatly contributed to the economy growth of the country. As far as disease is concerned in the industry, Basal Stem Rot (BSR) caused by Ganoderma boninence remains the most important disease. BSR disease is the most widely studied with information available for oil palm disease in Malaysia. However, there is still limited study on the spatial as well as temporal pattern or distribution of the disease especially under natural field epidemic condition in oil palm plantation. The objective of this study is to spatially identify the pattern of BSR disease under natural field epidemic using two geospatial analytical techniques, which are quadrat analysis for the first order properties of partial pattern analysis and nearest-neighbor analysis (NNA) for the second order properties of partial pattern analysis. Two study sites were selected with different age of tree. Both sites are located in Tawau, Sabah and managed by the same company. The results showed that at least one of the point pattern analysis used which is NNA (i.e. the second order properties of partial pattern analysis) has confirmed the disease is complete spatial randomness. This suggests the spread of the disease is not from tree to tree and the age of palm does not play a significance role in determining the spatial pattern of the disease. From the spatial pattern of the disease, it would help in the disease management program and for the industry in the future. The statistical modelling is expected to help in identifying the right model to estimate the yield loss of oil palm due to BSR disease in the future.

  2. In Situ Analysis of DNA Methylation in Plants.

    PubMed

    Kathiria, Palak; Kovalchuk, Igor

    2017-01-01

    Epigenetic regulation in the plant genome is associated with the determination of expression patterns of various genes. Methylation of DNA at cytosine residues is one of the mechanisms of epigenetic regulation and has been a subject of various studies. Various techniques have been developed to analyze DNA methylation, most of which involve isolation of chromatin from cells and further in vitro studies. Limited techniques are available for in situ study of DNA methylation in plants. Here, we present such an in situ method for DNA methylation analysis which has high sensitivity and good reproducibility.

  3. Multivariate analysis of fMRI time series: classification and regression of brain responses using machine learning.

    PubMed

    Formisano, Elia; De Martino, Federico; Valente, Giancarlo

    2008-09-01

    Machine learning and pattern recognition techniques are being increasingly employed in functional magnetic resonance imaging (fMRI) data analysis. By taking into account the full spatial pattern of brain activity measured simultaneously at many locations, these methods allow detecting subtle, non-strictly localized effects that may remain invisible to the conventional analysis with univariate statistical methods. In typical fMRI applications, pattern recognition algorithms "learn" a functional relationship between brain response patterns and a perceptual, cognitive or behavioral state of a subject expressed in terms of a label, which may assume discrete (classification) or continuous (regression) values. This learned functional relationship is then used to predict the unseen labels from a new data set ("brain reading"). In this article, we describe the mathematical foundations of machine learning applications in fMRI. We focus on two methods, support vector machines and relevance vector machines, which are respectively suited for the classification and regression of fMRI patterns. Furthermore, by means of several examples and applications, we illustrate and discuss the methodological challenges of using machine learning algorithms in the context of fMRI data analysis.

  4. Topologic analysis and comparison of brain activation in children with epilepsy versus controls: an fMRI study

    NASA Astrophysics Data System (ADS)

    Oweis, Khalid J.; Berl, Madison M.; Gaillard, William D.; Duke, Elizabeth S.; Blackstone, Kaitlin; Loew, Murray H.; Zara, Jason M.

    2010-03-01

    This paper describes the development of novel computer-aided analysis algorithms to identify the language activation patterns at a certain Region of Interest (ROI) in Functional Magnetic Resonance Imaging (fMRI). Previous analysis techniques have been used to compare typical and pathologic activation patterns in fMRI images resulting from identical tasks but none of them analyzed activation topographically in a quantitative manner. This paper presents new analysis techniques and algorithms capable of identifying a pattern of language activation associated with localization related epilepsy. fMRI images of 64 healthy individuals and 31 patients with localization related epilepsy have been studied and analyzed on an ROI basis. All subjects are right handed with normal MRI scans and have been classified into three age groups (4-6, 7-9, 10-12 years). Our initial efforts have focused on investigating activation in the Left Inferior Frontal Gyrus (LIFG). A number of volumetric features have been extracted from the data. The LIFG has been cut into slices and the activation has been investigated topographically on a slice by slice basis. Overall, a total of 809 features have been extracted, and correlation analysis was applied to eliminate highly correlated features. Principal Component analysis was then applied to account only for major components in the data and One-Way Analysis of Variance (ANOVA) has been applied to test for significantly different features between normal and patient groups. Twenty Nine features have were found to be significantly different (p<0.05) between patient and control groups

  5. E Pluribus Analysis: Applying a Superforecasting Methodology to the Detection of Homegrown Violence

    DTIC Science & Technology

    2018-03-01

    actor violence and a set of predefined decision-making protocols. This research included running four simulations using the Monte Carlo technique, which...actor violence and a set of predefined decision-making protocols. This research included running four simulations using the Monte Carlo technique...PREDICTING RANDOMNESS.............................................................24 1. Using a “ Runs Test” to Determine a Temporal Pattern in Lone

  6. Automated classification of single airborne particles from two-dimensional angle-resolved optical scattering (TAOS) patterns by non-linear filtering

    NASA Astrophysics Data System (ADS)

    Crosta, Giovanni Franco; Pan, Yong-Le; Aptowicz, Kevin B.; Casati, Caterina; Pinnick, Ronald G.; Chang, Richard K.; Videen, Gorden W.

    2013-12-01

    Measurement of two-dimensional angle-resolved optical scattering (TAOS) patterns is an attractive technique for detecting and characterizing micron-sized airborne particles. In general, the interpretation of these patterns and the retrieval of the particle refractive index, shape or size alone, are difficult problems. By reformulating the problem in statistical learning terms, a solution is proposed herewith: rather than identifying airborne particles from their scattering patterns, TAOS patterns themselves are classified through a learning machine, where feature extraction interacts with multivariate statistical analysis. Feature extraction relies on spectrum enhancement, which includes the discrete cosine FOURIER transform and non-linear operations. Multivariate statistical analysis includes computation of the principal components and supervised training, based on the maximization of a suitable figure of merit. All algorithms have been combined together to analyze TAOS patterns, organize feature vectors, design classification experiments, carry out supervised training, assign unknown patterns to classes, and fuse information from different training and recognition experiments. The algorithms have been tested on a data set with more than 3000 TAOS patterns. The parameters that control the algorithms at different stages have been allowed to vary within suitable bounds and are optimized to some extent. Classification has been targeted at discriminating aerosolized Bacillus subtilis particles, a simulant of anthrax, from atmospheric aerosol particles and interfering particles, like diesel soot. By assuming that all training and recognition patterns come from the respective reference materials only, the most satisfactory classification result corresponds to 20% false negatives from B. subtilis particles and <11% false positives from all other aerosol particles. The most effective operations have consisted of thresholding TAOS patterns in order to reject defective ones, and forming training sets from three or four pattern classes. The presented automated classification method may be adapted into a real-time operation technique, capable of detecting and characterizing micron-sized airborne particles.

  7. Vibration testing and analysis using holography

    NASA Technical Reports Server (NTRS)

    1971-01-01

    Time average holography is useful in recording steady state vibrational mode patterns. Phase relationships under steady state conditions are measured with real time holography and special phase shifting techniques. Data from Michelson interferometer verify vibration amplitudes from holographic data.

  8. Japanese migration in contemporary Japan: economic segmentation and interprefectural migration.

    PubMed

    Fukurai, H

    1991-01-01

    This paper examines the economic segmentation model in explaining 1985-86 Japanese interregional migration. The analysis takes advantage of statistical graphic techniques to illustrate the following substantive issues of interregional migration: (1) to examine whether economic segmentation significantly influences Japanese regional migration and (2) to explain socioeconomic characteristics of prefectures for both in- and out-migration. Analytic techniques include a latent structural equation (LISREL) methodology and statistical residual mapping. The residual dispersion patterns, for instance, suggest the extent to which socioeconomic and geopolitical variables explain migration differences by showing unique clusters of unexplained residuals. The analysis further points out that extraneous factors such as high residential land values, significant commuting populations, and regional-specific cultures and traditions need to be incorporated in the economic segmentation model in order to assess the extent of the model's reliability in explaining the pattern of interprefectural migration.

  9. Use of reciprocal lattice layer spacing in electron backscatter diffraction pattern analysis

    PubMed

    Michael; Eades

    2000-03-01

    In the scanning electron microscope using electron backscattered diffraction, it is possible to measure the spacing of the layers in the reciprocal lattice. These values are of great use in confirming the identification of phases. The technique derives the layer spacing from the higher-order Laue zone rings which appear in patterns from many materials. The method adapts results from convergent-beam electron diffraction in the transmission electron microscope. For many materials the measured layer spacing compares well with the calculated layer spacing. A noted exception is for higher atomic number materials. In these cases an extrapolation procedure is described that requires layer spacing measurements at a range of accelerating voltages. This procedure is shown to improve the accuracy of the technique significantly. The application of layer spacing measurements in EBSD is shown to be of use for the analysis of two polytypes of SiC.

  10. A Combined FEM/MoM/GTD Technique To Analyze Elliptically Polarized Cavity-Backed Antennas With Finite Ground Plane

    NASA Technical Reports Server (NTRS)

    Reddy, C. J.; Deshpande, M. D.; Fralick, D. T.; Cockrell, C. R.; Beck, F. B.

    1996-01-01

    Radiation pattern prediction analysis of elliptically polarized cavity-backed aperture antennas in a finite ground plane is performed using a combined Finite Element Method/Method of Moments/Geometrical Theory of Diffraction (FEM/MoM/GTD) technique. The magnetic current on the cavity-backed aperture in an infinite ground plane is calculated using the combined FEM/MoM analysis. GTD, including the slope diffraction contribution, is used to calculate the diffracted fields caused by both soft and hard polarizations at the edges of the finite ground plane. Explicit expressions for regular diffraction coefficients and slope diffraction coefficients are presented. The slope of the incident magnetic field at the diffraction points is derived and analytical expressions are presented. Numerical results for the radiation patterns of a cavity-backed circular spiral microstrip patch antenna excited by a coaxial probe in a finite rectangular ground plane are computed and compared with experimental results.

  11. Array Simulations Platform (ASP) predicts NASA Data Link Module (NDLM) performance

    NASA Technical Reports Server (NTRS)

    Snook, Allen David

    1993-01-01

    Through a variety of imbedded theoretical and actual antenna patterns, the array simulation platform (ASP) enhanced analysis of the array antenna pattern effects for the KTx (Ku-Band Transmit) service of the NDLM (NASA Data Link Module). The ASP utilizes internally stored models of the NDLM antennas and can develop the overall pattern of antenna arrays through common array calculation techniques. ASP expertly assisted in the diagnosing of element phase shifter errors during KTx testing and was able to accurately predict the overall array pattern from combinations of the four internally held element patterns. This paper provides an overview of the use of the ASP software in the solving of array mis-phasing problems.

  12. Progress in high temperature speckle-shift strain measurement system

    NASA Technical Reports Server (NTRS)

    Lant, Christian T.; Barranger, John P.

    1990-01-01

    A fast, easy to use speckle tracking system is under development for the speckle-shift strain measurement technique. Preliminary correlation tests on wire specimens show strong correlations of well-developed speckle patterns. Stable cross-correlations were obtained from a tungsten filament at 2480 C. An analysis of the optical system determines the minimum required sampling frequency of the speckle pattern to be 2.55 pixels per speckle.

  13. Variability-aware double-patterning layout optimization for analog circuits

    NASA Astrophysics Data System (ADS)

    Li, Yongfu; Perez, Valerio; Tripathi, Vikas; Lee, Zhao Chuan; Tseng, I.-Lun; Ong, Jonathan Yoong Seang

    2018-03-01

    The semiconductor industry has adopted multi-patterning techniques to manage the delay in the extreme ultraviolet lithography technology. During the design process of double-patterning lithography layout masks, two polygons are assigned to different masks if their spacing is less than the minimum printable spacing. With these additional design constraints, it is very difficult to find experienced layout-design engineers who have a good understanding of the circuit to manually optimize the mask layers in order to minimize color-induced circuit variations. In this work, we investigate the impact of double-patterning lithography on analog circuits and provide quantitative analysis for our designers to select the optimal mask to minimize the circuit's mismatch. To overcome the problem and improve the turn-around time, we proposed our smart "anchoring" placement technique to optimize mask decomposition for analog circuits. We have developed a software prototype that is capable of providing anchoring markers in the layout, allowing industry standard tools to perform automated color decomposition process.

  14. Holographic Reconstruction of Photoelectron Diffraction and Its Circular Dichroism for Local Structure Probing

    NASA Astrophysics Data System (ADS)

    Matsui, Fumihiko; Matsushita, Tomohiro; Daimon, Hiroshi

    2018-06-01

    The local atomic structure around a specific element atom can be recorded as a photoelectron diffraction pattern. Forward focusing peaks and diffraction rings around them indicate the directions and distances from the photoelectron emitting atom to the surrounding atoms. The state-of-the-art holography reconstruction algorithm enables us to image the local atomic arrangement around the excited atom in a real space. By using circularly polarized light as an excitation source, the angular momentum transfer from the light to the photoelectron induces parallax shifts in these diffraction patterns. As a result, stereographic images of atomic arrangements are obtained. These diffraction patterns can be used as atomic-site-resolved probes for local electronic structure investigation in combination with spectroscopy techniques. Direct three-dimensional atomic structure visualization and site-specific electronic property analysis methods are reviewed. Furthermore, circular dichroism was also found in valence photoelectron and Auger electron diffraction patterns. The investigation of these new phenomena provides hints for the development of new techniques for local structure probing.

  15. Detection of monoclonal immunoglobulin heavy chain gene rearrangement (FR3) in Thai malignant lymphoma by High Resolution Melting curve analysis.

    PubMed

    Kummalue, Tanawan; Chuphrom, Anchalee; Sukpanichanant, Sanya; Pongpruttipan, Tawatchai; Sukpanichanant, Sathien

    2010-05-19

    Malignant lymphoma, especially non-Hodgkin lymphoma, is one of the most common hematologic malignancies in Thailand. The diagnosis of malignant lymphoma is often problematic, especially in early stages of the disease. Detection of antigen receptor gene rearrangement including T cell receptor (TCR) and immunoglobulin heavy chain (IgH) by polymerase chain reaction followed by heteroduplex has currently become standard whereas fluorescent fragment analysis (GeneScan) has been used for confirmation test. In this study, three techniques had been compared: thermocycler polymerase chain reaction (PCR) followed by heteroduplex and polyacrylamide gel electrophoresis, GeneScan analysis, and real time PCR with High Resolution Melting curve analysis (HRM). The comparison was carried out with DNA extracted from paraffin embedded tissues diagnosed as B- cell non-Hodgkin lymphoma. Specific PCR primers sequences for IgH gene variable region 3, including fluorescence labeled IgH primers were used and results were compared with HRM. In conclusion, the detection IgH gene rearrangement by HRM in the LightCycler System showed potential for distinguishing monoclonality from polyclonality in B-cell non-Hodgkin lymphoma. Malignant lymphoma, especially non-Hodgkin lymphoma, is one of the most common hematologic malignancies in Thailand. The incidence rate as reported by Ministry of Public Health is 3.1 per 100,000 population in female whereas the rate in male is 4.5 per 100,000 population 1. At Siriraj Hospital, the new cases diagnosed as malignant lymphoma were 214.6 cases/year 2. The diagnosis of malignant lymphoma is often problematic, especially in early stages of the disease. Therefore, detection of antigen receptor gene rearrangement including T cell receptor (TCR) and immunoglobulin heavy chain (IgH) by polymerase chain reaction (PCR) assay has recently become a standard laboratory test for discrimination of reactive from malignant clonal lymphoproliferation 34. Analyzing DNA extracted from formalin-fixed, paraffin-embedded tissues by multiplex PCR techniques is more rapid, accurate and highly sensitive. Measuring the size of the amplicon from PCR analysis could be used to diagnose malignant lymphoma with monoclonal pattern showing specific and distinct bands detected on acrylamide gel electrophoresis. However, this technique has some limitations and some patients might require a further confirmation test such as GeneScan or fragment analysis 56.GeneScan technique or fragment analysis reflects size and peak of DNA by using capillary gel electrophoresis. This technique is highly sensitive and can detect 0.5-1% of clonal lymphoid cells. It measures the amplicons by using various fluorescently labeled primers at forward or reverse sides and a specific size standard. Using a Genetic Analyzer machine and GeneMapper software (Applied Bioscience, USA), the monoclonal pattern revealed one single, sharp and high peak at the specific size corresponding to acrylamide gel pattern, whereas the polyclonal pattern showed multiple and small peak condensed at the same size standard. This technique is the most sensitive and accurate technique; however, it usually requires high technical experience and is also of high cost 7. Therefore, rapid and more cost effective technique are being sought.LightCycler PCR performs the diagnostic detection of amplicon via melting curve analysis within 2 hours with the use of a specific dye 89. This dye consists of two types: one known as SYBR-Green I which is non specific and the other named as High Resolution Melting analysis (HRM) which is highly sensitive, more accurate and stable. Several reports demonstrated that this new instrument combined with DNA intercalating dyes can be used to discriminate sequence changes in PCR amplicon without manual handling of PCR product 1011. Therefore, current investigations using melting curve analysis are being developed 1213.In this study, three different techniques were compared to evaluate the suitability of LightCycler PCR with HRM as the clonal diagnostic tool for IgH gene rearrangement in B-cell non-Hogdkin lymphoma, i.e. thermocycler PCR followed by heteroduplex analysis and PAGE, GeneScan analysis and LightCycler PCR with HRM.

  16. Determination of morphological parameters of biological cells by analysis of scattered-light distributions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Burger, D.E.

    1979-11-01

    The extraction of morphological parameters from biological cells by analysis of light-scatter patterns is described. A light-scattering measurement system has been designed and constructed that allows one to visually examine and photographically record biological cells or cell models and measure the light-scatter pattern of an individual cell or cell model. Using a laser or conventional illumination, the imaging system consists of a modified microscope with a 35 mm camera attached to record the cell image or light-scatter pattern. Models of biological cells were fabricated. The dynamic range and angular distributions of light scattered from these models was compared to calculatedmore » distributions. Spectrum analysis techniques applied on the light-scatter data give the sought after morphological cell parameters. These results compared favorably to shape parameters of the fabricated cell models confirming the mathematical model procedure. For nucleated biological material, correct nuclear and cell eccentricity as well as the nuclear and cytoplasmic diameters were determined. A method for comparing the flow equivalent of nuclear and cytoplasmic size to the actual dimensions is shown. This light-scattering experiment provides baseline information for automated cytology. In its present application, it involves correlating average size as measured in flow cytology to the actual dimensions determined from this technique. (ERB)« less

  17. Non-random nature of spontaneous mIPSCs in mouse auditory brainstem neurons revealed by recurrence quantification analysis

    PubMed Central

    Leao, Richardson N; Leao, Fabricio N; Walmsley, Bruce

    2005-01-01

    A change in the spontaneous release of neurotransmitter is a useful indicator of processes occurring within presynaptic terminals. Linear techniques (e.g. Fourier transform) have been used to analyse spontaneous synaptic events in previous studies, but such methods are inappropriate if the timing pattern is complex. We have investigated spontaneous glycinergic miniature synaptic currents (mIPSCs) in principal cells of the medial nucleus of the trapezoid body. The random versus deterministic (or periodic) nature of mIPSCs was assessed using recurrence quantification analysis. Nonlinear methods were then used to quantify any detected determinism in spontaneous release, and to test for chaotic or fractal patterns. Modelling demonstrated that this procedure is much more sensitive in detecting periodicities than conventional techniques. mIPSCs were found to exhibit periodicities that were abolished by blockade of internal calcium stores with ryanodine, suggesting calcium oscillations in the presynaptic inhibitory terminals. Analysis indicated that mIPSC occurrences were chaotic in nature. Furthermore, periodicities were less evident in congenitally deaf mice than in normal mice, indicating that appropriate neural activity during development is necessary for the expression of deterministic chaos in mIPSC patterns. We suggest that chaotic oscillations of mIPSC occurrences play a physiological role in signal processing in the auditory brainstem. PMID:16271982

  18. Quantification of differences between nailfold capillaroscopy images with a scleroderma pattern and normal pattern using measures of geometric and algorithmic complexity.

    PubMed

    Urwin, Samuel George; Griffiths, Bridget; Allen, John

    2017-02-01

    This study aimed to quantify and investigate differences in the geometric and algorithmic complexity of the microvasculature in nailfold capillaroscopy (NFC) images displaying a scleroderma pattern and those displaying a 'normal' pattern. 11 NFC images were qualitatively classified by a capillary specialist as indicative of 'clear microangiopathy' (CM), i.e. a scleroderma pattern, and 11 as 'not clear microangiopathy' (NCM), i.e. a 'normal' pattern. Pre-processing was performed, and fractal dimension (FD) and Kolmogorov complexity (KC) were calculated following image binarisation. FD and KC were compared between groups, and a k-means cluster analysis (n  =  2) on all images was performed, without prior knowledge of the group assigned to them (i.e. CM or NCM), using FD and KC as inputs. CM images had significantly reduced FD and KC compared to NCM images, and the cluster analysis displayed promising results that the quantitative classification of images into CM and NCM groups is possible using the mathematical measures of FD and KC. The analysis techniques used show promise for quantitative microvascular investigation in patients with systemic sclerosis.

  19. Self-aligned quadruple patterning using spacer on spacer integration optimization for N5

    NASA Astrophysics Data System (ADS)

    Thibaut, Sophie; Raley, Angélique; Mohanty, Nihar; Kal, Subhadeep; Liu, Eric; Ko, Akiteru; O'Meara, David; Tapily, Kandabara; Biolsi, Peter

    2017-04-01

    To meet scaling requirements, the semiconductor industry has extended 193nm immersion lithography beyond its minimum pitch limitation using multiple patterning schemes such as self-aligned double patterning, self-aligned quadruple patterning and litho-etch / litho etch iterations. Those techniques have been declined in numerous options in the last few years. Spacer on spacer pitch splitting integration has been proven to show multiple advantages compared to conventional pitch splitting approach. Reducing the number of pattern transfer steps associated with sacrificial layers resulted in significant decrease of cost and an overall simplification of the double pitch split technique. While demonstrating attractive aspects, SAQP spacer on spacer flow brings challenges of its own. Namely, material set selections and etch chemistry development for adequate selectivities, mandrel shape and spacer shape engineering to improve edge placement error (EPE). In this paper we follow up and extend upon our previous learning and proceed into more details on the robustness of the integration in regards to final pattern transfer and full wafer critical dimension uniformity. Furthermore, since the number of intermediate steps is reduced, one will expect improved uniformity and pitch walking control. This assertion will be verified through a thorough pitch walking analysis.

  20. Discriminant analysis of resting-state functional connectivity patterns on the Grassmann manifold

    NASA Astrophysics Data System (ADS)

    Fan, Yong; Liu, Yong; Jiang, Tianzi; Liu, Zhening; Hao, Yihui; Liu, Haihong

    2010-03-01

    The functional networks, extracted from fMRI images using independent component analysis, have been demonstrated informative for distinguishing brain states of cognitive functions and neurological diseases. In this paper, we propose a novel algorithm for discriminant analysis of functional networks encoded by spatial independent components. The functional networks of each individual are used as bases for a linear subspace, referred to as a functional connectivity pattern, which facilitates a comprehensive characterization of temporal signals of fMRI data. The functional connectivity patterns of different individuals are analyzed on the Grassmann manifold by adopting a principal angle based subspace distance. In conjunction with a support vector machine classifier, a forward component selection technique is proposed to select independent components for constructing the most discriminative functional connectivity pattern. The discriminant analysis method has been applied to an fMRI based schizophrenia study with 31 schizophrenia patients and 31 healthy individuals. The experimental results demonstrate that the proposed method not only achieves a promising classification performance for distinguishing schizophrenia patients from healthy controls, but also identifies discriminative functional networks that are informative for schizophrenia diagnosis.

  1. Advanced study of global oceanographic requirements for EOS A/B: Appendix volume

    NASA Technical Reports Server (NTRS)

    1972-01-01

    Tables and graphs are presented for a review of oceanographic studies using satellite-borne instruments. The topics considered include sensor requirements, error analysis for wind determination from glitter pattern measurements, coverage frequency plots, ground station rise and set times, a technique for reduction and analysis of ocean spectral data, rationale for the selection of a 2 PM descending orbit, and a priority analysis.

  2. First GIS Analysis of Modern Stone Tools Used by Wild Chimpanzees (Pan troglodytes verus) in Bossou, Guinea, West Africa

    PubMed Central

    Arroyo, Adrian; Matsuzawa, Tetsuro; de la Torre, Ignacio

    2015-01-01

    Stone tool use by wild chimpanzees of West Africa offers a unique opportunity to explore the evolutionary roots of technology during human evolution. However, detailed analyses of chimpanzee stone artifacts are still lacking, thus precluding a comparison with the earliest archaeological record. This paper presents the first systematic study of stone tools used by wild chimpanzees to crack open nuts in Bossou (Guinea-Conakry), and applies pioneering analytical techniques to such artifacts. Automatic morphometric GIS classification enabled to create maps of use wear over the stone tools (anvils, hammers, and hammers/ anvils), which were blind tested with GIS spatial analysis of damage patterns identified visually. Our analysis shows that chimpanzee stone tool use wear can be systematized and specific damage patterns discerned, allowing to discriminate between active and passive pounders in lithic assemblages. In summary, our results demonstrate the heuristic potential of combined suites of GIS techniques for the analysis of battered artifacts, and have enabled creating a referential framework of analysis in which wild chimpanzee battered tools can for the first time be directly compared to the early archaeological record. PMID:25793642

  3. ANAlyte: A modular image analysis tool for ANA testing with indirect immunofluorescence.

    PubMed

    Di Cataldo, Santa; Tonti, Simone; Bottino, Andrea; Ficarra, Elisa

    2016-05-01

    The automated analysis of indirect immunofluorescence images for Anti-Nuclear Autoantibody (ANA) testing is a fairly recent field that is receiving ever-growing interest from the research community. ANA testing leverages on the categorization of intensity level and fluorescent pattern of IIF images of HEp-2 cells to perform a differential diagnosis of important autoimmune diseases. Nevertheless, it suffers from tremendous lack of repeatability due to subjectivity in the visual interpretation of the images. The automatization of the analysis is seen as the only valid solution to this problem. Several works in literature address individual steps of the work-flow, nonetheless integrating such steps and assessing their effectiveness as a whole is still an open challenge. We present a modular tool, ANAlyte, able to characterize a IIF image in terms of fluorescent intensity level and fluorescent pattern without any user-interactions. For this purpose, ANAlyte integrates the following: (i) Intensity Classifier module, that categorizes the intensity level of the input slide based on multi-scale contrast assessment; (ii) Cell Segmenter module, that splits the input slide into individual HEp-2 cells; (iii) Pattern Classifier module, that determines the fluorescent pattern of the slide based on the pattern of the individual cells. To demonstrate the accuracy and robustness of our tool, we experimentally validated ANAlyte on two different public benchmarks of IIF HEp-2 images with rigorous leave-one-out cross-validation strategy. We obtained overall accuracy of fluorescent intensity and pattern classification respectively around 85% and above 90%. We assessed all results by comparisons with some of the most representative state of the art works. Unlike most of the other works in the recent literature, ANAlyte aims at the automatization of all the major steps of ANA image analysis. Results on public benchmarks demonstrate that the tool can characterize HEp-2 slides in terms of intensity and fluorescent pattern with accuracy better or comparable with the state of the art techniques, even when such techniques are run on manually segmented cells. Hence, ANAlyte can be proposed as a valid solution to the problem of ANA testing automatization. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  4. Stability and Hopf bifurcation in a simplified BAM neural network with two time delays.

    PubMed

    Cao, Jinde; Xiao, Min

    2007-03-01

    Various local periodic solutions may represent different classes of storage patterns or memory patterns, and arise from the different equilibrium points of neural networks (NNs) by applying Hopf bifurcation technique. In this paper, a bidirectional associative memory NN with four neurons and multiple delays is considered. By applying the normal form theory and the center manifold theorem, analysis of its linear stability and Hopf bifurcation is performed. An algorithm is worked out for determining the direction and stability of the bifurcated periodic solutions. Numerical simulation results supporting the theoretical analysis are also given.

  5. Prelaunch optical characterization of the Laser Geodynamic Satellite (LAGEOS 2)

    NASA Technical Reports Server (NTRS)

    Minott, Peter O.; Zagwodzki, Thomas W.; Varghese, Thomas; Seldon, Michael

    1993-01-01

    The optical range correction (the distance between the apparent retroreflective skin of the satellite and the center of mass) of the LAGEOS 2 was determined using computer analysis of theoretical and experimentally measured far field diffraction patterns, and with short pulse lasers using both streak camera-based range receivers and more conventional PMT-based range receivers. The three measurement techniques yielded range correction values from 248 to 253 millimeters dependent on laser wavelength, pulsewidth, and polarization, location of the receiver in the far field diffraction pattern and detection technique (peak, half maximum, centroid, or constant fraction). The Lidar cross section of LAGEOS 2 was measured at 4 to 10 million square meters, comparable to the LAGEOS 1.

  6. Localization of spontaneous magnetoencephalographic activity of neonates and fetuses using independent component and Hilbert phase analysis.

    PubMed

    Vairavan, Srinivasan; Eswaran, Hari; Preissl, Hubert; Wilson, James D; Haddad, Naim; Lowery, Curtis L; Govindan, Rathinaswamy B

    2010-01-01

    The fetal magnetoencephalogram (fMEG) is measured in the presence of large interference from maternal and fetal magnetocardiograms (mMCG and fMCG). These cardiac interferences can be attenuated by orthogonal projection (OP) technique of the corresponding spatial vectors. However, the OP technique redistributes the fMEG signal among the channels and also leaves some cardiac residuals (partially attenuated mMCG and fMCG) due to loss of stationarity in the signal. In this paper, we propose a novel way to extract and localize the neonatal and fetal spontaneous brain activity by using independent component analysis (ICA) technique. In this approach, we perform ICA on a small subset of sensors for 1-min duration. The independent components obtained are further investigated for the presence of discontinuous patterns as identified by the Hilbert phase analysis and are used as decision criteria for localizing the spontaneous brain activity. In order to locate the region of highest spontaneous brain activity content, this analysis is performed on the sensor subsets, which are traversed across the entire sensor space. The region of the spontaneous brain activity as identified by the proposed approach correlated well with the neonatal and fetal head location. In addition, the burst duration and the inter-burst interval computed for the identified discontinuous brain patterns are in agreement with the reported values.

  7. Evaluation of Sex-Specific Movement Patterns in Judo Using Probabilistic Neural Networks.

    PubMed

    Miarka, Bianca; Sterkowicz-Przybycien, Katarzyna; Fukuda, David H

    2017-10-01

    The purpose of the present study was to create a probabilistic neural network to clarify the understanding of movement patterns in international judo competitions by gender. Analysis of 773 male and 638 female bouts was utilized to identify movements during the approach, gripping, attack (including biomechanical designations), groundwork, defense, and pause phases. Probabilistic neural network and chi-square (χ 2 ) tests modeled and compared frequencies (p ≤ .05). Women (mean [interquartile range]: 9.9 [4; 14]) attacked more than men (7.0 [3; 10]) while attempting a greater number of arm/leg lever (women: 2.7 [1; 6]; men: 4.0 [0; 4]) and trunk/leg lever (women: 0.8 [0; 1]; men: 2.4 [0; 4]) techniques but fewer maximal length-moment arm techniques (women: 0.7 [0; 1]; men: 1.0 [0; 2]). Male athletes displayed one-handed gripping of the back and sleeve, whereas female athletes executed a greater number of groundwork techniques. An optimized probabilistic neural network model, using patterns from the gripping, attack, groundwork, and pause phases, produced an overall prediction accuracy of 76% for discrimination between men and women.

  8. A Guide to Analyzing Message-Response Sequences and Group Interaction Patterns in Computer-Mediated Communication

    ERIC Educational Resources Information Center

    Jeong, Allan

    2005-01-01

    This paper proposes a set of methods and a framework for evaluating, modeling, and predicting group interactions in computer-mediated communication. The method of sequential analysis is described along with specific software tools and techniques to facilitate the analysis of message-response sequences. In addition, the Dialogic Theory and its…

  9. Classification Techniques for Multivariate Data Analysis.

    DTIC Science & Technology

    1980-03-28

    analysis among biologists, botanists, and ecologists, while some social scientists may refer "typology". Other frequently encountered terms are pattern...the determinantal equation: lB -XW 0 (42) 49 The solutions X. are the eigenvalues of the matrix W-1 B 1 as in discriminant analysis. There are t non...Statistical Package for Social Sciences (SPSS) (14) subprogram FACTOR was used for the principal components analysis. It is designed both for the factor

  10. Synthesis of Patterned Vertically Aligned Carbon Nanotubes by PECVD Using Different Growth Techniques: A Review.

    PubMed

    Gangele, Aparna; Sharma, Chandra Shekhar; Pandey, Ashok Kumar

    2017-04-01

    Immense development has been taken place not only to increase the bulk production, repeatability and yield of carbon nanotubes (CNTs) in last 25 years but preference is also given to acknowledge the basic concepts of nucleation and growth methods. Vertically aligned carbon nanotubes (VAC-NTs) are forest of CNTs accommodated perpendicular on a substrate. Their exceptional chemical and physical properties along with sequential arrangement and dense structure make them suitable in various fields. The effect of different type of selected substrate, carbon precursor, catalyst and their physical and chemical status, reaction conditions and many other key parameters have been thoroughly studied and analysed. The aim of this paper is to specify the trend and summarize the effect of key parameters instead of only presenting all the experiments reported till date. The identified trends will be compared with the recent observations on the growth of different types of patterned VACNTs. In this review article, we have presented a comprehensive analysis of different techniques to precisely determine the role of different parameters responsible for the growth of patterned vertical aligned carbon nanotubes. We have covered various techniques proposed in the span of more than two decades to fabricate the different structures and configurations of carbon nanotubes on different types of substrates. Apart from a detailed discussion of each technique along with their specific process and implementation, we have also provided a critical analysis of the associated constraints, benefits and shortcomings. To sum it all for easy reference for researchers, we have tabulated all the techniques based on certain main key factors. This review article comprises of an exhaustive discussion and a handy reference for researchers who are new in the field of synthesis of CNTs or who wants to get abreast with the techniques of determining the growth of VACNTs arrays.

  11. Damage detection methodology under variable load conditions based on strain field pattern recognition using FBGs, nonlinear principal component analysis, and clustering techniques

    NASA Astrophysics Data System (ADS)

    Sierra-Pérez, Julián; Torres-Arredondo, M.-A.; Alvarez-Montoya, Joham

    2018-01-01

    Structural health monitoring consists of using sensors integrated within structures together with algorithms to perform load monitoring, damage detection, damage location, damage size and severity, and prognosis. One possibility is to use strain sensors to infer structural integrity by comparing patterns in the strain field between the pristine and damaged conditions. In previous works, the authors have demonstrated that it is possible to detect small defects based on strain field pattern recognition by using robust machine learning techniques. They have focused on methodologies based on principal component analysis (PCA) and on the development of several unfolding and standardization techniques, which allow dealing with multiple load conditions. However, before a real implementation of this approach in engineering structures, changes in the strain field due to conditions different from damage occurrence need to be isolated. Since load conditions may vary in most engineering structures and promote significant changes in the strain field, it is necessary to implement novel techniques for uncoupling such changes from those produced by damage occurrence. A damage detection methodology based on optimal baseline selection (OBS) by means of clustering techniques is presented. The methodology includes the use of hierarchical nonlinear PCA as a nonlinear modeling technique in conjunction with Q and nonlinear-T 2 damage indices. The methodology is experimentally validated using strain measurements obtained by 32 fiber Bragg grating sensors bonded to an aluminum beam under dynamic bending loads and simultaneously submitted to variations in its pitch angle. The results demonstrated the capability of the methodology for clustering data according to 13 different load conditions (pitch angles), performing the OBS and detecting six different damages induced in a cumulative way. The proposed methodology showed a true positive rate of 100% and a false positive rate of 1.28% for a 99% of confidence.

  12. Relating interesting quantitative time series patterns with text events and text features

    NASA Astrophysics Data System (ADS)

    Wanner, Franz; Schreck, Tobias; Jentner, Wolfgang; Sharalieva, Lyubka; Keim, Daniel A.

    2013-12-01

    In many application areas, the key to successful data analysis is the integrated analysis of heterogeneous data. One example is the financial domain, where time-dependent and highly frequent quantitative data (e.g., trading volume and price information) and textual data (e.g., economic and political news reports) need to be considered jointly. Data analysis tools need to support an integrated analysis, which allows studying the relationships between textual news documents and quantitative properties of the stock market price series. In this paper, we describe a workflow and tool that allows a flexible formation of hypotheses about text features and their combinations, which reflect quantitative phenomena observed in stock data. To support such an analysis, we combine the analysis steps of frequent quantitative and text-oriented data using an existing a-priori method. First, based on heuristics we extract interesting intervals and patterns in large time series data. The visual analysis supports the analyst in exploring parameter combinations and their results. The identified time series patterns are then input for the second analysis step, in which all identified intervals of interest are analyzed for frequent patterns co-occurring with financial news. An a-priori method supports the discovery of such sequential temporal patterns. Then, various text features like the degree of sentence nesting, noun phrase complexity, the vocabulary richness, etc. are extracted from the news to obtain meta patterns. Meta patterns are defined by a specific combination of text features which significantly differ from the text features of the remaining news data. Our approach combines a portfolio of visualization and analysis techniques, including time-, cluster- and sequence visualization and analysis functionality. We provide two case studies, showing the effectiveness of our combined quantitative and textual analysis work flow. The workflow can also be generalized to other application domains such as data analysis of smart grids, cyber physical systems or the security of critical infrastructure, where the data consists of a combination of quantitative and textual time series data.

  13. Pattern Recognition for a Flight Dynamics Monte Carlo Simulation

    NASA Technical Reports Server (NTRS)

    Restrepo, Carolina; Hurtado, John E.

    2011-01-01

    The design, analysis, and verification and validation of a spacecraft relies heavily on Monte Carlo simulations. Modern computational techniques are able to generate large amounts of Monte Carlo data but flight dynamics engineers lack the time and resources to analyze it all. The growing amounts of data combined with the diminished available time of engineers motivates the need to automate the analysis process. Pattern recognition algorithms are an innovative way of analyzing flight dynamics data efficiently. They can search large data sets for specific patterns and highlight critical variables so analysts can focus their analysis efforts. This work combines a few tractable pattern recognition algorithms with basic flight dynamics concepts to build a practical analysis tool for Monte Carlo simulations. Current results show that this tool can quickly and automatically identify individual design parameters, and most importantly, specific combinations of parameters that should be avoided in order to prevent specific system failures. The current version uses a kernel density estimation algorithm and a sequential feature selection algorithm combined with a k-nearest neighbor classifier to find and rank important design parameters. This provides an increased level of confidence in the analysis and saves a significant amount of time.

  14. Extracting semantics from audio-visual content: the final frontier in multimedia retrieval.

    PubMed

    Naphade, M R; Huang, T S

    2002-01-01

    Multimedia understanding is a fast emerging interdisciplinary research area. There is tremendous potential for effective use of multimedia content through intelligent analysis. Diverse application areas are increasingly relying on multimedia understanding systems. Advances in multimedia understanding are related directly to advances in signal processing, computer vision, pattern recognition, multimedia databases, and smart sensors. We review the state-of-the-art techniques in multimedia retrieval. In particular, we discuss how multimedia retrieval can be viewed as a pattern recognition problem. We discuss how reliance on powerful pattern recognition and machine learning techniques is increasing in the field of multimedia retrieval. We review the state-of-the-art multimedia understanding systems with particular emphasis on a system for semantic video indexing centered around multijects and multinets. We discuss how semantic retrieval is centered around concepts and context and the various mechanisms for modeling concepts and context.

  15. A Hydrodynamic Instability Is Used to Create Aesthetically Appealing Patterns in Painting

    PubMed Central

    Zetina, Sandra; Godínez, Francisco A.; Zenit, Roberto

    2015-01-01

    Painters often acquire a deep empirical knowledge of the way in which paints and inks behave. Through experimentation and practice, they can control the way in which fluids move and deform to create textures and images. David Alfaro Siqueiros, a recognized Mexican muralist, invented an accidental painting technique to create new and unexpected textures. By pouring layers of paint of different colors on a horizontal surface, the paints infiltrate into each other creating patterns of aesthetic value. In this investigation, we reproduce the technique in a controlled manner. We found that for the correct color combination, the dual viscous layer becomes Rayleigh-Taylor unstable: the density mismatch of the two color paints drives the formation of a spotted pattern. Experiments and a linear instability analysis were conducted to understand the properties of the process. We also argue that this flow configuration can be used to study the linear properties of this instability. PMID:25942586

  16. Processing and statistical analysis of soil-root images

    NASA Astrophysics Data System (ADS)

    Razavi, Bahar S.; Hoang, Duyen; Kuzyakov, Yakov

    2016-04-01

    Importance of the hotspots such as rhizosphere, the small soil volume that surrounds and is influenced by plant roots, calls for spatially explicit methods to visualize distribution of microbial activities in this active site (Kuzyakov and Blagodatskaya, 2015). Zymography technique has previously been adapted to visualize the spatial dynamics of enzyme activities in rhizosphere (Spohn and Kuzyakov, 2014). Following further developing of soil zymography -to obtain a higher resolution of enzyme activities - we aimed to 1) quantify the images, 2) determine whether the pattern (e.g. distribution of hotspots in space) is clumped (aggregated) or regular (dispersed). To this end, we incubated soil-filled rhizoboxes with maize Zea mays L. and without maize (control box) for two weeks. In situ soil zymography was applied to visualize enzymatic activity of β-glucosidase and phosphatase at soil-root interface. Spatial resolution of fluorescent images was improved by direct application of a substrate saturated membrane to the soil-root system. Furthermore, we applied "spatial point pattern analysis" to determine whether the pattern (e.g. distribution of hotspots in space) is clumped (aggregated) or regular (dispersed). Our results demonstrated that distribution of hotspots at rhizosphere is clumped (aggregated) compare to control box without plant which showed regular (dispersed) pattern. These patterns were similar in all three replicates and for both enzymes. We conclude that improved zymography is promising in situ technique to identify, analyze, visualize and quantify spatial distribution of enzyme activities in the rhizosphere. Moreover, such different patterns should be considered in assessments and modeling of rhizosphere extension and the corresponding effects on soil properties and functions. Key words: rhizosphere, spatial point pattern, enzyme activity, zymography, maize.

  17. Agricultural Aircraft Aid

    NASA Technical Reports Server (NTRS)

    1982-01-01

    Farmers are increasingly turning to aerial applications of pesticides, fertilizers and other materials. Sometimes uneven distribution of the chemicals is caused by worn nozzles, improper alignment of spray nozzles or system leaks. If this happens, job must be redone with added expense to both the pilot and customer. Traditional pattern analysis techniques take days or weeks. Utilizing NASA's wind tunnel and computer validation technology, Dr. Roth, Oklahoma State University (OSU), developed a system for providing answers within minutes. Called the Rapid Distribution Pattern Evaluation System, the OSU system consists of a 100-foot measurement frame tied in to computerized analysis and readout equipment. System is mobile, delivered by trailer to airfields in agricultural areas where OSU conducts educational "fly-ins." A fly-in typically draws 50 to 100 aerial applicators, researchers, chemical suppliers and regulatory officials. An applicator can have his spray pattern checked. A computerized readout, available in five to 12 minutes, provides information for correcting shortcomings in the distribution pattern.

  18. Segmentation of touching handwritten Japanese characters using the graph theory method

    NASA Astrophysics Data System (ADS)

    Suwa, Misako

    2000-12-01

    Projection analysis methods have been widely used to segment Japanese character strings. However, if adjacent characters have overhanging strokes or a touching point doesn't correspond to the histogram minimum, the methods are prone to result in errors. In contrast, non-projection analysis methods being proposed for use on numerals or alphabet characters cannot be simply applied for Japanese characters because of the differences in the structure of the characters. Based on the oversegmenting strategy, a new pre-segmentation method is presented in this paper: touching patterns are represented as graphs and touching strokes are regarded as the elements of proper edge cutsets. By using the graph theoretical technique, the cutset martrix is calculated. Then, by applying pruning rules, potential touching strokes are determined and the patterns are over segmented. Moreover, this algorithm was confirmed to be valid for touching patterns with overhanging strokes and doubly connected patterns in simulations.

  19. Development of test methods for scale model simulation of aerial applications in the NASA Langley Vortex Research Facility. [agricultural aircraft

    NASA Technical Reports Server (NTRS)

    Jordan, F. L., Jr.

    1980-01-01

    As part of basic research to improve aerial applications technology, methods were developed at the Langley Vortex Research Facility to simulate and measure deposition patterns of aerially-applied sprays and granular materials by means of tests with small-scale models of agricultural aircraft and dynamically-scaled test particles. Interactions between the aircraft wake and the dispersed particles are being studied with the objective of modifying wake characteristics and dispersal techniques to increase swath width, improve deposition pattern uniformity, and minimize drift. The particle scaling analysis, test methods for particle dispersal from the model aircraft, visualization of particle trajectories, and measurement and computer analysis of test deposition patterns are described. An experimental validation of the scaling analysis and test results that indicate improved control of chemical drift by use of winglets are presented to demonstrate test methods.

  20. A primer to frequent itemset mining for bioinformatics

    PubMed Central

    Naulaerts, Stefan; Meysman, Pieter; Bittremieux, Wout; Vu, Trung Nghia; Vanden Berghe, Wim; Goethals, Bart

    2015-01-01

    Over the past two decades, pattern mining techniques have become an integral part of many bioinformatics solutions. Frequent itemset mining is a popular group of pattern mining techniques designed to identify elements that frequently co-occur. An archetypical example is the identification of products that often end up together in the same shopping basket in supermarket transactions. A number of algorithms have been developed to address variations of this computationally non-trivial problem. Frequent itemset mining techniques are able to efficiently capture the characteristics of (complex) data and succinctly summarize it. Owing to these and other interesting properties, these techniques have proven their value in biological data analysis. Nevertheless, information about the bioinformatics applications of these techniques remains scattered. In this primer, we introduce frequent itemset mining and their derived association rules for life scientists. We give an overview of various algorithms, and illustrate how they can be used in several real-life bioinformatics application domains. We end with a discussion of the future potential and open challenges for frequent itemset mining in the life sciences. PMID:24162173

  1. Nonlinear gamma correction via normed bicoherence minimization in optical fringe projection metrology

    NASA Astrophysics Data System (ADS)

    Kamagara, Abel; Wang, Xiangzhao; Li, Sikun

    2018-03-01

    We propose a method to compensate for the projector intensity nonlinearity induced by gamma effect in three-dimensional (3-D) fringe projection metrology by extending high-order spectra analysis and bispectral norm minimization to digital sinusoidal fringe pattern analysis. The bispectrum estimate allows extraction of vital signal information features such as spectral component correlation relationships in fringe pattern images. Our approach exploits the fact that gamma introduces high-order harmonic correlations in the affected fringe pattern image. Estimation and compensation of projector nonlinearity is realized by detecting and minimizing the normed bispectral coherence of these correlations. The proposed technique does not require calibration information and technical knowledge or specification of fringe projection unit. This is promising for developing a modular and calibration-invariant model for intensity nonlinear gamma compensation in digital fringe pattern projection profilometry. Experimental and numerical simulation results demonstrate this method to be efficient and effective in improving the phase measuring accuracies with phase-shifting fringe pattern projection profilometry.

  2. Geometric Analyses of Rotational Faults.

    ERIC Educational Resources Information Center

    Schwert, Donald Peters; Peck, Wesley David

    1986-01-01

    Describes the use of analysis of rotational faults in undergraduate structural geology laboratories to provide students with applications of both orthographic and stereographic techniques. A demonstration problem is described, and an orthographic/stereographic solution and a reproducible black model demonstration pattern are provided. (TW)

  3. Holographic analysis as an inspection method for welded thin-wall tubing

    NASA Technical Reports Server (NTRS)

    Brooks, Lawrence; Mulholland, John; Genin, Joseph; Matthews, Larryl

    1990-01-01

    The feasibility of using holographic interferometry for locating flaws in welded tubing is explored. Two holographic techniques are considered: traditional holographic interferometry and electronic speckle pattern interferometry. Several flaws including cold laps, discontinuities, and tube misalignments are detected.

  4. Reflected scatterometry for noninvasive interrogation of bacterial colonies

    USDA-ARS?s Scientific Manuscript database

    A phenotyping of bacterial colonies on agar plates using forward-scattering diffraction-pattern analysis provided promising classification of several different bacteria such as Salmonella, Vibrio, Listeria, and E. coli. Since the technique is based on forward-scattering phenomena, light transmittanc...

  5. Machine learning methods reveal the temporal pattern of dengue incidence using meteorological factors in metropolitan Manila, Philippines.

    PubMed

    Carvajal, Thaddeus M; Viacrusis, Katherine M; Hernandez, Lara Fides T; Ho, Howell T; Amalin, Divina M; Watanabe, Kozo

    2018-04-17

    Several studies have applied ecological factors such as meteorological variables to develop models and accurately predict the temporal pattern of dengue incidence or occurrence. With the vast amount of studies that investigated this premise, the modeling approaches differ from each study and only use a single statistical technique. It raises the question of whether which technique would be robust and reliable. Hence, our study aims to compare the predictive accuracy of the temporal pattern of Dengue incidence in Metropolitan Manila as influenced by meteorological factors from four modeling techniques, (a) General Additive Modeling, (b) Seasonal Autoregressive Integrated Moving Average with exogenous variables (c) Random Forest and (d) Gradient Boosting. Dengue incidence and meteorological data (flood, precipitation, temperature, southern oscillation index, relative humidity, wind speed and direction) of Metropolitan Manila from January 1, 2009 - December 31, 2013 were obtained from respective government agencies. Two types of datasets were used in the analysis; observed meteorological factors (MF) and its corresponding delayed or lagged effect (LG). After which, these datasets were subjected to the four modeling techniques. The predictive accuracy and variable importance of each modeling technique were calculated and evaluated. Among the statistical modeling techniques, Random Forest showed the best predictive accuracy. Moreover, the delayed or lag effects of the meteorological variables was shown to be the best dataset to use for such purpose. Thus, the model of Random Forest with delayed meteorological effects (RF-LG) was deemed the best among all assessed models. Relative humidity was shown to be the top-most important meteorological factor in the best model. The study exhibited that there are indeed different predictive outcomes generated from each statistical modeling technique and it further revealed that the Random forest model with delayed meteorological effects to be the best in predicting the temporal pattern of Dengue incidence in Metropolitan Manila. It is also noteworthy that the study also identified relative humidity as an important meteorological factor along with rainfall and temperature that can influence this temporal pattern.

  6. Combined point and distributed techniques for multidimensional estimation of spatial groundwater-stream water exchange in a heterogeneous sand bed-stream.

    NASA Astrophysics Data System (ADS)

    Gaona Garcia, J.; Lewandowski, J.; Bellin, A.

    2017-12-01

    Groundwater-stream water interactions in rivers determine water balances, but also chemical and biological processes in the streambed at different spatial and temporal scales. Due to the difficult identification and quantification of gaining, neutral and losing conditions, it is necessary to combine techniques with complementary capabilities and scale ranges. We applied this concept to a study site at the River Schlaube, East Brandenburg-Germany, a sand bed stream with intense sediment heterogeneity and complex environmental conditions. In our approach, point techniques such as temperature profiles of the streambed together with vertical hydraulic gradients provide data for the estimation of fluxes between groundwater and surface water with the numerical model 1DTempPro. On behalf of distributed techniques, fiber optic distributed temperature sensing identifies the spatial patterns of neutral, down- and up-welling areas by analysis of the changes in the thermal patterns at the streambed interface under certain flow. The study finally links point and surface temperatures to provide a method for upscaling of fluxes. Point techniques provide point flux estimates with essential depth detail to infer streambed structures while the results hardly represent the spatial distribution of fluxes caused by the heterogeneity of streambed properties. Fiber optics proved capable of providing spatial thermal patterns with enough resolution to observe distinct hyporheic thermal footprints at multiple scales. The relation of thermal footprint patterns and temporal behavior with flux results from point techniques enabled the use of methods for spatial flux estimates. The lack of detailed information of the physical driver's spatial distribution restricts the spatial flux estimation to the application of the T-proxy method, whose highly uncertain results mainly provide coarse spatial flux estimates. The study concludes that the upscaling of groundwater-stream water interactions using thermal measurements with combined point and distributed techniques requires the integration of physical drivers because of the heterogeneity of the flux patterns. Combined experimental and modeling approaches may help to obtain more reliable understanding of groundwater-surface water interactions at multiple scales.

  7. 3D capillary stop valves for versatile patterning inside microfluidic chips.

    PubMed

    Papadimitriou, V A; Segerink, L I; van den Berg, A; Eijkel, J C T

    2018-02-13

    The patterning of antibodies in microfluidics chips is always a delicate process that is usually done in an open chip before bonding. Typical bonding techniques such as plasma treatment can harm the antibodies with as result that they are removed from our fabrication toolbox. Here we propose a method, based on capillary phenomena using 3D capillary valves, that autonomously and conveniently allows us to pattern liquids inside closed chips. We theoretically analyse the system and demonstrate how our analysis can be used as a design tool for various applications. Chips patterned with the method were used for simple immunodetection of a cardiac biomarker which demonstrates its suitability for antibody patterning. Copyright © 2017 The Authors. Published by Elsevier B.V. All rights reserved.

  8. Pattern Activity Clustering and Evaluation (PACE)

    NASA Astrophysics Data System (ADS)

    Blasch, Erik; Banas, Christopher; Paul, Michael; Bussjager, Becky; Seetharaman, Guna

    2012-06-01

    With the vast amount of network information available on activities of people (i.e. motions, transportation routes, and site visits) there is a need to explore the salient properties of data that detect and discriminate the behavior of individuals. Recent machine learning approaches include methods of data mining, statistical analysis, clustering, and estimation that support activity-based intelligence. We seek to explore contemporary methods in activity analysis using machine learning techniques that discover and characterize behaviors that enable grouping, anomaly detection, and adversarial intent prediction. To evaluate these methods, we describe the mathematics and potential information theory metrics to characterize behavior. A scenario is presented to demonstrate the concept and metrics that could be useful for layered sensing behavior pattern learning and analysis. We leverage work on group tracking, learning and clustering approaches; as well as utilize information theoretical metrics for classification, behavioral and event pattern recognition, and activity and entity analysis. The performance evaluation of activity analysis supports high-level information fusion of user alerts, data queries and sensor management for data extraction, relations discovery, and situation analysis of existing data.

  9. Statistical Exploration of Electronic Structure of Molecules from Quantum Monte-Carlo Simulations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Prabhat, Mr; Zubarev, Dmitry; Lester, Jr., William A.

    In this report, we present results from analysis of Quantum Monte Carlo (QMC) simulation data with the goal of determining internal structure of a 3N-dimensional phase space of an N-electron molecule. We are interested in mining the simulation data for patterns that might be indicative of the bond rearrangement as molecules change electronic states. We examined simulation output that tracks the positions of two coupled electrons in the singlet and triplet states of an H2 molecule. The electrons trace out a trajectory, which was analyzed with a number of statistical techniques. This project was intended to address the following scientificmore » questions: (1) Do high-dimensional phase spaces characterizing electronic structure of molecules tend to cluster in any natural way? Do we see a change in clustering patterns as we explore different electronic states of the same molecule? (2) Since it is hard to understand the high-dimensional space of trajectories, can we project these trajectories to a lower dimensional subspace to gain a better understanding of patterns? (3) Do trajectories inherently lie in a lower-dimensional manifold? Can we recover that manifold? After extensive statistical analysis, we are now in a better position to respond to these questions. (1) We definitely see clustering patterns, and differences between the H2 and H2tri datasets. These are revealed by the pamk method in a fairly reliable manner and can potentially be used to distinguish bonded and non-bonded systems and get insight into the nature of bonding. (2) Projecting to a lower dimensional subspace ({approx}4-5) using PCA or Kernel PCA reveals interesting patterns in the distribution of scalar values, which can be related to the existing descriptors of electronic structure of molecules. Also, these results can be immediately used to develop robust tools for analysis of noisy data obtained during QMC simulations (3) All dimensionality reduction and estimation techniques that we tried seem to indicate that one needs 4 or 5 components to account for most of the variance in the data, hence this 5D dataset does not necessarily lie on a well-defined, low dimensional manifold. In terms of specific clustering techniques, K-means was generally useful in exploring the dataset. The partition around medoids (pam) technique produced the most definitive results for our data showing distinctive patterns for both a sample of the complete data and time-series. The gap statistic with tibshirani criteria did not provide any distinction across the 2 dataset. The gap statistic w/DandF criteria, Model based clustering and hierarchical modeling simply failed to run on our datasets. Thankfully, the vanilla PCA technique was successful in handling our entire dataset. PCA revealed some interesting patterns for the scalar value distribution. Kernel PCA techniques (vanilladot, RBF, Polynomial) and MDS failed to run on the entire dataset, or even a significant fraction of the dataset, and we resorted to creating an explicit feature map followed by conventional PCA. Clustering using K-means and PAM in the new basis set seems to produce promising results. Understanding the new basis set in the scientific context of the problem is challenging, and we are currently working to further examine and interpret the results.« less

  10. Automatic two- and three-dimensional mesh generation based on fuzzy knowledge processing

    NASA Astrophysics Data System (ADS)

    Yagawa, G.; Yoshimura, S.; Soneda, N.; Nakao, K.

    1992-09-01

    This paper describes the development of a novel automatic FEM mesh generation algorithm based on the fuzzy knowledge processing technique. A number of local nodal patterns are stored in a nodal pattern database of the mesh generation system. These nodal patterns are determined a priori based on certain theories or past experience of experts of FEM analyses. For example, such human experts can determine certain nodal patterns suitable for stress concentration analyses of cracks, corners, holes and so on. Each nodal pattern possesses a membership function and a procedure of node placement according to this function. In the cases of the nodal patterns for stress concentration regions, the membership function which is utilized in the fuzzy knowledge processing has two meanings, i.e. the “closeness” of nodal location to each stress concentration field as well as “nodal density”. This is attributed to the fact that a denser nodal pattern is required near a stress concentration field. What a user has to do in a practical mesh generation process are to choose several local nodal patterns properly and to designate the maximum nodal density of each pattern. After those simple operations by the user, the system places the chosen nodal patterns automatically in an analysis domain and on its boundary, and connects them smoothly by the fuzzy knowledge processing technique. Then triangular or tetrahedral elements are generated by means of the advancing front method. The key issue of the present algorithm is an easy control of complex two- or three-dimensional nodal density distribution by means of the fuzzy knowledge processing technique. To demonstrate fundamental performances of the present algorithm, a prototype system was constructed with one of object-oriented languages, Smalltalk-80 on a 32-bit microcomputer, Macintosh II. The mesh generation of several two- and three-dimensional domains with cracks, holes and junctions was presented as examples.

  11. Analysis of Vertebral Bone Strength, Fracture Pattern, and Fracture Location: A Validation Study Using a Computed Tomography-Based Nonlinear Finite Element Analysis

    PubMed Central

    Imai, Kazuhiro

    2015-01-01

    Finite element analysis (FEA) is an advanced computer technique of structural stress analysis developed in engineering mechanics. Because the compressive behavior of vertebral bone shows nonlinear behavior, a nonlinear FEA should be utilized to analyze the clinical vertebral fracture. In this article, a computed tomography-based nonlinear FEA (CT/FEA) to analyze the vertebral bone strength, fracture pattern, and fracture location is introduced. The accuracy of the CT/FEA was validated by performing experimental mechanical testing with human cadaveric specimens. Vertebral bone strength and the minimum principal strain at the vertebral surface were accurately analyzed using the CT/FEA. The experimental fracture pattern and fracture location were also accurately simulated. Optimization of the element size was performed by assessing the accuracy of the CT/FEA, and the optimum element size was assumed to be 2 mm. It is expected that the CT/FEA will be valuable in analyzing vertebral fracture risk and assessing therapeutic effects on osteoporosis. PMID:26029476

  12. Measurement of residual stresses by the moire method

    NASA Astrophysics Data System (ADS)

    Sciammarella, C. A.; Albertazzi, A., Jr.

    Three different applications of the moire method to the determination of residual stresses and strains are presented. The three applications take advantage of the property of ratings to record the changes of the surface they are printed on. One of the applications deals with thermal residual stresses, another with contact residual stress and the third one is a generalization of the blind hole technique. This last application is based on a computer assisted moire technique and on the generalization of the quasi-heterodyne techniques of fringe pattern analysis.

  13. Robust reconstruction of time-resolved diffraction from ultrafast streak cameras

    PubMed Central

    Badali, Daniel S.; Dwayne Miller, R. J.

    2017-01-01

    In conjunction with ultrafast diffraction, streak cameras offer an unprecedented opportunity for recording an entire molecular movie with a single probe pulse. This is an attractive alternative to conventional pump-probe experiments and opens the door to studying irreversible dynamics. However, due to the “smearing” of the diffraction pattern across the detector, the streaking technique has thus far been limited to simple mono-crystalline samples and extreme care has been taken to avoid overlapping diffraction spots. In this article, this limitation is addressed by developing a general theory of streaking of time-dependent diffraction patterns. Understanding the underlying physics of this process leads to the development of an algorithm based on Bayesian analysis to reconstruct the time evolution of the two-dimensional diffraction pattern from a single streaked image. It is demonstrated that this approach works on diffraction peaks that overlap when streaked, which not only removes the necessity of carefully choosing the streaking direction but also extends the streaking technique to be able to study polycrystalline samples and materials with complex crystalline structures. Furthermore, it is shown that the conventional analysis of streaked diffraction can lead to erroneous interpretations of the data. PMID:28653022

  14. An Optimization of Inventory Demand Forecasting in University Healthcare Centre

    NASA Astrophysics Data System (ADS)

    Bon, A. T.; Ng, T. K.

    2017-01-01

    Healthcare industry becomes an important field for human beings nowadays as it concerns about one’s health. With that, forecasting demand for health services is an important step in managerial decision making for all healthcare organizations. Hence, a case study was conducted in University Health Centre to collect historical demand data of Panadol 650mg for 68 months from January 2009 until August 2014. The aim of the research is to optimize the overall inventory demand through forecasting techniques. Quantitative forecasting or time series forecasting model was used in the case study to forecast future data as a function of past data. Furthermore, the data pattern needs to be identified first before applying the forecasting techniques. Trend is the data pattern and then ten forecasting techniques are applied using Risk Simulator Software. Lastly, the best forecasting techniques will be find out with the least forecasting error. Among the ten forecasting techniques include single moving average, single exponential smoothing, double moving average, double exponential smoothing, regression, Holt-Winter’s additive, Seasonal additive, Holt-Winter’s multiplicative, seasonal multiplicative and Autoregressive Integrated Moving Average (ARIMA). According to the forecasting accuracy measurement, the best forecasting technique is regression analysis.

  15. Classification of Antibiotic Resistance Patterns of Indicator Bacteria by Discriminant Analysis: Use in Predicting the Source of Fecal Contamination in Subtropical Waters

    PubMed Central

    Harwood, Valerie J.; Whitlock, John; Withington, Victoria

    2000-01-01

    The antibiotic resistance patterns of fecal streptococci and fecal coliforms isolated from domestic wastewater and animal feces were determined using a battery of antibiotics (amoxicillin, ampicillin, cephalothin, chlortetracycline, oxytetracycline, tetracycline, erythromycin, streptomycin, and vancomycin) at four concentrations each. The sources of animal feces included wild birds, cattle, chickens, dogs, pigs, and raccoons. Antibiotic resistance patterns of fecal streptococci and fecal coliforms from known sources were grouped into two separate databases, and discriminant analysis of these patterns was used to establish the relationship between the antibiotic resistance patterns and the bacterial source. The fecal streptococcus and fecal coliform databases classified isolates from known sources with similar accuracies. The average rate of correct classification for the fecal streptococcus database was 62.3%, and that for the fecal coliform database was 63.9%. The sources of fecal streptococci and fecal coliforms isolated from surface waters were identified by discriminant analysis of their antibiotic resistance patterns. Both databases identified the source of indicator bacteria isolated from surface waters directly impacted by septic tank discharges as human. At sample sites selected for relatively low anthropogenic impact, the dominant sources of indicator bacteria were identified as various animals. The antibiotic resistance analysis technique promises to be a useful tool in assessing sources of fecal contamination in subtropical waters, such as those in Florida. PMID:10966379

  16. Data handling and analysis for the 1971 corn blight watch experiment.

    NASA Technical Reports Server (NTRS)

    Anuta, P. E.; Phillips, T. L.; Landgrebe, D. A.

    1972-01-01

    Review of the data handling and analysis methods used in the near-operational test of remote sensing systems provided by the 1971 corn blight watch experiment. The general data analysis techniques and, particularly, the statistical multispectral pattern recognition methods for automatic computer analysis of aircraft scanner data are described. Some of the results obtained are examined, and the implications of the experiment for future data communication requirements of earth resource survey systems are discussed.

  17. Information analysis of a spatial database for ecological land classification

    NASA Technical Reports Server (NTRS)

    Davis, Frank W.; Dozier, Jeff

    1990-01-01

    An ecological land classification was developed for a complex region in southern California using geographic information system techniques of map overlay and contingency table analysis. Land classes were identified by mutual information analysis of vegetation pattern in relation to other mapped environmental variables. The analysis was weakened by map errors, especially errors in the digital elevation data. Nevertheless, the resulting land classification was ecologically reasonable and performed well when tested with higher quality data from the region.

  18. Microhabitat and Environmental Relationships of Bryophytes in Blue Oak (Quercus douglasii H. & A.) Woodlands and Forests of Central Coastal California

    Treesearch

    Mark Borchert; Daniel Norris

    1991-01-01

    Microhabitat preferences and species-environment patterns were quantified for bryophytes in blue oak woodlands and forests of central coastal California. Presence data for mosses collected from 149 400 m2 plots were analyzed using canonical correspondence analysis (CCA), a multivariate direct gradient analysis technique. Separate ordinations were performed for...

  19. Atmospheric solids analysis probe mass spectrometry for the rapid identification of pollens and semi-quantification of flavonoid fingerprints.

    PubMed

    Xiao, Xiaoyin; Miller, Lance L; Parchert, Kylea J; Hayes, Dulce; Hochrein, James M

    2016-07-15

    From allergies to plant reproduction, pollens have important impacts on the health of human and plant populations, yet identification of pollen grains remains difficult and time-consuming. Low-volatility flavonoids generated from pollens cannot be easily characterized and quantified with current analytical techniques. Here we present the novel use of atmospheric solids analysis probe mass spectrometry (ASAP-MS) for the characterization of flavonoids in pollens. Flavonoid patterns were generated for pollens collected from different plant types (trees and bushes) in addition to bee pollens from distinct geographic regions. Standard flavonoids (kaempferol and rhamnazin) and those produced from pollens were compared and assessed with ASAP-MS using low-energy collision MS/MS. Results for a semi-quantitative method for assessing the amount of a flavonoid in pollens are also presented. Flavonoid patterns for pollen samples were distinct with variability in the number and relative abundance of flavonoids in each sample. Pollens contained 2-5 flavonoids, and all but Kochia scoparia contained kaempferol or kaempferol isomers. We establish this method as a reliable and applicable technique for analyzing low-volatility compounds with minimal sample preparation. Standard curves were generated using 0.2-5 μg of kaempferol; from these experiments, it was estimated that there is approximately 2 mg of kaempferol present in 1 g of P. nigra italica pollen. Pollens can be characterized with a simple flavonoid pattern rather than analyzing the whole product pattern or the products-temperature profiles. ASAP-MS is a rapid analytical technique that can be used to distinguish between plant pollens and between bee pollens originating from different regions. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.

  20. Atlas-guided prostate intensity modulated radiation therapy (IMRT) planning.

    PubMed

    Sheng, Yang; Li, Taoran; Zhang, You; Lee, W Robert; Yin, Fang-Fang; Ge, Yaorong; Wu, Q Jackie

    2015-09-21

    An atlas-based IMRT planning technique for prostate cancer was developed and evaluated. A multi-dose atlas was built based on the anatomy patterns of the patients, more specifically, the percent distance to the prostate and the concaveness angle formed by the seminal vesicles relative to the anterior-posterior axis. A 70-case dataset was classified using a k-medoids clustering analysis to recognize anatomy pattern variations in the dataset. The best classification, defined by the number of classes or medoids, was determined by the largest value of the average silhouette width. Reference plans from each class formed a multi-dose atlas. The atlas-guided planning (AGP) technique started with matching the new case anatomy pattern to one of the reference cases in the atlas; then a deformable registration between the atlas and new case anatomies transferred the dose from the atlas to the new case to guide inverse planning with full automation. 20 additional clinical cases were re-planned to evaluate the AGP technique. Dosimetric properties between AGP and clinical plans were evaluated. The classification analysis determined that the 5-case atlas would best represent anatomy patterns for the patient cohort. AGP took approximately 1 min on average (corresponding to 70 iterations of optimization) for all cases. When dosimetric parameters were compared, the differences between AGP and clinical plans were less than 3.5%, albeit some statistical significances observed: homogeneity index (p  >  0.05), conformity index (p  <  0.01), bladder gEUD (p  <  0.01), and rectum gEUD (p  =  0.02). Atlas-guided treatment planning is feasible and efficient. Atlas predicted dose can effectively guide the optimizer to achieve plan quality comparable to that of clinical plans.

  1. A tool for classifying individuals with chronic back pain: using multivariate pattern analysis with functional magnetic resonance imaging data.

    PubMed

    Callan, Daniel; Mills, Lloyd; Nott, Connie; England, Robert; England, Shaun

    2014-01-01

    Chronic pain is one of the most prevalent health problems in the world today, yet neurological markers, critical to diagnosis of chronic pain, are still largely unknown. The ability to objectively identify individuals with chronic pain using functional magnetic resonance imaging (fMRI) data is important for the advancement of diagnosis, treatment, and theoretical knowledge of brain processes associated with chronic pain. The purpose of our research is to investigate specific neurological markers that could be used to diagnose individuals experiencing chronic pain by using multivariate pattern analysis with fMRI data. We hypothesize that individuals with chronic pain have different patterns of brain activity in response to induced pain. This pattern can be used to classify the presence or absence of chronic pain. The fMRI experiment consisted of alternating 14 seconds of painful electric stimulation (applied to the lower back) with 14 seconds of rest. We analyzed contrast fMRI images in stimulation versus rest in pain-related brain regions to distinguish between the groups of participants: 1) chronic pain and 2) normal controls. We employed supervised machine learning techniques, specifically sparse logistic regression, to train a classifier based on these contrast images using a leave-one-out cross-validation procedure. We correctly classified 92.3% of the chronic pain group (N = 13) and 92.3% of the normal control group (N = 13) by recognizing multivariate patterns of activity in the somatosensory and inferior parietal cortex. This technique demonstrates that differences in the pattern of brain activity to induced pain can be used as a neurological marker to distinguish between individuals with and without chronic pain. Medical, legal and business professionals have recognized the importance of this research topic and of developing objective measures of chronic pain. This method of data analysis was very successful in correctly classifying each of the two groups.

  2. A Tool for Classifying Individuals with Chronic Back Pain: Using Multivariate Pattern Analysis with Functional Magnetic Resonance Imaging Data

    PubMed Central

    Callan, Daniel; Mills, Lloyd; Nott, Connie; England, Robert; England, Shaun

    2014-01-01

    Chronic pain is one of the most prevalent health problems in the world today, yet neurological markers, critical to diagnosis of chronic pain, are still largely unknown. The ability to objectively identify individuals with chronic pain using functional magnetic resonance imaging (fMRI) data is important for the advancement of diagnosis, treatment, and theoretical knowledge of brain processes associated with chronic pain. The purpose of our research is to investigate specific neurological markers that could be used to diagnose individuals experiencing chronic pain by using multivariate pattern analysis with fMRI data. We hypothesize that individuals with chronic pain have different patterns of brain activity in response to induced pain. This pattern can be used to classify the presence or absence of chronic pain. The fMRI experiment consisted of alternating 14 seconds of painful electric stimulation (applied to the lower back) with 14 seconds of rest. We analyzed contrast fMRI images in stimulation versus rest in pain-related brain regions to distinguish between the groups of participants: 1) chronic pain and 2) normal controls. We employed supervised machine learning techniques, specifically sparse logistic regression, to train a classifier based on these contrast images using a leave-one-out cross-validation procedure. We correctly classified 92.3% of the chronic pain group (N = 13) and 92.3% of the normal control group (N = 13) by recognizing multivariate patterns of activity in the somatosensory and inferior parietal cortex. This technique demonstrates that differences in the pattern of brain activity to induced pain can be used as a neurological marker to distinguish between individuals with and without chronic pain. Medical, legal and business professionals have recognized the importance of this research topic and of developing objective measures of chronic pain. This method of data analysis was very successful in correctly classifying each of the two groups. PMID:24905072

  3. Graphic design of pinhole cameras

    NASA Technical Reports Server (NTRS)

    Edwards, H. B.; Chu, W. P.

    1979-01-01

    The paper describes a graphic technique for the analysis and optimization of pinhole size and focal length. The technique is based on the use of the transfer function of optical elements described by Scott (1959) to construct the transfer function of a circular pinhole camera. This transfer function is the response of a component or system to a pattern of lines having a sinusoidally varying radiance at varying spatial frequencies. Some specific examples of graphic design are presented.

  4. Use of Amplified Fragment Length Polymorphisms for Typing Corynebacterium diphtheriae

    PubMed Central

    De Zoysa, Aruni; Efstratiou, Androulla

    2000-01-01

    Amplified fragment length polymorphism (AFLP) was investigated for the differentiation of Corynebacterium diphtheriae isolates. Analysis using Taxotron revealed 10 distinct AFLP profiles among 57 isolates. Strains with ribotype patterns D1, D4, and D12 could not be distinguished; however, the technique discriminated isolates of ribotype patterns D3, D6, and D7 further. AFLP was rapid, fairly inexpensive, and reproducible and could be used as an alternative to ribotyping. PMID:11015416

  5. Identification of superficial defects in reconstructed 3D objects using phase-shifting fringe projection

    NASA Astrophysics Data System (ADS)

    Madrigal, Carlos A.; Restrepo, Alejandro; Branch, John W.

    2016-09-01

    3D reconstruction of small objects is used in applications of surface analysis, forensic analysis and tissue reconstruction in medicine. In this paper, we propose a strategy for the 3D reconstruction of small objects and the identification of some superficial defects. We applied a technique of projection of structured light patterns, specifically sinusoidal fringes and an algorithm of phase unwrapping. A CMOS camera was used to capture images and a DLP digital light projector for synchronous projection of the sinusoidal pattern onto the objects. We implemented a technique based on a 2D flat pattern as calibration process, so the intrinsic and extrinsic parameters of the camera and the DLP were defined. Experimental tests were performed in samples of artificial teeth, coal particles, welding defects and surfaces tested with Vickers indentation. Areas less than 5cm were studied. The objects were reconstructed in 3D with densities of about one million points per sample. In addition, the steps of 3D description, identification of primitive, training and classification were implemented to recognize defects, such as: holes, cracks, roughness textures and bumps. We found that pattern recognition strategies are useful, when quality supervision of surfaces has enough quantities of points to evaluate the defective region, because the identification of defects in small objects is a demanding activity of the visual inspection.

  6. Inhomogeneity Based Characterization of Distribution Patterns on the Plasma Membrane

    PubMed Central

    Paparelli, Laura; Corthout, Nikky; Wakefield, Devin L.; Sannerud, Ragna; Jovanovic-Talisman, Tijana; Annaert, Wim; Munck, Sebastian

    2016-01-01

    Cell surface protein and lipid molecules are organized in various patterns: randomly, along gradients, or clustered when segregated into discrete micro- and nano-domains. Their distribution is tightly coupled to events such as polarization, endocytosis, and intracellular signaling, but challenging to quantify using traditional techniques. Here we present a novel approach to quantify the distribution of plasma membrane proteins and lipids. This approach describes spatial patterns in degrees of inhomogeneity and incorporates an intensity-based correction to analyze images with a wide range of resolutions; we have termed it Quantitative Analysis of the Spatial distributions in Images using Mosaic segmentation and Dual parameter Optimization in Histograms (QuASIMoDOH). We tested its applicability using simulated microscopy images and images acquired by widefield microscopy, total internal reflection microscopy, structured illumination microscopy, and photoactivated localization microscopy. We validated QuASIMoDOH, successfully quantifying the distribution of protein and lipid molecules detected with several labeling techniques, in different cell model systems. We also used this method to characterize the reorganization of cell surface lipids in response to disrupted endosomal trafficking and to detect dynamic changes in the global and local organization of epidermal growth factor receptors across the cell surface. Our findings demonstrate that QuASIMoDOH can be used to assess protein and lipid patterns, quantifying distribution changes and spatial reorganization at the cell surface. An ImageJ/Fiji plugin of this analysis tool is provided. PMID:27603951

  7. Knowledge Discovery and Data Mining in Iran's Climatic Researches

    NASA Astrophysics Data System (ADS)

    Karimi, Mostafa

    2013-04-01

    Advances in measurement technology and data collection is the database gets larger. Large databases require powerful tools for analysis data. Iterative process of acquiring knowledge from information obtained from data processing is done in various forms in all scientific fields. However, when the data volume large, and many of the problems the Traditional methods cannot respond. in the recent years, use of databases in various scientific fields, especially atmospheric databases in climatology expanded. in addition, increases in the amount of data generated by the climate models is a challenge for analysis of it for extraction of hidden pattern and knowledge. The approach to this problem has been made in recent years uses the process of knowledge discovery and data mining techniques with the use of the concepts of machine learning, artificial intelligence and expert (professional) systems is overall performance. Data manning is analytically process for manning in massive volume data. The ultimate goal of data mining is access to information and finally knowledge. climatology is a part of science that uses variety and massive volume data. Goal of the climate data manning is Achieve to information from variety and massive atmospheric and non-atmospheric data. in fact, Knowledge Discovery performs these activities in a logical and predetermined and almost automatic process. The goal of this research is study of uses knowledge Discovery and data mining technique in Iranian climate research. For Achieve This goal, study content (descriptive) analysis and classify base method and issue. The result shown that in climatic research of Iran most clustering, k-means and wards applied and in terms of issues precipitation and atmospheric circulation patterns most introduced. Although several studies in geography and climate issues with statistical techniques such as clustering and pattern extraction is done, Due to the nature of statistics and data mining, but cannot say for internal climate studies in data mining and knowledge discovery techniques are used. However, it is necessary to use the KDD Approach and DM techniques in the climatic studies, specific interpreter of climate modeling result.

  8. Spatio-temporal patterns of Barmah Forest virus disease in Queensland, Australia.

    PubMed

    Naish, Suchithra; Hu, Wenbiao; Mengersen, Kerrie; Tong, Shilu

    2011-01-01

    Barmah Forest virus (BFV) disease is a common and wide-spread mosquito-borne disease in Australia. This study investigated the spatio-temporal patterns of BFV disease in Queensland, Australia using geographical information system (GIS) tools and geostatistical analysis. We calculated the incidence rates and standardised incidence rates of BFV disease. Moran's I statistic was used to assess the spatial autocorrelation of BFV incidences. Spatial dynamics of BFV disease was examined using semi-variogram analysis. Interpolation techniques were applied to visualise and display the spatial distribution of BFV disease in statistical local areas (SLAs) throughout Queensland. Mapping of BFV disease by SLAs reveals the presence of substantial spatio-temporal variation over time. Statistically significant differences in BFV incidence rates were identified among age groups (χ(2) = 7587, df = 7327,p<0.01). There was a significant positive spatial autocorrelation of BFV incidence for all four periods, with the Moran's I statistic ranging from 0.1506 to 0.2901 (p<0.01). Semi-variogram analysis and smoothed maps created from interpolation techniques indicate that the pattern of spatial autocorrelation was not homogeneous across the state. This is the first study to examine spatial and temporal variation in the incidence rates of BFV disease across Queensland using GIS and geostatistics. The BFV transmission varied with age and gender, which may be due to exposure rates or behavioural risk factors. There are differences in the spatio-temporal patterns of BFV disease which may be related to local socio-ecological and environmental factors. These research findings may have implications in the BFV disease control and prevention programs in Queensland.

  9. Characterisation of Ductile Prepregs

    NASA Astrophysics Data System (ADS)

    Pinto, F.; White, A.; Meo, M.

    2013-04-01

    This study is focused on the analysis of micro-perforated prepregs created from standard, off the shelf prepregs modified by a particular laser process to enhance ductility of prepregs for better formability and drapability. Fibres are shortened through the use of laser cutting in a predetermined pattern intended to maintain alignment, and therefore mechanical properties, yet increase ductility at the working temperature. The increase in ductility allows the product to be more effectively optimised for specific forming techniques. Tensile tests were conducted on several specimens in order to understand the ductility enhancement offered by this process with different micro-perforation patterns over standard prepregs. Furthermore, the effects of forming temperature was also analysed to assess the applicability of this material to hot draping techniques and other heated processes.

  10. Optimization technique for problems with an inequality constraint

    NASA Technical Reports Server (NTRS)

    Russell, K. J.

    1972-01-01

    General technique uses a modified version of an existing technique termed the pattern search technique. New procedure called the parallel move strategy permits pattern search technique to be used with problems involving a constraint.

  11. Development and application of the maximum entropy method and other spectral estimation techniques

    NASA Astrophysics Data System (ADS)

    King, W. R.

    1980-09-01

    This summary report is a collection of four separate progress reports prepared under three contracts, which are all sponsored by the Office of Naval Research in Arlington, Virginia. This report contains the results of investigations into the application of the maximum entropy method (MEM), a high resolution, frequency and wavenumber estimation technique. The report also contains a description of two, new, stable, high resolution spectral estimation techniques that is provided in the final report section. Many examples of wavenumber spectral patterns for all investigated techniques are included throughout the report. The maximum entropy method is also known as the maximum entropy spectral analysis (MESA) technique, and both names are used in the report. Many MEM wavenumber spectral patterns are demonstrated using both simulated and measured radar signal and noise data. Methods for obtaining stable MEM wavenumber spectra are discussed, broadband signal detection using the MEM prediction error transform (PET) is discussed, and Doppler radar narrowband signal detection is demonstrated using the MEM technique. It is also shown that MEM cannot be applied to randomly sampled data. The two new, stable, high resolution, spectral estimation techniques discussed in the final report section, are named the Wiener-King and the Fourier spectral estimation techniques. The two new techniques have a similar derivation based upon the Wiener prediction filter, but the two techniques are otherwise quite different. Further development of the techniques and measurement of the technique spectral characteristics is recommended for subsequent investigation.

  12. Spatial patterns of heavy metals in soil under different geological structures and land uses for assessing metal enrichments.

    PubMed

    Krami, Loghman Khoda; Amiri, Fazel; Sefiyanian, Alireza; Shariff, Abdul Rashid B Mohamed; Tabatabaie, Tayebeh; Pradhan, Biswajeet

    2013-12-01

    One hundred and thirty composite soil samples were collected from Hamedan county, Iran to characterize the spatial distribution and trace the sources of heavy metals including As, Cd, Co, Cr, Cu, Ni, Pb, V, Zn, and Fe. The multivariate gap statistical analysis was used; for interrelation of spatial patterns of pollution, the disjunctive kriging and geoenrichment factor (EF(G)) techniques were applied. Heavy metals and soil properties were grouped using agglomerative hierarchical clustering and gap statistic. Principal component analysis was used for identification of the source of metals in a set of data. Geostatistics was used for the geospatial data processing. Based on the comparison between the original data and background values of the ten metals, the disjunctive kriging and EF(G) techniques were used to quantify their geospatial patterns and assess the contamination levels of the heavy metals. The spatial distribution map combined with the statistical analysis showed that the main source of Cr, Co, Ni, Zn, Pb, and V in group A land use (agriculture, rocky, and urban) was geogenic; the origin of As, Cd, and Cu was industrial and agricultural activities (anthropogenic sources). In group B land use (rangeland and orchards), the origin of metals (Cr, Co, Ni, Zn, and V) was mainly controlled by natural factors and As, Cd, Cu, and Pb had been added by organic factors. In group C land use (water), the origin of most heavy metals is natural without anthropogenic sources. The Cd and As pollution was relatively more serious in different land use. The EF(G) technique used confirmed the anthropogenic influence of heavy metal pollution. All metals showed concentrations substantially higher than their background values, suggesting anthropogenic pollution.

  13. Non-Inferential Multi-Subject Study of Functional Connectivity during Visual Stimulation.

    PubMed

    Esposito, F; Cirillo, M; Aragri, A; Caranci, F; Cirillo, L; Di Salle, F; Cirillo, S

    2007-01-31

    Independent component analysis (ICA) is a powerful technique for the multivariate, non-inferential, data-driven analysis of functional magnetic resonance imaging (fMRI) data-sets. The non-inferential nature of ICA makes this a suitable technique for the study of complex mental states whose temporal evolution would be difficult to describe analytically in terms of classical statistical regressors. Taking advantage of this feature, ICA can extract a number of functional connectivity patterns regardless of the task executed by the subject. The technique is so powerful that functional connectivity patterns can be derived even when the subject is just resting in the scanner, opening the opportunity for functional investigation of the human mind at its basal "default" state, which has been proposed to be altered in several brain disorders. However, one major drawback of ICA consists in the difficulty of managing its results, which are not represented by a single functional image as in inferential studies. This produces the need for a classification of ICA results and exacerbates the difficulty of obtaining group "averaged" functional connectivity patterns, while preserving the interpretation of individual differences. Addressing the subject-level variability in the very same framework of "grouping" appears to be a favourable approach towards the clinical evaluation and application of ICA-based methodologies. Here we present a novel strategy for group-level ICA analyses, namely the self-organizing group-level ICA (sog-ICA), which is used on visual activation fMRI data from a block-design experiment repeated on six subjects. We propose the sog-ICA as a multi-subject analysis tool for grouping ICA data while assessing the similarity and variability of the fMRI results of individual subject decompositions.

  14. Qualitative and quantitative differentiation of gases using ZnO thin film gas sensors and pattern recognition analysis.

    PubMed

    Pati, Sumati; Maity, A; Banerji, P; Majumder, S B

    2014-04-07

    In the present work we have grown highly textured, ultra-thin, nano-crystalline zinc oxide thin films using a metal organic chemical vapor deposition technique and addressed their selectivity towards hydrogen, carbon dioxide and methane gas sensing. Structural and microstructural characteristics of the synthesized films were investigated utilizing X-ray diffraction and electron microscopy techniques respectively. Using a dynamic flow gas sensing measurement set up, the sensing characteristics of these films were investigated as a function of gas concentration (10-1660 ppm) and operating temperature (250-380 °C). ZnO thin film sensing elements were found to be sensitive to all of these gases. Thus at a sensor operating temperature of ~300 °C, the response% of the ZnO thin films were ~68, 59, and 52% for hydrogen, carbon monoxide and methane gases respectively. The data matrices extracted from first Fourier transform analyses (FFT) of the conductance transients were used as input parameters in a linear unsupervised principal component analysis (PCA) pattern recognition technique. We have demonstrated that FFT combined with PCA is an excellent tool for the differentiation of these reducing gases.

  15. Real-time continuous visual biofeedback in the treatment of speech breathing disorders following childhood traumatic brain injury: report of one case.

    PubMed

    Murdoch, B E; Pitt, G; Theodoros, D G; Ward, E C

    1999-01-01

    The efficacy of traditional and physiological biofeedback methods for modifying abnormal speech breathing patterns was investigated in a child with persistent dysarthria following severe traumatic brain injury (TBI). An A-B-A-B single-subject experimental research design was utilized to provide the subject with two exclusive periods of therapy for speech breathing, based on traditional therapy techniques and physiological biofeedback methods, respectively. Traditional therapy techniques included establishing optimal posture for speech breathing, explanation of the movement of the respiratory muscles, and a hierarchy of non-speech and speech tasks focusing on establishing an appropriate level of sub-glottal air pressure, and improving the subject's control of inhalation and exhalation. The biofeedback phase of therapy utilized variable inductance plethysmography (or Respitrace) to provide real-time, continuous visual biofeedback of ribcage circumference during breathing. As in traditional therapy, a hierarchy of non-speech and speech tasks were devised to improve the subject's control of his respiratory pattern. Throughout the project, the subject's respiratory support for speech was assessed both instrumentally and perceptually. Instrumental assessment included kinematic and spirometric measures, and perceptual assessment included the Frenchay Dysarthria Assessment, Assessment of Intelligibility of Dysarthric Speech, and analysis of a speech sample. The results of the study demonstrated that real-time continuous visual biofeedback techniques for modifying speech breathing patterns were not only effective, but superior to the traditional therapy techniques for modifying abnormal speech breathing patterns in a child with persistent dysarthria following severe TBI. These results show that physiological biofeedback techniques are potentially useful clinical tools for the remediation of speech breathing impairment in the paediatric dysarthric population.

  16. A Parallel Genetic Algorithm to Discover Patterns in Genetic Markers that Indicate Predisposition to Multifactorial Disease

    PubMed Central

    Rausch, Tobias; Thomas, Alun; Camp, Nicola J.; Cannon-Albright, Lisa A.; Facelli, Julio C.

    2008-01-01

    This paper describes a novel algorithm to analyze genetic linkage data using pattern recognition techniques and genetic algorithms (GA). The method allows a search for regions of the chromosome that may contain genetic variations that jointly predispose individuals for a particular disease. The method uses correlation analysis, filtering theory and genetic algorithms (GA) to achieve this goal. Because current genome scans use from hundreds to hundreds of thousands of markers, two versions of the method have been implemented. The first is an exhaustive analysis version that can be used to visualize, explore, and analyze small genetic data sets for two marker correlations; the second is a GA version, which uses a parallel implementation allowing searches of higher-order correlations in large data sets. Results on simulated data sets indicate that the method can be informative in the identification of major disease loci and gene-gene interactions in genome-wide linkage data and that further exploration of these techniques is justified. The results presented for both variants of the method show that it can help genetic epidemiologists to identify promising combinations of genetic factors that might predispose to complex disorders. In particular, the correlation analysis of IBD expression patterns might hint to possible gene-gene interactions and the filtering might be a fruitful approach to distinguish true correlation signals from noise. PMID:18547558

  17. A Bio Medical Waste Identification and Classification Algorithm Using Mltrp and Rvm.

    PubMed

    Achuthan, Aravindan; Ayyallu Madangopal, Vasumathi

    2016-10-01

    We aimed to extract the histogram features for text analysis and, to classify the types of Bio Medical Waste (BMW) for garbage disposal and management. The given BMW was preprocessed by using the median filtering technique that efficiently reduced the noise in the image. After that, the histogram features of the filtered image were extracted with the help of proposed Modified Local Tetra Pattern (MLTrP) technique. Finally, the Relevance Vector Machine (RVM) was used to classify the BMW into human body parts, plastics, cotton and liquids. The BMW image was collected from the garbage image dataset for analysis. The performance of the proposed BMW identification and classification system was evaluated in terms of sensitivity, specificity, classification rate and accuracy with the help of MATLAB. When compared to the existing techniques, the proposed techniques provided the better results. This work proposes a new texture analysis and classification technique for BMW management and disposal. It can be used in many real time applications such as hospital and healthcare management systems for proper BMW disposal.

  18. The application of phase grating to CLM technology for the sub-65nm node optical lithography

    NASA Astrophysics Data System (ADS)

    Yoon, Gi-Sung; Kim, Sung-Hyuck; Park, Ji-Soong; Choi, Sun-Young; Jeon, Chan-Uk; Shin, In-Kyun; Choi, Sung-Woon; Han, Woo-Sung

    2005-06-01

    As a promising technology for sub-65nm node optical lithography, CLM(Chrome-Less Mask) technology among RETs(Resolution Enhancement Techniques) for low k1 has been researched worldwide in recent years. CLM has several advantages, such as relatively simple manufacturing process and competitive performance compared to phase-edge PSM's. For the low-k1 lithography, we have researched CLM technique as a good solution especially for sub-65nm node. As a step for developing the sub-65nm node optical lithography, we have applied CLM technology in 80nm-node lithography with mesa and trench method. From the analysis of the CLM technology in the 80nm lithography, we found that there is the optimal shutter size for best performance in the technique, the increment of wafer ADI CD varied with pattern's pitch, and a limitation in patterning various shapes and size by OPC dead-zone - OPC dead-zone in CLM technique is the specific region of shutter size that dose not make the wafer CD increased more than a specific size. And also small patterns are easily broken, while fabricating the CLM mask in mesa method. Generally, trench method has better optical performance than mesa. These issues have so far restricted the application of CLM technology to a small field. We approached these issues with 3-D topographic simulation tool and found that the issues could be overcome by applying phase grating in trench-type CLM. With the simulation data, we made some test masks which had many kinds of patterns with many different conditions and analyzed their performance through AIMS fab 193 and exposure on wafer. Finally, we have developed the CLM technology which is free of OPC dead-zone and pattern broken in fabrication process. Therefore, we can apply the CLM technique into sub-65nm node optical lithography including logic devices.

  19. Supervised analysis of drug prescription sequences.

    PubMed

    Ficheur, Grégoire; Chazard, Emmanuel; Merlin, Béatrice; Ferret, Laurie; Luyckx, Michel; Beuscart, Régis

    2013-01-01

    Hospitals have at their disposal large databases that may be considered for reuse. The objective of this work is to evaluate the impact of a drug on a specific laboratory result by analyzing these data. This analysis first involves building a record of temporal patterns, including medical context, of drug prescriptions. Changes in outcome due to these patterns of drug prescription are assessed using short phases of the inpatient stay compared to monotonous changes in the laboratory result. To illustrate this technique, we investigated potassium chloride supplementation and its impact on kalemia. This method enables us to assess the impact of a drug (in its frequent context of prescription) on a laboratory result. This kind of analysis could play a role in post-marketing studies.

  20. Pattern recognition tool based on complex network-based approach

    NASA Astrophysics Data System (ADS)

    Casanova, Dalcimar; Backes, André Ricardo; Martinez Bruno, Odemir

    2013-02-01

    This work proposed a generalization of the method proposed by the authors: 'A complex network-based approach for boundary shape analysis'. Instead of modelling a contour into a graph and use complex networks rules to characterize it, here, we generalize the technique. This way, the work proposes a mathematical tool for characterization signals, curves and set of points. To evaluate the pattern description power of the proposal, an experiment of plat identification based on leaf veins image are conducted. Leaf vein is a taxon characteristic used to plant identification proposes, and one of its characteristics is that these structures are complex, and difficult to be represented as a signal or curves and this way to be analyzed in a classical pattern recognition approach. Here, we model the veins as a set of points and model as graphs. As features, we use the degree and joint degree measurements in a dynamic evolution. The results demonstrates that the technique has a good power of discrimination and can be used for plant identification, as well as other complex pattern recognition tasks.

  1. Data handling and analysis for the 1971 corn blight watch experiment

    NASA Technical Reports Server (NTRS)

    Anuta, P. E.; Phillips, T. L.

    1973-01-01

    The overall corn blight watch experiment data flow is described and the organization of the LARS/Purdue data center is discussed. Data analysis techniques are discussed in general and the use of statistical multispectral pattern recognition methods for automatic computer analysis of aircraft scanner data is described. Some of the results obtained are discussed and the implications of the experiment on future data communication requirements for earth resource survey systems is discussed.

  2. Developing a Complex Independent Component Analysis (CICA) Technique to Extract Non-stationary Patterns from Geophysical Time Series

    NASA Astrophysics Data System (ADS)

    Forootan, Ehsan; Kusche, Jürgen; Talpe, Matthieu; Shum, C. K.; Schmidt, Michael

    2017-12-01

    In recent decades, decomposition techniques have enabled increasingly more applications for dimension reduction, as well as extraction of additional information from geophysical time series. Traditionally, the principal component analysis (PCA)/empirical orthogonal function (EOF) method and more recently the independent component analysis (ICA) have been applied to extract, statistical orthogonal (uncorrelated), and independent modes that represent the maximum variance of time series, respectively. PCA and ICA can be classified as stationary signal decomposition techniques since they are based on decomposing the autocovariance matrix and diagonalizing higher (than two) order statistical tensors from centered time series, respectively. However, the stationarity assumption in these techniques is not justified for many geophysical and climate variables even after removing cyclic components, e.g., the commonly removed dominant seasonal cycles. In this paper, we present a novel decomposition method, the complex independent component analysis (CICA), which can be applied to extract non-stationary (changing in space and time) patterns from geophysical time series. Here, CICA is derived as an extension of real-valued ICA, where (a) we first define a new complex dataset that contains the observed time series in its real part, and their Hilbert transformed series as its imaginary part, (b) an ICA algorithm based on diagonalization of fourth-order cumulants is then applied to decompose the new complex dataset in (a), and finally, (c) the dominant independent complex modes are extracted and used to represent the dominant space and time amplitudes and associated phase propagation patterns. The performance of CICA is examined by analyzing synthetic data constructed from multiple physically meaningful modes in a simulation framework, with known truth. Next, global terrestrial water storage (TWS) data from the Gravity Recovery And Climate Experiment (GRACE) gravimetry mission (2003-2016), and satellite radiometric sea surface temperature (SST) data (1982-2016) over the Atlantic and Pacific Oceans are used with the aim of demonstrating signal separations of the North Atlantic Oscillation (NAO) from the Atlantic Multi-decadal Oscillation (AMO), and the El Niño Southern Oscillation (ENSO) from the Pacific Decadal Oscillation (PDO). CICA results indicate that ENSO-related patterns can be extracted from the Gravity Recovery And Climate Experiment Terrestrial Water Storage (GRACE TWS) with an accuracy of 0.5-1 cm in terms of equivalent water height (EWH). The magnitude of errors in extracting NAO or AMO from SST data using the complex EOF (CEOF) approach reaches up to 50% of the signal itself, while it is reduced to 16% when applying CICA. Larger errors with magnitudes of 100% and 30% of the signal itself are found while separating ENSO from PDO using CEOF and CICA, respectively. We thus conclude that the CICA is more effective than CEOF in separating non-stationary patterns.

  3. Technical support for creating an artificial intelligence system for feature extraction and experimental design

    NASA Technical Reports Server (NTRS)

    Glick, B. J.

    1985-01-01

    Techniques for classifying objects into groups or clases go under many different names including, most commonly, cluster analysis. Mathematically, the general problem is to find a best mapping of objects into an index set consisting of class identifiers. When an a priori grouping of objects exists, the process of deriving the classification rules from samples of classified objects is known as discrimination. When such rules are applied to objects of unknown class, the process is denoted classification. The specific problem addressed involves the group classification of a set of objects that are each associated with a series of measurements (ratio, interval, ordinal, or nominal levels of measurement). Each measurement produces one variable in a multidimensional variable space. Cluster analysis techniques are reviewed and methods for incuding geographic location, distance measures, and spatial pattern (distribution) as parameters in clustering are examined. For the case of patterning, measures of spatial autocorrelation are discussed in terms of the kind of data (nominal, ordinal, or interval scaled) to which they may be applied.

  4. Characterizing multivariate decoding models based on correlated EEG spectral features

    PubMed Central

    McFarland, Dennis J.

    2013-01-01

    Objective Multivariate decoding methods are popular techniques for analysis of neurophysiological data. The present study explored potential interpretative problems with these techniques when predictors are correlated. Methods Data from sensorimotor rhythm-based cursor control experiments was analyzed offline with linear univariate and multivariate models. Features were derived from autoregressive (AR) spectral analysis of varying model order which produced predictors that varied in their degree of correlation (i.e., multicollinearity). Results The use of multivariate regression models resulted in much better prediction of target position as compared to univariate regression models. However, with lower order AR features interpretation of the spectral patterns of the weights was difficult. This is likely to be due to the high degree of multicollinearity present with lower order AR features. Conclusions Care should be exercised when interpreting the pattern of weights of multivariate models with correlated predictors. Comparison with univariate statistics is advisable. Significance While multivariate decoding algorithms are very useful for prediction their utility for interpretation may be limited when predictors are correlated. PMID:23466267

  5. Comparison among different retrofitting strategies for the vulnerability reduction of masonry bell towers

    NASA Astrophysics Data System (ADS)

    Milani, Gabriele; Shehu, Rafael; Valente, Marco

    2017-11-01

    This paper investigates the effectiveness of reducing the seismic vulnerability of masonry towers by means of innovative and traditional strengthening techniques. The followed strategy for providing the optimal retrofitting for masonry towers subjected to seismic risk relies on preventing active failure mechanisms. These vulnerable mechanisms are pre-assigned failure patterns based on the crack patterns experienced during the past seismic events. An upper bound limit analysis strategy is found suitable to be applied for simplified tower models in their present state and the proposed retrofitted ones. Taking into consideration the variability of geometrical features and the uncertainty of the strengthening techniques, Monte Carlo simulations are implemented into the limit analysis. In this framework a wide range of idealized cases are covered by the conducted analyses. The retrofitting strategies aim to increase the shear strength and the overturning load carrying capacity in order to reduce vulnerability. This methodology gives the possibility to use different materials which can fulfill the structural implementability requirements.

  6. Analysis of cylindrical wrap-around and doubly conformal patch antennas by way of the finite element-artificial absorber method

    NASA Technical Reports Server (NTRS)

    Volakis, J. L.; Kempel, L. C.; Sliva, R.; Wang, H. T. G.; Woo, A. G.

    1994-01-01

    The goal of this project was to develop analysis codes for computing the scattering and radiation of antennas on cylindrically and doubly conformal platforms. The finite element-boundary integral (FE-BI) method has been shown to accurately model the scattering and radiation of cavity-backed patch antennas. Unfortunately extension of this rigorous technique to coated or doubly curved platforms is cumbersome and inefficient. An alternative approximate approach is to employ an absorbing boundary condition (ABC) for terminating the finite element mesh thus avoiding use of a Green's function. A FE-ABC method is used to calculate the radar cross section (RCS) and radiation pattern of a cavity-backed patch antenna which is recessed within a metallic surface. It is shown that this approach is accurate for RCS and antenna pattern calculations with an ABC surface displaced as little as 0.3 lambda from the cavity aperture. These patch antennas may have a dielectric overlay which may also be modeled with this technique.

  7. A Moire Fringing Spectrometer for Extra-Solar Planet Searches

    NASA Astrophysics Data System (ADS)

    van Eyken, J. C.; Ge, J.; Mahadevan, S.; De Witt, C.; Ramsey, L. W.; Berger, D.; Shaklan, S.; Pan, X.

    2001-12-01

    We have developed a prototype moire fringing spectrometer for high precision radial velocity measurements for the detection of extra-solar planets. This combination of Michelson interferometer and spectrograph overlays an interferometer comb on a medium resolution stellar spectrum, producing Moire patterns. Small changes in the doppler shift of the spectrum lead to corresponding large shifts in the Moire pattern (Moire magnification). The sinusoidal shape of the Moire fringes enables much simpler measurement of these shifts than in standard echelle spectrograph techniques, facilitating high precision measurements with a low cost instrument. Current data analysis software we have developed has produced short-term repeatability (over a few hours) to 5-10m/s, and future planned improvements based on previous experiments should reduce this significantly. We plan eventually to carry out large scale surveys for low mass companions around other stars. This poster will present new results obtained in the lab and at the HET and Palomar 5m telescopes, the theory of the instrument, and data analysis techniques.

  8. Detection and Evaluation of Spatio-Temporal Spike Patterns in Massively Parallel Spike Train Data with SPADE.

    PubMed

    Quaglio, Pietro; Yegenoglu, Alper; Torre, Emiliano; Endres, Dominik M; Grün, Sonja

    2017-01-01

    Repeated, precise sequences of spikes are largely considered a signature of activation of cell assemblies. These repeated sequences are commonly known under the name of spatio-temporal patterns (STPs). STPs are hypothesized to play a role in the communication of information in the computational process operated by the cerebral cortex. A variety of statistical methods for the detection of STPs have been developed and applied to electrophysiological recordings, but such methods scale poorly with the current size of available parallel spike train recordings (more than 100 neurons). In this work, we introduce a novel method capable of overcoming the computational and statistical limits of existing analysis techniques in detecting repeating STPs within massively parallel spike trains (MPST). We employ advanced data mining techniques to efficiently extract repeating sequences of spikes from the data. Then, we introduce and compare two alternative approaches to distinguish statistically significant patterns from chance sequences. The first approach uses a measure known as conceptual stability, of which we investigate a computationally cheap approximation for applications to such large data sets. The second approach is based on the evaluation of pattern statistical significance. In particular, we provide an extension to STPs of a method we recently introduced for the evaluation of statistical significance of synchronous spike patterns. The performance of the two approaches is evaluated in terms of computational load and statistical power on a variety of artificial data sets that replicate specific features of experimental data. Both methods provide an effective and robust procedure for detection of STPs in MPST data. The method based on significance evaluation shows the best overall performance, although at a higher computational cost. We name the novel procedure the spatio-temporal Spike PAttern Detection and Evaluation (SPADE) analysis.

  9. Detection and Evaluation of Spatio-Temporal Spike Patterns in Massively Parallel Spike Train Data with SPADE

    PubMed Central

    Quaglio, Pietro; Yegenoglu, Alper; Torre, Emiliano; Endres, Dominik M.; Grün, Sonja

    2017-01-01

    Repeated, precise sequences of spikes are largely considered a signature of activation of cell assemblies. These repeated sequences are commonly known under the name of spatio-temporal patterns (STPs). STPs are hypothesized to play a role in the communication of information in the computational process operated by the cerebral cortex. A variety of statistical methods for the detection of STPs have been developed and applied to electrophysiological recordings, but such methods scale poorly with the current size of available parallel spike train recordings (more than 100 neurons). In this work, we introduce a novel method capable of overcoming the computational and statistical limits of existing analysis techniques in detecting repeating STPs within massively parallel spike trains (MPST). We employ advanced data mining techniques to efficiently extract repeating sequences of spikes from the data. Then, we introduce and compare two alternative approaches to distinguish statistically significant patterns from chance sequences. The first approach uses a measure known as conceptual stability, of which we investigate a computationally cheap approximation for applications to such large data sets. The second approach is based on the evaluation of pattern statistical significance. In particular, we provide an extension to STPs of a method we recently introduced for the evaluation of statistical significance of synchronous spike patterns. The performance of the two approaches is evaluated in terms of computational load and statistical power on a variety of artificial data sets that replicate specific features of experimental data. Both methods provide an effective and robust procedure for detection of STPs in MPST data. The method based on significance evaluation shows the best overall performance, although at a higher computational cost. We name the novel procedure the spatio-temporal Spike PAttern Detection and Evaluation (SPADE) analysis. PMID:28596729

  10. Digital versus conventional techniques for pattern fabrication of implant-supported frameworks.

    PubMed

    Alikhasi, Marzieh; Rohanian, Ahmad; Ghodsi, Safoura; Kolde, Amin Mohammadpour

    2018-01-01

    The aim of this experimental study was to compare retention of frameworks cast from wax patterns fabricated by three different methods. Thirty-six implant analogs connected to one-piece abutments were divided randomly into three groups according to the wax pattern fabrication method ( n = 12). Computer-aided design/computer-aided manufacturing (CAD/CAM) milling machine, three-dimensional printer, and conventional technique were used for fabrication of waxing patterns. All laboratory procedures were performed by an expert-reliable technician to eliminate intra-operator bias. The wax patterns were cast, finished, and seated on related abutment analogs. The number of adjustment times was recorded and analyzed by Kruskal-Wallis test. Frameworks were cemented on the corresponding analogs with zinc phosphate cement and tensile resistance test was used to measure retention value. One-way analysis of variance (ANOVA) and post hoc Tukey tests were used for statistical analysis. Level of significance was set at P < 0.05. The mean retentive values of 680.36 ± 21.93 N, 440.48 ± 85.98 N, and 407.23 ± 67.48 N were recorded for CAD/CAM, rapid prototyping, and conventional group, respectively. One-way ANOVA test revealed significant differences among the three groups ( P < 0.001). The post hoc Tukey test showed significantly higher retention for CAD/CAM group ( P < 0.001), while there was no significant difference between the two other groups ( P = 0.54). CAD/CAM group required significantly more adjustments ( P < 0.001). CAD/CAM-fabricated wax patterns showed significantly higher retention for implant-supported cement-retained frameworks; this could be a valuable help when there are limitations in the retention of single-unit implant restorations.

  11. Recent growth of conifer species of western North America: Assessing spatial patterns of radial growth trends

    USGS Publications Warehouse

    McKenzie, D.; Hessl, Amy E.; Peterson, D.L.

    2001-01-01

    We explored spatial patterns of low-frequency variability in radial tree growth among western North American conifer species and identified predictors of the variability in these patterns. Using 185 sites from the International Tree-Ring Data Bank, each of which contained 10a??60 raw ring-width series, we rebuilt two chronologies for each site, using two conservative methods designed to retain any low-frequency variability associated with recent environmental change. We used factor analysis to identify regional low-frequency patterns in site chronologies and estimated the slope of the growth trend since 1850 at each site from a combination of linear regression and time-series techniques. This slope was the response variable in a regression-tree model to predict the effects of environmental gradients and species-level differences on growth trends. Growth patterns at 27 sites from the American Southwest were consistent with quasi-periodic patterns of drought. Either 12 or 32 of the 185 sites demonstrated patterns of increasing growth between 1850 and 1980 A.D., depending on the standardization technique used. Pronounced growth increases were associated with high-elevation sites (above 3000 m) and high-latitude sites in maritime climates. Future research focused on these high-elevation and high-latitude sites should address the precise mechanisms responsible for increased 20th century growth.

  12. Pattern recognition and image processing for environmental monitoring

    NASA Astrophysics Data System (ADS)

    Siddiqui, Khalid J.; Eastwood, DeLyle

    1999-12-01

    Pattern recognition (PR) and signal/image processing methods are among the most powerful tools currently available for noninvasively examining spectroscopic and other chemical data for environmental monitoring. Using spectral data, these systems have found a variety of applications employing analytical techniques for chemometrics such as gas chromatography, fluorescence spectroscopy, etc. An advantage of PR approaches is that they make no a prior assumption regarding the structure of the patterns. However, a majority of these systems rely on human judgment for parameter selection and classification. A PR problem is considered as a composite of four subproblems: pattern acquisition, feature extraction, feature selection, and pattern classification. One of the basic issues in PR approaches is to determine and measure the features useful for successful classification. Selection of features that contain the most discriminatory information is important because the cost of pattern classification is directly related to the number of features used in the decision rules. The state of the spectral techniques as applied to environmental monitoring is reviewed. A spectral pattern classification system combining the above components and automatic decision-theoretic approaches for classification is developed. It is shown how such a system can be used for analysis of large data sets, warehousing, and interpretation. In a preliminary test, the classifier was used to classify synchronous UV-vis fluorescence spectra of relatively similar petroleum oils with reasonable success.

  13. Seasonal differences in the subjective assessment of outdoor thermal conditions and the impact of analysis techniques on the obtained results

    NASA Astrophysics Data System (ADS)

    Kántor, Noémi; Kovács, Attila; Takács, Ágnes

    2016-11-01

    Wide research attention has been paid in the last two decades to the thermal comfort conditions of different outdoor and semi-outdoor urban spaces. Field studies were conducted in a wide range of geographical regions in order to investigate the relationship between the thermal sensation of people and thermal comfort indices. Researchers found that the original threshold values of these indices did not describe precisely the actual thermal sensation patterns of subjects, and they reported neutral temperatures that vary among nations and with time of the year. For that reason, thresholds of some objective indices were rescaled and new thermal comfort categories were defined. This research investigates the outdoor thermal perception patterns of Hungarians regarding the Physiologically Equivalent Temperature ( PET) index, based on more than 5800 questionnaires. The surveys were conducted in the city of Szeged on 78 days in spring, summer, and autumn. Various, frequently applied analysis approaches (simple descriptive technique, regression analysis, and probit models) were adopted to reveal seasonal differences in the thermal assessment of people. Thermal sensitivity and neutral temperatures were found to be significantly different, especially between summer and the two transient seasons. Challenges of international comparison are also emphasized, since the results prove that neutral temperatures obtained through different analysis techniques may be considerably different. The outcomes of this study underline the importance of the development of standard measurement and analysis methodologies in order to make future studies comprehensible, hereby facilitating the broadening of the common scientific knowledge about outdoor thermal comfort.

  14. Techniques for computer-aided analysis of ERTS-1 data, useful in geologic, forest and water resource surveys. [Colorado Rocky Mountains

    NASA Technical Reports Server (NTRS)

    Hoffer, R. M.

    1974-01-01

    Forestry, geology, and water resource applications were the focus of this study, which involved the use of computer-implemented pattern-recognition techniques to analyze ERTS-1 data. The results have proven the value of computer-aided analysis techniques, even in areas of mountainous terrain. Several analysis capabilities have been developed during these ERTS-1 investigations. A procedure to rotate, deskew, and geometrically scale the MSS data results in 1:24,000 scale printouts that can be directly overlayed on 7 1/2 minutes U.S.G.S. topographic maps. Several scales of computer-enhanced "false color-infrared" composites of MSS data can be obtained from a digital display unit, and emphasize the tremendous detail present in the ERTS-1 data. A grid can also be superimposed on the displayed data to aid in specifying areas of interest.

  15. Automated quantification of the synchrogram by recurrence plot analysis.

    PubMed

    Nguyen, Chinh Duc; Wilson, Stephen James; Crozier, Stuart

    2012-04-01

    Recently, the concept of phase synchronization of two weakly coupled oscillators has raised a great research interest and has been applied to characterize synchronization phenomenon in physiological data. Phase synchronization of cardiorespiratory coupling is often studied by a synchrogram analysis, a graphical tool investigating the relationship between instantaneous phases of two signals. Although several techniques have been proposed to automatically quantify the synchrogram, most of them require a preselection of a phase-locking ratio by trial and error. One technique does not require this information; however, it is based on the power spectrum of phase's distribution in the synchrogram, which is vulnerable to noise. This study aims to introduce a new technique to automatically quantify the synchrogram by studying its dynamic structure. Our technique exploits recurrence plot analysis, which is a well-established tool for characterizing recurring patterns and nonstationarities in experiments. We applied our technique to detect synchronization in simulated and measured infants' cardiorespiratory data. Our results suggest that the proposed technique is able to systematically detect synchronization in noisy and chaotic data without preselecting the phase-locking ratio. By embedding phase information of the synchrogram into phase space, the phase-locking ratio is automatically unveiled as the number of attractors.

  16. Preliminary GIS analysis of the agricultural landscape of Cuyo Cuyo, Department of Puno, Peru

    NASA Technical Reports Server (NTRS)

    Winterhalder, Bruce; Evans, Tom

    1991-01-01

    Computerized analysis of a geographic database (GIS) for Cuyo Cuyo, (Dept. Puno, Peru) is used to correlate the agricultural production zones of two adjacent communities to altitude, slope, aspect, and other geomorphological features of the high-altitude eastern escarpment landscape. The techniques exemplified will allow ecological anthropologists to analyze spatial patterns at regional scales with much greater control over the data.

  17. Aural analysis of image texture via cepstral filtering and sonification

    NASA Astrophysics Data System (ADS)

    Rangayyan, Rangaraj M.; Martins, Antonio C. G.; Ruschioni, Ruggero A.

    1996-03-01

    Texture plays an important role in image analysis and understanding, with many applications in medical imaging and computer vision. However, analysis of texture by image processing is a rather difficult issue, with most techniques being oriented towards statistical analysis which may not have readily comprehensible perceptual correlates. We propose new methods for auditory display (AD) and sonification of (quasi-) periodic texture (where a basic texture element or `texton' is repeated over the image field) and random texture (which could be modeled as filtered or `spot' noise). Although the AD designed is not intended to be speech- like or musical, we draw analogies between the two types of texture mentioned above and voiced/unvoiced speech, and design a sonification algorithm which incorporates physical and perceptual concepts of texture and speech. More specifically, we present a method for AD of texture where the projections of the image at various angles (Radon transforms or integrals) are mapped to audible signals and played in sequence. In the case of random texture, the spectral envelopes of the projections are related to the filter spot characteristics, and convey the essential information for texture discrimination. In the case of periodic texture, the AD provides timber and pitch related to the texton and periodicity. In another procedure for sonification of periodic texture, we propose to first deconvolve the image using cepstral analysis to extract information about the texton and horizontal and vertical periodicities. The projections of individual textons at various angles are used to create a voiced-speech-like signal with each projection mapped to a basic wavelet, the horizontal period to pitch, and the vertical period to rhythm on a longer time scale. The sound pattern then consists of a serial, melody-like sonification of the patterns for each projection. We believe that our approaches provide the much-desired `natural' connection between the image data and the sounds generated. We have evaluated the sonification techniques with a number of synthetic textures. The sound patterns created have demonstrated the potential of the methods in distinguishing between different types of texture. We are investigating the application of these techniques to auditory analysis of texture in medical images such as magnetic resonance images.

  18. Analysis of x-ray diffraction pattern and complex plane impedance plot of polypyrrole/titanium dioxide nanocomposite: A simulation study

    NASA Astrophysics Data System (ADS)

    Ravikiran, Y. T.; Vijaya Kumari, S. C.

    2013-06-01

    To innovate the properties of Polypyrrole/Titanium dioxide (PPy/TiO2) nanocomposite further, it has been synthesized by chemical polymerization technique. The nanostructure and monoclinic phase of the prepared composite have been confirmed by simulating the X-ray diffraction pattern (XRD). Also, complex plane impedance plot of the composite has been simulated to find equivalent resistance capacitance circuit (RC circuit) and numerical values of R and C have been predicted.

  19. On the use of the hole-drilling technique for residual stress measurements in thin plates

    NASA Technical Reports Server (NTRS)

    Hampton, R. W.; Nelson, D. V.

    1992-01-01

    The strain gage blind hole-drilling technique may be used to determine residual stresses at and below the surface of components. In this paper, the hole-drilling analysis methodology for thick plates is reviewed, and experimental data are used to evaluate the methodology and to assess its applicability to thin plates. Data on the effects of gage pattern, surface preparation, hole spacing, hole eccentricity, and stress level are also presented.

  20. A manual for inexpensive methods of analyzing and utilizing remote sensor data

    NASA Technical Reports Server (NTRS)

    Elifrits, C. D.; Barr, D. J.

    1978-01-01

    Instructions are provided for inexpensive methods of using remote sensor data to assist in the completion of the need to observe the earth's surface. When possible, relative costs were included. Equipment need for analysis of remote sensor data is described, and methods of use of these equipment items are included, as well as advantages and disadvantages of the use of individual items. Interpretation and analysis of stereo photos and the interpretation of typical patterns such as tone and texture, landcover, drainage, and erosional form are described. Similar treatment is given to monoscopic image interpretation, including LANDSAT MSS data. Enhancement techniques are detailed with respect to their application and simple techniques of creating an enhanced data item. Techniques described include additive and subtractive (Diazo processes) color techniques and enlargement of photos or images. Applications of these processes, including mappings of land resources, engineering soils, geology, water resources, environmental conditions, and crops and/or vegetation, are outlined.

  1. Topological image texture analysis for quality assessment

    NASA Astrophysics Data System (ADS)

    Asaad, Aras T.; Rashid, Rasber Dh.; Jassim, Sabah A.

    2017-05-01

    Image quality is a major factor influencing pattern recognition accuracy and help detect image tampering for forensics. We are concerned with investigating topological image texture analysis techniques to assess different type of degradation. We use Local Binary Pattern (LBP) as a texture feature descriptor. For any image construct simplicial complexes for selected groups of uniform LBP bins and calculate persistent homology invariants (e.g. number of connected components). We investigated image quality discriminating characteristics of these simplicial complexes by computing these models for a large dataset of face images that are affected by the presence of shadows as a result of variation in illumination conditions. Our tests demonstrate that for specific uniform LBP patterns, the number of connected component not only distinguish between different levels of shadow effects but also help detect the infected regions as well.

  2. Three-dimensional reconstruction for coherent diffraction patterns obtained by XFEL.

    PubMed

    Nakano, Miki; Miyashita, Osamu; Jonic, Slavica; Song, Changyong; Nam, Daewoong; Joti, Yasumasa; Tama, Florence

    2017-07-01

    The three-dimensional (3D) structural analysis of single particles using an X-ray free-electron laser (XFEL) is a new structural biology technique that enables observations of molecules that are difficult to crystallize, such as flexible biomolecular complexes and living tissue in the state close to physiological conditions. In order to restore the 3D structure from the diffraction patterns obtained by the XFEL, computational algorithms are necessary as the orientation of the incident beam with respect to the sample needs to be estimated. A program package for XFEL single-particle analysis based on the Xmipp software package, that is commonly used for image processing in 3D cryo-electron microscopy, has been developed. The reconstruction program has been tested using diffraction patterns of an aerosol nanoparticle obtained by tomographic coherent X-ray diffraction microscopy.

  3. Using Dual Regression to Investigate Network Shape and Amplitude in Functional Connectivity Analyses

    PubMed Central

    Nickerson, Lisa D.; Smith, Stephen M.; Öngür, Döst; Beckmann, Christian F.

    2017-01-01

    Independent Component Analysis (ICA) is one of the most popular techniques for the analysis of resting state FMRI data because it has several advantageous properties when compared with other techniques. Most notably, in contrast to a conventional seed-based correlation analysis, it is model-free and multivariate, thus switching the focus from evaluating the functional connectivity of single brain regions identified a priori to evaluating brain connectivity in terms of all brain resting state networks (RSNs) that simultaneously engage in oscillatory activity. Furthermore, typical seed-based analysis characterizes RSNs in terms of spatially distributed patterns of correlation (typically by means of simple Pearson's coefficients) and thereby confounds together amplitude information of oscillatory activity and noise. ICA and other regression techniques, on the other hand, retain magnitude information and therefore can be sensitive to both changes in the spatially distributed nature of correlations (differences in the spatial pattern or “shape”) as well as the amplitude of the network activity. Furthermore, motion can mimic amplitude effects so it is crucial to use a technique that retains such information to ensure that connectivity differences are accurately localized. In this work, we investigate the dual regression approach that is frequently applied with group ICA to assess group differences in resting state functional connectivity of brain networks. We show how ignoring amplitude effects and how excessive motion corrupts connectivity maps and results in spurious connectivity differences. We also show how to implement the dual regression to retain amplitude information and how to use dual regression outputs to identify potential motion effects. Two key findings are that using a technique that retains magnitude information, e.g., dual regression, and using strict motion criteria are crucial for controlling both network amplitude and motion-related amplitude effects, respectively, in resting state connectivity analyses. We illustrate these concepts using realistic simulated resting state FMRI data and in vivo data acquired in healthy subjects and patients with bipolar disorder and schizophrenia. PMID:28348512

  4. Using foreground/background analysis to determine leaf and canopy chemistry

    NASA Technical Reports Server (NTRS)

    Pinzon, J. E.; Ustin, S. L.; Hart, Q. J.; Jacquemoud, S.; Smith, M. O.

    1995-01-01

    Spectral Mixture Analysis (SMA) has become a well established procedure for analyzing imaging spectrometry data, however, the technique is relatively insensitive to minor sources of spectral variation (e.g., discriminating stressed from unstressed vegetation and variations in canopy chemistry). Other statistical approaches have been tried e.g., stepwise multiple linear regression analysis to predict canopy chemistry. Grossman et al. reported that SMLR is sensitive to measurement error and that the prediction of minor chemical components are not independent of patterns observed in more dominant spectral components like water. Further, they observed that the relationships were strongly dependent on the mode of expressing reflectance (R, -log R) and whether chemistry was expressed on a weight (g/g) or are basis (g/sq m). Thus, alternative multivariate techniques need to be examined. Smith et al. reported a revised SMA that they termed Foreground/Background Analysis (FBA) that permits directing the analysis along any axis of variance by identifying vectors through the n-dimensional spectral volume orthonormal to each other. Here, we report an application of the FBA technique for the detection of canopy chemistry using a modified form of the analysis.

  5. Identifying spatially similar gene expression patterns in early stage fruit fly embryo images: binary feature versus invariant moment digital representations

    PubMed Central

    Gurunathan, Rajalakshmi; Van Emden, Bernard; Panchanathan, Sethuraman; Kumar, Sudhir

    2004-01-01

    Background Modern developmental biology relies heavily on the analysis of embryonic gene expression patterns. Investigators manually inspect hundreds or thousands of expression patterns to identify those that are spatially similar and to ultimately infer potential gene interactions. However, the rapid accumulation of gene expression pattern data over the last two decades, facilitated by high-throughput techniques, has produced a need for the development of efficient approaches for direct comparison of images, rather than their textual descriptions, to identify spatially similar expression patterns. Results The effectiveness of the Binary Feature Vector (BFV) and Invariant Moment Vector (IMV) based digital representations of the gene expression patterns in finding biologically meaningful patterns was compared for a small (226 images) and a large (1819 images) dataset. For each dataset, an ordered list of images, with respect to a query image, was generated to identify overlapping and similar gene expression patterns, in a manner comparable to what a developmental biologist might do. The results showed that the BFV representation consistently outperforms the IMV representation in finding biologically meaningful matches when spatial overlap of the gene expression pattern and the genes involved are considered. Furthermore, we explored the value of conducting image-content based searches in a dataset where individual expression components (or domains) of multi-domain expression patterns were also included separately. We found that this technique improves performance of both IMV and BFV based searches. Conclusions We conclude that the BFV representation consistently produces a more extensive and better list of biologically useful patterns than the IMV representation. The high quality of results obtained scales well as the search database becomes larger, which encourages efforts to build automated image query and retrieval systems for spatial gene expression patterns. PMID:15603586

  6. Forecasting of hourly load by pattern recognition in a small area power system

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dehdashti-Shahrokh, A.

    1982-01-01

    An intuitive, logical, simple and efficient method of forecasting hourly load in a small area power system is presented. A pattern recognition approach is used in developing the forecasting model. Pattern recognition techniques are powerful tools in the field of artificial intelligence (cybernetics) and simulate the way the human brain operates to make decisions. Pattern recognition is generally used in analysis of processes where the total physical nature behind the process variation is unkown but specific kinds of measurements explain their behavior. In this research basic multivariate analyses, in conjunction with pattern recognition techniques, are used to develop a linearmore » deterministic model to forecast hourly load. This method assumes that load patterns in the same geographical area are direct results of climatological changes (weather sensitive load), and have occurred in the past as a result of similar climatic conditions. The algorithm described in here searches for the best possible pattern from a seasonal library of load and weather data in forecasting hourly load. To accommodate the unpredictability of weather and the resulting load, the basic twenty-four load pattern was divided into eight three-hour intervals. This division was made to make the model adaptive to sudden climatic changes. The proposed method offers flexible lead times of one to twenty-four hours. The results of actual data testing had indicated that this proposed method is computationally efficient, highly adaptive, with acceptable data storage size and accuracy that is comparable to many other existing methods.« less

  7. Acoustic tweezers: patterning cells and microparticles using standing surface acoustic waves (SSAW).

    PubMed

    Shi, Jinjie; Ahmed, Daniel; Mao, Xiaole; Lin, Sz-Chin Steven; Lawit, Aitan; Huang, Tony Jun

    2009-10-21

    Here we present an active patterning technique named "acoustic tweezers" that utilizes standing surface acoustic wave (SSAW) to manipulate and pattern cells and microparticles. This technique is capable of patterning cells and microparticles regardless of shape, size, charge or polarity. Its power intensity, approximately 5x10(5) times lower than that of optical tweezers, compares favorably with those of other active patterning methods. Flow cytometry studies have revealed it to be non-invasive. The aforementioned advantages, along with this technique's simple design and ability to be miniaturized, render the "acoustic tweezers" technique a promising tool for various applications in biology, chemistry, engineering, and materials science.

  8. Analysis of Spatial Point Patterns in Nuclear Biology

    PubMed Central

    Weston, David J.; Adams, Niall M.; Russell, Richard A.; Stephens, David A.; Freemont, Paul S.

    2012-01-01

    There is considerable interest in cell biology in determining whether, and to what extent, the spatial arrangement of nuclear objects affects nuclear function. A common approach to address this issue involves analyzing a collection of images produced using some form of fluorescence microscopy. We assume that these images have been successfully pre-processed and a spatial point pattern representation of the objects of interest within the nuclear boundary is available. Typically in these scenarios, the number of objects per nucleus is low, which has consequences on the ability of standard analysis procedures to demonstrate the existence of spatial preference in the pattern. There are broadly two common approaches to look for structure in these spatial point patterns. First a spatial point pattern for each image is analyzed individually, or second a simple normalization is performed and the patterns are aggregated. In this paper we demonstrate using synthetic spatial point patterns drawn from predefined point processes how difficult it is to distinguish a pattern from complete spatial randomness using these techniques and hence how easy it is to miss interesting spatial preferences in the arrangement of nuclear objects. The impact of this problem is also illustrated on data related to the configuration of PML nuclear bodies in mammalian fibroblast cells. PMID:22615822

  9. In Vivo Measurement of Glenohumeral Joint Contact Patterns

    NASA Astrophysics Data System (ADS)

    Bey, Michael J.; Kline, Stephanie K.; Zauel, Roger; Kolowich, Patricia A.; Lock, Terrence R.

    2009-12-01

    The objectives of this study were to describe a technique for measuring in-vivo glenohumeral joint contact patterns during dynamic activities and to demonstrate application of this technique. The experimental technique calculated joint contact patterns by combining CT-based 3D bone models with joint motion data that were accurately measured from biplane x-ray images. Joint contact patterns were calculated for the repaired and contralateral shoulders of 20 patients who had undergone rotator cuff repair. Significant differences in joint contact patterns were detected due to abduction angle and shoulder condition (i.e., repaired versus contralateral). Abduction angle had a significant effect on the superior/inferior contact center position, with the average joint contact center of the repaired shoulder 12.1% higher on the glenoid than the contralateral shoulder. This technique provides clinically relevant information by calculating in-vivo joint contact patterns during dynamic conditions and overcomes many limitations associated with conventional techniques for quantifying joint mechanics.

  10. A Computational Approach to Qualitative Analysis in Large Textual Datasets

    PubMed Central

    Evans, Michael S.

    2014-01-01

    In this paper I introduce computational techniques to extend qualitative analysis into the study of large textual datasets. I demonstrate these techniques by using probabilistic topic modeling to analyze a broad sample of 14,952 documents published in major American newspapers from 1980 through 2012. I show how computational data mining techniques can identify and evaluate the significance of qualitatively distinct subjects of discussion across a wide range of public discourse. I also show how examining large textual datasets with computational methods can overcome methodological limitations of conventional qualitative methods, such as how to measure the impact of particular cases on broader discourse, how to validate substantive inferences from small samples of textual data, and how to determine if identified cases are part of a consistent temporal pattern. PMID:24498398

  11. Analysis of Variance in Statistical Image Processing

    NASA Astrophysics Data System (ADS)

    Kurz, Ludwik; Hafed Benteftifa, M.

    1997-04-01

    A key problem in practical image processing is the detection of specific features in a noisy image. Analysis of variance (ANOVA) techniques can be very effective in such situations, and this book gives a detailed account of the use of ANOVA in statistical image processing. The book begins by describing the statistical representation of images in the various ANOVA models. The authors present a number of computationally efficient algorithms and techniques to deal with such problems as line, edge, and object detection, as well as image restoration and enhancement. By describing the basic principles of these techniques, and showing their use in specific situations, the book will facilitate the design of new algorithms for particular applications. It will be of great interest to graduate students and engineers in the field of image processing and pattern recognition.

  12. Physics Notes

    ERIC Educational Resources Information Center

    School Science Review, 1972

    1972-01-01

    Short articles describe the production, photography, and analysis of diffraction patterns using a small laser, a technique for measuring electrical resistance without a standard resistor, a demonstration of a thermocouple effect in a galvanometer with a built-in light source, and a common error in deriving the expression for centripetal force. (AL)

  13. Ecological Effects of Weather Modification: A Problem Analysis.

    ERIC Educational Resources Information Center

    Cooper, Charles F.; Jolly, William C.

    This publication reviews the potential hazards to the environment of weather modification techniques as they eventually become capable of producing large scale weather pattern modifications. Such weather modifications could result in ecological changes which would generally require several years to be fully evident, including the alteration of…

  14. Comparison of sequencing-based methods to profile DNA methylation and identification of monoallelic epigenetic modifications

    USDA-ARS?s Scientific Manuscript database

    Analysis of DNA methylation patterns relies increasingly on sequencing-based profiling methods. The four most frequently used sequencing-based technologies are the bisulfite-based methods MethylC-seq and reduced representation bisulfite sequencing (RRBS), and the enrichment-based techniques methylat...

  15. LANDSAT-4 and LANDSAT-5 Multispectral Scanner Coherent Noise Characterization and Removal

    NASA Technical Reports Server (NTRS)

    Tilton, James C.; Alford, William L.

    1988-01-01

    A technique is described for characterizing the coherent noise found in LANDSAT-4 and LANDSAT-5 MSS data and a companion technique for filtering out the coherent noise. The techniques are demonstrated on LANDSAT-4 and LANDSAT-5 MSS data sets, and explanations of the noise pattern are suggested in Appendix C. A cookbook procedure for characterizing and filtering the coherent noise using special NASA/Goddard IDIMS functions is included. Also presented are analysis results from the retrofitted LANDSAT-5 MSS sensor, which shows that the coherent noise has been substantially reduced.

  16. Planning representation for automated exploratory data analysis

    NASA Astrophysics Data System (ADS)

    St. Amant, Robert; Cohen, Paul R.

    1994-03-01

    Igor is a knowledge-based system for exploratory statistical analysis of complex systems and environments. Igor has two related goals: to help automate the search for interesting patterns in data sets, and to help develop models that capture significant relationships in the data. We outline a language for Igor, based on techniques of opportunistic planning, which balances control and opportunism. We describe the application of Igor to the analysis of the behavior of Phoenix, an artificial intelligence planning system.

  17. PATTERNS IN BIOMEDICAL DATA-HOW DO WE FIND THEM?

    PubMed

    Basile, Anna O; Verma, Anurag; Byrska-Bishop, Marta; Pendergrass, Sarah A; Darabos, Christian; Lester Kirchner, H

    2017-01-01

    Given the exponential growth of biomedical data, researchers are faced with numerous challenges in extracting and interpreting information from these large, high-dimensional, incomplete, and often noisy data. To facilitate addressing this growing concern, the "Patterns in Biomedical Data-How do we find them?" session of the 2017 Pacific Symposium on Biocomputing (PSB) is devoted to exploring pattern recognition using data-driven approaches for biomedical and precision medicine applications. The papers selected for this session focus on novel machine learning techniques as well as applications of established methods to heterogeneous data. We also feature manuscripts aimed at addressing the current challenges associated with the analysis of biomedical data.

  18. Food purchase patterns: empirical identification and analysis of their association with diet quality, socio-economic factors, and attitudes.

    PubMed

    Thiele, Silke; Peltner, Jonas; Richter, Almut; Mensink, Gert B M

    2017-10-12

    Empirically derived food purchase patterns provide information about which combinations of foods were purchased from households. The objective of this study was to identify what kinds of patterns exist, which level of diet quality they represent and which factors are associated with the patterns. The study made use of representative German consumption data in which approximately 12 million food purchases from 13,125 households are recorded. In accordance with healthy diet criteria the food purchases were assigned to 18 food groups of the German Food Pyramid. Based on these groups a factor analysis with a principal component technique was applied to identify food patterns. For these patterns nutrient and energy densities were examined. Using regression analysis, associations between pattern scores and socio-economic as well as attitude variables, reflecting personal statements about healthy eating, were analyzed. In total, three food purchase patterns could be identified: a natural, a processed and a traditional one. The first one was characterized by a higher purchasing of natural foods, the second by an increased purchasing of processed foods and the third by a meat-oriented diet. In each pattern there were specific diet quality criteria that could be improved whereas others were in line with actual dietary guidelines. In addition to socio-demographic factors, attitudes were significantly associated with the purchase patterns. The findings of this study are interesting from a public health perspective, as it can be assumed that measures focusing on specific aspects of diet quality are more promising than general ones. However, it is a major challenge to identify the population groups with their specific needs of improvement. As the patterns were associated with both socio-economic and attitude variables these grouping criteria could be used to define target groups.

  19. Evaluation of crystallographic strain, rotation and defects in functional oxides by the moiré effect in scanning transmission electron microscopy.

    PubMed

    Naden, A B; O'Shea, K J; MacLaren, D A

    2018-04-20

    Moiré patterns in scanning transmission electron microscopy (STEM) images of epitaxial perovskite oxides are used to assess strain and defect densities over fields of view extending over several hundred nanometers. The patterns arise from the geometric overlap of the rastered STEM electron beam and the samples' crystal periodicities and we explore the emergence and application of these moiré fringes for rapid strain analysis. Using the epitaxial functional oxide perovskites BiFeO 3 and Pr 1-x Ca x MnO 3 , we discuss the impact of large degrees of strain on the quantification of STEM moiré patterns, identify defects in the fringe patterns and quantify strain and lattice rotation. Such a wide-area analysis of crystallographic strain and defects is crucial for developing structure-function relations of functional oxides and we find the STEM moiré technique to be an attractive means of structural assessment that can be readily applied to low dose studies of damage sensitive crystalline materials.

  20. Evaluation of crystallographic strain, rotation and defects in functional oxides by the moiré effect in scanning transmission electron microscopy

    NASA Astrophysics Data System (ADS)

    Naden, A. B.; O'Shea, K. J.; MacLaren, D. A.

    2018-04-01

    Moiré patterns in scanning transmission electron microscopy (STEM) images of epitaxial perovskite oxides are used to assess strain and defect densities over fields of view extending over several hundred nanometers. The patterns arise from the geometric overlap of the rastered STEM electron beam and the samples’ crystal periodicities and we explore the emergence and application of these moiré fringes for rapid strain analysis. Using the epitaxial functional oxide perovskites BiFeO3 and Pr1-x Ca x MnO3, we discuss the impact of large degrees of strain on the quantification of STEM moiré patterns, identify defects in the fringe patterns and quantify strain and lattice rotation. Such a wide-area analysis of crystallographic strain and defects is crucial for developing structure-function relations of functional oxides and we find the STEM moiré technique to be an attractive means of structural assessment that can be readily applied to low dose studies of damage sensitive crystalline materials.

  1. Emergent 1d Ising Behavior in AN Elementary Cellular Automaton Model

    NASA Astrophysics Data System (ADS)

    Kassebaum, Paul G.; Iannacchione, Germano S.

    The fundamental nature of an evolving one-dimensional (1D) Ising model is investigated with an elementary cellular automaton (CA) simulation. The emergent CA simulation employs an ensemble of cells in one spatial dimension, each cell capable of two microstates interacting with simple nearest-neighbor rules and incorporating an external field. The behavior of the CA model provides insight into the dynamics of coupled two-state systems not expressible by exact analytical solutions. For instance, state progression graphs show the causal dynamics of a system through time in relation to the system's entropy. Unique graphical analysis techniques are introduced through difference patterns, diffusion patterns, and state progression graphs of the 1D ensemble visualizing the evolution. All analyses are consistent with the known behavior of the 1D Ising system. The CA simulation and new pattern recognition techniques are scalable (in both dimension, complexity, and size) and have many potential applications such as complex design of materials, control of agent systems, and evolutionary mechanism design.

  2. [Analysis of genomic DNA methylation level in radish under cadmium stress by methylation-sensitive amplified polymorphism technique].

    PubMed

    Yang, Jin-Lan; Liu, Li-Wang; Gong, Yi-Qin; Huang, Dan-Qiong; Wang, Feng; He, Ling-Li

    2007-06-01

    The level of cytosine methylation induced by cadmium in radish (Raphanus sativus L.) genome was analysed using the technique of methylation-sensitive amplified polymorphism (MSAP). The MSAP ratios in radish seedling exposed to cadmium chloride at the concentration of 50, 250 and 500 mg/L were 37%, 43% and 51%, respectively, and the control was 34%; the full methylation levels (C(m)CGG in double strands) were at 23%, 25% and 27%, respectively, while the control was 22%. The level of increase in MSAP and full methylation indicated that de novo methylation occurred in some 5'-CCGG sites under Cd stress. There was significant positive correlation between increase of total DNA methylation level and CdCl(2) concentration. Four types of MSAP patterns: de novo methylation, de-methylation, atypical pattern and no changes of methylation pattern were identified among CdCl(2) treatments and the control. DNA methylation alteration in plants treated with CdCl(2) was mainly through de novo methylation.

  3. Complexity analysis of human physiological signals based on case studies

    NASA Astrophysics Data System (ADS)

    Angelova, Maia; Holloway, Philip; Ellis, Jason

    2015-04-01

    This work focuses on methods for investigation of physiological time series based on complexity analysis. It is a part of a wider programme to determine non-invasive markers for healthy ageing. We consider two case studies investigated with actigraphy: (a) sleep and alternations with insomnia, and (b) ageing effects on mobility patterns. We illustrate, using these case studies, the application of fractal analysis to the investigation of regulation patterns and control, and change of physiological function. In the first case study, fractal analysis techniques were implemented to study the correlations present in sleep actigraphy for individuals suffering from acute insomnia in comparison with healthy controls. The aim was to investigate if complexity analysis can detect the onset of adverse health-related events. The subjects with acute insomnia displayed significantly higher levels of complexity, possibly a result of too much activity in the underlying regulatory systems. The second case study considered mobility patterns during night time and their variations with age. It showed that complexity metrics can identify change in physiological function with ageing. Both studies demonstrated that complexity analysis can be used to investigate markers of health, disease and healthy ageing.

  4. Infrastructure stability surveillance with high resolution InSAR

    NASA Astrophysics Data System (ADS)

    Balz, Timo; Düring, Ralf

    2017-02-01

    The construction of new infrastructure in largely unknown and difficult environments, as it is necessary for the construction of the New Silk Road, can lead to a decreased stability along the construction site, leading to an increase in landslide risk and deformation caused by surface motion. This generally requires a thorough pre-analysis and consecutive surveillance of the deformation patterns to ensure the stability and safety of the infrastructure projects. Interferometric SAR (InSAR) and the derived techniques of multi-baseline InSAR are very powerful tools for a large area observation of surface deformation patterns. With InSAR and deriver techniques, the topographic height and the surface motion can be estimated for large areas, making it an ideal tool for supporting the planning, construction, and safety surveillance of new infrastructure elements in remote areas.

  5. Identification of Legionella Species by Random Amplified Polymorphic DNA Profiles

    PubMed Central

    Lo Presti, François; Riffard, Serge; Vandenesch, François; Etienne, Jerome

    1998-01-01

    Random amplified polymorphic DNA (RAPD) was used for the identification of Legionella species. Primer SK2 (5′-CGGCGGCGGCGG-3′) and standardized RAPD conditions gave the technique a reproducibility of 93 to 100%, depending on the species tested. Species-specific patterns corresponding to the 42 Legionella species were consequently defined by this method; the patterns were dependent on the recognition of a core of common bands for each species. This specificity was demonstrated by testing 65 type strains and 265 environmental and clinical isolates. No serogroup-specific profiles were obtained. A number of unidentified Legionella isolates potentially corresponding to new species were clustered in four groups. RAPD analysis appears to be a rapid and reproducible technique for identification of Legionella isolates to the species level without further restriction or hybridization. PMID:9774564

  6. Applying Jlint to Space Exploration Software

    NASA Technical Reports Server (NTRS)

    Artho, Cyrille; Havelund, Klaus

    2004-01-01

    Java is a very successful programming language which is also becoming widespread in embedded systems, where software correctness is critical. Jlint is a simple but highly efficient static analyzer that checks a Java program for several common errors, such as null pointer exceptions, and overflow errors. It also includes checks for multi-threading problems, such as deadlocks and data races. The case study described here shows the effectiveness of Jlint in find-false positives in the multi-threading warnings gives an insight into design patterns commonly used in multi-threaded code. The results show that a few analysis techniques are sufficient to avoid almost all false positives. These techniques include investigating all possible callers and a few code idioms. Verifying the correct application of these patterns is still crucial, because their correct usage is not trivial.

  7. An evaluation of object-oriented image analysis techniques to identify motorized vehicle effects in semi-arid to arid ecosystems of the American West

    USGS Publications Warehouse

    Mladinich, C.

    2010-01-01

    Human disturbance is a leading ecosystem stressor. Human-induced modifications include transportation networks, areal disturbances due to resource extraction, and recreation activities. High-resolution imagery and object-oriented classification rather than pixel-based techniques have successfully identified roads, buildings, and other anthropogenic features. Three commercial, automated feature-extraction software packages (Visual Learning Systems' Feature Analyst, ENVI Feature Extraction, and Definiens Developer) were evaluated by comparing their ability to effectively detect the disturbed surface patterns from motorized vehicle traffic. Each package achieved overall accuracies in the 70% range, demonstrating the potential to map the surface patterns. The Definiens classification was more consistent and statistically valid. Copyright ?? 2010 by Bellwether Publishing, Ltd. All rights reserved.

  8. Chronic obstructive pulmonary disease: quantitative and visual ventilation pattern analysis at xenon ventilation CT performed by using a dual-energy technique.

    PubMed

    Park, Eun-Ah; Goo, Jin Mo; Park, Sang Joon; Lee, Hyun Ju; Lee, Chang Hyun; Park, Chang Min; Yoo, Chul-Gyu; Kim, Jong Hyo

    2010-09-01

    To evaluate the potential of xenon ventilation computed tomography (CT) in the quantitative and visual analysis of chronic obstructive pulmonary disease (COPD). This study was approved by the institutional review board. After informed consent was obtained, 32 patients with COPD underwent CT performed before the administration of xenon, two-phase xenon ventilation CT with wash-in (WI) and wash-out (WO) periods, and pulmonary function testing (PFT). For quantitative analysis, results of PFT were compared with attenuation parameters from prexenon images and xenon parameters from xenon-enhanced images in the following three areas at each phase: whole lung, lung with normal attenuation, and low-attenuating lung (LAL). For visual analysis, ventilation patterns were categorized according to the pattern of xenon attenuation in the area of structural abnormalities compared with that in the normal-looking background on a per-lobe basis: pattern A consisted of isoattenuation or high attenuation in the WI period and isoattenuation in the WO period; pattern B, isoattenuation or high attenuation in the WI period and high attenuation in the WO period; pattern C, low attenuation in both the WI and WO periods; and pattern D, low attenuation in the WI period and isoattenuation or high attenuation in the WO period. Among various attenuation and xenon parameters, xenon parameters of the LAL in the WO period showed the best inverse correlation with results of PFT (P < .0001). At visual analysis, while emphysema (which affected 99 lobes) commonly showed pattern A or B, airway diseases such as obstructive bronchiolitis (n = 5) and bronchiectasis (n = 2) and areas with a mucus plug (n = 1) or centrilobular nodules (n = 5) showed pattern D or C. WI and WO xenon ventilation CT is feasible for the simultaneous regional evaluation of structural and ventilation abnormalities both quantitatively and qualitatively in patients with COPD. (c) RSNA, 2010.

  9. Exploratory wavelet analysis of dengue seasonal patterns in Colombia.

    PubMed

    Fernández-Niño, Julián Alfredo; Cárdenas-Cárdenas, Luz Mery; Hernández-Ávila, Juan Eugenio; Palacio-Mejía, Lina Sofía; Castañeda-Orjuela, Carlos Andrés

    2015-12-04

    Dengue has a seasonal behavior associated with climatic changes, vector cycles, circulating serotypes, and population dynamics. The wavelet analysis makes it possible to separate a very long time series into calendar time and periods. This is the first time this technique is used in an exploratory manner to model the behavior of dengue in Colombia.  To explore the annual seasonal dengue patterns in Colombia and in its five most endemic municipalities for the period 2007 to 2012, and for roughly annual cycles between 1978 and 2013 at the national level.  We made an exploratory wavelet analysis using data from all incident cases of dengue per epidemiological week for the period 2007 to 2012, and per year for 1978 to 2013. We used a first-order autoregressive model as the null hypothesis.  The effect of the 2010 epidemic was evident in both the national time series and the series for the five municipalities. Differences in interannual seasonal patterns were observed among municipalities. In addition, we identified roughly annual cycles of 2 to 5 years since 2004 at a national level.  Wavelet analysis is useful to study a long time series containing changing seasonal patterns, as is the case of dengue in Colombia, and to identify differences among regions. These patterns need to be explored at smaller aggregate levels, and their relationships with different predictive variables need to be investigated.

  10. Optically Remote Noncontact Heart Rates Sensing Technique

    NASA Astrophysics Data System (ADS)

    Thongkongoum, W.; Boonduang, S.; Limsuwan, P.

    2017-09-01

    Heart rate monitoring via optically remote noncontact technique was reported in this research. A green laser (5 mW, 532±10 nm) was projected onto the left carotid artery. The reflected laser light on the screen carried the deviation of the interference patterns. The interference patterns were recorded by the digital camera. The recorded videos of the interference patterns were frame by frame analysed by 2 standard digital image processing (DIP) techniques, block matching (BM) and optical flow (OF) techniques. The region of interest (ROI) pixels within the interference patterns were analysed for periodically changes of the interference patterns due to the heart pumping action. Both results of BM and OF techniques were compared with the reference medical heart rate monitoring device by which a contact measurement using pulse transit technique. The results obtained from BM technique was 74.67 bpm (beats per minute) and OF technique was 75.95 bpm. Those results when compared with the reference value of 75.43±1 bpm, the errors were found to be 1.01% and 0.69%, respectively.

  11. RS- and GIS-based study on landscape pattern change in the Poyang Lake wetland area, China

    NASA Astrophysics Data System (ADS)

    Chen, Xiaoling; Li, Hui; Bao, Shuming; Wu, Zhongyi; Fu, Weijuan; Cai, Xiaobin; Zhao, Hongmei; Guo, Peng

    2006-10-01

    As wetland has been recognized as an important component of ecosystem, it is received ever-increasing attention worldwide. Poyang Lake wetlands, the international wetlands and the largest bird habitat in Asia, play an important role in biodiversity and ecologic protection. However, with the rapid economic growth and urbanization, landscape patterns in the wetlands have dramatically changed in the past three decades. To better understand the wetland landscape dynamics, remote sensing, geographic information system technologies, and the FRAGSTATS landscape analysis program were used to measure landscape patterns. Statistical approach was employed to illustrate the driving forces. In this study, Landsat images (TM and ETM+) from 1989 and 2000 were acquired for the wetland area. The landscapes in the wetland area were classified as agricultural land, urban, wetland, forest, grassland, unused land, and water body using a combination of supervised and unsupervised classification techniques integrated with Digital Elevation Model (DEM). Landscape indices, which are popular for the quantitative analysis of landscape pattern, were then employed to analyze the landscape pattern changes between the two dates in a GIS. From this analysis an understanding of the spatial-temporal patterns of landscape evolution was generated. The results show that wetland area was reduced while fragmentation was increased over the study period. Further investigation was made to examine the relationship between landscape metrics and some other parameters such as urbanization to address the driving forces for those changes. The urban was chosen as center to conduct buffer analysis in a GIS to study the impact of human-induced activities on landscape pattern dynamics. It was found that the selected parameters were significantly correlated with the landscape metrics, which may well indicate the impact of human-induced activities on the wetland landscape pattern dynamics and account for the driving forces.

  12. Assessing Footwear Effects from Principal Features of Plantar Loading during Running.

    PubMed

    Trudeau, Matthieu B; von Tscharner, Vinzenz; Vienneau, Jordyn; Hoerzer, Stefan; Nigg, Benno M

    2015-09-01

    The effects of footwear on the musculoskeletal system are commonly assessed by interpreting the resultant force at the foot during the stance phase of running. However, this approach overlooks loading patterns across the entire foot. An alternative technique for assessing foot loading across different footwear conditions is possible using comprehensive analysis tools that extract different foot loading features, thus enhancing the functional interpretation of the differences across different interventions. The purpose of this article was to use pattern recognition techniques to develop and use a novel comprehensive method for assessing the effects of different footwear interventions on plantar loading. A principal component analysis was used to extract different loading features from the stance phase of running, and a support vector machine (SVM) was used to determine whether and how these loading features were different across three shoe conditions. The results revealed distinct loading features at the foot during the stance phase of running. The loading features determined from the principal component analysis allowed successful classification of all three shoe conditions using the SVM. Several differences were found in the location and timing of the loading across each pairwise shoe comparison using the output from the SVM. The analysis approach proposed can successfully be used to compare different loading patterns with a much greater resolution than has been reported previously. This study has several important applications. One such application is that it would not be relevant for a user to select a shoe or for a manufacturer to alter a shoe's construction if the classification across shoe conditions would not have been significant.

  13. Direct nano-patterning of graphene with helium ion beams

    NASA Astrophysics Data System (ADS)

    Naitou, Y.; Iijima, T.; Ogawa, S.

    2015-01-01

    Helium ion microscopy (HIM) was used for direct nano-patterning of single-layer graphene (SLG) on SiO2/Si substrates. This technique involves irradiation of the sample with accelerated helium ions (He+). Doses of 2.0 × 1016 He+ cm-2 from a 30 kV beam induced a metal-insulator transition in the SLG. The resolution of HIM patterning on SLG was investigated by fabricating nanoribbons and nanostructures. Analysis of scanning capacitance microscopy measurements revealed that the spatial resolution of HIM patterning depended on the dosage of He+ in a non-monotonic fashion. Increasing the dose from 2.0 × 1016 to 5.0 × 1016 He+ cm-2 improved the spatial resolution to several tens of nanometers. However, doses greater than 1.0 × 1017 He+ cm-2 degraded the patterning characteristics. Direct patterning using HIM is a versatile approach to graphene fabrication and can be applied to graphene-based devices.

  14. Health management and pattern analysis of daily living activities of people with dementia using in-home sensors and machine learning techniques

    PubMed Central

    Markides, Andreas; Skillman, Severin; Acton, Sahr Thomas; Elsaleh, Tarek; Hassanpour, Masoud; Ahrabian, Alireza; Kenny, Mark; Klein, Stuart; Rostill, Helen; Nilforooshan, Ramin; Barnaghi, Payam

    2018-01-01

    The number of people diagnosed with dementia is expected to rise in the coming years. Given that there is currently no definite cure for dementia and the cost of care for this condition soars dramatically, slowing the decline and maintaining independent living are important goals for supporting people with dementia. This paper discusses a study that is called Technology Integrated Health Management (TIHM). TIHM is a technology assisted monitoring system that uses Internet of Things (IoT) enabled solutions for continuous monitoring of people with dementia in their own homes. We have developed machine learning algorithms to analyse the correlation between environmental data collected by IoT technologies in TIHM in order to monitor and facilitate the physical well-being of people with dementia. The algorithms are developed with different temporal granularity to process the data for long-term and short-term analysis. We extract higher-level activity patterns which are then used to detect any change in patients’ routines. We have also developed a hierarchical information fusion approach for detecting agitation, irritability and aggression. We have conducted evaluations using sensory data collected from homes of people with dementia. The proposed techniques are able to recognise agitation and unusual patterns with an accuracy of up to 80%. PMID:29723236

  15. Health management and pattern analysis of daily living activities of people with dementia using in-home sensors and machine learning techniques.

    PubMed

    Enshaeifar, Shirin; Zoha, Ahmed; Markides, Andreas; Skillman, Severin; Acton, Sahr Thomas; Elsaleh, Tarek; Hassanpour, Masoud; Ahrabian, Alireza; Kenny, Mark; Klein, Stuart; Rostill, Helen; Nilforooshan, Ramin; Barnaghi, Payam

    2018-01-01

    The number of people diagnosed with dementia is expected to rise in the coming years. Given that there is currently no definite cure for dementia and the cost of care for this condition soars dramatically, slowing the decline and maintaining independent living are important goals for supporting people with dementia. This paper discusses a study that is called Technology Integrated Health Management (TIHM). TIHM is a technology assisted monitoring system that uses Internet of Things (IoT) enabled solutions for continuous monitoring of people with dementia in their own homes. We have developed machine learning algorithms to analyse the correlation between environmental data collected by IoT technologies in TIHM in order to monitor and facilitate the physical well-being of people with dementia. The algorithms are developed with different temporal granularity to process the data for long-term and short-term analysis. We extract higher-level activity patterns which are then used to detect any change in patients' routines. We have also developed a hierarchical information fusion approach for detecting agitation, irritability and aggression. We have conducted evaluations using sensory data collected from homes of people with dementia. The proposed techniques are able to recognise agitation and unusual patterns with an accuracy of up to 80%.

  16. Spatial patterns of giant sequoia (Sequoiadendron giganteum) in two sequoia groves in Sequoia National Park, California

    USGS Publications Warehouse

    Stohlgren, Thomas J.

    1993-01-01

    Although Muir Grove and Castle Creek Grove are similar in area, elevation, and number of giant sequoias, various spatial pattern analysis techniques showed that they had dissimilar spatial patterns for similar-sized trees. Two-dimensional and transect two-term local quadrat variance techniques detected general trends in the spatial patterns of different-sized trees, detected multiple-scale patterns within individual size classes, and provided information on the scale and intensity of patches of individual size classes of trees in Muir and Castle Creek groves. In Muir Grove, midsized sequoias (1.5 to 2.4 m DBH classes) had major pattern scales 350–450 m in diameter, whereas the same-sized trees in Castle Creek Grove had pattern scales >1000 m in diameter. Many size classes of trees had minor patches superimposed on larger scale patterns in both groves. There may be different recruitment patterns in core (i.e., central) areas compared with peripheral areas of sequoia groves; core areas of both groves had more small live sequoias and dead sequoias than peripheral areas of the groves. Higher densities of sequoias and, perhaps, more rapid turnover of individuals in core areas may indicate (i) differences in disturbance histories and favorability of microsites in the core and peripheral areas of groves; (ii) different responses to disturbance due to shifts in the species composition of the stand and thus, the relative influences of intra- to inter-specific competition; or (iii) slower growth or lower survivorship rates in marginal habitat (i.e., peripheral areas).

  17. Analysis and synthesis of abstract data types through generalization from examples

    NASA Technical Reports Server (NTRS)

    Wild, Christian

    1987-01-01

    The discovery of general patterns of behavior from a set of input/output examples can be a useful technique in the automated analysis and synthesis of software systems. These generalized descriptions of the behavior form a set of assertions which can be used for validation, program synthesis, program testing and run-time monitoring. Describing the behavior is characterized as a learning process in which general patterns can be easily characterized. The learning algorithm must choose a transform function and define a subset of the transform space which is related to equivalence classes of behavior in the original domain. An algorithm for analyzing the behavior of abstract data types is presented and several examples are given. The use of the analysis for purposes of program synthesis is also discussed.

  18. Introduction to Social Network Analysis

    NASA Astrophysics Data System (ADS)

    Zaphiris, Panayiotis; Ang, Chee Siang

    Social Network analysis focuses on patterns of relations between and among people, organizations, states, etc. It aims to describe networks of relations as fully as possible, identify prominent patterns in such networks, trace the flow of information through them, and discover what effects these relations and networks have on people and organizations. Social network analysis offers a very promising potential for analyzing human-human interactions in online communities (discussion boards, newsgroups, virtual organizations). This Tutorial provides an overview of this analytic technique and demonstrates how it can be used in Human Computer Interaction (HCI) research and practice, focusing especially on Computer Mediated Communication (CMC). This topic acquires particular importance these days, with the increasing popularity of social networking websites (e.g., youtube, myspace, MMORPGs etc.) and the research interest in studying them.

  19. Spatio-Temporal Patterns of Barmah Forest Virus Disease in Queensland, Australia

    PubMed Central

    Naish, Suchithra; Hu, Wenbiao; Mengersen, Kerrie; Tong, Shilu

    2011-01-01

    Background Barmah Forest virus (BFV) disease is a common and wide-spread mosquito-borne disease in Australia. This study investigated the spatio-temporal patterns of BFV disease in Queensland, Australia using geographical information system (GIS) tools and geostatistical analysis. Methods/Principal Findings We calculated the incidence rates and standardised incidence rates of BFV disease. Moran's I statistic was used to assess the spatial autocorrelation of BFV incidences. Spatial dynamics of BFV disease was examined using semi-variogram analysis. Interpolation techniques were applied to visualise and display the spatial distribution of BFV disease in statistical local areas (SLAs) throughout Queensland. Mapping of BFV disease by SLAs reveals the presence of substantial spatio-temporal variation over time. Statistically significant differences in BFV incidence rates were identified among age groups (χ2 = 7587, df = 7327,p<0.01). There was a significant positive spatial autocorrelation of BFV incidence for all four periods, with the Moran's I statistic ranging from 0.1506 to 0.2901 (p<0.01). Semi-variogram analysis and smoothed maps created from interpolation techniques indicate that the pattern of spatial autocorrelation was not homogeneous across the state. Conclusions/Significance This is the first study to examine spatial and temporal variation in the incidence rates of BFV disease across Queensland using GIS and geostatistics. The BFV transmission varied with age and gender, which may be due to exposure rates or behavioural risk factors. There are differences in the spatio-temporal patterns of BFV disease which may be related to local socio-ecological and environmental factors. These research findings may have implications in the BFV disease control and prevention programs in Queensland. PMID:22022430

  20. Persistent Scatterer Interferometry analysis of ground deformation in the Po Plain (Piacenza-Reggio Emilia sector, Northern Italy): seismo-tectonic implications

    NASA Astrophysics Data System (ADS)

    Antonielli, Benedetta; Monserrat, Oriol; Bonini, Marco; Cenni, Nicola; Devanthéry, Núria; Righini, Gaia; Sani, Federico

    2016-08-01

    This work aims to explore the ongoing tectonic activity of structures in the outermost sector of the Northern Apennines, which represents the active leading edge of the thrust belt and is dominated by compressive deformation. We have applied the Persistent Scatterer Interferometry (PSI) technique to obtain new insights into the present-day deformation pattern of the frontal area of the Northern Apennine. PSI has proved to be effective in detecting surface deformation of wide regions involved in low tectonic movements. We used 34 Envisat images in descending geometry over the period of time between 2004 and 2010, performing about 300 interferometric pairs. The analysis of the velocity maps and of the PSI time-series has allowed to observe ground deformation over the sector of the Po Plain between Piacenza and Reggio Emilia. The time-series of permanent GPS stations located in the study area, validated the results of the PSI technique, showing a good correlation with the PS time-series. The PS analysis reveals the occurrence of a well-known subsidence area on the rear of the Ferrara arc, mostly connected to the exploitation of water resources. In some instances, the PS velocity pattern reveals ground uplift (with mean velocities ranging from 1 to 2.8 mm yr-1) above active thrust-related anticlines of the Emilia and Ferrara folds, and part of the Pede-Apennine margin. We hypothesize a correlation between the observed uplift deformation pattern and the growth of the thrust-related anticlines. As the uplift pattern corresponds to known geological features, it can be used to constrain the seismo-tectonic setting, and a working hypothesis may involve that the active Emilia and Ferrara thrust folds would be characterized by interseismic periods possibly dominated by aseismic creep.

  1. Mining Longitudinal Web Queries: Trends and Patterns.

    ERIC Educational Resources Information Center

    Wang, Peiling; Berry, Michael W.; Yang, Yiheng

    2003-01-01

    Analyzed user queries submitted to an academic Web site during a four-year period, using a relational database, to examine users' query behavior, to identify problems they encounter, and to develop techniques for optimizing query analysis and mining. Linguistic analyses focus on query structures, lexicon, and word associations using statistical…

  2. Microwave Diffraction Techniques from Macroscopic Crystal Models

    ERIC Educational Resources Information Center

    Murray, William Henry

    1974-01-01

    Discusses the construction of a diffractometer table and four microwave models which are built of styrofoam balls with implanted metallic reflecting spheres and designed to simulate the structures of carbon (graphite structure), sodium chloride, tin oxide, and palladium oxide. Included are samples of Bragg patterns and computer-analysis results.…

  3. Dissatisfaction Theory in the 21st Century

    ERIC Educational Resources Information Center

    Adler, Louise

    2010-01-01

    This case study uses two theoretical lenses to analyze political events in a Southern California school district: dissatisfaction theory and groupthink. The case study technique of pattern matching was used to frame the analysis (Yin, 2009). Data for 1992-2008 was gathered from interviews, the Orange County Registrar of Voters, newspapers,…

  4. Economic and Demographic Factors Impacting Placement of Students with Autism

    ERIC Educational Resources Information Center

    Kurth, Jennifer A.; Mastergeorge, Ann M.; Paschall, Katherine

    2016-01-01

    Educational placement of students with autism is often associated with child factors, such as IQ and communication skills. However, variability in placement patterns across states suggests that other factors are at play. This study used hierarchical cluster analysis techniques to identify demographic, economic, and educational covariates…

  5. Testing Evolutionary Hypotheses in the Classroom with MacClade Software.

    ERIC Educational Resources Information Center

    Codella, Sylvio G.

    2002-01-01

    Introduces MacClade which is a Macintosh-based software package that uses the techniques of cladistic analysis to explore evolutionary patterns. Describes a novel and effective exercise that allows undergraduate biology majors to test a hypothesis about behavioral evolution in insects. (Contains 13 references.) (Author/YDS)

  6. Improvement of sub-20nm pattern quality with dose modulation technique for NIL template production

    NASA Astrophysics Data System (ADS)

    Yagawa, Keisuke; Ugajin, Kunihiro; Suenaga, Machiko; Kanamitsu, Shingo; Motokawa, Takeharu; Hagihara, Kazuki; Arisawa, Yukiyasu; Kobayashi, Sachiko; Saito, Masato; Ito, Masamitsu

    2016-04-01

    Nanoimprint lithography (NIL) technology is in the spotlight as a next-generation semiconductor manufacturing technique for integrated circuits at 22 nm and beyond. NIL is the unmagnified lithography technique using template which is replicated from master templates. On the other hand, master templates are currently fabricated by electron-beam (EB) lithography[1]. In near future, finer patterns less than 15nm will be required on master template and EB data volume increases exponentially. So, we confront with a difficult challenge. A higher resolution EB mask writer and a high performance fabrication process will be required. In our previous study, we investigated a potential of photomask fabrication process for finer patterning and achieved 15.5nm line and space (L/S) pattern on template by using VSB (Variable Shaped Beam) type EB mask writer and chemically amplified resist. In contrast, we found that a contrast loss by backscattering decreases the performance of finer patterning. For semiconductor devices manufacturing, we must fabricate complicated patterns which includes high and low density simultaneously except for consecutive L/S pattern. Then it's quite important to develop a technique to make various size or coverage patterns all at once. In this study, a small feature pattern was experimentally formed on master template with dose modulation technique. This technique makes it possible to apply the appropriate exposure dose for each pattern size. As a result, we succeed to improve the performance of finer patterning in bright field area. These results show that the performance of current EB lithography process have a potential to fabricate NIL template.

  7. Event Networks and the Identification of Crime Pattern Motifs

    PubMed Central

    2015-01-01

    In this paper we demonstrate the use of network analysis to characterise patterns of clustering in spatio-temporal events. Such clustering is of both theoretical and practical importance in the study of crime, and forms the basis for a number of preventative strategies. However, existing analytical methods show only that clustering is present in data, while offering little insight into the nature of the patterns present. Here, we show how the classification of pairs of events as close in space and time can be used to define a network, thereby generalising previous approaches. The application of graph-theoretic techniques to these networks can then offer significantly deeper insight into the structure of the data than previously possible. In particular, we focus on the identification of network motifs, which have clear interpretation in terms of spatio-temporal behaviour. Statistical analysis is complicated by the nature of the underlying data, and we provide a method by which appropriate randomised graphs can be generated. Two datasets are used as case studies: maritime piracy at the global scale, and residential burglary in an urban area. In both cases, the same significant 3-vertex motif is found; this result suggests that incidents tend to occur not just in pairs, but in fact in larger groups within a restricted spatio-temporal domain. In the 4-vertex case, different motifs are found to be significant in each case, suggesting that this technique is capable of discriminating between clustering patterns at a finer granularity than previously possible. PMID:26605544

  8. Percutaneous adhesiolysis procedures in the medicare population: analysis of utilization and growth patterns from 2000 to 2011.

    PubMed

    Manchikanti, Laxmaiah; Helm Ii, Standiford; Pampati, Vidyasagar; Racz, Gabor B

    2014-01-01

    Multiple reviews have shown that interventional techniques for chronic pain have increased dramatically over the years. Of these interventional techniques, both sacroiliac joint injections and facet joint interventions showed explosive growth, followed by epidural procedures. Percutaneous adhesiolysis procedures have not been assessed for their utilization patterns separately from epidural injections. An analysis of the utilization patterns of percutaneous adhesiolysis procedures in managing chronic low back pain in the Medicare population from 2000 to 2011. To assess the utilization and growth patterns of percutaneous adhesiolysis in managing chronic low back pain. The study was performed utilizing the Centers for Medicare and Medicaid Services (CMS) Physician Supplier Procedure Summary Master of Fee-For-Service (FFS) Data from 2000 to 2011. Percutaneous adhesiolysis procedures increased 47% with an annual growth rate of 3.6% in the FFS Medicare population from 2000 to 2011. These growth rates are significantly lower than the growth rates for sacroiliac joint injections (331%), facet joint interventions (308%), and epidural injections (130%), but substantially lower than lumbar transforaminal injections (665%) and lumbar facet joint neurolysis (544%). Study limitations include lack of inclusion of Medicare Advantage patients. In addition, the statewide data is based on claims which may include the contiguous or other states. Percutaneous adhesiolysis utilization increased moderately in Medicare beneficiaries from 2000 to 2011. Overall, there was an increase of 47% in the utilization of adhesiolysis procedures per 100,000 Medicare beneficiaries, with an annual geometric average increase of 3.6%.

  9. Volumetric pattern analysis of fuselage-mounted airborne antennas. Ph.D. Thesis; [prediction analysis techniques for antenna radiation patterns of microwave antennas on commercial aircraft

    NASA Technical Reports Server (NTRS)

    Yu, C. L.

    1976-01-01

    A volumetric pattern analysis of fuselage-mounted airborne antennas at high frequencies was investigated. The primary goal of the investigation was to develop a numerical solution for predicting radiation patterns of airborne antennas in an accurate and efficient manner. An analytical study of airborne antenna pattern problems is presented in which the antenna is mounted on the fuselage near the top or bottom. Since this is a study of general-type commercial aircraft, the aircraft was modeled in its most basic form. The fuselage was assumed to be an infinitely long perfectly conducting elliptic cylinder in its cross-section and a composite elliptic cylinder in its elevation profile. The wing, cockpit, stabilizers (horizontal and vertical) and landing gear are modeled by "N" sided bent or flat plates which can be arbitrarily attached to the fuselage. The volumetric solution developed utilizes two elliptic cylinders, namely, the roll plane and elevation plane models to approximate the principal surface profile (longitudinal and transverse) at the antenna location. With the belt concept and the aid of appropriate coordinate system transformations the solution can be used to predict the volumetric patterns of airborne antennas in an accurate and efficient manner. Applications of this solution to various airborne antenna problems show good agreement with scale model measurements. Extensive data are presented for a microwave landing antenna system.

  10. Functional neuroanatomy of meditation: A review and meta-analysis of 78 functional neuroimaging investigations.

    PubMed

    Fox, Kieran C R; Dixon, Matthew L; Nijeboer, Savannah; Girn, Manesh; Floman, James L; Lifshitz, Michael; Ellamil, Melissa; Sedlmeier, Peter; Christoff, Kalina

    2016-06-01

    Meditation is a family of mental practices that encompasses a wide array of techniques employing distinctive mental strategies. We systematically reviewed 78 functional neuroimaging (fMRI and PET) studies of meditation, and used activation likelihood estimation to meta-analyze 257 peak foci from 31 experiments involving 527 participants. We found reliably dissociable patterns of brain activation and deactivation for four common styles of meditation (focused attention, mantra recitation, open monitoring, and compassion/loving-kindness), and suggestive differences for three others (visualization, sense-withdrawal, and non-dual awareness practices). Overall, dissociable activation patterns are congruent with the psychological and behavioral aims of each practice. Some brain areas are recruited consistently across multiple techniques-including insula, pre/supplementary motor cortices, dorsal anterior cingulate cortex, and frontopolar cortex-but convergence is the exception rather than the rule. A preliminary effect-size meta-analysis found medium effects for both activations (d=0.59) and deactivations (d=-0.74), suggesting potential practical significance. Our meta-analysis supports the neurophysiological dissociability of meditation practices, but also raises many methodological concerns and suggests avenues for future research. Copyright © 2016 Elsevier Ltd. All rights reserved.

  11. A Wavelet Analysis Approach for Categorizing Air Traffic Behavior

    NASA Technical Reports Server (NTRS)

    Drew, Michael; Sheth, Kapil

    2015-01-01

    In this paper two frequency domain techniques are applied to air traffic analysis. The Continuous Wavelet Transform (CWT), like the Fourier Transform, is shown to identify changes in historical traffic patterns caused by Traffic Management Initiatives (TMIs) and weather with the added benefit of detecting when in time those changes take place. Next, with the expectation that it could detect anomalies in the network and indicate the extent to which they affect traffic flows, the Spectral Graph Wavelet Transform (SGWT) is applied to a center based graph model of air traffic. When applied to simulations based on historical flight plans, it identified the traffic flows between centers that have the greatest impact on either neighboring flows, or flows between centers many centers away. Like the CWT, however, it can be difficult to interpret SGWT results and relate them to simulations where major TMIs are implemented, and more research may be warranted in this area. These frequency analysis techniques can detect off-nominal air traffic behavior, but due to the nature of air traffic time series data, so far they prove difficult to apply in a way that provides significant insight or specific identification of traffic patterns.

  12. Runoff potentiality of a watershed through SCS and functional data analysis technique.

    PubMed

    Adham, M I; Shirazi, S M; Othman, F; Rahman, S; Yusop, Z; Ismail, Z

    2014-01-01

    Runoff potentiality of a watershed was assessed based on identifying curve number (CN), soil conservation service (SCS), and functional data analysis (FDA) techniques. Daily discrete rainfall data were collected from weather stations in the study area and analyzed through lowess method for smoothing curve. As runoff data represents a periodic pattern in each watershed, Fourier series was introduced to fit the smooth curve of eight watersheds. Seven terms of Fourier series were introduced for the watersheds 5 and 8, while 8 terms of Fourier series were used for the rest of the watersheds for the best fit of data. Bootstrapping smooth curve analysis reveals that watersheds 1, 2, 3, 6, 7, and 8 are with monthly mean runoffs of 29, 24, 22, 23, 26, and 27 mm, respectively, and these watersheds would likely contribute to surface runoff in the study area. The purpose of this study was to transform runoff data into a smooth curve for representing the surface runoff pattern and mean runoff of each watershed through statistical method. This study provides information of runoff potentiality of each watershed and also provides input data for hydrological modeling.

  13. Runoff Potentiality of a Watershed through SCS and Functional Data Analysis Technique

    PubMed Central

    Adham, M. I.; Shirazi, S. M.; Othman, F.; Rahman, S.; Yusop, Z.; Ismail, Z.

    2014-01-01

    Runoff potentiality of a watershed was assessed based on identifying curve number (CN), soil conservation service (SCS), and functional data analysis (FDA) techniques. Daily discrete rainfall data were collected from weather stations in the study area and analyzed through lowess method for smoothing curve. As runoff data represents a periodic pattern in each watershed, Fourier series was introduced to fit the smooth curve of eight watersheds. Seven terms of Fourier series were introduced for the watersheds 5 and 8, while 8 terms of Fourier series were used for the rest of the watersheds for the best fit of data. Bootstrapping smooth curve analysis reveals that watersheds 1, 2, 3, 6, 7, and 8 are with monthly mean runoffs of 29, 24, 22, 23, 26, and 27 mm, respectively, and these watersheds would likely contribute to surface runoff in the study area. The purpose of this study was to transform runoff data into a smooth curve for representing the surface runoff pattern and mean runoff of each watershed through statistical method. This study provides information of runoff potentiality of each watershed and also provides input data for hydrological modeling. PMID:25152911

  14. Analyzing psychotherapy process as intersubjective sensemaking: an approach based on discourse analysis and neural networks.

    PubMed

    Nitti, Mariangela; Ciavolino, Enrico; Salvatore, Sergio; Gennaro, Alessandro

    2010-09-01

    The authors propose a method for analyzing the psychotherapy process: discourse flow analysis (DFA). DFA is a technique representing the verbal interaction between therapist and patient as a discourse network, aimed at measuring the therapist-patient discourse ability to generate new meanings through time. DFA assumes that the main function of psychotherapy is to produce semiotic novelty. DFA is applied to the verbatim transcript of the psychotherapy. It defines the main meanings active within the therapeutic discourse by means of the combined use of text analysis and statistical techniques. Subsequently, it represents the dynamic interconnections among these meanings in terms of a "discursive network." The dynamic and structural indexes of the discursive network have been shown to provide a valid representation of the patient-therapist communicative flow as well as an estimation of its clinical quality. Finally, a neural network is designed specifically to identify patterns of functioning of the discursive network and to verify the clinical validity of these patterns in terms of their association with specific phases of the psychotherapy process. An application of the DFA to a case of psychotherapy is provided to illustrate the method and the kinds of results it produces.

  15. Wains: a pattern-seeking artificial life species.

    PubMed

    de Buitléir, Amy; Russell, Michael; Daly, Mark

    2012-01-01

    We describe the initial phase of a research project to develop an artificial life framework designed to extract knowledge from large data sets with minimal preparation or ramp-up time. In this phase, we evolved an artificial life population with a new brain architecture. The agents have sufficient intelligence to discover patterns in data and to make survival decisions based on those patterns. The species uses diploid reproduction, Hebbian learning, and Kohonen self-organizing maps, in combination with novel techniques such as using pattern-rich data as the environment and framing the data analysis as a survival problem for artificial life. The first generation of agents mastered the pattern discovery task well enough to thrive. Evolution further adapted the agents to their environment by making them a little more pessimistic, and also by making their brains more efficient.

  16. Parallel computing in experimental mechanics and optical measurement: A review (II)

    NASA Astrophysics Data System (ADS)

    Wang, Tianyi; Kemao, Qian

    2018-05-01

    With advantages such as non-destructiveness, high sensitivity and high accuracy, optical techniques have successfully integrated into various important physical quantities in experimental mechanics (EM) and optical measurement (OM). However, in pursuit of higher image resolutions for higher accuracy, the computation burden of optical techniques has become much heavier. Therefore, in recent years, heterogeneous platforms composing of hardware such as CPUs and GPUs, have been widely employed to accelerate these techniques due to their cost-effectiveness, short development cycle, easy portability, and high scalability. In this paper, we analyze various works by first illustrating their different architectures, followed by introducing their various parallel patterns for high speed computation. Next, we review the effects of CPU and GPU parallel computing specifically in EM & OM applications in a broad scope, which include digital image/volume correlation, fringe pattern analysis, tomography, hyperspectral imaging, computer-generated holograms, and integral imaging. In our survey, we have found that high parallelism can always be exploited in such applications for the development of high-performance systems.

  17. Regression equations for sex and population detection using the lip print pattern among Egyptian and Malaysian adult.

    PubMed

    Abdel Aziz, Manal H; Badr El Dine, Fatma M M; Saeed, Nourhan M M

    2016-11-01

    Identification of sex and ethnicity has always been a challenge in the fields of forensic medicine and criminal investigations. Fingerprinting and DNA comparisons are probably the most common techniques used in this context. However, since they cannot always be used, it is necessary to apply different and less known techniques such as lip prints. Is to study the pattern of lip print in Egyptian and Malaysian populations and its relation to sex and populations difference. Also, to develop equations for sex and populations detection using lip print pattern by different populations (Egyptian and Malaysian). The sample comprised of 120 adults volunteers divided into two ethnic groups; sixty adult Egyptians (30 males and 30 females) and sixty adult Malaysians (30 males and 30 females). The lip prints were collected on a white paper. Each lip print was divided into four compartments and were classified and scored according to Suzuki and Tsuchihashi classification. Data were statistically analyzed. The results showed that type III lip print pattern (intersected grooves) was the predominant type in both the Egyptian and Malaysian populations. Type II and III were the most frequent in Egyptian males (28.3% each), while in Egyptian females type III pattern was predominant (46.7%). As regards Malaysian males, type III lip print pattern was the predominant one (41.7%), while type II lip print pattern was predominant (30.8%) in Malaysian females. Statistical analysis of different quadrants showed significant differences between males and females in the Egyptian population in the third and fourth quadrants. On the other hand, significant differences were detected only in the second quadrant between Malaysian males and females. Also, a statistically significant difference was present in the second quadrant between Egyptian and Malaysian males. Using the regression analysis, four regression equations were obtained. Copyright © 2016 Elsevier Ltd and Faculty of Forensic and Legal Medicine. All rights reserved.

  18. Process Mining for Individualized Behavior Modeling Using Wireless Tracking in Nursing Homes

    PubMed Central

    Fernández-Llatas, Carlos; Benedi, José-Miguel; García-Gómez, Juan M.; Traver, Vicente

    2013-01-01

    The analysis of human behavior patterns is increasingly used for several research fields. The individualized modeling of behavior using classical techniques requires too much time and resources to be effective. A possible solution would be the use of pattern recognition techniques to automatically infer models to allow experts to understand individual behavior. However, traditional pattern recognition algorithms infer models that are not readily understood by human experts. This limits the capacity to benefit from the inferred models. Process mining technologies can infer models as workflows, specifically designed to be understood by experts, enabling them to detect specific behavior patterns in users. In this paper, the eMotiva process mining algorithms are presented. These algorithms filter, infer and visualize workflows. The workflows are inferred from the samples produced by an indoor location system that stores the location of a resident in a nursing home. The visualization tool is able to compare and highlight behavior patterns in order to facilitate expert understanding of human behavior. This tool was tested with nine real users that were monitored for a 25-week period. The results achieved suggest that the behavior of users is continuously evolving and changing and that this change can be measured, allowing for behavioral change detection. PMID:24225907

  19. Investigation of computer-aided colonic crypt pattern analysis

    NASA Astrophysics Data System (ADS)

    Qi, Xin; Pan, Yinsheng; Sivak, Michael V., Jr.; Olowe, Kayode; Rollins, Andrew M.

    2007-02-01

    Colorectal cancer is the second leading cause of cancer-related death in the United States. Approximately 50% of these deaths could be prevented by earlier detection through screening. Magnification chromoendoscopy is a technique which utilizes tissue stains applied to the gastrointestinal mucosa and high-magnification endoscopy to better visualize and characterize lesions. Prior studies have shown that shapes of colonic crypts change with disease and show characteristic patterns. Current methods for assessing colonic crypt patterns are somewhat subjective and not standardized. Computerized algorithms could be used to standardize colonic crypt pattern assessment. We have imaged resected colonic mucosa in vitro (N = 70) using methylene blue dye and a surgical microscope to approximately simulate in vivo imaging with magnification chromoendoscopy. We have developed a method of computerized processing to analyze the crypt patterns in the images. The quantitative image analysis consists of three steps. First, the crypts within the region of interest of colonic tissue are semi-automatically segmented using watershed morphological processing. Second, crypt size and shape parameters are extracted from the segmented crypts. Third, each sample is assigned to a category according to the Kudo criteria. The computerized classification is validated by comparison with human classification using the Kudo classification criteria. The computerized colonic crypt pattern analysis algorithm will enable a study of in vivo magnification chromoendoscopy of colonic crypt pattern correlated with risk of colorectal cancer. This study will assess the feasibility of screening and surveillance of the colon using magnification chromoendoscopy.

  20. Biometric analysis of the palm vein distribution by means two different techniques of feature extraction

    NASA Astrophysics Data System (ADS)

    Castro-Ortega, R.; Toxqui-Quitl, C.; Solís-Villarreal, J.; Padilla-Vivanco, A.; Castro-Ramos, J.

    2014-09-01

    Vein patterns can be used for accessing, identifying, and authenticating purposes; which are more reliable than classical identification way. Furthermore, these patterns can be used for venipuncture in health fields to get on to veins of patients when they cannot be seen with the naked eye. In this paper, an image acquisition system is implemented in order to acquire digital images of people hands in the near infrared. The image acquisition system consists of a CCD camera and a light source with peak emission in the 880 nm. This radiation can penetrate and can be strongly absorbed by the desoxyhemoglobin that is presented in the blood of the veins. Our method of analysis is composed by several steps and the first one of all is the enhancement of acquired images which is implemented by spatial filters. After that, adaptive thresholding and mathematical morphology operations are used in order to obtain the distribution of vein patterns. The above process is focused on the people recognition through of images of their palm-dorsal distributions obtained from the near infrared light. This work has been directed for doing a comparison of two different techniques of feature extraction as moments and veincode. The classification task is achieved using Artificial Neural Networks. Two databases are used for the analysis of the performance of the algorithms. The first database used here is owned of the Hong Kong Polytechnic University and the second one is our own database.

  1. Beyond the ridge pattern: multi-informative analysis of latent fingermarks by MALDI mass spectrometry.

    PubMed

    Francese, S; Bradshaw, R; Ferguson, L S; Wolstenholme, R; Clench, M R; Bleay, S

    2013-08-07

    After over a century, fingerprints are still one of the most powerful means of biometric identification. The conventional forensic workflow for suspect identification consists of (i) recovering latent marks from crime scenes using the appropriate enhancement technique and (ii) obtaining an image of the mark to compare either against known suspect prints and/or to search in a Fingerprint Database. The suspect is identified through matching the ridge pattern and local characteristics of the ridge pattern (minutiae). However successful, there are a number of scenarios in which this process may fail; they include the recovery of partial, distorted or smudged marks, poor quality of the image resulting from inadequacy of the enhancement technique applied, extensive scarring/abrasion of the fingertips or absence of suspect's fingerprint records in the database. In all of these instances it would be very desirable to have a technology able to provide additional information from a fingermark exploiting its endogenous and exogenous chemical content. This opportunity could potentially provide new investigative leads, especially when the fingermark comparison and match process fails. We have demonstrated that Matrix Assisted Laser Desorption Ionisation Mass Spectrometry and Mass Spectrometry Imaging (MALDI MSI) can provide multiple images of the same fingermark in one analysis simultaneous with additional intelligence. Here, a review on the pioneering use and development of MALDI MSI for the analysis of latent fingermarks is presented along with the latest achievements on the forensic intelligence retrievable.

  2. Motor Oil Classification using Color Histograms and Pattern Recognition Techniques.

    PubMed

    Ahmadi, Shiva; Mani-Varnosfaderani, Ahmad; Habibi, Biuck

    2018-04-20

    Motor oil classification is important for quality control and the identification of oil adulteration. In thiswork, we propose a simple, rapid, inexpensive and nondestructive approach based on image analysis and pattern recognition techniques for the classification of nine different types of motor oils according to their corresponding color histograms. For this, we applied color histogram in different color spaces such as red green blue (RGB), grayscale, and hue saturation intensity (HSI) in order to extract features that can help with the classification procedure. These color histograms and their combinations were used as input for model development and then were statistically evaluated by using linear discriminant analysis (LDA), quadratic discriminant analysis (QDA), and support vector machine (SVM) techniques. Here, two common solutions for solving a multiclass classification problem were applied: (1) transformation to binary classification problem using a one-against-all (OAA) approach and (2) extension from binary classifiers to a single globally optimized multilabel classification model. In the OAA strategy, LDA, QDA, and SVM reached up to 97% in terms of accuracy, sensitivity, and specificity for both the training and test sets. In extension from binary case, despite good performances by the SVM classification model, QDA and LDA provided better results up to 92% for RGB-grayscale-HSI color histograms and up to 93% for the HSI color map, respectively. In order to reduce the numbers of independent variables for modeling, a principle component analysis algorithm was used. Our results suggest that the proposed method is promising for the identification and classification of different types of motor oils.

  3. Hybrid vehicle assessment. Phase 1: Petroleum savings analysis

    NASA Technical Reports Server (NTRS)

    Levin, R.; Liddle, S.; Deshpande, G.; Trummel, M.; Vivian, H. C.

    1984-01-01

    The results of a comprehensive analysis of near term electric hybrid vehicles are presented, with emphasis on their potential to save significant amounts of petroleum on a national scale in the 1990s. Performance requirements and expected annual usage patterns of these vehicles are first modeled. The projected U.S. fleet composition is estimated, and conceptual hybrid vehicle designs are conceived and analyzed for petroleum use when driven in the expected annual patterns. These petroleum consumption estimates are then compared to similar estimates for projected 1990 conventional vehicles having the same performance and driven in the same patterns. Results are presented in the form of three utility functions and comparisons of sevral conceptual designs are made. The Hybrid Vehicle (HV) design and assessment techniques are discussed and a general method is explained for selecting the optimum energy management strategy for any vehicle mission battery combination. Conclusions and recommendations are presented, and development recommendations are identified.

  4. Detecting dominant motion patterns in crowds of pedestrians

    NASA Astrophysics Data System (ADS)

    Saqib, Muhammad; Khan, Sultan Daud; Blumenstein, Michael

    2017-02-01

    As the population of the world increases, urbanization generates crowding situations which poses challenges to public safety and security. Manual analysis of crowded situations is a tedious job and usually prone to errors. In this paper, we propose a novel technique of crowd analysis, the aim of which is to detect different dominant motion patterns in real-time videos. A motion field is generated by computing the dense optical flow. The motion field is then divided into blocks. For each block, we adopt an Intra-clustering algorithm for detecting different flows within the block. Later on, we employ Inter-clustering for clustering the flow vectors among different blocks. We evaluate the performance of our approach on different real-time videos. The experimental results show that our proposed method is capable of detecting distinct motion patterns in crowded videos. Moreover, our algorithm outperforms state-of-the-art methods.

  5. Classification of damage in structural systems using time series analysis and supervised and unsupervised pattern recognition techniques

    NASA Astrophysics Data System (ADS)

    Omenzetter, Piotr; de Lautour, Oliver R.

    2010-04-01

    Developed for studying long, periodic records of various measured quantities, time series analysis methods are inherently suited and offer interesting possibilities for Structural Health Monitoring (SHM) applications. However, their use in SHM can still be regarded as an emerging application and deserves more studies. In this research, Autoregressive (AR) models were used to fit experimental acceleration time histories from two experimental structural systems, a 3- storey bookshelf-type laboratory structure and the ASCE Phase II SHM Benchmark Structure, in healthy and several damaged states. The coefficients of the AR models were chosen as damage sensitive features. Preliminary visual inspection of the large, multidimensional sets of AR coefficients to check the presence of clusters corresponding to different damage severities was achieved using Sammon mapping - an efficient nonlinear data compression technique. Systematic classification of damage into states based on the analysis of the AR coefficients was achieved using two supervised classification techniques: Nearest Neighbor Classification (NNC) and Learning Vector Quantization (LVQ), and one unsupervised technique: Self-organizing Maps (SOM). This paper discusses the performance of AR coefficients as damage sensitive features and compares the efficiency of the three classification techniques using experimental data.

  6. Functional assessment of the fundus autofluorescence pattern in Best vitelliform macular dystrophy.

    PubMed

    Parodi, Maurizio Battaglia; Iacono, Pierluigi; Del Turco, Claudia; Triolo, Giacinto; Bandello, Francesco

    2016-07-01

    To identify the fundus autofluorescence (FAF) patterns in Best vitelliform macular dystrophy (VMD). Patients affected by VMD in vitelliform, pseudohypopyon, and vitelliruptive stages underwent a complete ophthalmological examination, including best-corrected visual acuity (BCVA), short-wavelength FAF (SW-FAF), near-infrared FAF (NIR-FAF) and microperimetry. the identification of the correlation between SW-FAF and NIR-FAF patterns of the foveal region with BCVA, and central retinal sensitivity in eyes affected by VMD. The secondary outcomes included the definition of the frequency of foveal patterns on SW-FAF and NIR-FAF. Thirty-seven of 64 (58 %), 8 of 64 (12.5 %) and 19 of 64 (29.5 %) eyes showed vitelliform, pseudohypopyon, and vitelliruptive stages respectively. Three main FAF patterns were identified on both techniques: hyper-autofluorescent pattern, hypo-autofluorescent pattern, and patchy pattern. BCVA was significantly different in eyes with hypo-autofluorescent and patchy patterns with respect to eyes showing a hyper-autofluorescent pattern. Similar differences were registered in the FS according to SW-FAF classification. However, the FS differed in each subgroup in the NIR-FAF analysis. Subgroup analyses were performed on the patchy pattern, combining FAF and fundus abnormalities. Considering both FAF techniques, the BCVA differed between the vitelliform and pseudohypopyon stages, and between the vitelliform and vitelliruptive stages. In the NIR-FAF classification, there was a significant statistical difference in the FS between each subgroup; in the SW-FAF, there was a significant difference between the vitelliform and pseudohypopyon stages and the vitelliform and vitelliruptive stages. Three main FAF patterns can be identified in VMD. The patchy pattern is the most frequent, accounting for 70 % of eyes on SW-FAF and 80 % of eyes on NIR-FAF. A tighter correlation links the classification of NIR-FAF patterns and FS. Longitudinal investigations are warranted to evaluate the course of FAF patterns and their role in disease monitoring.

  7. High-speed peak matching algorithm for retention time alignment of gas chromatographic data for chemometric analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Johnson, Kevin J.; Wright, Bob W.; Jarman, Kristin H.

    2003-05-09

    A rapid retention time alignment algorithm was developed as a preprocessing utility to be used prior to chemometric analysis of large datasets of diesel fuel gas chromatographic profiles. Retention time variation from chromatogram-to-chromatogram has been a significant impediment against the use of chemometric techniques in the analysis of chromatographic data due to the inability of current multivariate techniques to correctly model information that shifts from variable to variable within a dataset. The algorithm developed is shown to increase the efficacy of pattern recognition methods applied to a set of diesel fuel chromatograms by retaining chemical selectivity while reducing chromatogram-to-chromatogram retentionmore » time variations and to do so on a time scale that makes analysis of large sets of chromatographic data practical.« less

  8. Implementation and analysis of relief patterns of the surface of benign and malignant lesions of the skin by microtopography

    NASA Astrophysics Data System (ADS)

    López Pacheco, María del Carmen; Filipe Pereira da Cunha Martins-Costa, Manuel; Pérez Zapata, Aura Judith; Domínguez Cherit, Judith; Ramón Gallegos, Eva

    2005-12-01

    The objective of this study was to be able to distinguish between healthy skin tissue and malignant ones, furthermore determining a unique pattern of roughness for each skin lesion by microtopographic analysis of the skin surface of Mexican patients during the period from April to October 2002. The standard technique used in this study for the diagnosis of skin cancer and the comparison of the results was the haematoxylin eosin histopathological technique. Latex impressions were taken from skin lesions as well as from the healthy skin of each patient to serve as control samples. These impressions were analysed by the MICROTOP.03.MFC microtopographic system inspection. It was observed that when the tumour becomes rougher, more malign will be the lesion. On average, the melanoma present an increase of roughness of 67% compared to healthy skin, obtaining a roughness relation of 1:2.54. The percentage decreases to 49% (49%, 1:60) in the case of basal cell carcinoma and to 40% in pre-malignant lesions such as melanocytic nevus (40%, 1:150). In benign lesions such as the seborrhoea keratosis only a small increase in roughness was noted (4%, 1:0.72). Microtopographic inspection of the skin surface can be considered as a complementary diagnostic technique for skin cancer.

  9. Gait Analysis Methods for Rodent Models of Arthritic Disorders: Reviews and Recommendations

    PubMed Central

    Lakes, Emily H.; Allen, Kyle D.

    2016-01-01

    Gait analysis is a useful tool to understand behavioral changes in preclinical arthritis models. While observational scoring and spatiotemporal gait parameters are the most widely performed gait analyses in rodents, commercially available systems can now provide quantitative assessments of spatiotemporal patterns. However, inconsistencies remain between testing platforms, and laboratories often select different gait pattern descriptors to report in the literature. Rodent gait can also be described through kinetic and kinematic analyses, but systems to analyze rodent kinetics and kinematics are typically custom made and often require sensitive, custom equipment. While the use of rodent gait analysis rapidly expands, it is important to remember that, while rodent gait analysis is a relatively modern behavioral assay, the study of quadrupedal gait is not new. Nearly all gait parameters are correlated, and a collection of gait parameters is needed to understand a compensatory gait pattern used by the animal. As such, a change in a single gait parameter is unlikely to tell the full biomechanical story; and to effectively use gait analysis, one must consider how multiple different parameters contribute to an altered gait pattern. The goal of this article is to review rodent gait analysis techniques and provide recommendations on how to use these technologies in rodent arthritis models, including discussions on the strengths and limitations of observational scoring, spatiotemporal, kinetic, and kinematic measures. Recognizing rodent gait analysis is an evolving tool, we also provide technical recommendations we hope will improve the utility of these analyses in the future. PMID:26995111

  10. Analysis of short tandem repeat polymorphisms using infrared fluorescence with M18 tailed primers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Oetting, W.S.; Wiesner, G.; Laken, S.

    The use of short tandem repeat polymorphisms (STRPs) are becoming increasingly important as markers for linkage analysis due to their large numbers of the human genome and their high degree of polymorphism. Fluorescence based detection of the STRP pattern using the LI-COR model 4000S automated DNA sequencer eliminates the need for radioactivity and produces a digitized image that can be used for the analysis of the polymorphisms. In an effort to reduce the cost of STRP analysis, we have synthesized primers with a 19 bp extension complementary to the sequence of the M13 primer on the 5{prime} end of onemore » of the two primers used in the amplification of the STRP instead of using primers with direct conjugation of the infrared fluorescent dye. Up to 5 primer pairs can be multiplexed together with the M13 primer-dye conjugate as the sole primer conjugated to the fluorescent dye. Comparisons between primers that have been directly conjugated to the fluor with those having the M13 sequence extension show no difference in the ability to determine the STRP pattern. At present, the entire Weber 4A set of STRP markers is available with the M13 5{prime} extension. We are currently using this technique for linkage analysis of familial breast cancer and asthma. The combination of STRP analysis using fluorescence detection will allow this technique to be fully automated for allele scoring and linkage analysis.« less

  11. Pattern recognition technique

    NASA Technical Reports Server (NTRS)

    Hong, J. P.

    1971-01-01

    Technique operates regardless of pattern rotation, translation or magnification and successfully detects out-of-register patterns. It improves accuracy and reduces cost of various optical character recognition devices and page readers and provides data input to computer.

  12. Applications of liquid-based separation in conjunction with mass spectrometry to the analysis of forensic evidence.

    PubMed

    Moini, Mehdi

    2018-05-01

    In the past few years, there has been a significant effort by the forensic science community to develop new scientific techniques for the analysis of forensic evidence. Forensic chemists have been spearheaded to develop information-rich confirmatory technologies and techniques and apply them to a broad array of forensic challenges. The purpose of these confirmatory techniques is to provide alternatives to presumptive techniques that rely on data such as color changes, pattern matching, or retention time alone, which are prone to more false positives. To this end, the application of separation techniques in conjunction with mass spectrometry has played an important role in the analysis of forensic evidence. Moreover, in the past few years the role of liquid separation techniques, such as liquid chromatography and capillary electrophoresis in conjunction with mass spectrometry, has gained significant tractions and have been applied to a wide range of chemicals, from small molecules such as drugs and explosives, to large molecules such as proteins. For example, proteomics and peptidomics have been used for identification of humans, organs, and bodily fluids. A wide range of HPLC techniques including reversed phase, hydrophilic interaction, mixed-mode, supercritical fluid, multidimensional chromatography, and nanoLC, as well as several modes of capillary electrophoresis mass spectrometry, including capillary zone electrophoresis, partial filling, full filling, and micellar electrokenetic chromatography have been applied to the analysis drugs, explosives, and questioned documents. In this article, we review recent (2015-2017) applications of liquid separation in conjunction with mass spectrometry to the analysis of forensic evidence. © 2018 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  13. Develop advanced nonlinear signal analysis topographical mapping system

    NASA Technical Reports Server (NTRS)

    1994-01-01

    The Space Shuttle Main Engine (SSME) has been undergoing extensive flight certification and developmental testing, which involves some 250 health monitoring measurements. Under the severe temperature, pressure, and dynamic environments sustained during operation, numerous major component failures have occurred, resulting in extensive engine hardware damage and scheduling losses. To enhance SSME safety and reliability, detailed analysis and evaluation of the measurements signal are mandatory to assess its dynamic characteristics and operational condition. Efficient and reliable signal detection techniques will reduce catastrophic system failure risks and expedite the evaluation of both flight and ground test data, and thereby reduce launch turn-around time. The basic objective of this contract are threefold: (1) develop and validate a hierarchy of innovative signal analysis techniques for nonlinear and nonstationary time-frequency analysis. Performance evaluation will be carried out through detailed analysis of extensive SSME static firing and flight data. These techniques will be incorporated into a fully automated system; (2) develop an advanced nonlinear signal analysis topographical mapping system (ATMS) to generate a Compressed SSME TOPO Data Base (CSTDB). This ATMS system will convert tremendous amount of complex vibration signals from the entire SSME test history into a bank of succinct image-like patterns while retaining all respective phase information. High compression ratio can be achieved to allow minimal storage requirement, while providing fast signature retrieval, pattern comparison, and identification capabilities; and (3) integrate the nonlinear correlation techniques into the CSTDB data base with compatible TOPO input data format. Such integrated ATMS system will provide the large test archives necessary for quick signature comparison. This study will provide timely assessment of SSME component operational status, identify probable causes of malfunction, and indicate feasible engineering solutions. The final result of this program will yield an ATMS system of nonlinear and nonstationary spectral analysis software package integrated with the Compressed SSME TOPO Data Base (CSTDB) on the same platform. This system will allow NASA engineers to retrieve any unique defect signatures and trends associated with different failure modes and anomalous phenomena over the entire SSME test history across turbo pump families.

  14. Develop advanced nonlinear signal analysis topographical mapping system

    NASA Technical Reports Server (NTRS)

    Jong, Jen-Yi

    1993-01-01

    The SSME has been undergoing extensive flight certification and developmental testing, which involves some 250 health monitoring measurements. Under the severe temperature pressure, and dynamic environments sustained during operation, numerous major component failures have occurred, resulting in extensive engine hardware damage and scheduling losses. To enhance SSME safety and reliability, detailed analysis and evaluation of the measurements signal are mandatory to assess its dynamic characteristics and operational condition. Efficient and reliable signal detection techniques will reduce catastrophic system failure risks and expedite the evaluation of both flight and ground test data, and thereby reduce launch turn-around time. The basic objective of this contract are threefold: (1) Develop and validate a hierarchy of innovative signal analysis techniques for nonlinear and nonstationary time-frequency analysis. Performance evaluation will be carried out through detailed analysis of extensive SSME static firing and flight data. These techniques will be incorporated into a fully automated system. (2) Develop an advanced nonlinear signal analysis topographical mapping system (ATMS) to generate a Compressed SSME TOPO Data Base (CSTDB). This ATMS system will convert tremendous amounts of complex vibration signals from the entire SSME test history into a bank of succinct image-like patterns while retaining all respective phase information. A high compression ratio can be achieved to allow the minimal storage requirement, while providing fast signature retrieval, pattern comparison, and identification capabilities. (3) Integrate the nonlinear correlation techniques into the CSTDB data base with compatible TOPO input data format. Such integrated ATMS system will provide the large test archives necessary for a quick signature comparison. This study will provide timely assessment of SSME component operational status, identify probable causes of malfunction, and indicate feasible engineering solutions. The final result of this program will yield an ATMS system of nonlinear and nonstationary spectral analysis software package integrated with the Compressed SSME TOPO Data Base (CSTDB) on the same platform. This system will allow NASA engineers to retrieve any unique defect signatures and trends associated with different failure modes and anomalous phenomena over the entire SSME test history across turbo pump families.

  15. Development of wide-angle 2D light scattering static cytometry

    NASA Astrophysics Data System (ADS)

    Xie, Linyan; Liu, Qiao; Shao, Changshun; Su, Xuantao

    2016-10-01

    We have recently developed a 2D light scattering static cytometer for cellular analysis in a label-free manner, which measures side scatter (SSC) light in the polar angular range from 79 to 101 degrees. Compared with conventional flow cytometry, our cytometric technique requires no fluorescent labeling of the cells, and static cytometry measurements can be performed without flow control. In this paper we present an improved label-free static cytometer that can obtain 2D light scattering patterns in a wider angular range. By illuminating the static microspheres on chip with a scanning optical fiber, wide-angle 2D light scattering patterns of single standard microspheres with a mean diameter of 3.87 μm are obtained. The 2D patterns of 3.87 μm microspheres contain both large-angle forward scatter (FSC) and SSC light in the polar angular range from 40 to 100 degrees, approximately. Experimental 2D patterns of 3.87 μm microspheres are in good agreement with Mie theory simulated ones. The wide-angle light scattering measurements may provide a better resolution for particle analysis as compared with the SSC measurements. Two dimensional light scattering patterns of HL-60 human acute leukemia cells are obtained by using our static cytometer. Compared with SSC 2D light scattering patterns, wide-angle 2D patterns contain richer information of the HL-60 cells. The obtaining of 2D light scattering patterns in a wide angular range could help to enhance the capabilities of our label-free static cytometry for cell analysis.

  16. Image processing and 3D visualization in the interpretation of patterned injury of the skin

    NASA Astrophysics Data System (ADS)

    Oliver, William R.; Altschuler, Bruce R.

    1995-09-01

    The use of image processing is becoming increasingly important in the evaluation of violent crime. While much work has been done in the use of these techniques for forensic purposes outside of forensic pathology, its use in the pathologic examination of wounding has been limited. We are investigating the use of image processing in the analysis of patterned injuries and tissue damage. Our interests are currently concentrated on 1) the use of image processing techniques to aid the investigator in observing and evaluating patterned injuries in photographs, 2) measurement of the 3D shape characteristics of surface lesions, and 3) correlation of patterned injuries with deep tissue injury as a problem in 3D visualization. We are beginning investigations in data-acquisition problems for performing 3D scene reconstructions from the pathology perspective of correlating tissue injury to scene features and trace evidence localization. Our primary tool for correlation of surface injuries with deep tissue injuries has been the comparison of processed surface injury photographs with 3D reconstructions from antemortem CT and MRI data. We have developed a prototype robot for the acquisition of 3D wound and scene data.

  17. A diagnostic technique used to obtain cross range radiation centers from antenna patterns

    NASA Technical Reports Server (NTRS)

    Lee, T. H.; Burnside, W. D.

    1988-01-01

    A diagnostic technique to obtain cross range radiation centers based on antenna radiation patterns is presented. This method is similar to the synthetic aperture processing of scattered fields in the radar application. Coherent processing of the radiated fields is used to determine the various radiation centers associated with the far-zone pattern of an antenna for a given radiation direction. This technique can be used to identify an unexpected radiation center that creates an undesired effect in a pattern; on the other hand, it can improve a numerical simulation of the pattern by identifying other significant mechanisms. Cross range results for two 8' reflector antennas are presented to illustrate as well as validate that technique.

  18. Development of indirect EFBEM for radiating noise analysis including underwater problems

    NASA Astrophysics Data System (ADS)

    Kwon, Hyun-Wung; Hong, Suk-Yoon; Song, Jee-Hun

    2013-09-01

    For the analysis of radiating noise problems in medium-to-high frequency ranges, the Energy Flow Boundary Element Method (EFBEM) was developed. EFBEM is the analysis technique that applies the Boundary Element Method (BEM) to Energy Flow Analysis (EFA). The fundamental solutions representing spherical wave property for radiating noise problems in open field and considering the free surface effect in underwater are developed. Also the directivity factor is developed to express wave's directivity patterns in medium-to-high frequency ranges. Indirect EFBEM by using fundamental solutions and fictitious source was applied to open field and underwater noise problems successfully. Through numerical applications, the acoustic energy density distributions due to vibration of a simple plate model and a sphere model were compared with those of commercial code, and the comparison showed good agreement in the level and pattern of the energy density distributions.

  19. Evaluating the Effect of Three Water Management Techniques on Tomato Crop.

    PubMed

    Elnesr, Mohammad Nabil; Alazba, Abdurrahman Ali; Zein El-Abedein, Assem Ibrahim; El-Adl, Mahmoud Maher

    2015-01-01

    The effects of three water management techniques were evaluated on subsurface drip irrigated tomatoes. The three techniques were the intermittent flow (3 pulses), the dual-lateral drip system (two lateral lines per row, at 15 and 25 cm below soil surface), and the physical barrier (buried at 30 cm below soil surface). Field experiments were established for two successive seasons. Water movement in soil was monitored using continuously logging capacitance probes up to 60 cm depth. The results showed that the dual lateral technique positively increased the yield up to 50%, water use efficiency up to 54%, while the intermittent application improved some of the quality measures (fruit size, TSS, and Vitamin C), not the quantity of the yield that decreased in one season, and not affected in the other. The physical barrier has no significant effect on any of the important growth measures. The soil water patterns showed that the dual lateral method lead to uniform wetting pattern with depth up to 45 cm, the physical barrier appeared to increase lateral and upward water movement, while the intermittent application kept the wetting pattern at higher moisture level for longer time. The cost analysis showed also that the economic treatments were the dual lateral followed by the intermittent technique, while the physical barrier is not economical. The study recommends researching the effect of the dual lateral method on the root growth and performance. The intermittent application may be recommended to improve tomato quality but not quantity. The physical barrier is not recommended unless in high permeable soils.

  20. Flow Pattern Phenomena in Two-Phase Flow in Microchannels

    NASA Astrophysics Data System (ADS)

    Keska, Jerry K.; Simon, William E.

    2004-02-01

    Space transportation systems require high-performance thermal protection and fluid management techniques for systems ranging from cryogenic fluid management devices to primary structures and propulsion systems exposed to extremely high temperatures, as well as for other space systems such as cooling or environment control for advanced space suits and integrated circuits. Although considerable developmental effort is being expended to bring potentially applicable technologies to a readiness level for practical use, new and innovative methods are still needed. One such method is the concept of Advanced Micro Cooling Modules (AMCMs), which are essentially compact two-phase heat exchangers constructed of microchannels and designed to remove large amounts of heat rapidly from critical systems by incorporating phase transition. The development of AMCMs requires fundamental technological advancement in many areas, including: (1) development of measurement methods/systems for flow-pattern measurement/identification for two-phase mixtures in microchannels; (2) development of a phenomenological model for two-phase flow which includes the quantitative measure of flow patterns; and (3) database development for multiphase heat transfer/fluid dynamics flows in microchannels. This paper focuses on the results of experimental research in the phenomena of two-phase flow in microchannels. The work encompasses both an experimental and an analytical approach to incorporating flow patterns for air-water mixtures flowing in a microchannel, which are necessary tools for the optimal design of AMCMs. Specifically, the following topics are addressed: (1) design and construction of a sensitive test system for two-phase flow in microchannels, one which measures ac and dc components of in-situ physical mixture parameters including spatial concentration using concomitant methods; (2) data acquisition and analysis in the amplitude, time, and frequency domains; and (3) analysis of results including evaluation of data acquisition techniques and their validity for application in flow pattern determination.

  1. One-Dimensional Scanning Approach to Shock Sensing

    NASA Technical Reports Server (NTRS)

    Tokars, Roger; Adamovsky, Girgory; Floyd, Bertram

    2009-01-01

    Measurement tools for high speed air flow are sought both in industry and academia. Particular interest is shown in air flows that exhibit aerodynamic shocks. Shocks are accompanied by sudden changes in density, pressure, and temperature. Optical detection and characterization of such shocks can be difficult because the medium is normally transparent air. A variety of techniques to analyze these flows are available, but they often require large windows and optical components as in the case of Schlieren measurements and/or large operating powers which precludes their use for in-flight monitoring and applications. The one-dimensional scanning approach in this work is a compact low power technique that can be used to non-intrusively detect shocks. The shock is detected by analyzing the optical pattern generated by a small diameter laser beam as it passes through the shock. The optical properties of a shock result in diffraction and spreading of the beam as well as interference fringes. To investigate the feasibility of this technique a shock is simulated by a 426 m diameter optical fiber. Analysis of results revealed a direct correlation between the optical fiber or shock location and the beam s diffraction pattern. A plot of the width of the diffraction pattern vs. optical fiber location reveals that the width of the diffraction pattern was maximized when the laser beam is directed at the center of the optical fiber. This work indicates that the one-dimensional scanning approach may be able to determine the location of an actual shock. Near and far field effects associated with a small diameter laser beam striking an optical fiber used as a simulated shock are investigated allowing a proper one-dimensional scanning beam technique.

  2. An Automatic Phase-Change Detection Technique for Colloidal Hard Sphere Suspensions

    NASA Technical Reports Server (NTRS)

    McDowell, Mark; Gray, Elizabeth; Rogers, Richard B.

    2005-01-01

    Colloidal suspensions of monodisperse spheres are used as physical models of thermodynamic phase transitions and as precursors to photonic band gap materials. However, current image analysis techniques are not able to distinguish between densely packed phases within conventional microscope images, which are mainly characterized by degrees of randomness or order with similar grayscale value properties. Current techniques for identifying the phase boundaries involve manually identifying the phase transitions, which is very tedious and time consuming. We have developed an intelligent machine vision technique that automatically identifies colloidal phase boundaries. The algorithm utilizes intelligent image processing techniques that accurately identify and track phase changes vertically or horizontally for a sequence of colloidal hard sphere suspension images. This technique is readily adaptable to any imaging application where regions of interest are distinguished from the background by differing patterns of motion over time.

  3. Characterization of edible seaweed harvested on the Galician coast (northwestern Spain) using pattern recognition techniques and major and trace element data.

    PubMed

    Romarís-Hortas, Vanessa; García-Sartal, Cristina; Barciela-Alonso, María Carmen; Moreda-Piñeiro, Antonio; Bermejo-Barrera, Pilar

    2010-02-10

    Major and trace elements in North Atlantic seaweed originating from Galicia (northwestern Spain) were determined by using inductively coupled plasma-optical emission spectrometry (ICP-OES) (Ba, Ca, Cu, K, Mg, Mn, Na, Sr, and Zn), inductively coupled plasma-mass spectrometry (ICP-MS) (Br and I) and hydride generation-atomic fluorescence spectrometry (HG-AFS) (As). Pattern recognition techniques were then used to classify the edible seaweed according to their type (red, brown, and green seaweed) and also their variety (Wakame, Fucus, Sea Spaghetti, Kombu, Dulse, Nori, and Sea Lettuce). Principal component analysis (PCA) and cluster analysis (CA) were used as exploratory techniques, and linear discriminant analysis (LDA) and soft independent modeling of class analogy (SIMCA) were used as classification procedures. In total, t12 elements were determined in a range of 35 edible seaweed samples (20 brown seaweed, 10 red seaweed, 4 green seaweed, and 1 canned seaweed). Natural groupings of the samples (brown, red, and green types) were observed using PCA and CA (squared Euclidean distance between objects and Ward method as clustering procedure). The application of LDA gave correct assignation percentages of 100% for brown, red, and green types at a significance level of 5%. However, a satisfactory classification (recognition and prediction) using SIMCA was obtained only for red seaweed (100% of cases correctly classified), whereas percentages of 89 and 80% were obtained for brown seaweed for recognition (training set) and prediction (testing set), respectively.

  4. Speckle patterns produced by an optical vortex and its application to surface roughness measurements.

    PubMed

    Passos, M H M; Lemos, M R; Almeida, S R; Balthazar, W F; da Silva, L; Huguenin, J A O

    2017-01-10

    In this work, we report on the analysis of speckle patterns produced by illuminating different rough surfaces with an optical vortex, a first-order (l=1) Laguerre-Gaussian beam. The generated speckle patterns were observed in the normal direction exploring four different planes: the diffraction plane, image plane, focal plane, and exact Fourier transform plane. The digital speckle patterns were analyzed using the Hurst exponent of digital images, an interesting tool used to study surface roughness. We show a proof of principle that the Hurst exponent of a digital speckle pattern is more sensitive with respect to the surface roughness when the speckle pattern is produced by an optical vortex and observed at a focal plane. We also show that Hurst exponents are not so sensitive with respect to the topological charge l. These results open news possibilities of investigation into speckle metrology once we have several techniques that use speckle patterns for different applications.

  5. Unravelling associations between unassigned mass spectrometry peaks with frequent itemset mining techniques.

    PubMed

    Vu, Trung Nghia; Mrzic, Aida; Valkenborg, Dirk; Maes, Evelyne; Lemière, Filip; Goethals, Bart; Laukens, Kris

    2014-01-01

    Mass spectrometry-based proteomics experiments generate spectra that are rich in information. Often only a fraction of this information is used for peptide/protein identification, whereas a significant proportion of the peaks in a spectrum remain unexplained. In this paper we explore how a specific class of data mining techniques termed "frequent itemset mining" can be employed to discover patterns in the unassigned data, and how such patterns can help us interpret the origin of the unexpected/unexplained peaks. First a model is proposed that describes the origin of the observed peaks in a mass spectrum. For this purpose we use the classical correlative database search algorithm. Peaks that support a positive identification of the spectrum are termed explained peaks. Next, frequent itemset mining techniques are introduced to infer which unexplained peaks are associated in a spectrum. The method is validated on two types of experimental proteomic data. First, peptide mass fingerprint data is analyzed to explain the unassigned peaks in a full scan mass spectrum. Interestingly, a large numbers of experimental spectra reveals several highly frequent unexplained masses, and pattern mining on these frequent masses demonstrates that subsets of these peaks frequently co-occur. Further evaluation shows that several of these co-occurring peaks indeed have a known common origin, and other patterns are promising hypothesis generators for further analysis. Second, the proposed methodology is validated on tandem mass spectrometral data using a public spectral library, where associations within the mass differences of unassigned peaks and peptide modifications are explored. The investigation of the found patterns illustrates that meaningful patterns can be discovered that can be explained by features of the employed technology and found modifications. This simple approach offers opportunities to monitor accumulating unexplained mass spectrometry data for emerging new patterns, with possible applications for the development of mass exclusion lists, for the refinement of quality control strategies and for a further interpretation of unexplained spectral peaks in mass spectrometry and tandem mass spectrometry.

  6. Microbead-regulated surface wrinkling patterns in a film-substrate system

    NASA Astrophysics Data System (ADS)

    Zhang, Cheng; Wang, Jiawen; Cao, Yan-Ping; Lu, Conghua; Li, Bo; Feng, Xi-Qiao

    2017-10-01

    The control of surface wrinkling patterns at the microscale is a concern in many applications. In this letter, we regulate surface wrinkling patterns on a film-substrate system by introducing microbeads atop the film. Both experiments and theoretical analysis reveal the changes in surface wrinkles induced by microbeads. Under equibiaxial compression, the film-substrate system without microbeads bonded on its upper surface often buckles into global, uniform labyrinths, whereas the labyrinthine pattern locally gives way to radial stripes emanating from the microbeads. This regulation of surface wrinkles depends on the sizes and spacing of microbeads. We combine the finite element method and the Fourier spectral method to explore the physical mechanisms underlying the phenomena. This study offers a viable technique for engineering surfaces with tunable functions.

  7. The analysis of crystallographic symmetry types in finite groups

    NASA Astrophysics Data System (ADS)

    Sani, Atikah Mohd; Sarmin, Nor Haniza; Adam, Nooraishikin; Zamri, Siti Norziahidayu Amzee

    2014-06-01

    Undeniably, it is human nature to prefer objects which are considered beautiful. Most consider beautiful as perfection, hence they try to create objects which are perfectly balance in shape and patterns. This creates a whole different kind of art, the kind that requires an object to be symmetrical. This leads to the study of symmetrical objects and pattern. Even mathematicians and ethnomathematicians are very interested with the essence of symmetry. One of these studies were conducted on the Malay traditional triaxial weaving culture. The patterns derived from this technique are symmetrical and this allows for further research. In this paper, the 17 symmetry types in a plane, known as the wallpaper groups, are studied and discussed. The wallpaper groups will then be applied to the triaxial patterns of food cover in Malaysia.

  8. Yearly report, Yucca Mountain project

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brune, J.N.

    1992-09-30

    We proposed to (1) Develop our data logging and analysis equipment and techniques for analyzing seismic data from the Southern Great Basin Seismic Network (SGBSN), (2) Investigate the SGBSN data for evidence of seismicity patterns, depth distribution patterns, and correlations with geologic features (3) Repair and maintain our three broad band downhole digital seismograph stations at Nelson, nevada, Troy Canyon, Nevada, and Deep Springs, California (4) Install, operate, and log data from a super sensitive microearthquake array at Yucca Mountain (5) Analyze data from micro-earthquakes relative to seismic hazard at Yucca Mountain.

  9. Patterns of healthcare service utilisation following severe traumatic brain injury: an idiographic analysis of injury compensation claims data.

    PubMed

    Collie, A; Prang, K-H

    2013-11-01

    The rate and extent of recovery after severe traumatic brain injury (TBI) is heterogeneous making prediction of likely healthcare service utilisation (HSU) difficult. Patterns of HSU derived from nomothetic samples do not represent the diverse range of outcomes possible within this patient group. Group-based trajectory model is a semi-parametric statistical technique that seeks to identify clusters of individuals whose outcome (however measured) follows a similar pattern of change over time. To identify and characterise patterns of HSU in the 5-year period following severe TBI. Detailed healthcare treatment payments data in 316 adults with severe TBI (Glasgow Coma Scale score 3-8) from the transport accident compensation system in the state of Victoria, Australia was accessed for this analysis. A semi-parametric group-based trajectory analytical technique for longitudinal data was applied to monthly observation counts of HSU data to identify distinct clusters of participants' trajectories. Comparison between trajectory groups on demographic, injury, disability and compensation relevant outcomes was undertaken. Four distinct patterns (trajectories) of HSU were identified in the sample. The first trajectory group comprised 27% of participants and displayed a rapid decrease in HSU in the first year post-injury. The second group comprised 24% of participants and showed a sharp peak in HSU during the first 12 months post-injury followed by a decline over time. The third group comprised 32% of participants and showed a slight peak in HSU in the first few months post-injury and then a slow decline over time. The fourth group comprised 17% of participants and displayed a steady rise in HSU up to 30 months post-injury, followed by a gradual decline to a level consistent with that received in the first months post-injury. Significant differences were observed between groups on factors such as age, injury severity, and use of disability services. There is substantial variation in patterns of HSU following severe TBI. Idiographic analysis can provide rich information for describing and understanding the resources required to help people with TBI. Copyright © 2013 Elsevier Ltd. All rights reserved.

  10. Analysis of artifacts suggests DGGE should not be used for quantitative diversity analysis.

    PubMed

    Neilson, Julia W; Jordan, Fiona L; Maier, Raina M

    2013-03-01

    PCR-denaturing gradient gel electrophoresis (PCR-DGGE) is widely used in microbial ecology for the analysis of comparative community structure. However, artifacts generated during PCR-DGGE of mixed template communities impede the application of this technique to quantitative analysis of community diversity. The objective of the current study was to employ an artificial bacterial community to document and analyze artifacts associated with multiband signatures and preferential template amplification and to highlight their impacts on the use of this technique for quantitative diversity analysis. Six bacterial species (three Betaproteobacteria, two Alphaproteobacteria, and one Firmicutes) were amplified individually and in combinations with primers targeting the V7/V8 region of the 16S rRNA gene. Two of the six isolates produced multiband profiles demonstrating that band number does not correlate directly with α-diversity. Analysis of the multiple bands from one of these isolates confirmed that both bands had identical sequences which lead to the hypothesis that the multiband pattern resulted from two distinct structural conformations of the same amplicon. In addition, consistent preferential amplification was demonstrated following pairwise amplifications of the six isolates. DGGE and real time PCR analysis identified primer mismatch and PCR inhibition due to 16S rDNA secondary structure as the most probable causes of preferential amplification patterns. Reproducible DGGE community profiles generated in this study confirm that PCR-DGGE provides an excellent high-throughput tool for comparative community structure analysis, but that method-specific artifacts preclude its use for accurate comparative diversity analysis. Copyright © 2013 Elsevier B.V. All rights reserved.

  11. Image distortion analysis using polynomial series expansion.

    PubMed

    Baggenstoss, Paul M

    2004-11-01

    In this paper, we derive a technique for analysis of local distortions which affect data in real-world applications. In the paper, we focus on image data, specifically handwritten characters. Given a reference image and a distorted copy of it, the method is able to efficiently determine the rotations, translations, scaling, and any other distortions that have been applied. Because the method is robust, it is also able to estimate distortions for two unrelated images, thus determining the distortions that would be required to cause the two images to resemble each other. The approach is based on a polynomial series expansion using matrix powers of linear transformation matrices. The technique has applications in pattern recognition in the presence of distortions.

  12. MANOVA vs nonlinear mixed effects modeling: The comparison of growth patterns of female and male quail

    NASA Astrophysics Data System (ADS)

    Gürcan, Eser Kemal

    2017-04-01

    The most commonly used methods for analyzing time-dependent data are multivariate analysis of variance (MANOVA) and nonlinear regression models. The aim of this study was to compare some MANOVA techniques and nonlinear mixed modeling approach for investigation of growth differentiation in female and male Japanese quail. Weekly individual body weight data of 352 male and 335 female quail from hatch to 8 weeks of age were used to perform analyses. It is possible to say that when all the analyses are evaluated, the nonlinear mixed modeling is superior to the other techniques because it also reveals the individual variation. In addition, the profile analysis also provides important information.

  13. Spectral mapping of soil organic matter

    NASA Technical Reports Server (NTRS)

    Kristof, S. J.; Baumgardner, M. F.; Johannsen, C. J.

    1974-01-01

    Multispectral remote sensing data were examined for use in the mapping of soil organic matter content. Computer-implemented pattern recognition techniques were used to analyze data collected in May 1969 and May 1970 by an airborne multispectral scanner over a 40-km flightline. Two fields within the flightline were selected for intensive study. Approximately 400 surface soil samples from these fields were obtained for organic matter analysis. The analytical data were used as training sets for computer-implemented analysis of the spectral data. It was found that within the geographical limitations included in this study, multispectral data and automatic data processing techniques could be used very effectively to delineate and map surface soils areas containing different levels of soil organic matter.

  14. Information spreading by a combination of MEG source estimation and multivariate pattern classification.

    PubMed

    Sato, Masashi; Yamashita, Okito; Sato, Masa-Aki; Miyawaki, Yoichi

    2018-01-01

    To understand information representation in human brain activity, it is important to investigate its fine spatial patterns at high temporal resolution. One possible approach is to use source estimation of magnetoencephalography (MEG) signals. Previous studies have mainly quantified accuracy of this technique according to positional deviations and dispersion of estimated sources, but it remains unclear how accurately MEG source estimation restores information content represented by spatial patterns of brain activity. In this study, using simulated MEG signals representing artificial experimental conditions, we performed MEG source estimation and multivariate pattern analysis to examine whether MEG source estimation can restore information content represented by patterns of cortical current in source brain areas. Classification analysis revealed that the corresponding artificial experimental conditions were predicted accurately from patterns of cortical current estimated in the source brain areas. However, accurate predictions were also possible from brain areas whose original sources were not defined. Searchlight decoding further revealed that this unexpected prediction was possible across wide brain areas beyond the original source locations, indicating that information contained in the original sources can spread through MEG source estimation. This phenomenon of "information spreading" may easily lead to false-positive interpretations when MEG source estimation and classification analysis are combined to identify brain areas that represent target information. Real MEG data analyses also showed that presented stimuli were able to be predicted in the higher visual cortex at the same latency as in the primary visual cortex, also suggesting that information spreading took place. These results indicate that careful inspection is necessary to avoid false-positive interpretations when MEG source estimation and multivariate pattern analysis are combined.

  15. Information spreading by a combination of MEG source estimation and multivariate pattern classification

    PubMed Central

    Sato, Masashi; Yamashita, Okito; Sato, Masa-aki

    2018-01-01

    To understand information representation in human brain activity, it is important to investigate its fine spatial patterns at high temporal resolution. One possible approach is to use source estimation of magnetoencephalography (MEG) signals. Previous studies have mainly quantified accuracy of this technique according to positional deviations and dispersion of estimated sources, but it remains unclear how accurately MEG source estimation restores information content represented by spatial patterns of brain activity. In this study, using simulated MEG signals representing artificial experimental conditions, we performed MEG source estimation and multivariate pattern analysis to examine whether MEG source estimation can restore information content represented by patterns of cortical current in source brain areas. Classification analysis revealed that the corresponding artificial experimental conditions were predicted accurately from patterns of cortical current estimated in the source brain areas. However, accurate predictions were also possible from brain areas whose original sources were not defined. Searchlight decoding further revealed that this unexpected prediction was possible across wide brain areas beyond the original source locations, indicating that information contained in the original sources can spread through MEG source estimation. This phenomenon of “information spreading” may easily lead to false-positive interpretations when MEG source estimation and classification analysis are combined to identify brain areas that represent target information. Real MEG data analyses also showed that presented stimuli were able to be predicted in the higher visual cortex at the same latency as in the primary visual cortex, also suggesting that information spreading took place. These results indicate that careful inspection is necessary to avoid false-positive interpretations when MEG source estimation and multivariate pattern analysis are combined. PMID:29912968

  16. Study to determine cloud motion from meteorological satellite data

    NASA Technical Reports Server (NTRS)

    Clark, B. B.

    1972-01-01

    Processing techniques were tested for deducing cloud motion vectors from overlapped portions of pairs of pictures made from meteorological satellites. This was accomplished by programming and testing techniques for estimating pattern motion by means of cross correlation analysis with emphasis placed upon identifying and reducing errors resulting from various factors. Techniques were then selected and incorporated into a cloud motion determination program which included a routine which would select and prepare sample array pairs from the preprocessed test data. The program was then subjected to limited testing with data samples selected from the Nimbus 4 THIR data provided by the 11.5 micron channel.

  17. Spectroscopic study of Pbs nano-structured layer prepared by Pld utilized as a Hall-effect magnetic sensor

    NASA Astrophysics Data System (ADS)

    Atwa, D. M.; Aboulfotoh, N.; El-magd, A. Abo; Badr, Y.

    2013-10-01

    Lead sulfide (PbS) nano-structured films have been grown on quartz substrates using PLD technique. The deposited films were characterized by several structural techniques, including scanning electron microscopy (SEM), transmission electron microscopy (TEM), and Selected-area electron diffraction patterns (SAED). The results prove the formation of cubic phase of PbS nanocrystals. Elemental analysis of the deposited films compared to the bulk target was obtained via laser induced fluorescence of the produced plasma particles and the energy dispersive X-ray "EDX" technique. The Hall coefficient measurements indicate an efficient performance of the deposited films as a magnetic sensor.

  18. Coordination analysis of players' distribution in football using cross-correlation and vector coding techniques.

    PubMed

    Moura, Felipe Arruda; van Emmerik, Richard E A; Santana, Juliana Exel; Martins, Luiz Eduardo Barreto; Barros, Ricardo Machado Leite de; Cunha, Sergio Augusto

    2016-12-01

    The purpose of this study was to investigate the coordination between teams spread during football matches using cross-correlation and vector coding techniques. Using a video-based tracking system, we obtained the trajectories of 257 players during 10 matches. Team spread was calculated as functions of time. For a general coordination description, we calculated the cross-correlation between the signals. Vector coding was used to identify the coordination patterns between teams during offensive sequences that ended in shots on goal or defensive tackles. Cross-correlation showed that opponent teams have a tendency to present in-phase coordination, with a short time lag. During offensive sequences, vector coding results showed that, although in-phase coordination dominated, other patterns were observed. We verified that during the early stages, offensive sequences ending in shots on goal present greater anti-phase and attacking team phase periods, compared to sequences ending in tackles. Results suggest that the attacking team may seek to present a contrary behaviour of its opponent (or may lead the adversary behaviour) in the beginning of the attacking play, regarding to the distribution strategy, to increase the chances of a shot on goal. The techniques allowed detecting the coordination patterns between teams, providing additional information about football dynamics and players' interaction.

  19. A peak position comparison method for high-speed quantitative Laue microdiffraction data processing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kou, Jiawei; Chen, Kai; Tamura, Nobumichi

    Indexing Laue patterns of a synchrotron microdiffraction scan can take as much as ten times longer than collecting the data, impeding efficient structural analysis using this technique. Here in this paper, a novel strategy is developed. By comparing the peak positions of adjacent Laue patterns and checking the intensity sequence, grain and phase boundaries are identified, requiring only a limited number of indexing steps for each individual grain. Using this protocol, the Laue patterns can be indexed on the fly as they are taken. The validation of this method is demonstrated by analyzing the microstructure of a laser 3D printedmore » multi-phase/multi-grain Ni-based superalloy.« less

  20. Root System Water Consumption Pattern Identification on Time Series Data

    PubMed Central

    Figueroa, Manuel; Pope, Christopher

    2017-01-01

    In agriculture, soil and meteorological sensors are used along low power networks to capture data, which allows for optimal resource usage and minimizing environmental impact. This study uses time series analysis methods for outliers’ detection and pattern recognition on soil moisture sensor data to identify irrigation and consumption patterns and to improve a soil moisture prediction and irrigation system. This study compares three new algorithms with the current detection technique in the project; the results greatly decrease the number of false positives detected. The best result is obtained by the Series Strings Comparison (SSC) algorithm averaging a precision of 0.872 on the testing sets, vastly improving the current system’s 0.348 precision. PMID:28621739

  1. Root System Water Consumption Pattern Identification on Time Series Data.

    PubMed

    Figueroa, Manuel; Pope, Christopher

    2017-06-16

    In agriculture, soil and meteorological sensors are used along low power networks to capture data, which allows for optimal resource usage and minimizing environmental impact. This study uses time series analysis methods for outliers' detection and pattern recognition on soil moisture sensor data to identify irrigation and consumption patterns and to improve a soil moisture prediction and irrigation system. This study compares three new algorithms with the current detection technique in the project; the results greatly decrease the number of false positives detected. The best result is obtained by the Series Strings Comparison (SSC) algorithm averaging a precision of 0.872 on the testing sets, vastly improving the current system's 0.348 precision.

  2. A peak position comparison method for high-speed quantitative Laue microdiffraction data processing

    DOE PAGES

    Kou, Jiawei; Chen, Kai; Tamura, Nobumichi

    2018-09-12

    Indexing Laue patterns of a synchrotron microdiffraction scan can take as much as ten times longer than collecting the data, impeding efficient structural analysis using this technique. Here in this paper, a novel strategy is developed. By comparing the peak positions of adjacent Laue patterns and checking the intensity sequence, grain and phase boundaries are identified, requiring only a limited number of indexing steps for each individual grain. Using this protocol, the Laue patterns can be indexed on the fly as they are taken. The validation of this method is demonstrated by analyzing the microstructure of a laser 3D printedmore » multi-phase/multi-grain Ni-based superalloy.« less

  3. EMGAN: A computer program for time and frequency domain reduction of electromyographic data

    NASA Technical Reports Server (NTRS)

    Hursta, W. N.

    1975-01-01

    An experiment in electromyography utilizing surface electrode techniques was developed for the Apollo-Soyuz test project. This report describes the computer program, EMGAN, which was written to provide first order data reduction for the experiment. EMG signals are produced by the membrane depolarization of muscle fibers during a muscle contraction. Surface electrodes detect a spatially summated signal from a large number of muscle fibers commonly called an interference pattern. An interference pattern is usually so complex that analysis through signal morphology is extremely difficult if not impossible. It has become common to process EMG interference patterns in the frequency domain. Muscle fatigue and certain myopathic conditions are recognized through changes in muscle frequency spectra.

  4. Persistence of polyomavirus in mice infected as adults differs from that observed in mice infected as newborns.

    PubMed Central

    Berke, Z; Dalianis, T

    1993-01-01

    By using the polymerase chain reaction (PCR) technique, a technique more sensitive than Southern analysis, which allows detection of polyomavirus DNA only in newborn and nude adult mice, it has now been possible to monitor the persistence pattern of polyomavirus DNA after infection of normal adult CBA mice for the first time. Viral signs appeared gradually, showing variations in time course and organ distribution between mice, and reached a peak activity after 2 to 3 weeks, when they could be found in bone, heart, gonads, lymph node, and skin, but disappeared by 2 to 5 months. No virus DNA was detected in the kidneys or lungs, which is in contrast to what is observed after infection of newborn mice. This finding suggests that the persistence pattern of polyomavirus is age dependent. PMID:8389934

  5. Advanced image based methods for structural integrity monitoring: Review and prospects

    NASA Astrophysics Data System (ADS)

    Farahani, Behzad V.; Sousa, Pedro José; Barros, Francisco; Tavares, Paulo J.; Moreira, Pedro M. G. P.

    2018-02-01

    There is a growing trend in engineering to develop methods for structural integrity monitoring and characterization of in-service mechanical behaviour of components. The fast growth in recent years of image processing techniques and image-based sensing for experimental mechanics, brought about a paradigm change in phenomena sensing. Hence, several widely applicable optical approaches are playing a significant role in support of experiment. The current review manuscript describes advanced image based methods for structural integrity monitoring, and focuses on methods such as Digital Image Correlation (DIC), Thermoelastic Stress Analysis (TSA), Electronic Speckle Pattern Interferometry (ESPI) and Speckle Pattern Shearing Interferometry (Shearography). These non-contact full-field techniques rely on intensive image processing methods to measure mechanical behaviour, and evolve even as reviews such as this are being written, which justifies a special effort to keep abreast of this progress.

  6. Optical holographic structural analysis of Kevlar rocket motor cases

    NASA Astrophysics Data System (ADS)

    Harris, W. J.

    1981-05-01

    The methodology of applying optical holography to evaluation of subscale Kevlar 49 composite pressure vessels is explored. The results and advantages of the holographic technique are discussed. The cases utilized were of similar design, but each had specific design features, the effects of which are reviewed. Burst testing results are presented in conjunction with the holographic fringe patterns obtained during progressive pressurization. Examples of quantitative data extracted by analysis of fringe fields are included.

  7. Pattern Discovery in Biomolecular Data – Tools, Techniques, and Applications | Center for Cancer Research

    Cancer.gov

    Finding patterns in biomolecular data, particularly in DNA and RNA, is at the center of modern biological research. These data are complex and growing rapidly, so the search for patterns requires increasingly sophisticated computer methods. This book provides a summary of principal techniques. Each chapter describes techniques that are drawn from many fields, including graph

  8. Cooperative Interactions in Peer Tutoring: Patterns and Sequences in Paired Writing

    ERIC Educational Resources Information Center

    Duran, David

    2010-01-01

    The research analyzes the interaction of 24 students (12 pairs) of secondary students when using peer tutoring techniques to learn Catalan. Students worked together in a program to produce an authentic writing experience. Significant increases were observed in pre- and posttest Catalan attainment scores of students. An analysis of the…

  9. Correlation Functions Aid Analyses Of Spectra

    NASA Technical Reports Server (NTRS)

    Beer, Reinhard; Norton, Robert H., Jr.

    1989-01-01

    New uses found for correlation functions in analyses of spectra. In approach combining elements of both pattern-recognition and traditional spectral-analysis techniques, spectral lines identified in data appear useless at first glance because they are dominated by noise. New approach particularly useful in measurement of concentrations of rare species of molecules in atmosphere.

  10. Improving the Scalability of an Exact Approach for Frequent Item Set Hiding

    ERIC Educational Resources Information Center

    LaMacchia, Carolyn

    2013-01-01

    Technological advances have led to the generation of large databases of organizational data recognized as an information-rich, strategic asset for internal analysis and sharing with trading partners. Data mining techniques can discover patterns in large databases including relationships considered strategically relevant to the owner of the data.…

  11. Multi-perspective analysis and spatiotemporal mapping of air pollution monitoring data.

    PubMed

    Kolovos, Alexander; Skupin, André; Jerrett, Michael; Christakos, George

    2010-09-01

    Space-time data analysis and assimilation techniques in atmospheric sciences typically consider input from monitoring measurements. The input is often processed in a manner that acknowledges characteristics of the measurements (e.g., underlying patterns, fluctuation features) under conditions of uncertainty; it also leads to the derivation of secondary information that serves study-oriented goals, and provides input to space-time prediction techniques. We present a novel approach that blends a rigorous space-time prediction model (Bayesian maximum entropy, BME) with a cognitively informed visualization of high-dimensional data (spatialization). The combined BME and spatialization approach (BME-S) is used to study monthly averaged NO2 and mean annual SO4 measurements in California over the 15-year period 1988-2002. Using the original scattered measurements of these two pollutants BME generates spatiotemporal predictions on a regular grid across the state. Subsequently, the prediction network undergoes the spatialization transformation into a lower-dimensional geometric representation, aimed at revealing patterns and relationships that exist within the input data. The proposed BME-S provides a powerful spatiotemporal framework to study a variety of air pollution data sources.

  12. Shotgun metagenomic data streams: surfing without fear

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Berendzen, Joel R

    2010-12-06

    Timely information about bio-threat prevalence, consequence, propagation, attribution, and mitigation is needed to support decision-making, both routinely and in a crisis. One DNA sequencer can stream 25 Gbp of information per day, but sampling strategies and analysis techniques are needed to turn raw sequencing power into actionable knowledge. Shotgun metagenomics can enable biosurveillance at the level of a single city, hospital, or airplane. Metagenomics characterizes viruses and bacteria from complex environments such as soil, air filters, or sewage. Unlike targeted-primer-based sequencing, shotgun methods are not blind to sequences that are truly novel, and they can measure absolute prevalence. Shotgun metagenomicmore » sampling can be non-invasive, efficient, and inexpensive while being informative. We have developed analysis techniques for shotgun metagenomic sequencing that rely upon phylogenetic signature patterns. They work by indexing local sequence patterns in a manner similar to web search engines. Our methods are laptop-fast and favorable scaling properties ensure they will be sustainable as sequencing methods grow. We show examples of application to soil metagenomic samples.« less

  13. Characterizing multivariate decoding models based on correlated EEG spectral features.

    PubMed

    McFarland, Dennis J

    2013-07-01

    Multivariate decoding methods are popular techniques for analysis of neurophysiological data. The present study explored potential interpretative problems with these techniques when predictors are correlated. Data from sensorimotor rhythm-based cursor control experiments was analyzed offline with linear univariate and multivariate models. Features were derived from autoregressive (AR) spectral analysis of varying model order which produced predictors that varied in their degree of correlation (i.e., multicollinearity). The use of multivariate regression models resulted in much better prediction of target position as compared to univariate regression models. However, with lower order AR features interpretation of the spectral patterns of the weights was difficult. This is likely to be due to the high degree of multicollinearity present with lower order AR features. Care should be exercised when interpreting the pattern of weights of multivariate models with correlated predictors. Comparison with univariate statistics is advisable. While multivariate decoding algorithms are very useful for prediction their utility for interpretation may be limited when predictors are correlated. Copyright © 2013 International Federation of Clinical Neurophysiology. Published by Elsevier Ireland Ltd. All rights reserved.

  14. Variational optimization analysis of temperature and moisture advection in a severe storm environment

    NASA Technical Reports Server (NTRS)

    Mcfarland, M. J.

    1975-01-01

    Horizontal wind components, potential temperature, and mixing ratio fields associated with a severe storm environment in the south central U.S. were analyzed from synoptic upper air observations with a nonhomogeneous, anisotropic weighting function. Each data field was filtered with variational optimization analysis techniques. Variational optimization analysis was also performed on the vertical motion field and was used to produce advective forecasts of the potential temperature and mixing ratio fields. Results show that the dry intrusion is characterized by warm air, the advection of which produces a well-defined upward motion pattern. A corresponding downward motion pattern comprising a deep vertical circulation in the warm air sector of the low pressure system was detected. The axes alignment of maximum dry and warm advection with the axis of the tornado-producing squall line also resulted.

  15. Technological innovations for a sustainable business model in the semiconductor industry

    NASA Astrophysics Data System (ADS)

    Levinson, Harry J.

    2014-09-01

    Increasing costs of wafer processing, particularly for lithographic processes, have made it increasingly difficult to achieve simultaneous reductions in cost-per-function and area per device. Multiple patterning techniques have made possible the fabrication of circuit layouts below the resolution limit of single optical exposures but have led to significant increases in the costs of patterning. Innovative techniques, such as self-aligned double patterning (SADP) have enabled good device performance when using less expensive patterning equipment. Other innovations have directly reduced the cost of manufacturing. A number of technical challenges must be overcome to enable a return to single-exposure patterning using short wavelength optical techniques, such as EUV patterning.

  16. Associative Memory Synthesis, Performance, Storage Capacity And Updating: New Heteroassociative Memory Results

    NASA Astrophysics Data System (ADS)

    Casasent, David; Telfer, Brian

    1988-02-01

    The storage capacity, noise performance, and synthesis of associative memories for image analysis are considered. Associative memory synthesis is shown to be very similar to that of linear discriminant functions used in pattern recognition. These lead to new associative memories and new associative memory synthesis and recollection vector encodings. Heteroassociative memories are emphasized in this paper, rather than autoassociative memories, since heteroassociative memories provide scene analysis decisions, rather than merely enhanced output images. The analysis of heteroassociative memories has been given little attention. Heteroassociative memory performance and storage capacity are shown to be quite different from those of autoassociative memories, with much more dependence on the recollection vectors used and less dependence on M/N. This allows several different and preferable synthesis techniques to be considered for associative memories. These new associative memory synthesis techniques and new techniques to update associative memories are included. We also introduce a new SNR performance measure that is preferable to conventional noise standard deviation ratios.

  17. Measurement of elastic and thermal properties of composite materials using digital speckle pattern interferometry

    NASA Astrophysics Data System (ADS)

    Kumar, Manoj; Khan, Gufran S.; Shakher, Chandra

    2015-08-01

    In the present work, application of digital speckle pattern interferometry (DSPI) was applied for the measurement of mechanical/elastic and thermal properties of fibre reinforced plastics (FRP). Digital speckle pattern interferometric technique was used to characterize the material constants (Poisson's ratio and Young's modulus) of the composite material. Poisson ratio based on plate bending and Young's modulus based on plate vibration of material are measured by using DSPI. In addition to this, the coefficient of thermal expansion of composite material is also measured. To study the thermal strain analysis, a single DSPI fringe pattern is used to extract the phase information by using Riesz transform and the monogenic signal. The phase extraction from a single DSPI fringe pattern by using Riesz transform does not require a phase-shifting system or spatial carrier. The elastic and thermal parameters obtained from DSPI are in close agreement with the theoretical predictions available in literature.

  18. Describing spatial pattern in stream networks: A practical approach

    USGS Publications Warehouse

    Ganio, L.M.; Torgersen, C.E.; Gresswell, R.E.

    2005-01-01

    The shape and configuration of branched networks influence ecological patterns and processes. Recent investigations of network influences in riverine ecology stress the need to quantify spatial structure not only in a two-dimensional plane, but also in networks. An initial step in understanding data from stream networks is discerning non-random patterns along the network. On the other hand, data collected in the network may be spatially autocorrelated and thus not suitable for traditional statistical analyses. Here we provide a method that uses commercially available software to construct an empirical variogram to describe spatial pattern in the relative abundance of coastal cutthroat trout in headwater stream networks. We describe the mathematical and practical considerations involved in calculating a variogram using a non-Euclidean distance metric to incorporate the network pathway structure in the analysis of spatial variability, and use a non-parametric technique to ascertain if the pattern in the empirical variogram is non-random.

  19. A geostatistical approach for describing spatial pattern in stream networks

    USGS Publications Warehouse

    Ganio, L.M.; Torgersen, C.E.; Gresswell, R.E.

    2005-01-01

    The shape and configuration of branched networks influence ecological patterns and processes. Recent investigations of network influences in riverine ecology stress the need to quantify spatial structure not only in a two-dimensional plane, but also in networks. An initial step in understanding data from stream networks is discerning non-random patterns along the network. On the other hand, data collected in the network may be spatially autocorrelated and thus not suitable for traditional statistical analyses. Here we provide a method that uses commercially available software to construct an empirical variogram to describe spatial pattern in the relative abundance of coastal cutthroat trout in headwater stream networks. We describe the mathematical and practical considerations involved in calculating a variogram using a non-Euclidean distance metric to incorporate the network pathway structure in the analysis of spatial variability, and use a non-parametric technique to ascertain if the pattern in the empirical variogram is non-random.

  20. An effective noise-suppression technique for surface microseismic data

    USGS Publications Warehouse

    Forghani-Arani, Farnoush; Willis, Mark; Haines, Seth S.; Batzle, Mike; Behura, Jyoti; Davidson, Michael

    2013-01-01

    The presence of strong surface-wave noise in surface microseismic data may decrease the utility of these data. We implement a technique, based on the distinct characteristics that microseismic signal and noise show in the τ‐p domain, to suppress surface-wave noise in microseismic data. Because most microseismic source mechanisms are deviatoric, preprocessing is necessary to correct for the nonuniform radiation pattern prior to transforming the data to the τ‐p domain. We employ a scanning approach, similar to semblance analysis, to test all possible double-couple orientations to determine an estimated orientation that best accounts for the polarity pattern of any microseismic events. We then correct the polarity of the data traces according to this pattern, prior to conducting signal-noise separation in the τ‐p domain. We apply our noise-suppression technique to two surface passive-seismic data sets from different acquisition surveys. The first data set includes a synthetic microseismic event added to field passive noise recorded by an areal receiver array distributed over a Barnett Formation reservoir undergoing hydraulic fracturing. The second data set is field microseismic data recorded by receivers arranged in a star-shaped array, over a Bakken Shale reservoir during a hydraulic-fracturing process. Our technique significantly improves the signal-to-noise ratios of the microseismic events and preserves the waveforms at the individual traces. We illustrate that the enhancement in signal-to-noise ratio also results in improved imaging of the microseismic hypocenter.

  1. Insomnia and sleep misperception.

    PubMed

    Bastien, C H; Ceklic, T; St-Hilaire, P; Desmarais, F; Pérusse, A D; Lefrançois, J; Pedneault-Drolet, M

    2014-10-01

    Sleep misperception is often observed in insomnia individuals (INS). The extent of misperception varies between different types of INS. The following paper comprised sections which will be aimed at studying the sleep EEG and compares it to subjective reports of sleep in individuals suffering from either psychophysiological insomnia or paradoxical insomnia and good sleeper controls. The EEG can be studied without any intervention (thus using the raw data) via either PSG or fine quantitative EEG analyses (power spectral analysis [PSA]), identifying EEG patterns as in the case of cyclic alternating patterns (CAPs) or by decorticating the EEG while scoring the different transient or phasic events (K-Complexes or sleep spindles). One can also act on the on-going EEG by delivering stimuli so to study their impact on cortical measures as in the case of event-related potential studies (ERPs). From the paucity of studies available using these different techniques, a general conclusion can be reached: sleep misperception is not an easy phenomenon to quantify and its clinical value is not well recognized. Still, while none of the techniques or EEG measures defined in the paper is available and/or recommended to diagnose insomnia, ERPs might be the most indicated technique to study hyperarousal and sleep quality in different types of INS. More research shall also be dedicated to EEG patterns and transient phasic events as these EEG scoring techniques can offer a unique insight of sleep misperception. Copyright © 2014 Elsevier Masson SAS. All rights reserved.

  2. Geometric pre-patterning based tuning of the period doubling onset strain during thin film wrinkling

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Saha, Sourabh K.

    Wrinkling of supported thin films is an easy-to-implement and low-cost fabrication technique for generation of stretch-tunable periodic micro and nano-scale structures. However, the tunability of such structures is often limited by the emergence of an undesirable period doubled mode at high strains. Predictively tuning the onset strain for period doubling via existing techniques requires one to have extensive knowledge about the nonlinear pattern formation behavior. Herein, a geometric pre-patterning based technique is introduced to delay the onset of period doubling that can be implemented to predictively tune the onset strain even with limited system knowledge. The technique comprises pre-patterning themore » film/base bilayer with a sinusoidal pattern that has the same period as the natural wrinkle period of the system. The effectiveness of this technique has been verified via physical and computational experiments on the polydimethylsiloxane/glass bilayer system. It is observed that the period doubling onset strain can be increased from the typical value of 20% for flat films to greater than 30% with a modest pre-pattern aspect ratio (2∙amplitude/period) of 0.15. In addition, finite element simulations reveal that (i) the onset strain can be increased up to a limit by increasing the amplitude of the pre-patterns and (ii) the delaying effect can be captured entirely by the pre-pattern geometry. As a result, one can implement this technique even with limited system knowledge, such as material properties or film thickness, by simply replicating pre-existing wrinkled patterns to generate prepatterned bilayers. Thus, geometric pre-patterning is a practical scheme to suppress period doubling that can increase the operating range of stretch-tunable wrinkle-based devices by at least 50%.« less

  3. Gradient pattern analysis applied to galaxy morphology

    NASA Astrophysics Data System (ADS)

    Rosa, R. R.; de Carvalho, R. R.; Sautter, R. A.; Barchi, P. H.; Stalder, D. H.; Moura, T. C.; Rembold, S. B.; Morell, D. R. F.; Ferreira, N. C.

    2018-06-01

    Gradient pattern analysis (GPA) is a well-established technique for measuring gradient bilateral asymmetries of a square numerical lattice. This paper introduces an improved version of GPA designed for galaxy morphometry. We show the performance of the new method on a selected sample of 54 896 objects from the SDSS-DR7 in common with Galaxy Zoo 1 catalogue. The results suggest that the second gradient moment, G2, has the potential to dramatically improve over more conventional morphometric parameters. It separates early- from late-type galaxies better (˜ 90 per cent) than the CAS system (C˜ 79 per cent, A˜ 50 per cent, S˜ 43 per cent) and a benchmark test shows that it is applicable to hundreds of thousands of galaxies using typical processing systems.

  4. Subreflector extension for improved efficiencies in Cassegrain antennas - GTD/PO analysis. [Geometrical Theory of Diffraction/Physical Optics

    NASA Technical Reports Server (NTRS)

    Rahmat-Samii, Yahya

    1986-01-01

    Both offset and symmetric Cassegrain reflector antennas are used in satellite and ground communication systems. It is known that the subreflector diffraction can degrade the performance of these reflectors. A geometrical theory of diffraction/physical optics analysis technique is used to investigate the effects of the extended subreflector, beyond its optical rim, on the reflector efficiency and far-field patterns. Representative numerical results are shown for an offset Cassegrain reflector antenna with different feed illumination tapers and subreflector extensions. It is observed that for subreflector extensions as small as one wavelength, noticeable improvements in the overall efficiencies can be expected. Useful design data are generated for the efficiency curves and far-field patterns.

  5. Gradient Pattern Analysis Applied to Galaxy Morphology

    NASA Astrophysics Data System (ADS)

    Rosa, R. R.; de Carvalho, R. R.; Sautter, R. A.; Barchi, P. H.; Stalder, D. H.; Moura, T. C.; Rembold, S. B.; Morell, D. R. F.; Ferreira, N. C.

    2018-04-01

    Gradient pattern analysis (GPA) is a well-established technique for measuring gradient bilateral asymmetries of a square numerical lattice. This paper introduces an improved version of GPA designed for galaxy morphometry. We show the performance of the new method on a selected sample of 54,896 objects from the SDSS-DR7 in common with Galaxy Zoo 1 catalog. The results suggest that the second gradient moment, G2, has the potential to dramatically improve over more conventional morphometric parameters. It separates early from late type galaxies better (˜90%) than the CAS system (C ˜ 79%, A ˜ 50%, S ˜ 43%) and a benchmark test shows that it is applicable to hundreds of thousands of galaxies using typical processing systems.

  6. FT-IR spectroscopic, thermal analysis of human urinary stones and their characterization

    NASA Astrophysics Data System (ADS)

    Selvaraju, R.; Raja, A.; Thiruppathi, G.

    2015-02-01

    In the present study, FT-IR, XRD, TGA-DTA spectral methods have been used to investigate the chemical compositions of urinary calculi. Multi-components of urinary calculi such as calcium oxalate, hydroxyl apatite, struvite and uric acid have been studied. The chemical compounds are identified by FT-IR spectroscopic technique. The mineral identification was confirmed by powder X-ray diffraction patterns as compared with JCPDS reported values. Thermal analysis techniques are considered the best techniques for the characterization and detection of endothermic and exothermic behaviors of the urinary stones. The percentages of each hydrate (COM and COD) are present together, in the presences of MAPH or UA. Finally, the present study suggests that the Urolithiasis is significant health problem in children, and is very common in some parts of the world, especially in India. So that present study is so useful and helpful to the scientific community for identification of latest human health problems and their remedies using spectroscopic techniques.

  7. High-resolution neutron diffraction study of microstructural changes in nanocrystalline ball-milled niobium carbide NbC{sub 0.93}

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Balagurov, Anatoly M.; Bobrikov, Ivan A.; Bokuchava, Gizo D.

    2015-11-15

    High resolution neutron diffraction was applied for elucidating of the microstructural evolution of nanocrystalline niobium carbide NbC{sub 0.93} powders subjected to high-energy ball milling. The diffraction patterns were collected with the high resolution Fourier diffractometer HRFD by using the reverse time-of-flight (RTOF) mode of data acquisition. The traditional single diffraction line analysis, the Rietveld method and more advanced Whole Powder Pattern Modeling technique were applied for the data analysis. The comparison of these techniques was performed. It is established that short-time milling produces a non-uniform powder, in which two distinct fractions with differing microstructure can be identified. Part of themore » material is in fact milled efficiently, with a reduction in grain size, an increase in the quantity of defects, and a corresponding tendency to decarburize reaching a composition NbC{sub 0.80} after 15 h of milling. The rest of the powder is less efficiently processed and preserves its composition and lower defect content. Larger milling times should have homogenized the system by increasing the efficiently milled fraction, but the material is unable to reach a uniform and homogeneous state. It is definitely shown that RTOF neutron diffraction patterns can provide the very accurate data for microstructure analysis of nanocrystalline powders. - Highlights: • The NbC{sub 0.93} powder was processed by high-energy ball milling. • The microstrain and dislocation density increase with milling time increase. • The corresponding decrease in crystallite size with milling time was observed. • The material exhibits the presence of two fractions after ball milling. • The RTOF neutron diffraction data are suitable for accurate microstructure analysis.« less

  8. Identification of complex metabolic states in critically injured patients using bioinformatic cluster analysis.

    PubMed

    Cohen, Mitchell J; Grossman, Adam D; Morabito, Diane; Knudson, M Margaret; Butte, Atul J; Manley, Geoffrey T

    2010-01-01

    Advances in technology have made extensive monitoring of patient physiology the standard of care in intensive care units (ICUs). While many systems exist to compile these data, there has been no systematic multivariate analysis and categorization across patient physiological data. The sheer volume and complexity of these data make pattern recognition or identification of patient state difficult. Hierarchical cluster analysis allows visualization of high dimensional data and enables pattern recognition and identification of physiologic patient states. We hypothesized that processing of multivariate data using hierarchical clustering techniques would allow identification of otherwise hidden patient physiologic patterns that would be predictive of outcome. Multivariate physiologic and ventilator data were collected continuously using a multimodal bioinformatics system in the surgical ICU at San Francisco General Hospital. These data were incorporated with non-continuous data and stored on a server in the ICU. A hierarchical clustering algorithm grouped each minute of data into 1 of 10 clusters. Clusters were correlated with outcome measures including incidence of infection, multiple organ failure (MOF), and mortality. We identified 10 clusters, which we defined as distinct patient states. While patients transitioned between states, they spent significant amounts of time in each. Clusters were enriched for our outcome measures: 2 of the 10 states were enriched for infection, 6 of 10 were enriched for MOF, and 3 of 10 were enriched for death. Further analysis of correlations between pairs of variables within each cluster reveals significant differences in physiology between clusters. Here we show for the first time the feasibility of clustering physiological measurements to identify clinically relevant patient states after trauma. These results demonstrate that hierarchical clustering techniques can be useful for visualizing complex multivariate data and may provide new insights for the care of critically injured patients.

  9. Phase analysis for three-dimensional surface reconstruction of apples using structured-illumination reflectance imaging

    NASA Astrophysics Data System (ADS)

    Lu, Yuzhen; Lu, Renfu

    2017-05-01

    Three-dimensional (3-D) shape information is valuable for fruit quality evaluation. This study was aimed at developing phase analysis techniques for reconstruction of the 3-D surface of fruit from the pattern images acquired by a structuredillumination reflectance imaging (SIRI) system. Phase-shifted sinusoidal patterns, distorted by the fruit geometry, were acquired and processed through phase demodulation, phase unwrapping and other post-processing procedures to obtain phase difference maps relative to the phase of a reference plane. The phase maps were then transformed into height profiles and 3-D shapes in a world coordinate system based on phase-to-height and in-plane calibrations. A reference plane-based approach, coupled with the curve fitting technique using polynomials of order 3 or higher, was utilized for phase-to-height calibrations, which achieved superior accuracies with the root-mean-squared errors (RMSEs) of 0.027- 0.033 mm for a height measurement range of 0-91 mm. The 3rd-order polynomial curve fitting technique was further tested on two reference blocks with known heights, resulting in relative errors of 3.75% and 4.16%. In-plane calibrations were performed by solving a linear system formed by a number of control points in a calibration object, which yielded a RMSE of 0.311 mm. Tests of the calibrated system for reconstructing the surface of apple samples showed that surface concavities (i.e., stem/calyx regions) could be easily discriminated from bruises from the phase difference maps, reconstructed height profiles and the 3-D shape of apples. This study has laid a foundation for using SIRI for 3-D shape measurement, and thus expanded the capability of the technique for quality evaluation of horticultural products. Further research is needed to utilize the phase analysis techniques for stem/calyx detection of apples, and optimize the phase demodulation and unwrapping algorithms for faster and more reliable detection.

  10. Investigation of a Cross-Correlation Based Optical Strain Measurement Technique for Detecting radial Growth on a Rotating Disk

    NASA Technical Reports Server (NTRS)

    Clem, Michelle M.; Woike, Mark R.

    2013-01-01

    The Aeronautical Sciences Project under NASA`s Fundamental Aeronautics Program is extremely interested in the development of novel measurement technologies, such as optical surface measurements in the internal parts of a flow path, for in situ health monitoring of gas turbine engines. In situ health monitoring has the potential to detect flaws, i.e. cracks in key components, such as engine turbine disks, before the flaws lead to catastrophic failure. In the present study, a cross-correlation imaging technique is investigated in a proof-of-concept study as a possible optical technique to measure the radial growth and strain field on an already cracked sub-scale turbine engine disk under loaded conditions in the NASA Glenn Research Center`s High Precision Rotordynamics Laboratory. The optical strain measurement technique under investigation offers potential fault detection using an applied high-contrast random speckle pattern and imaging the pattern under unloaded and loaded conditions with a CCD camera. Spinning the cracked disk at high speeds induces an external load, resulting in a radial growth of the disk of approximately 50.0-im in the flawed region and hence, a localized strain field. When imaging the cracked disk under static conditions, the disk will be undistorted; however, during rotation the cracked region will grow radially, thus causing the applied particle pattern to be .shifted`. The resulting particle displacements between the two images will then be measured using the two-dimensional cross-correlation algorithms implemented in standard Particle Image Velocimetry (PIV) software to track the disk growth, which facilitates calculation of the localized strain field. In order to develop and validate this optical strain measurement technique an initial proof-of-concept experiment is carried out in a controlled environment. Using PIV optimization principles and guidelines, three potential speckle patterns, for future use on the rotating disk, are developed and investigated in the controlled experiment. A range of known shifts are induced on the patterns; reference and data images are acquired before and after the induced shift, respectively, and the images are processed using the cross-correlation algorithms in order to determine the particle displacements. The effectiveness of each pattern at resolving the known shift is evaluated and discussed in order to choose the most suitable pattern to be implemented onto a rotating disk in the Rotordynamics Lab. Although testing on the rotating disk has not yet been performed, the driving principles behind the development of the present optical technique are based upon critical aspects of the future experiment, such as the amount of expected radial growth, disk analysis, and experimental design and are therefore addressed in the paper.

  11. A comparison of algorithms for inference and learning in probabilistic graphical models.

    PubMed

    Frey, Brendan J; Jojic, Nebojsa

    2005-09-01

    Research into methods for reasoning under uncertainty is currently one of the most exciting areas of artificial intelligence, largely because it has recently become possible to record, store, and process large amounts of data. While impressive achievements have been made in pattern classification problems such as handwritten character recognition, face detection, speaker identification, and prediction of gene function, it is even more exciting that researchers are on the verge of introducing systems that can perform large-scale combinatorial analyses of data, decomposing the data into interacting components. For example, computational methods for automatic scene analysis are now emerging in the computer vision community. These methods decompose an input image into its constituent objects, lighting conditions, motion patterns, etc. Two of the main challenges are finding effective representations and models in specific applications and finding efficient algorithms for inference and learning in these models. In this paper, we advocate the use of graph-based probability models and their associated inference and learning algorithms. We review exact techniques and various approximate, computationally efficient techniques, including iterated conditional modes, the expectation maximization (EM) algorithm, Gibbs sampling, the mean field method, variational techniques, structured variational techniques and the sum-product algorithm ("loopy" belief propagation). We describe how each technique can be applied in a vision model of multiple, occluding objects and contrast the behaviors and performances of the techniques using a unifying cost function, free energy.

  12. High-dimensional inference with the generalized Hopfield model: principal component analysis and corrections.

    PubMed

    Cocco, S; Monasson, R; Sessak, V

    2011-05-01

    We consider the problem of inferring the interactions between a set of N binary variables from the knowledge of their frequencies and pairwise correlations. The inference framework is based on the Hopfield model, a special case of the Ising model where the interaction matrix is defined through a set of patterns in the variable space, and is of rank much smaller than N. We show that maximum likelihood inference is deeply related to principal component analysis when the amplitude of the pattern components ξ is negligible compared to √N. Using techniques from statistical mechanics, we calculate the corrections to the patterns to the first order in ξ/√N. We stress the need to generalize the Hopfield model and include both attractive and repulsive patterns in order to correctly infer networks with sparse and strong interactions. We present a simple geometrical criterion to decide how many attractive and repulsive patterns should be considered as a function of the sampling noise. We moreover discuss how many sampled configurations are required for a good inference, as a function of the system size N and of the amplitude ξ. The inference approach is illustrated on synthetic and biological data.

  13. Application of the empirical orthogonal function to study the rainfall pattern in Daerah Istimewa Yogyakarta province

    NASA Astrophysics Data System (ADS)

    Adi-Kusumo, Fajar; Gunardi, Utami, Herni; Nurjani, Emilya; Sopaheluwakan, Ardhasena; Aluicius, Irwan Endrayanto; Christiawan, Titus

    2016-02-01

    We consider the Empirical Orthogonal Function (EOF) to study the rainfall pattern in Daerah Istimewa Yogyakarta (DIY) Province, Indonesia. The EOF is one of the important methods to study the dominant pattern of the data using dimension reduction technique. EOF makes possible to reduce the huge dimension of observed data into a smaller one without losing its significant information in order to figures the whole data. The methods is also known as Principal Components Analysis (PCA) which is conducted to find the pattern of the data. DIY Province is one of the province in Indonesia which has special characteristics related to the rainfall pattern. This province has an active volcano, karst, highlands, and also some lower area including beach. This province is bounded by the Indonesian ocean which is one of the important factor to provide the rainfall. We use at least ten years rainfall monthly data of all stations in this area and study the rainfall characteristics based on the four regencies of the province. EOF analysis is conducted to analyze data in order to decide the station groups which have similar characters.

  14. A Versatile Method of Patterning Proteins and Cells.

    PubMed

    Shrirao, Anil B; Kung, Frank H; Yip, Derek; Firestein, Bonnie L; Cho, Cheul H; Townes-Anderson, Ellen

    2017-02-26

    Substrate and cell patterning techniques are widely used in cell biology to study cell-to-cell and cell-to-substrate interactions. Conventional patterning techniques work well only with simple shapes, small areas and selected bio-materials. This article describes a method to distribute cell suspensions as well as substrate solutions into complex, long, closed (dead-end) polydimethylsiloxane (PDMS) microchannels using negative pressure. This method enables researchers to pattern multiple substrates including fibronectin, collagen, antibodies (Sal-1), poly-D-lysine (PDL), and laminin. Patterning of substrates allows one to indirectly pattern a variety of cells. We have tested C2C12 myoblasts, the PC12 neuronal cell line, embryonic rat cortical neurons, and amphibian retinal neurons. In addition, we demonstrate that this technique can directly pattern fibroblasts in microfluidic channels via brief application of a low vacuum on cell suspensions. The low vacuum does not significantly decrease cell viability as shown by cell viability assays. Modifications are discussed for application of the method to different cell and substrate types. This technique allows researchers to pattern cells and proteins in specific patterns without the need for exotic materials or equipment and can be done in any laboratory with a vacuum.

  15. Electrical Load Profile Analysis Using Clustering Techniques

    NASA Astrophysics Data System (ADS)

    Damayanti, R.; Abdullah, A. G.; Purnama, W.; Nandiyanto, A. B. D.

    2017-03-01

    Data mining is one of the data processing techniques to collect information from a set of stored data. Every day the consumption of electricity load is recorded by Electrical Company, usually at intervals of 15 or 30 minutes. This paper uses a clustering technique, which is one of data mining techniques to analyse the electrical load profiles during 2014. The three methods of clustering techniques were compared, namely K-Means (KM), Fuzzy C-Means (FCM), and K-Means Harmonics (KHM). The result shows that KHM is the most appropriate method to classify the electrical load profile. The optimum number of clusters is determined using the Davies-Bouldin Index. By grouping the load profile, the demand of variation analysis and estimation of energy loss from the group of load profile with similar pattern can be done. From the group of electric load profile, it can be known cluster load factor and a range of cluster loss factor that can help to find the range of values of coefficients for the estimated loss of energy without performing load flow studies.

  16. Evaluation of the environmental contamination at an abandoned mining site using multivariate statistical techniques--the Rodalquilar (Southern Spain) mining district.

    PubMed

    Bagur, M G; Morales, S; López-Chicano, M

    2009-11-15

    Unsupervised and supervised pattern recognition techniques such as hierarchical cluster analysis, principal component analysis, factor analysis and linear discriminant analysis have been applied to water samples recollected in Rodalquilar mining district (Southern Spain) in order to identify different sources of environmental pollution caused by the abandoned mining industry. The effect of the mining activity on waters was monitored determining the concentration of eleven elements (Mn, Ba, Co, Cu, Zn, As, Cd, Sb, Hg, Au and Pb) by inductively coupled plasma mass spectrometry (ICP-MS). The Box-Cox transformation has been used to transform the data set in normal form in order to minimize the non-normal distribution of the geochemical data. The environmental impact is affected mainly by the mining activity developed in the zone, the acid drainage and finally by the chemical treatment used for the benefit of gold.

  17. Imaging patterns predict patient survival and molecular subtype in glioblastoma via machine learning techniques

    PubMed Central

    Macyszyn, Luke; Akbari, Hamed; Pisapia, Jared M.; Da, Xiao; Attiah, Mark; Pigrish, Vadim; Bi, Yingtao; Pal, Sharmistha; Davuluri, Ramana V.; Roccograndi, Laura; Dahmane, Nadia; Martinez-Lage, Maria; Biros, George; Wolf, Ronald L.; Bilello, Michel; O'Rourke, Donald M.; Davatzikos, Christos

    2016-01-01

    Background MRI characteristics of brain gliomas have been used to predict clinical outcome and molecular tumor characteristics. However, previously reported imaging biomarkers have not been sufficiently accurate or reproducible to enter routine clinical practice and often rely on relatively simple MRI measures. The current study leverages advanced image analysis and machine learning algorithms to identify complex and reproducible imaging patterns predictive of overall survival and molecular subtype in glioblastoma (GB). Methods One hundred five patients with GB were first used to extract approximately 60 diverse features from preoperative multiparametric MRIs. These imaging features were used by a machine learning algorithm to derive imaging predictors of patient survival and molecular subtype. Cross-validation ensured generalizability of these predictors to new patients. Subsequently, the predictors were evaluated in a prospective cohort of 29 new patients. Results Survival curves yielded a hazard ratio of 10.64 for predicted long versus short survivors. The overall, 3-way (long/medium/short survival) accuracy in the prospective cohort approached 80%. Classification of patients into the 4 molecular subtypes of GB achieved 76% accuracy. Conclusions By employing machine learning techniques, we were able to demonstrate that imaging patterns are highly predictive of patient survival. Additionally, we found that GB subtypes have distinctive imaging phenotypes. These results reveal that when imaging markers related to infiltration, cell density, microvascularity, and blood–brain barrier compromise are integrated via advanced pattern analysis methods, they form very accurate predictive biomarkers. These predictive markers used solely preoperative images, hence they can significantly augment diagnosis and treatment of GB patients. PMID:26188015

  18. Comparative study of the hemagglutinin and neuraminidase genes of influenza A virus H3N2, H9N2, and H5N1 subtypes using bioinformatics techniques.

    PubMed

    Ahn, Insung; Son, Hyeon S

    2007-07-01

    To investigate the genomic patterns of influenza A virus subtypes, such as H3N2, H9N2, and H5N1, we collected 1842 sequences of the hemagglutinin and neuraminidase genes from the NCBI database and parsed them into 7 categories: accession number, host species, sampling year, country, subtype, gene name, and sequence. The sequences that were isolated from the human, avian, and swine populations were extracted and stored in a MySQL database for intensive analysis. The GC content and relative synonymous codon usage (RSCU) values were calculated using JAVA codes. As a result, correspondence analysis of the RSCU values yielded the unique codon usage pattern (CUP) of each subtype and revealed no extreme differences among the human, avian, and swine isolates. H5N1 subtype viruses exhibited little variation in CUPs compared with other subtypes, suggesting that the H5N1 CUP has not yet undergone significant changes within each host species. Moreover, some observations may be relevant to CUP variation that has occurred over time among the H3N2 subtype viruses isolated from humans. All the sequences were divided into 3 groups over time, and each group seemed to have preferred synonymous codon patterns for each amino acid, especially for arginine, glycine, leucine, and valine. The bioinformatics technique we introduce in this study may be useful in predicting the evolutionary patterns of pandemic viruses.

  19. Predicting and Detecting Emerging Cyberattack Patterns Using StreamWorks

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chin, George; Choudhury, Sutanay; Feo, John T.

    2014-06-30

    The number and sophistication of cyberattacks on industries and governments have dramatically grown in recent years. To counter this movement, new advanced tools and techniques are needed to detect cyberattacks in their early stages such that defensive actions may be taken to avert or mitigate potential damage. From a cybersecurity analysis perspective, detecting cyberattacks may be cast as a problem of identifying patterns in computer network traffic. Logically and intuitively, these patterns may take on the form of a directed graph that conveys how an attack or intrusion propagates through the computers of a network. Such cyberattack graphs could providemore » cybersecurity analysts with powerful conceptual representations that are natural to express and analyze. We have been researching and developing graph-centric approaches and algorithms for dynamic cyberattack detection. The advanced dynamic graph algorithms we are developing will be packaged into a streaming network analysis framework known as StreamWorks. With StreamWorks, a scientist or analyst may detect and identify precursor events and patterns as they emerge in complex networks. This analysis framework is intended to be used in a dynamic environment where network data is streamed in and is appended to a large-scale dynamic graph. Specific graphical query patterns are decomposed and collected into a graph query library. The individual decomposed subpatterns in the library are continuously and efficiently matched against the dynamic graph as it evolves to identify and detect early, partial subgraph patterns. The scalable emerging subgraph pattern algorithms will match on both structural and semantic network properties.« less

  20. Diagnosing Developmental Dyscalculia on the Basis of Reliable Single Case FMRI Methods: Promises and Limitations

    PubMed Central

    Dinkel, Philipp Johannes; Willmes, Klaus; Krinzinger, Helga; Konrad, Kerstin; Koten Jr, Jan Willem

    2013-01-01

    FMRI-studies are mostly based on a group study approach, either analyzing one group or comparing multiple groups, or on approaches that correlate brain activation with clinically relevant criteria or behavioral measures. In this study we investigate the potential of fMRI-techniques focusing on individual differences in brain activation within a test-retest reliability context. We employ a single-case analysis approach, which contrasts dyscalculic children with a control group of typically developing children. In a second step, a support-vector machine analysis and cluster analysis techniques served to investigate similarities in multivariate brain activation patterns. Children were confronted with a non-symbolic number comparison and a non-symbolic exact calculation task during fMRI acquisition. Conventional second level group comparison analysis only showed small differences around the angular gyrus bilaterally and the left parieto-occipital sulcus. Analyses based on single-case statistical procedures revealed that developmental dyscalculia is characterized by individual differences predominantly in visual processing areas. Dyscalculic children seemed to compensate for relative under-activation in the primary visual cortex through an upregulation in higher visual areas. However, overlap in deviant activation was low for the dyscalculic children, indicating that developmental dyscalculia is a disorder characterized by heterogeneous brain activation differences. Using support vector machine analysis and cluster analysis, we tried to group dyscalculic and typically developing children according to brain activation. Fronto-parietal systems seem to qualify for a distinction between the two groups. However, this was only effective when reliable brain activations of both tasks were employed simultaneously. Results suggest that deficits in number representation in the visual-parietal cortex get compensated for through finger related aspects of number representation in fronto-parietal cortex. We conclude that dyscalculic children show large individual differences in brain activation patterns. Nonetheless, the majority of dyscalculic children can be differentiated from controls employing brain activation patterns when appropriate methods are used. PMID:24349547

  1. Diagnosing developmental dyscalculia on the basis of reliable single case FMRI methods: promises and limitations.

    PubMed

    Dinkel, Philipp Johannes; Willmes, Klaus; Krinzinger, Helga; Konrad, Kerstin; Koten, Jan Willem

    2013-01-01

    FMRI-studies are mostly based on a group study approach, either analyzing one group or comparing multiple groups, or on approaches that correlate brain activation with clinically relevant criteria or behavioral measures. In this study we investigate the potential of fMRI-techniques focusing on individual differences in brain activation within a test-retest reliability context. We employ a single-case analysis approach, which contrasts dyscalculic children with a control group of typically developing children. In a second step, a support-vector machine analysis and cluster analysis techniques served to investigate similarities in multivariate brain activation patterns. Children were confronted with a non-symbolic number comparison and a non-symbolic exact calculation task during fMRI acquisition. Conventional second level group comparison analysis only showed small differences around the angular gyrus bilaterally and the left parieto-occipital sulcus. Analyses based on single-case statistical procedures revealed that developmental dyscalculia is characterized by individual differences predominantly in visual processing areas. Dyscalculic children seemed to compensate for relative under-activation in the primary visual cortex through an upregulation in higher visual areas. However, overlap in deviant activation was low for the dyscalculic children, indicating that developmental dyscalculia is a disorder characterized by heterogeneous brain activation differences. Using support vector machine analysis and cluster analysis, we tried to group dyscalculic and typically developing children according to brain activation. Fronto-parietal systems seem to qualify for a distinction between the two groups. However, this was only effective when reliable brain activations of both tasks were employed simultaneously. Results suggest that deficits in number representation in the visual-parietal cortex get compensated for through finger related aspects of number representation in fronto-parietal cortex. We conclude that dyscalculic children show large individual differences in brain activation patterns. Nonetheless, the majority of dyscalculic children can be differentiated from controls employing brain activation patterns when appropriate methods are used.

  2. Calibration and validation of projection lithography in chemically amplified resist systems using fluorescence imaging

    NASA Astrophysics Data System (ADS)

    Mason, Michael D.; Ray, Krishanu; Feke, Gilbert D.; Grober, Robert D.; Pohlers, Gerd; Cameron, James F.

    2003-05-01

    Coumarin 6 (C6), a pH sensitive fluorescent molecule were doped into commercial resist systems to demonstrate a cost-effective fluorescence microscopy technique for detecting latent photoacid images in exposed chemically amplified resist films. The fluorescenec image contrast is optimized by carefully selecting optical filters to match the spectroscopic properties of C6 in the resist matrices. We demonstrate the potential of this technique for two sepcific non-invasive applications. First, a fast, conventient, fluorescence technique is demonstrated for determination of quantum yeidsl of photo-acid generation. Since the Ka of C6 in the 193nm resist system lies wihtin the range of acid concentrations that can be photogenerated, we have used this technique to evaluate the acid generation efficiency of various photo-acid generators (PAGs). The technique is based on doping the resist formulations containing the candidate PAGs with C6, coating one wafer per PAG, patterning the wafer with a dose ramp and spectroscopically imaging the wafers. The fluorescence of each pattern in the dose ramp is measured as a single image and analyzed with the optical titration model. Second, a nondestructive in-line diagnostic technique is developed for the focus calibration and validation of a projection lithography system. Our experimental results show excellent correlation between the fluorescence images and scanning electron microscope analysis of developed features. This technique has successfully been applied in both deep UV resists e.g., Shipley UVIIHS resist and 193 nm resists e.g., Shipley Vema-type resist. This method of focus calibration has also been extended to samples with feature sizes below the diffraction limit where the pitch between adjacent features is on the order of 300 nm. Image capture, data analysis, and focus latitude verification are all computer controlled from a single hardware/software platform. Typical focus calibration curves can be obtained within several minutes.

  3. Bio-speckle assessment of bruising in fruits

    NASA Astrophysics Data System (ADS)

    Pajuelo, M.; Baldwin, G.; Rabal, H.; Cap, N.; Arizaga, R.; Trivi, M.

    2003-07-01

    The dynamic speckle patterns or bio-speckle is a phenomenon produced by laser illumination of active materials, such as a biological tissue. Fruits, even hard peel ones, show a speckle activity that can be related to maturity, turgor, damage, aging, and mechanical properties. In this case, we suggest a bio-speckle technique as a potential methodology for the study of impact on apples and the analysis of bruises produced by them. The aim is to correlate physical properties of apples with quality factors using a non-contact and non-invasive technique.

  4. Bilateral symmetry aspects in computer-aided Alzheimer's disease diagnosis by single-photon emission-computed tomography imaging.

    PubMed

    Illán, Ignacio Alvarez; Górriz, Juan Manuel; Ramírez, Javier; Lang, Elmar W; Salas-Gonzalez, Diego; Puntonet, Carlos G

    2012-11-01

    This paper explores the importance of the latent symmetry of the brain in computer-aided systems for diagnosing Alzheimer's disease (AD). Symmetry and asymmetry are studied from two points of view: (i) the development of an effective classifier within the scope of machine learning techniques, and (ii) the assessment of its relevance to the AD diagnosis in the early stages of the disease. The proposed methodology is based on eigenimage decomposition of single-photon emission-computed tomography images, using an eigenspace extension to accommodate odd and even eigenvectors separately. This feature extraction technique allows for support-vector-machine classification and image analysis. Identification of AD patterns is improved when the latent symmetry of the brain is considered, with an estimated 92.78% accuracy (92.86% sensitivity, 92.68% specificity) using a linear kernel and a leave-one-out cross validation strategy. Also, asymmetries may be used to define a test for AD that is very specific (90.24% specificity) but not especially sensitive. Two main conclusions are derived from the analysis of the eigenimage spectrum. Firstly, the recognition of AD patterns is improved when considering only the symmetric part of the spectrum. Secondly, asymmetries in the hypo-metabolic patterns, when present, are more pronounced in subjects with AD. Copyright © 2012 Elsevier B.V. All rights reserved.

  5. Semi-supervised clustering for parcellating brain regions based on resting state fMRI data

    NASA Astrophysics Data System (ADS)

    Cheng, Hewei; Fan, Yong

    2014-03-01

    Many unsupervised clustering techniques have been adopted for parcellating brain regions of interest into functionally homogeneous subregions based on resting state fMRI data. However, the unsupervised clustering techniques are not able to take advantage of exiting knowledge of the functional neuroanatomy readily available from studies of cytoarchitectonic parcellation or meta-analysis of the literature. In this study, we propose a semi-supervised clustering method for parcellating amygdala into functionally homogeneous subregions based on resting state fMRI data. Particularly, the semi-supervised clustering is implemented under the framework of graph partitioning, and adopts prior information and spatial consistent constraints to obtain a spatially contiguous parcellation result. The graph partitioning problem is solved using an efficient algorithm similar to the well-known weighted kernel k-means algorithm. Our method has been validated for parcellating amygdala into 3 subregions based on resting state fMRI data of 28 subjects. The experiment results have demonstrated that the proposed method is more robust than unsupervised clustering and able to parcellate amygdala into centromedial, laterobasal, and superficial parts with improved functionally homogeneity compared with the cytoarchitectonic parcellation result. The validity of the parcellation results is also supported by distinctive functional and structural connectivity patterns of the subregions and high consistency between coactivation patterns derived from a meta-analysis and functional connectivity patterns of corresponding subregions.

  6. Short-term forecasts gain in accuracy. [Regression technique using ''Box-Jenkins'' analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    Box-Jenkins time-series models offer accuracy for short-term forecasts that compare with large-scale macroeconomic forecasts. Utilities need to be able to forecast peak demand in order to plan their generating, transmitting, and distribution systems. This new method differs from conventional models by not assuming specific data patterns, but by fitting available data into a tentative pattern on the basis of auto-correlations. Three types of models (autoregressive, moving average, or mixed autoregressive/moving average) can be used according to which provides the most appropriate combination of autocorrelations and related derivatives. Major steps in choosing a model are identifying potential models, estimating the parametersmore » of the problem, and running a diagnostic check to see if the model fits the parameters. The Box-Jenkins technique is well suited for seasonal patterns, which makes it possible to have as short as hourly forecasts of load demand. With accuracy up to two years, the method will allow electricity price-elasticity forecasting that can be applied to facility planning and rate design. (DCK)« less

  7. Fingerprinting of HLA class I genes for improved selection of unrelated bone marrow donors.

    PubMed

    Martinelli, G; Farabegoli, P; Buzzi, M; Panzica, G; Zaccaria, A; Bandini, G; Calori, E; Testoni, N; Rosti, G; Conte, R; Remiddi, C; Salvucci, M; De Vivo, A; Tura, S

    1996-02-01

    The degree of matching of HLA genes between the selected donor and recipient is an important aspect of the selection of unrelated donors for allogeneic bone marrow transplantation (UBMT). The most sensitive methods currently used are serological typing of HLA class I genes, mixed lymphocyte culture (MLC), IEF and molecular genotyping of HLA class II genes by direct sequencing of PCR products. Serological typing of class I antigenes (A, B and C) fails to detect minor differences demonstrated by direct sequencing of DNA polymorphic regions. Molecular genotyping of HLA class I genes by DNA analysis is costly and work-intensive. To improve compatibility between donor and recipient, we have set up a new rapid and non-radioisotopic application of the 'fingerprinting PCR' technique for the analysis of the polymorphic second exon of the HLA class I A, B and C genes. This technique is based on the formation of specific patterns (PCR fingerprints) of homoduplexes and heteroduplexes between heterologous amplified DNA sequences. After an electrophoretic run on non-denaturing polyacrylamide gel, different HLA class I types give allele-specific banding patterns. HLA class I matching is performed, after the gel has been soaked in ethidium bromide or silver-stained, by visual comparison of patients' fingerprints with those of donors. Identity can be confirmed by mixing donor and recipient DNAs in an amplification cross-match. To assess the technique, 10 normal samples, 22 related allogeneic bone marrow transplanted pairs and 10 unrelated HLA-A and HLA-B serologically matched patient-donor pairs were analysed for HLA class I polymorphic regions. In all the related pairs and in 1/10 unrelated pairs, matched donor-recipient patterns were identified. This new application of PCR fingerprinting may confirm the HLA class I serological selection of unrelated marrow donors.

  8. Z-Scan Analysis: a New Method to Determine the Oxidative State of Low-Density Lipoprotein and Its Association with Multiple Cardiometabolic Biomarkers

    NASA Astrophysics Data System (ADS)

    de Freitas, Maria Camila Pruper; Figueiredo Neto, Antonio Martins; Giampaoli, Viviane; da Conceição Quintaneiro Aubin, Elisete; de Araújo Lima Barbosa, Milena Maria; Damasceno, Nágila Raquel Teixeira

    2016-04-01

    The great atherogenic potential of oxidized low-density lipoprotein has been widely described in the literature. The objective of this study was to investigate whether the state of oxidized low-density lipoprotein in human plasma measured by the Z-scan technique has an association with different cardiometabolic biomarkers. Total cholesterol, high-density lipoprotein cholesterol, triacylglycerols, apolipoprotein A-I and apolipoprotein B, paraoxonase-1, and glucose were analyzed using standard commercial kits, and low-density lipoprotein cholesterol was estimated using the Friedewald equation. A sandwich enzyme-linked immunosorbent assay was used to detect electronegative low-density lipoprotein. Low-density lipoprotein and high-density lipoprotein sizes were determined by Lipoprint® system. The Z-scan technique was used to measure the non-linear optical response of low-density lipoprotein solution. Principal component analysis and correlations were used respectively to resize the data from the sample and test association between the θ parameter, measured with the Z-scan technique, and the principal component. A total of 63 individuals, from both sexes, with mean age 52 years (±11), being overweight and having high levels of total cholesterol and low levels of high-density lipoprotein cholesterol, were enrolled in this study. A positive correlation between the θ parameter and more anti-atherogenic pattern for cardiometabolic biomarkers together with a negative correlation for an atherogenic pattern was found. Regarding the parameters related with an atherogenic low-density lipoprotein profile, the θ parameter was negatively correlated with a more atherogenic pattern. By using Z-scan measurements, we were able to find an association between oxidized low-density lipoprotein state and multiple cardiometabolic biomarkers in samples from individuals with different cardiovascular risk factors.

  9. Highly selective creation of hydrophilic micro-craters on super hydrophobic surface using electrohydrodynamic jet printing

    NASA Astrophysics Data System (ADS)

    Lee, Jaehyun; Hwang, Sangyeon; Prasetyo, Fariza Dian; Nguyen, Vu Dat; Hong, Jungwoo; Shin, Jennifer H.; Byun, Doyoung

    2014-11-01

    Selective surface modification is considered as an alternative to conventional printing techniques in high resolution patterning. Here, we present fabrication of hydrophilic patterns on the super hydrophobic surface, which makes structure on the hydrophilic region. The super hydrophobic surface is able to be chemically changed to hydrophilic with alcohols. As a consecutive process, electrohydrodynamic (EHD) jet printing was utilized to fabricate local hydrophilic craters with 30-200 μm sizes. 3 kinds of target liquids were deposited well on hydrophilic region; PEDOT (poly 3,4 ethylenediocythiophene), polystyrene nano-particles, and salmonella bacteria medium. Additionally, qualitative analysis were presented for modification mechanism and surface properties on super hydrophobic/hydrophilic by analysis of surface energy with contact angle, SEM (scanning electron microscopy) image, and SIMS (secondary ion mass spectroscopy) analysis. This new simple modification method provides possibility to be utilizing in bio-patterning engineering such as cell culturing microchip and lab on a chip. This research was supported by the Basi Science Research Program through the National Research Foundation of Korea (NRF) (Grand Number: 2014-023284).

  10. Techniques for Fault Detection and Visualization of Telemetry Dependence Relationships for Root Cause Fault Analysis in Complex Systems

    NASA Astrophysics Data System (ADS)

    Guy, Nathaniel

    This thesis explores new ways of looking at telemetry data, from a time-correlative perspective, in order to see patterns within the data that may suggest root causes of system faults. It was thought initially that visualizing an animated Pearson Correlation Coefficient (PCC) matrix for telemetry channels would be sufficient to give new understanding; however, testing showed that the high dimensionality and inability to easily look at change over time in this approach impeded understanding. Different correlative techniques, combined with the time curve visualization proposed by Bach et al (2015), were adapted to visualize both raw telemetry and telemetry data correlations. Review revealed that these new techniques give insights into the data, and an intuitive grasp of data families, which show the effectiveness of this approach for enhancing system understanding and assisting with root cause analysis for complex aerospace systems.

  11. Geographic techniques and recent applications of remote sensing to landscape-water quality studies

    USGS Publications Warehouse

    Griffith, J.A.

    2002-01-01

    This article overviews recent advances in studies of landscape-water quality relationships using remote sensing techniques. With the increasing feasibility of using remotely-sensed data, landscape-water quality studies can now be more easily performed on regional, multi-state scales. The traditional method of relating land use and land cover to water quality has been extended to include landscape pattern and other landscape information derived from satellite data. Three items are focused on in this article: 1) the increasing recognition of the importance of larger-scale studies of regional water quality that require a landscape perspective; 2) the increasing importance of remotely sensed data, such as the imagery-derived normalized difference vegetation index (NDVI) and vegetation phenological metrics derived from time-series NDVI data; and 3) landscape pattern. In some studies, using landscape pattern metrics explained some of the variation in water quality not explained by land use/cover. However, in some other studies, the NDVI metrics were even more highly correlated to certain water quality parameters than either landscape pattern metrics or land use/cover proportions. Although studies relating landscape pattern metrics to water quality have had mixed results, this recent body of work applying these landscape measures and satellite-derived metrics to water quality analysis has demonstrated their potential usefulness in monitoring watershed conditions across large regions.

  12. Artificial intelligence in sports on the example of weight training.

    PubMed

    Novatchkov, Hristo; Baca, Arnold

    2013-01-01

    The overall goal of the present study was to illustrate the potential of artificial intelligence (AI) techniques in sports on the example of weight training. The research focused in particular on the implementation of pattern recognition methods for the evaluation of performed exercises on training machines. The data acquisition was carried out using way and cable force sensors attached to various weight machines, thereby enabling the measurement of essential displacement and force determinants during training. On the basis of the gathered data, it was consequently possible to deduce other significant characteristics like time periods or movement velocities. These parameters were applied for the development of intelligent methods adapted from conventional machine learning concepts, allowing an automatic assessment of the exercise technique and providing individuals with appropriate feedback. In practice, the implementation of such techniques could be crucial for the investigation of the quality of the execution, the assistance of athletes but also coaches, the training optimization and for prevention purposes. For the current study, the data was based on measurements from 15 rather inexperienced participants, performing 3-5 sets of 10-12 repetitions on a leg press machine. The initially preprocessed data was used for the extraction of significant features, on which supervised modeling methods were applied. Professional trainers were involved in the assessment and classification processes by analyzing the video recorded executions. The so far obtained modeling results showed good performance and prediction outcomes, indicating the feasibility and potency of AI techniques in assessing performances on weight training equipment automatically and providing sportsmen with prompt advice. Key pointsArtificial intelligence is a promising field for sport-related analysis.Implementations integrating pattern recognition techniques enable the automatic evaluation of data measurements.Artificial neural networks applied for the analysis of weight training data show good performance and high classification rates.

  13. Artificial Intelligence in Sports on the Example of Weight Training

    PubMed Central

    Novatchkov, Hristo; Baca, Arnold

    2013-01-01

    The overall goal of the present study was to illustrate the potential of artificial intelligence (AI) techniques in sports on the example of weight training. The research focused in particular on the implementation of pattern recognition methods for the evaluation of performed exercises on training machines. The data acquisition was carried out using way and cable force sensors attached to various weight machines, thereby enabling the measurement of essential displacement and force determinants during training. On the basis of the gathered data, it was consequently possible to deduce other significant characteristics like time periods or movement velocities. These parameters were applied for the development of intelligent methods adapted from conventional machine learning concepts, allowing an automatic assessment of the exercise technique and providing individuals with appropriate feedback. In practice, the implementation of such techniques could be crucial for the investigation of the quality of the execution, the assistance of athletes but also coaches, the training optimization and for prevention purposes. For the current study, the data was based on measurements from 15 rather inexperienced participants, performing 3-5 sets of 10-12 repetitions on a leg press machine. The initially preprocessed data was used for the extraction of significant features, on which supervised modeling methods were applied. Professional trainers were involved in the assessment and classification processes by analyzing the video recorded executions. The so far obtained modeling results showed good performance and prediction outcomes, indicating the feasibility and potency of AI techniques in assessing performances on weight training equipment automatically and providing sportsmen with prompt advice. Key points Artificial intelligence is a promising field for sport-related analysis. Implementations integrating pattern recognition techniques enable the automatic evaluation of data measurements. Artificial neural networks applied for the analysis of weight training data show good performance and high classification rates. PMID:24149722

  14. Discrimination among populations of sockeye salmon fry with Fourier analysis of otolith banding patterns formed during incubation

    USGS Publications Warehouse

    Finn, James E.; Burger, Carl V.; Holland-Bartels, Leslie E.

    1997-01-01

    We used otolith banding patterns formed during incubation to discriminate among hatchery- and wild-incubated fry of sockeye salmon Oncorhynchus nerka from Tustumena Lake, Alaska. Fourier analysis of otolith luminance profiles was used to describe banding patterns: the amplitudes of individual Fourier harmonics were discriminant variables. Correct classification of otoliths to either hatchery or wild origin was 83.1% (cross-validation) and 72.7% (test data) with the use of quadratic discriminant function analysts on 10 Fourier amplitudes. Overall classification rates among the six test groups (one hatchery and five wild groups) were 46.5% (cross-validation) and 39.3% (test data) with the use of linear discriminant function analysis on 16 Fourier amplitudes. Although classification rates for wild-incubated fry from any one site never exceeded 67% (cross-validation) or 60% (test data), location-specific information was evident for all groups because the probability of classifying an individual to its true incubation location was significantly greater than chance. Results indicate phenotypic differences in otolith microstructure among incubation sites separated by less than 10 km. Analysis of otolith luminance profiles is a potentially useful technique for discriminating among and between various populations of hatchery and wild fish.

  15. Applications of Remote Sensing and GIS(Geographic Information System) in Crime Analysis of Gujranwala City.

    NASA Astrophysics Data System (ADS)

    Munawar, Iqra

    2016-07-01

    Crime mapping is a dynamic process. It can be used to assist all stages of the problem solving process. Mapping crime can help police protect citizens more effectively. The decision to utilize a certain type of map or design element may change based on the purpose of a map, the audience or the available data. If the purpose of the crime analysis map is to assist in the identification of a particular problem, selected data may be mapped to identify patterns of activity that have been previously undetected. The main objective of this research was to study the spatial distribution patterns of the four common crimes i.e Narcotics, Arms, Burglary and Robbery in Gujranwala City using spatial statistical techniques to identify the hotspots. Hotspots or location of clusters were identified using Getis-Ord Gi* Statistic. Crime analysis mapping can be used to conduct a comprehensive spatial analysis of the problem. Graphic presentations of such findings provide a powerful medium to communicate conditions, patterns and trends thus creating an avenue for analysts to bring about significant policy changes. Moreover Crime mapping also helps in the reduction of crime rate.

  16. Decoding auditory spatial and emotional information encoding using multivariate versus univariate techniques.

    PubMed

    Kryklywy, James H; Macpherson, Ewan A; Mitchell, Derek G V

    2018-04-01

    Emotion can have diverse effects on behaviour and perception, modulating function in some circumstances, and sometimes having little effect. Recently, it was identified that part of the heterogeneity of emotional effects could be due to a dissociable representation of emotion in dual pathway models of sensory processing. Our previous fMRI experiment using traditional univariate analyses showed that emotion modulated processing in the auditory 'what' but not 'where' processing pathway. The current study aims to further investigate this dissociation using a more recently emerging multi-voxel pattern analysis searchlight approach. While undergoing fMRI, participants localized sounds of varying emotional content. A searchlight multi-voxel pattern analysis was conducted to identify activity patterns predictive of sound location and/or emotion. Relative to the prior univariate analysis, MVPA indicated larger overlapping spatial and emotional representations of sound within early secondary regions associated with auditory localization. However, consistent with the univariate analysis, these two dimensions were increasingly segregated in late secondary and tertiary regions of the auditory processing streams. These results, while complimentary to our original univariate analyses, highlight the utility of multiple analytic approaches for neuroimaging, particularly for neural processes with known representations dependent on population coding.

  17. Exploring the CAESAR database using dimensionality reduction techniques

    NASA Astrophysics Data System (ADS)

    Mendoza-Schrock, Olga; Raymer, Michael L.

    2012-06-01

    The Civilian American and European Surface Anthropometry Resource (CAESAR) database containing over 40 anthropometric measurements on over 4000 humans has been extensively explored for pattern recognition and classification purposes using the raw, original data [1-4]. However, some of the anthropometric variables would be impossible to collect in an uncontrolled environment. Here, we explore the use of dimensionality reduction methods in concert with a variety of classification algorithms for gender classification using only those variables that are readily observable in an uncontrolled environment. Several dimensionality reduction techniques are employed to learn the underlining structure of the data. These techniques include linear projections such as the classical Principal Components Analysis (PCA) and non-linear (manifold learning) techniques, such as Diffusion Maps and the Isomap technique. This paper briefly describes all three techniques, and compares three different classifiers, Naïve Bayes, Adaboost, and Support Vector Machines (SVM), for gender classification in conjunction with each of these three dimensionality reduction approaches.

  18. Basic research planning in mathematical pattern recognition and image analysis

    NASA Technical Reports Server (NTRS)

    Bryant, J.; Guseman, L. F., Jr.

    1981-01-01

    Fundamental problems encountered while attempting to develop automated techniques for applications of remote sensing are discussed under the following categories: (1) geometric and radiometric preprocessing; (2) spatial, spectral, temporal, syntactic, and ancillary digital image representation; (3) image partitioning, proportion estimation, and error models in object scene interference; (4) parallel processing and image data structures; and (5) continuing studies in polarization; computer architectures and parallel processing; and the applicability of "expert systems" to interactive analysis.

  19. Macromolecular structure of coals. 6. Mass spectroscopic analysis of coal-derived liquids

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hooker, D.T.; Lucht, L.M.; Peppas, N.A.

    1986-02-01

    The macromolecular structure of coal networks was analyzed by depolymerizing coal samples using the Sternberg reductive alkylation and the Miyake alkylation techniques. Electron impact mass spectra showed peaks of greater abundance of 125-132, 252-260, 383-391, and 511-520 m/z ratios. Based on analysis of the patterns of the spectra, the cluster size of the cross-linked structure of bituminous coals was determined as 126-130. Various chemical species were identified.

  20. How does modifying a DEM to reflect known hydrology affect subsequent terrain analysis?

    NASA Astrophysics Data System (ADS)

    Callow, John Nikolaus; Van Niel, Kimberly P.; Boggs, Guy S.

    2007-01-01

    SummaryMany digital elevation models (DEMs) have difficulty replicating hydrological patterns in flat landscapes. Efforts to improve DEM performance in replicating known hydrology have included a variety of soft (i.e. algorithm-based approaches) and hard techniques, such as " Stream burning" or "surface reconditioning" (e.g. Agree or ANUDEM). Using a representation of the known stream network, these methods trench or mathematically warp the original DEM to improve how accurately stream position, stream length and catchment boundaries replicate known hydrological conditions. However, these techniques permanently alter the DEM and may affect further analyses (e.g. slope). This paper explores the impact that commonly used hydrological correction methods ( Stream burning, Agree.aml and ANUDEM v4.6.3 and ANUDEM v5.1) have on the overall nature of a DEM, finding that different methods produce non-convergent outcomes for catchment parameters (such as catchment boundaries, stream position and length), and differentially compromise secondary terrain analysis. All hydrological correction methods successfully improved calculation of catchment area, stream position and length as compared to using the DEM without any modification, but they all increased catchment slope. No single method performing best across all categories. Different hydrological correction methods changed elevation and slope in different spatial patterns and magnitudes, compromising the ability to derive catchment parameters and conduct secondary terrain analysis from a single DEM. Modification of a DEM to better reflect known hydrology can be useful, however knowledge of the magnitude and spatial pattern of the changes are required before using a DEM for subsequent analyses.

  1. Applications of artificial neural network in AIDS research and therapy.

    PubMed

    Sardari, S; Sardari, D

    2002-01-01

    In recent years considerable effort has been devoted to applying pattern recognition techniques to the complex task of data analysis in drug research. Artificial neural networks (ANN) methodology is a modeling method with great ability to adapt to a new situation, or control an unknown system, using data acquired in previous experiments. In this paper, a brief history of ANN and the basic concepts behind the computing, the mathematical and algorithmic formulation of each of the techniques, and their developmental background is presented. Based on the abilities of ANNs in pattern recognition and estimation of system outputs from the known inputs, the neural network can be considered as a tool for molecular data analysis and interpretation. Analysis by neural networks improves the classification accuracy, data quantification and reduces the number of analogues necessary for correct classification of biologically active compounds. Conformational analysis and quantifying the components in mixtures using NMR spectra, aqueous solubility prediction and structure-activity correlation are among the reported applications of ANN as a new modeling method. Ranging from drug design and discovery to structure and dosage form design, the potential pharmaceutical applications of the ANN methodology are significant. In the areas of clinical monitoring, utilization of molecular simulation and design of bioactive structures, ANN would make the study of the status of the health and disease possible and brings their predicted chemotherapeutic response closer to reality.

  2. CpG PatternFinder: a Windows-based utility program for easy and rapid identification of the CpG methylation status of DNA.

    PubMed

    Xu, Yi-Hua; Manoharan, Herbert T; Pitot, Henry C

    2007-09-01

    The bisulfite genomic sequencing technique is one of the most widely used techniques to study sequence-specific DNA methylation because of its unambiguous ability to reveal DNA methylation status to the order of a single nucleotide. One characteristic feature of the bisulfite genomic sequencing technique is that a number of sample sequence files will be produced from a single DNA sample. The PCR products of bisulfite-treated DNA samples cannot be sequenced directly because they are heterogeneous in nature; therefore they should be cloned into suitable plasmids and then sequenced. This procedure generates an enormous number of sample DNA sequence files as well as adding extra bases belonging to the plasmids to the sequence, which will cause problems in the final sequence comparison. Finding the methylation status for each CpG in each sample sequence is not an easy job. As a result CpG PatternFinder was developed for this purpose. The main functions of the CpG PatternFinder are: (i) to analyze the reference sequence to obtain CpG and non-CpG-C residue position information. (ii) To tailor sample sequence files (delete insertions and mark deletions from the sample sequence files) based on a configuration of ClustalW multiple alignment. (iii) To align sample sequence files with a reference file to obtain bisulfite conversion efficiency and CpG methylation status. And, (iv) to produce graphics, highlighted aligned sequence text and a summary report which can be easily exported to Microsoft Office suite. CpG PatternFinder is designed to operate cooperatively with BioEdit, a freeware on the internet. It can handle up to 100 files of sample DNA sequences simultaneously, and the total CpG pattern analysis process can be finished in minutes. CpG PatternFinder is an ideal software tool for DNA methylation studies to determine the differential methylation pattern in a large number of individuals in a population. Previously we developed the CpG Analyzer program; CpG PatternFinder is our further effort to create software tools for DNA methylation studies.

  3. Automating security monitoring and analysis for Space Station Freedom's electric power system

    NASA Technical Reports Server (NTRS)

    Dolce, James L.; Sobajic, Dejan J.; Pao, Yoh-Han

    1990-01-01

    Operating a large, space power system requires classifying the system's status and analyzing its security. Conventional algorithms are used by terrestrial electric utilities to provide such information to their dispatchers, but their application aboard Space Station Freedom will consume too much processing time. A new approach for monitoring and analysis using adaptive pattern techniques is presented. This approach yields an on-line security monitoring and analysis algorithm that is accurate and fast; and thus, it can free the Space Station Freedom's power control computers for other tasks.

  4. Automating security monitoring and analysis for Space Station Freedom's electric power system

    NASA Technical Reports Server (NTRS)

    Dolce, James L.; Sobajic, Dejan J.; Pao, Yoh-Han

    1990-01-01

    Operating a large, space power system requires classifying the system's status and analyzing its security. Conventional algorithms are used by terrestrial electric utilities to provide such information to their dispatchers, but their application aboard Space Station Freedom will consume too much processing time. A novel approach for monitoring and analysis using adaptive pattern techniques is presented. This approach yields an on-line security monitoring and analysis algorithm that is accurate and fast; and thus, it can free the Space Station Freedom's power control computers for other tasks.

  5. Different techniques of multispectral data analysis for vegetation fraction retrieval

    NASA Astrophysics Data System (ADS)

    Kancheva, Rumiana; Georgiev, Georgi

    2012-07-01

    Vegetation monitoring is one of the most important applications of remote sensing technologies. In respect to farmlands, the assessment of crop condition constitutes the basis of growth, development, and yield processes monitoring. Plant condition is defined by a set of biometric variables, such as density, height, biomass amount, leaf area index, and etc. The canopy cover fraction is closely related to these variables, and is state-indicative of the growth process. At the same time it is a defining factor of the soil-vegetation system spectral signatures. That is why spectral mixtures decomposition is a primary objective in remotely sensed data processing and interpretation, specifically in agricultural applications. The actual usefulness of the applied methods depends on their prediction reliability. The goal of this paper is to present and compare different techniques for quantitative endmember extraction from soil-crop patterns reflectance. These techniques include: linear spectral unmixing, two-dimensional spectra analysis, spectral ratio analysis (vegetation indices), spectral derivative analysis (red edge position), colorimetric analysis (tristimulus values sum, chromaticity coordinates and dominant wavelength). The objective is to reveal their potential, accuracy and robustness for plant fraction estimation from multispectral data. Regression relationships have been established between crop canopy cover and various spectral estimators.

  6. Multivariate analysis of DSC-XRD simultaneous measurement data: a study of multistage crystalline structure changes in a linear poly(ethylene imine) thin film.

    PubMed

    Kakuda, Hiroyuki; Okada, Tetsuo; Otsuka, Makoto; Katsumoto, Yukiteru; Hasegawa, Takeshi

    2009-01-01

    A multivariate analytical technique has been applied to the analysis of simultaneous measurement data from differential scanning calorimetry (DSC) and X-ray diffraction (XRD) in order to study thermal changes in crystalline structure of a linear poly(ethylene imine) (LPEI) film. A large number of XRD patterns generated from the simultaneous measurements were subjected to an augmented alternative least-squares (ALS) regression analysis, and the XRD patterns were readily decomposed into chemically independent XRD patterns and their thermal profiles were also obtained at the same time. The decomposed XRD patterns and the profiles were useful in discussing the minute peaks in the DSC. The analytical results revealed the following changes of polymorphisms in detail: An LPEI film prepared by casting an aqueous solution was composed of sesquihydrate and hemihydrate crystals. The sesquihydrate one was lost at an early stage of heating, and the film changed into an amorphous state. Once the sesquihydrate was lost by heating, it was not recovered even when it was cooled back to room temperature. When the sample was heated again, structural changes were found between the hemihydrate and the amorphous components. In this manner, the simultaneous DSC-XRD measurements combined with ALS analysis proved to be powerful for obtaining a better understanding of the thermally induced changes of the crystalline structure in a polymer film.

  7. Analysis of respiratory and muscle activity by means of cross information function between ventilatory and myographic signals.

    PubMed

    Alonso, J F; Mañanas, M A; Hoyer, D; Topor, Z L; Bruce, E N

    2004-01-01

    Analysis of respiratory muscle activity is a promising technique for the study of pulmonary diseases such as obstructive sleep apnea syndrome (OSAS). Evaluation of interactions between muscles is very useful in order to determine the muscular pattern during an exercise. These interactions have already been assessed by means of different linear techniques like cross-spectrum, magnitude squared coherence or cross-correlation. The aim of this work is to evaluate interactions between respiratory and myographic signals through nonlinear analysis by means of cross mutual information function (CMIF), and finding out what information can be extracted from it. Some parameters are defined and calculated from CMIF between ventilatory and myographic signals of three respiratory muscles. Finally, differences in certain parameters were obtained between OSAS patients and healthy subjects indicating different respiratory muscle couplings.

  8. Sequential Pattern Analysis: Method and Application in Exploring How Students Develop Concept Maps

    ERIC Educational Resources Information Center

    Chiu, Chiung-Hui; Lin, Chien-Liang

    2012-01-01

    Concept mapping is a technique that represents knowledge in graphs. It has been widely adopted in science education and cognitive psychology to aid learning and assessment. To realize the sequential manner in which students develop concept maps, most research relies upon human-dependent, qualitative approaches. This article proposes a method for…

  9. Mining Interactions in Immersive Learning Environments for Real-Time Student Feedback

    ERIC Educational Resources Information Center

    Kennedy, Gregor; Ioannou, Ioanna; Zhou, Yun; Bailey, James; O'Leary, Stephen

    2013-01-01

    The analysis and use of data generated by students' interactions with learning systems or programs--learning analytics--has recently gained widespread attention in the educational technology community. Part of the reason for this interest is based on the potential of learning analytic techniques such as data mining to find hidden patterns in…

  10. Nonlinear behavior of the tarka flute's distinctive sounds.

    PubMed

    Gérard, Arnaud; Yapu-Quispe, Luis; Sakuma, Sachiko; Ghezzi, Flavio; Ramírez-Ávila, Gonzalo Marcelo

    2016-09-01

    The Andean tarka flute generates multiphonic sounds. Using spectral techniques, we verify two distinctive musical behaviors and the nonlinear nature of the tarka. Through nonlinear time series analysis, we determine chaotic and hyperchaotic behavior. Experimentally, we observe that by increasing the blow pressure on different fingerings, peculiar changes from linear to nonlinear patterns are produced, leading ultimately to quenching.

  11. Adverbials of Result: Phraseology and Functions in the Problem-Solution Pattern

    ERIC Educational Resources Information Center

    Charles, Maggie

    2011-01-01

    This paper combines the use of corpus techniques with discourse analysis in order to investigate adverbials of result in the writing of advanced academic student writers. It focuses in detail on the phraseology and functions of "thus," "therefore," "then," "hence," "so" and "consequently." Two corpora of native-speaker theses are examined: 190,000…

  12. Methods for Identifying Object Class, Type, and Orientation in the Presence of Uncertainty

    DTIC Science & Technology

    1990-08-01

    on Range Finding Techniques for Computer Vision," IEEE Trans. on Pattern Analysis and Machine Intellegence PAMI-5 (2), pp 129-139 March 1983. 15. Yang... Artificial Intelligence Applications, pp 199-205, December 1984. 16. Flynn, P.J. and Jain, A.K.," On Reliable Curvature Estimation, " Proceedings of the

  13. "Whoa! We're Going Deep in the Trees!": Patterns of Collaboration around an Interactive Information Visualization Exhibit

    ERIC Educational Resources Information Center

    Davis, Pryce; Horn, Michael; Block, Florian; Phillips, Brenda; Evans, E. Margaret; Diamond, Judy; Shen, Chia

    2015-01-01

    In this paper we present a qualitative analysis of natural history museum visitor interaction around a multi-touch tabletop exhibit called "DeepTree" that we designed around concepts of evolution and common descent. DeepTree combines several large scientific datasets and an innovative visualization technique to display a phylogenetic…

  14. Nonlinear behavior of the tarka flute's distinctive sounds

    NASA Astrophysics Data System (ADS)

    Gérard, Arnaud; Yapu-Quispe, Luis; Sakuma, Sachiko; Ghezzi, Flavio; Ramírez-Ávila, Gonzalo Marcelo

    2016-09-01

    The Andean tarka flute generates multiphonic sounds. Using spectral techniques, we verify two distinctive musical behaviors and the nonlinear nature of the tarka. Through nonlinear time series analysis, we determine chaotic and hyperchaotic behavior. Experimentally, we observe that by increasing the blow pressure on different fingerings, peculiar changes from linear to nonlinear patterns are produced, leading ultimately to quenching.

  15. Examining Response to a One-to-One Computer Initiative: Student and Teacher Voices

    ERIC Educational Resources Information Center

    Storz, Mark G.; Hoffman, Amy R.

    2013-01-01

    The impact of a one-to-one computing initiative at a Midwestern urban middle school was examined through phenomenological research techniques focusing on the voices of eighth grade students and their teachers. Analysis of transcripts from pre and post-implementation interviews of 47 students and eight teachers yielded patterns of responses to…

  16. An overview of computer vision

    NASA Technical Reports Server (NTRS)

    Gevarter, W. B.

    1982-01-01

    An overview of computer vision is provided. Image understanding and scene analysis are emphasized, and pertinent aspects of pattern recognition are treated. The basic approach to computer vision systems, the techniques utilized, applications, the current existing systems and state-of-the-art issues and research requirements, who is doing it and who is funding it, and future trends and expectations are reviewed.

  17. A Comparative Study of Frequent and Maximal Periodic Pattern Mining Algorithms in Spatiotemporal Databases

    NASA Astrophysics Data System (ADS)

    Obulesu, O.; Rama Mohan Reddy, A., Dr; Mahendra, M.

    2017-08-01

    Detecting regular and efficient cyclic models is the demanding activity for data analysts due to unstructured, vigorous and enormous raw information produced from web. Many existing approaches generate large candidate patterns in the occurrence of huge and complex databases. In this work, two novel algorithms are proposed and a comparative examination is performed by considering scalability and performance parameters. The first algorithm is, EFPMA (Extended Regular Model Detection Algorithm) used to find frequent sequential patterns from the spatiotemporal dataset and the second one is, ETMA (Enhanced Tree-based Mining Algorithm) for detecting effective cyclic models with symbolic database representation. EFPMA is an algorithm grows models from both ends (prefixes and suffixes) of detected patterns, which results in faster pattern growth because of less levels of database projection compared to existing approaches such as Prefixspan and SPADE. ETMA uses distinct notions to store and manage transactions data horizontally such as segment, sequence and individual symbols. ETMA exploits a partition-and-conquer method to find maximal patterns by using symbolic notations. Using this algorithm, we can mine cyclic models in full-series sequential patterns including subsection series also. ETMA reduces the memory consumption and makes use of the efficient symbolic operation. Furthermore, ETMA only records time-series instances dynamically, in terms of character, series and section approaches respectively. The extent of the pattern and proving efficiency of the reducing and retrieval techniques from synthetic and actual datasets is a really open & challenging mining problem. These techniques are useful in data streams, traffic risk analysis, medical diagnosis, DNA sequence Mining, Earthquake prediction applications. Extensive investigational outcomes illustrates that the algorithms outperforms well towards efficiency and scalability than ECLAT, STNR and MAFIA approaches.

  18. Analyzing coastal environments by means of functional data analysis

    NASA Astrophysics Data System (ADS)

    Sierra, Carlos; Flor-Blanco, Germán; Ordoñez, Celestino; Flor, Germán; Gallego, José R.

    2017-07-01

    Here we used Functional Data Analysis (FDA) to examine particle-size distributions (PSDs) in a beach/shallow marine sedimentary environment in Gijón Bay (NW Spain). The work involved both Functional Principal Components Analysis (FPCA) and Functional Cluster Analysis (FCA). The grainsize of the sand samples was characterized by means of laser dispersion spectroscopy. Within this framework, FPCA was used as a dimension reduction technique to explore and uncover patterns in grain-size frequency curves. This procedure proved useful to describe variability in the structure of the data set. Moreover, an alternative approach, FCA, was applied to identify clusters and to interpret their spatial distribution. Results obtained with this latter technique were compared with those obtained by means of two vector approaches that combine PCA with CA (Cluster Analysis). The first method, the point density function (PDF), was employed after adapting a log-normal distribution to each PSD and resuming each of the density functions by its mean, sorting, skewness and kurtosis. The second applied a centered-log-ratio (clr) to the original data. PCA was then applied to the transformed data, and finally CA to the retained principal component scores. The study revealed functional data analysis, specifically FPCA and FCA, as a suitable alternative with considerable advantages over traditional vector analysis techniques in sedimentary geology studies.

  19. A morphometric analysis of vegetation patterns in dryland ecosystems

    PubMed Central

    Dekker, Stefan C.; Li, Mao; Mio, Washington; Punyasena, Surangi W.; Lenton, Timothy M.

    2017-01-01

    Vegetation in dryland ecosystems often forms remarkable spatial patterns. These range from regular bands of vegetation alternating with bare ground, to vegetated spots and labyrinths, to regular gaps of bare ground within an otherwise continuous expanse of vegetation. It has been suggested that spotted vegetation patterns could indicate that collapse into a bare ground state is imminent, and the morphology of spatial vegetation patterns, therefore, represents a potentially valuable source of information on the proximity of regime shifts in dryland ecosystems. In this paper, we have developed quantitative methods to characterize the morphology of spatial patterns in dryland vegetation. Our approach is based on algorithmic techniques that have been used to classify pollen grains on the basis of textural patterning, and involves constructing feature vectors to quantify the shapes formed by vegetation patterns. We have analysed images of patterned vegetation produced by a computational model and a small set of satellite images from South Kordofan (South Sudan), which illustrates that our methods are applicable to both simulated and real-world data. Our approach provides a means of quantifying patterns that are frequently described using qualitative terminology, and could be used to classify vegetation patterns in large-scale satellite surveys of dryland ecosystems. PMID:28386414

  20. A morphometric analysis of vegetation patterns in dryland ecosystems.

    PubMed

    Mander, Luke; Dekker, Stefan C; Li, Mao; Mio, Washington; Punyasena, Surangi W; Lenton, Timothy M

    2017-02-01

    Vegetation in dryland ecosystems often forms remarkable spatial patterns. These range from regular bands of vegetation alternating with bare ground, to vegetated spots and labyrinths, to regular gaps of bare ground within an otherwise continuous expanse of vegetation. It has been suggested that spotted vegetation patterns could indicate that collapse into a bare ground state is imminent, and the morphology of spatial vegetation patterns, therefore, represents a potentially valuable source of information on the proximity of regime shifts in dryland ecosystems. In this paper, we have developed quantitative methods to characterize the morphology of spatial patterns in dryland vegetation. Our approach is based on algorithmic techniques that have been used to classify pollen grains on the basis of textural patterning, and involves constructing feature vectors to quantify the shapes formed by vegetation patterns. We have analysed images of patterned vegetation produced by a computational model and a small set of satellite images from South Kordofan (South Sudan), which illustrates that our methods are applicable to both simulated and real-world data. Our approach provides a means of quantifying patterns that are frequently described using qualitative terminology, and could be used to classify vegetation patterns in large-scale satellite surveys of dryland ecosystems.

  1. A morphometric analysis of vegetation patterns in dryland ecosystems

    NASA Astrophysics Data System (ADS)

    Mander, Luke; Dekker, Stefan C.; Li, Mao; Mio, Washington; Punyasena, Surangi W.; Lenton, Timothy M.

    2017-02-01

    Vegetation in dryland ecosystems often forms remarkable spatial patterns. These range from regular bands of vegetation alternating with bare ground, to vegetated spots and labyrinths, to regular gaps of bare ground within an otherwise continuous expanse of vegetation. It has been suggested that spotted vegetation patterns could indicate that collapse into a bare ground state is imminent, and the morphology of spatial vegetation patterns, therefore, represents a potentially valuable source of information on the proximity of regime shifts in dryland ecosystems. In this paper, we have developed quantitative methods to characterize the morphology of spatial patterns in dryland vegetation. Our approach is based on algorithmic techniques that have been used to classify pollen grains on the basis of textural patterning, and involves constructing feature vectors to quantify the shapes formed by vegetation patterns. We have analysed images of patterned vegetation produced by a computational model and a small set of satellite images from South Kordofan (South Sudan), which illustrates that our methods are applicable to both simulated and real-world data. Our approach provides a means of quantifying patterns that are frequently described using qualitative terminology, and could be used to classify vegetation patterns in large-scale satellite surveys of dryland ecosystems.

  2. Monitoring of Microseismicity with ArrayTechniques in the Peach Tree Valley Region

    NASA Astrophysics Data System (ADS)

    Garcia-Reyes, J. L.; Clayton, R. W.

    2016-12-01

    This study is focused on the analysis of microseismicity along the San Andreas Fault in the PeachTree Valley region. This zone is part of the transition zone between the locked portion to the south (Parkfield, CA) and the creeping section to the north (Jovilet, et al., JGR, 2014). The data for the study comes from a 2-week deployment of 116 Zland nodes in a cross-shaped configuration along (8.2 km) and across (9 km) the Fault. We analyze the distribution of microseismicity using a 3D backprojection technique, and we explore the use of Hidden Markov Models to identify different patterns of microseismicity (Hammer et al., GJI, 2013). The goal of the study is to relate the style of seismicity to the mechanical state of the Fault. The results show the evolution of seismic activity as well as at least two different patterns of seismic signals.

  3. Detonation Failure Thickness Measurement in AN Annular Geometry

    NASA Astrophysics Data System (ADS)

    Mack, D. B.; Petel, O. E.; Higgins, A. J.

    2007-12-01

    The failure thickness of neat nitromethane in aluminum confinement was measured using a novel experimental technique. The thickness was approximated in an annular geometry by the gap between a concentric aluminum tube and rod. This technique was motivated by the desire to have a periodic boundary condition in the direction orthogonal to the annulus thickness, rather than a free surface occurring in typical rectangular geometry experiments. This results in a two-dimensional charge analogous to previous failure thickness setups but with infinite effective width (i.e. infinite aspect ratio). Detonation propagation or failure was determined by the observation of failure patterns engraved on the aluminum rod by the passing detonation. Analysis of these engraved patterns provides a statistical measurement of the spatial density of failure waves. Failure was observed as far as 180 thicknesses downstream. The failure thickness was measured to be 1.45 mm±0.15 mm.

  4. Template for preparation of papers for IEEE sponsored conferences & symposia.

    PubMed

    Sacchi, L; Dagliati, A; Tibollo, V; Leporati, P; De Cata, P; Cerra, C; Chiovato, L; Bellazzi, R

    2015-01-01

    To improve the access to medical information is necessary to design and implement integrated informatics techniques aimed to gather data from different and heterogeneous sources. This paper describes the technologies used to integrate data coming from the electronic medical record of the IRCCS Fondazione Maugeri (FSM) hospital of Pavia, Italy, and combines them with administrative, pharmacy drugs purchase coming from the local healthcare agency (ASL) of the Pavia area and environmental open data of the same region. The integration process is focused on data coming from a cohort of one thousand patients diagnosed with Type 2 Diabetes Mellitus (T2DM). Data analysis and temporal data mining techniques have been integrated to enhance the initial dataset allowing the possibility to stratify patients using further information coming from the mined data like behavioral patterns of prescription-related drug purchases and other frequent clinical temporal patterns, through the use of an intuitive dashboard controlled system.

  5. MTF Analysis of LANDSAT-4 Thematic Mapper

    NASA Technical Reports Server (NTRS)

    Schowengerdt, R.

    1984-01-01

    A research program to measure the LANDSAT 4 Thematic Mapper (TM) modulation transfer function (MTF) is described. Measurement of a satellite sensor's MTF requires the use of a calibrated ground target, i.e., the spatial radiance distribution of the target must be known to a resolution at least four to five times greater than that of the system under test. A small reflective mirror or a dark light linear pattern such as line or edge, and relatively high resolution underflight imagery are used to calibrate the target. A technique that utilizes an analytical model for the scene spatial frequency power spectrum will be investigated as an alternative to calibration of the scene. The test sites and analysis techniques are also described.

  6. An introduction to metabolomics and its potential application in veterinary science.

    PubMed

    Jones, Oliver A H; Cheung, Victoria L

    2007-10-01

    Metabolomics has been found to be applicable to a wide range of fields, including the study of gene function, toxicology, plant sciences, environmental analysis, clinical diagnostics, nutrition, and the discrimination of organism genotypes. This approach combines high-throughput sample analysis with computer-assisted multivariate pattern-recognition techniques. It is increasingly being deployed in toxico- and pharmacokinetic studies in the pharmaceutical industry, especially during the safety assessment of candidate drugs in human medicine. However, despite the potential of this technique to reduce both costs and the numbers of animals used for research, examples of the application of metabolomics in veterinary research are, thus far, rare. Here we give an introduction to metabolomics and discuss its potential in the field of veterinary science.

  7. Optimizing tertiary storage organization and access for spatio-temporal datasets

    NASA Technical Reports Server (NTRS)

    Chen, Ling Tony; Rotem, Doron; Shoshani, Arie; Drach, Bob; Louis, Steve; Keating, Meridith

    1994-01-01

    We address in this paper data management techniques for efficiently retrieving requested subsets of large datasets stored on mass storage devices. This problem represents a major bottleneck that can negate the benefits of fast networks, because the time to access a subset from a large dataset stored on a mass storage system is much greater that the time to transmit that subset over a network. This paper focuses on very large spatial and temporal datasets generated by simulation programs in the area of climate modeling, but the techniques developed can be applied to other applications that deal with large multidimensional datasets. The main requirement we have addressed is the efficient access of subsets of information contained within much larger datasets, for the purpose of analysis and interactive visualization. We have developed data partitioning techniques that partition datasets into 'clusters' based on analysis of data access patterns and storage device characteristics. The goal is to minimize the number of clusters read from mass storage systems when subsets are requested. We emphasize in this paper proposed enhancements to current storage server protocols to permit control over physical placement of data on storage devices. We also discuss in some detail the aspects of the interface between the application programs and the mass storage system, as well as a workbench to help scientists to design the best reorganization of a dataset for anticipated access patterns.

  8. Digital quantification of fibrosis in liver biopsy sections: description of a new method by Photoshop software.

    PubMed

    Dahab, Gamal M; Kheriza, Mohamed M; El-Beltagi, Hussien M; Fouda, Abdel-Motaal M; El-Din, Osama A Sharaf

    2004-01-01

    The precise quantification of fibrous tissue in liver biopsy sections is extremely important in the classification, diagnosis and grading of chronic liver disease, as well as in evaluating the response to antifibrotic therapy. Because the recently described methods of digital image analysis of fibrosis in liver biopsy sections have major flaws, including the use of out-dated techniques in image processing, inadequate precision and inability to detect and quantify perisinusoidal fibrosis, we developed a new technique in computerized image analysis of liver biopsy sections based on Adobe Photoshop software. We prepared an experimental model of liver fibrosis involving treatment of rats with oral CCl4 for 6 weeks. After staining liver sections with Masson's trichrome, a series of computer operations were performed including (i) reconstitution of seamless widefield images from a number of acquired fields of liver sections; (ii) image size and solution adjustment; (iii) color correction; (iv) digital selection of a specified color range representing all fibrous tissue in the image and; (v) extraction and calculation. This technique is fully computerized with no manual interference at any step, and thus could be very reliable for objectively quantifying any pattern of fibrosis in liver biopsy sections and in assessing the response to antifibrotic therapy. It could also be a valuable tool in the precise assessment of antifibrotic therapy to other tissue regardless of the pattern of tissue or fibrosis.

  9. NC-TEST: noncontact thermal emissions screening technique for drug and alcohol detection

    NASA Astrophysics Data System (ADS)

    Prokoski, Francine J.

    1997-01-01

    Drug abuse is highly correlated with criminal behavior. The typical drug-using criminal commits hundreds of crimes per year. The crime rate cannot be significantly reduced without a reduction in the percentage of the population abusing drugs and alcohol. Accurate and timely estimation of that percentage is important for policy decisions concerning crime control, public health measures, allocation of intervention resources for prevention and treatment, projections of criminal justice needs, and the evaluation of policy effectiveness. Such estimation is particularly difficult because self reporting is unreliable; and physical testing has to date required blood or urine analysis which is expensive and invasive, with the result that too few people are tested. MIKOS Ltd. has developed a non-contact, passive technique with the potential for automatic, real- time screening for drug and alcohol use. The system utilizes thermal radiation which is spontaneously and continuously emitted by the human body. Facial thermal patterns and changes in patterns are correlated with standardized effects of specific drugs and alcohol. A portable system incorporating the collection and analysis technique can be used episodically to collect data for estimating drug and alcohol use by general unknown populations such as crowds at airports, or it can be used for repetitive routine screening of specific known groups such as airline pilots, military personnel, school children, or persons on probation or parole.

  10. Mobile assemblies of Bennett linkages from four-crease origami patterns

    NASA Astrophysics Data System (ADS)

    Zhang, Xiao; Chen, Yan

    2018-02-01

    This paper deals with constructing mobile assemblies of Bennett linkages inspired by four-crease origami patterns. A transition technique has been proposed by taking the thick-panel form of an origami pattern as an intermediate bridge. A zero-thickness rigid origami pattern and its thick-panel form share the same sector angles and folding behaviours, while the thick-panel origami and the mobile assembly of linkages are kinematically equivalent with differences only in link profiles. Applying this transition technique to typical four-crease origami patterns, we have found that the Miura-ori and graded Miura-ori patterns lead to assemblies of Bennett linkages with identical link lengths. The supplementary-type origami patterns with different mountain-valley crease assignments correspond to different types of Bennett linkage assemblies with negative link lengths. And the identical linkage-type origami pattern generates a new mobile assembly. Hence, the transition technique offers a novel approach to constructing mobile assemblies of spatial linkages from origami patterns.

  11. Mobile assemblies of Bennett linkages from four-crease origami patterns.

    PubMed

    Zhang, Xiao; Chen, Yan

    2018-02-01

    This paper deals with constructing mobile assemblies of Bennett linkages inspired by four-crease origami patterns. A transition technique has been proposed by taking the thick-panel form of an origami pattern as an intermediate bridge. A zero-thickness rigid origami pattern and its thick-panel form share the same sector angles and folding behaviours, while the thick-panel origami and the mobile assembly of linkages are kinematically equivalent with differences only in link profiles. Applying this transition technique to typical four-crease origami patterns, we have found that the Miura-ori and graded Miura-ori patterns lead to assemblies of Bennett linkages with identical link lengths. The supplementary-type origami patterns with different mountain-valley crease assignments correspond to different types of Bennett linkage assemblies with negative link lengths. And the identical linkage-type origami pattern generates a new mobile assembly. Hence, the transition technique offers a novel approach to constructing mobile assemblies of spatial linkages from origami patterns.

  12. Sucrose lyophiles: a semi-quantitative study of residual water content by total X-ray diffraction analysis.

    PubMed

    Bates, S; Jonaitis, D; Nail, S

    2013-10-01

    Total X-ray Powder Diffraction Analysis (TXRPD) using transmission geometry was able to observe significant variance in measured powder patterns for sucrose lyophilizates with differing residual water contents. Integrated diffraction intensity corresponding to the observed variances was found to be linearly correlated to residual water content as measured by an independent technique. The observed variance was concentrated in two distinct regions of the lyophilizate powder pattern, corresponding to the characteristic sucrose matrix double halo and the high angle diffuse region normally associated with free-water. Full pattern fitting of the lyophilizate powder patterns suggested that the high angle variance was better described by the characteristic diffraction profile of a concentrated sucrose/water system rather than by the free-water diffraction profile. This suggests that the residual water in the sucrose lyophilizates is intimately mixed at the molecular level with sucrose molecules forming a liquid/solid solution. The bound nature of the residual water and its impact on the sucrose matrix gives an enhanced diffraction response between 3.0 and 3.5 beyond that expected for free-water. The enhanced diffraction response allows semi-quantitative analysis of residual water contents within the studied sucrose lyophilizates to levels below 1% by weight. Copyright © 2013 Elsevier B.V. All rights reserved.

  13. Spectral feature extraction of EEG signals and pattern recognition during mental tasks of 2-D cursor movements for BCI using SVM and ANN.

    PubMed

    Bascil, M Serdar; Tesneli, Ahmet Y; Temurtas, Feyzullah

    2016-09-01

    Brain computer interface (BCI) is a new communication way between man and machine. It identifies mental task patterns stored in electroencephalogram (EEG). So, it extracts brain electrical activities recorded by EEG and transforms them machine control commands. The main goal of BCI is to make available assistive environmental devices for paralyzed people such as computers and makes their life easier. This study deals with feature extraction and mental task pattern recognition on 2-D cursor control from EEG as offline analysis approach. The hemispherical power density changes are computed and compared on alpha-beta frequency bands with only mental imagination of cursor movements. First of all, power spectral density (PSD) features of EEG signals are extracted and high dimensional data reduced by principle component analysis (PCA) and independent component analysis (ICA) which are statistical algorithms. In the last stage, all features are classified with two types of support vector machine (SVM) which are linear and least squares (LS-SVM) and three different artificial neural network (ANN) structures which are learning vector quantization (LVQ), multilayer neural network (MLNN) and probabilistic neural network (PNN) and mental task patterns are successfully identified via k-fold cross validation technique.

  14. Analysis and synthesis of abstract data types through generalization from examples

    NASA Technical Reports Server (NTRS)

    Wild, Christian

    1987-01-01

    The discovery of general patterns of behavior from a set of input/output examples can be a useful technique in the automated analysis and synthesis of software systems. These generalized descriptions of the behavior form a set of assertions which can be used for validation, program synthesis, program testing, and run-time monitoring. Describing the behavior is characterized as a learning process in which the set of inputs is mapped into an appropriate transform space such that general patterns can be easily characterized. The learning algorithm must chose a transform function and define a subset of the transform space which is related to equivalence classes of behavior in the original domain. An algorithm for analyzing the behavior of abstract data types is presented and several examples are given. The use of the analysis for purposes of program synthesis is also discussed.

  15. Recognition of anaerobic bacterial isolates in vitro using electronic nose technology.

    PubMed

    Pavlou, A; Turner, A P F; Magan, N

    2002-01-01

    Use of an electronic nose (e.nose) system to differentiation between anaerobic bacteria grown in vitro on agar media. Cultures of Clostridium spp. (14 strains) and Bacteroides fragilis (12 strains) were grown on blood agar plates and incubated in sampling bags for 30 min before head space analysis of the volatiles. Qualitative analyses of the volatile production patterns was carried out using an e.nose system with 14 conducting polymer sensors. Using data analysis techniques such as principal components analysis (PCA), genetic algorithms and neural networks it was possible to differentiate between agar blanks and individual species which accounted for all the data. A total of eight unknowns were correctly discriminated into the bacterial groups. This is the first report of in vitro complex volatile pattern recognition and differentiation of anaerobic pathogens. These results suggest the potential for application of e.nose technology in early diagnosis of microbial pathogens of medical importance.

  16. Feature-space-based FMRI analysis using the optimal linear transformation.

    PubMed

    Sun, Fengrong; Morris, Drew; Lee, Wayne; Taylor, Margot J; Mills, Travis; Babyn, Paul S

    2010-09-01

    The optimal linear transformation (OLT), an image analysis technique of feature space, was first presented in the field of MRI. This paper proposes a method of extending OLT from MRI to functional MRI (fMRI) to improve the activation-detection performance over conventional approaches of fMRI analysis. In this method, first, ideal hemodynamic response time series for different stimuli were generated by convolving the theoretical hemodynamic response model with the stimulus timing. Second, constructing hypothetical signature vectors for different activity patterns of interest by virtue of the ideal hemodynamic responses, OLT was used to extract features of fMRI data. The resultant feature space had particular geometric clustering properties. It was then classified into different groups, each pertaining to an activity pattern of interest; the applied signature vector for each group was obtained by averaging. Third, using the applied signature vectors, OLT was applied again to generate fMRI composite images with high SNRs for the desired activity patterns. Simulations and a blocked fMRI experiment were employed for the method to be verified and compared with the general linear model (GLM)-based analysis. The simulation studies and the experimental results indicated the superiority of the proposed method over the GLM-based analysis in detecting brain activities.

  17. Gear Shifting of Quadriceps during Isometric Knee Extension Disclosed Using Ultrasonography.

    PubMed

    Zhang, Shu; Huang, Weijian; Zeng, Yu; Shi, Wenxiu; Diao, Xianfen; Wei, Xiguang; Ling, Shan

    2018-01-01

    Ultrasonography has been widely employed to estimate the morphological changes of muscle during contraction. To further investigate the motion pattern of quadriceps during isometric knee extensions, we studied the relative motion pattern between femur and quadriceps under ultrasonography. An interesting observation is that although the force of isometric knee extension can be controlled to change almost linearly, femur in the simultaneously captured ultrasound video sequences has several different piecewise moving patterns. This phenomenon is like quadriceps having several forward gear ratios like a car starting from rest towards maximal voluntary contraction (MVC) and then returning to rest. Therefore, to verify this assumption, we captured several ultrasound video sequences of isometric knee extension and collected the torque/force signal simultaneously. Then we extract the shapes of femur from these ultrasound video sequences using video processing techniques and study the motion pattern both qualitatively and quantitatively. The phenomenon can be seen easier via a comparison between the torque signal and relative spatial distance between femur and quadriceps. Furthermore, we use cluster analysis techniques to study the process and the clustering results also provided preliminary support to the conclusion that, during both ramp increasing and decreasing phases, quadriceps contraction may have several forward gear ratios relative to femur.

  18. New approach for cognitive analysis and understanding of medical patterns and visualizations

    NASA Astrophysics Data System (ADS)

    Ogiela, Marek R.; Tadeusiewicz, Ryszard

    2003-11-01

    This paper presents new opportunities for applying linguistic description of the picture merit content and AI methods to undertake tasks of the automatic understanding of images semantics in intelligent medical information systems. A successful obtaining of the crucial semantic content of the medical image may contribute considerably to the creation of new intelligent multimedia cognitive medical systems. Thanks to the new idea of cognitive resonance between stream of the data extracted from the image using linguistic methods and expectations taken from the representaion of the medical knowledge, it is possible to understand the merit content of the image even if teh form of the image is very different from any known pattern. This article proves that structural techniques of artificial intelligence may be applied in the case of tasks related to automatic classification and machine perception based on semantic pattern content in order to determine the semantic meaning of the patterns. In the paper are described some examples presenting ways of applying such techniques in the creation of cognitive vision systems for selected classes of medical images. On the base of scientific research described in the paper we try to build some new systems for collecting, storing, retrieving and intelligent interpreting selected medical images especially obtained in radiological and MRI examinations.

  19. Spectroscopy analysis of phenolic and sugar patterns in a food grade chestnut tannin.

    PubMed

    Ricci, A; Lagel, M-C; Parpinello, G P; Pizzi, A; Kilmartin, P A; Versari, A

    2016-07-15

    Tannin of chestnut (Castanea sativa Mill.) wood, commonly used in winemaking was characterised with a spectroscopy qualitative approach that revealed its phenolic composition: several vibrational diagnostic bands assigned using the Attenuated Total Reflectance-Infrared Spectroscopy, and fragmentation patterns obtained using the Laser-Desorption-Ionization Time-of-Flight technique evidenced polygalloylglucose, e.g. castalagin/vescalagin-like structures as the most representative molecules, together with sugar moieties. The implication of these findings on winemaking application and the potential influence of the chemical structure on the sensory properties of wine are discussed. Copyright © 2016 Elsevier Ltd. All rights reserved.

  20. Pattern optimizing verification of self-align quadruple patterning

    NASA Astrophysics Data System (ADS)

    Yamato, Masatoshi; Yamada, Kazuki; Oyama, Kenichi; Hara, Arisa; Natori, Sakurako; Yamauchi, Shouhei; Koike, Kyohei; Yaegashi, Hidetami

    2017-03-01

    Lithographic scaling continues to advance by extending the life of 193nm immersion technology, and spacer-type multi-patterning is undeniably the driving force behind this trend. Multi-patterning techniques such as self-aligned double patterning (SADP) and self-aligned quadruple patterning (SAQP) have come to be used in memory devices, and they have also been adopted in logic devices to create constituent patterns in the formation of 1D layout designs. Multi-patterning has consequently become an indispensible technology in the fabrication of all advanced devices. In general, items that must be managed when using multi-patterning include critical dimension uniformity (CDU), line edge roughness (LER), and line width roughness (LWR). Recently, moreover, there has been increasing focus on judging and managing pattern resolution performance from a more detailed perspective and on making a right/wrong judgment from the perspective of edge placement error (EPE). To begin with, pattern resolution performance in spacer-type multi-patterning is affected by the process accuracy of the core (mandrel) pattern. Improving the controllability of CD and LER of the mandrel is most important, and to reduce LER, an appropriate smoothing technique should be carefully selected. In addition, the atomic layer deposition (ALD) technique is generally used to meet the need for high accuracy in forming the spacer film. Advances in scaling are accompanied by stricter requirements in the controllability of fine processing. In this paper, we first describe our efforts in improving controllability by selecting the most appropriate materials for the mandrel pattern and spacer film. Then, based on the materials selected, we present experimental results on a technique for improving etching selectivity.

Top